Search Immortality Topics:

Page 30«..1020..29303132..4050..»


3 Stocks to Grab Now to Ride the Artificial Intelligence Chip Boom to Riches – InvestorPlace

Posted: April 27, 2024 at 2:41 am

Data analytics company GlobalData projects that the AI market will grow 35% annually over the next few years, reaching $909 billion by 2030. Naturally, thats made AI chip stocks extremely popular with investors.

Google the words AI chip stocks in quotation marks, and you will get 39,600 results. AI is undoubtedly a priority subject for investors at the moment.

On April 1, Barrons reported that Microsoft (NASDAQ:MSFT) and OpenAI plan to build a $100 billion AI data center, an investment equal to Microsofts capital spending over the past four years.

Investors are stocked because the AI chips required to power such a large data center would be enormous. Thats good news for Nvidia (NASDAQ:NVDA) and every other major AI player.

Bank of Americas Global Research analyst Vivek Arya has buy ratings on Nvidia and four other AI chip stocks. Id normally include Nvidia in any AI-related recommendation, but Ill go with three that he does not mention in his article.

To make my selection, I looked at the Horizons Global Semiconductor Index ETFs holdings, which trade on the Toronto Stock Exchange.

Source: William Potter / Shutterstock.com

Ill admit that my picks arent the most original. However, that doesnt make them any less actionable.

Taiwan Semiconductor Manufacturing (NYSE:TSM) makes the cut because of its commitment to American manufacturing. TSM is building an Arizona plant that will go live in 2025. The company also plans to start making its most advanced chips beginning in 2028. By 2030, it plans to open three fabrication plants in the U.S., which would cost the company $65 billion to get them up and running.

Now, big business gets done with some help from the federal government, which is chipping in $11.6 billion in grants and loans. That pales compared to the nearly $20 billion being thrown Intels (NASDAQ:INTC) way under the CHIPS Act, intended to bring 20% of the worlds advanced semiconductor manufacturing back to the U.S.

Ive always thought globalization worked best when companies manufactured products in the country where the products are intended to be sold. Good for TSM.

Source: Ralf Liebhold / Shutterstock

As I write this, ASML Holding (NASDAQ:ASML) stock is falling. The Dutch chip maker reported weaker-than-expected sales in Q1 2024.

Analysts expected revenue of 5.39 billion euros ($5.73 billion), but ASML delivered 5.29 billion euros ($5.63 billion), 2% shy of the mark. However, its net income was 1.22 billion euros ($1.30 billion), 14% higher than Wall Streets predictions.

ASML produces extreme ultraviolet lithography machines, which are used to make technologically advanced chips. Lower consumer demand for smartphones and laptops has had a knock-on effect on the companys revenues.Sales and profits were down 21.6% and 37.4% in Q1 2024, respectively. Bookings were also down 4% year over year.

Despite the miss, ASML reiterated its 2024 guidance for revenue. Similar to 2023, it suggests that 2025 will be its breakout year as both TSM and Intel increase their U.S. production.

I think by 2025 you will see all three of those coming together. New fab openings, strong secular trends and the industry in the midst of its upturn, said CFO Roger Dassen in an interview for CNBC.

ASML is a buy below $900.

Source: Xixi Fu / Shutterstock.com

Qualcomm (NASDAQ:QCOM) stock is up more than 19% year to date and more than 49% since November lows.

Qualcomm launched AI Hub in early March. It includes over 75 popular AI and generative AI language learning models such as Whisper, ControlNet, Stable Diffusion and Baichuan 7B, which will provide developers with high performance and low power consumption when creating applications.

In a February interview from the 2024 Mobile World Congress in Barcelona, Qualcomm Chief Financial Officer and Chief Operating Officer Akash Palkhiwala spoke with Yahoo Finance host Brad Smith about Qualcomms AI Hubs role in generative AI.

And you could take those models, build it into an application, test it on a device, and deploy it into an application store, all in one go right at the website. So it just makes it very easy for the developers to take advantage of the hardware that weve put forward. And were excited that this broadens the reach of our products. And it makes it very easy for developers to access them.

Smartphone makers will launch devices with full AI capabilities integrated into them in 2024 and 2025. Qualcomms Snapdragon 8 Gen 3 chip will help manufacturers deliver these capabilities.

This is a big positive for the company and its stock.

On the date of publication, Will Ashworth did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Will Ashworth has written about investments full-time since 2008. Publications where hes appeared include InvestorPlace, The Motley Fool Canada, Investopedia, Kiplinger, and several others in both the U.S. and Canada. He particularly enjoys creating model portfolios that stand the test of time. He lives in Halifax, Nova Scotia.

Continued here:

3 Stocks to Grab Now to Ride the Artificial Intelligence Chip Boom to Riches - InvestorPlace

Recommendation and review posted by G. Smith

Artificial Intelligence Has Come for Our…Beauty Pageants? – Glamour

Posted: April 27, 2024 at 2:41 am

Hence the creation of the Miss AI pageant, in which AI-generated contestants will be judged on some of the classic aspects of pageantry and the the skill and implementation of AI tools used to create the contestants. Also being considered is the AI creators social media cloutmeaning theyre not just crowning the most beautiful avatar but also the most influential.

Sodo we think Amazon's Alexa will compete? (Sorry.)

All jokes aside, both Fanvue and the WAICAs are being met with criticism, especially since real beauty pageants are so problematic as is. Concern for the impact of beauty pageants on mental health has been well documented and includes poor self-esteem, negative body image, and disordered eating, says Ashley Moser, a licensed therapist and clinical education specialist at The Renfrew Center, and upping the ante by digitizing contestants perfection and beauty could set a dangerous precedent.

These issues arise from the literal crowning of the best version of what women should be, specifically, beautiful and thin, Moser adds. What's more, it feels regressiveand quite frankly, offensiveto combine something so superficial and archaic with what's an otherwise cutting-edge technological innovation.

Emily Pellegrini

I support the recognition and awarding of women in tech and would hope that those skills could be celebrated without having to include beauty and appearance as a qualifying factor, Moser says. Cant we celebrate women for their abilities without making it about looks?

WAICAs says its not like that, though. The WAICA awards aim to raise the standard of the industry, focusing on celebrating diversity and realism, the spokesperson says. This isnt about pushing unrealistic standards but realistic models that represent real people. We want to see AI models of all shapes, sizes, and backgrounds entering the awardsand that's what the judges will be looking for.

More here:

Artificial Intelligence Has Come for Our...Beauty Pageants? - Glamour

Recommendation and review posted by G. Smith

Pope Francis to participate in G7 session on AI – Vatican News – English

Posted: April 27, 2024 at 2:41 am

Pope Francis will take part in the upcoming G7 session on Artificial Intelligence under Italys presidency of the group.

By Vatican News

The Holy See Press Office on Friday confirmed that Pope Francis will intervene in the G7 Summit in Italys southern Puglia region in the session devoted to Artificial Intelligence (AI).

The confirmation of the Holy Fathers participation in the Summit, which will take place from June 13 to 15 at Borgo Egnazia in Puglia, follows the announcement made by Italian Prime Minister, Giorgia Meloni.

"This is the first time in history that a pontiff will participate in the work of a G7," she said, adding that the Pope would attend the "outreach session" for guest participants at the upcoming Group of Seven industrialised nations meeting.

The Summit foresees the participation of the United States, Canada, France, the United Kingdom, Germany, and Japan.

"I heartily thank the Holy Father for accepting Italy's invitation. His presence honours our nation and the entire G7," Meloni explained, emphasizing how the Italian government intends to enhance the contribution given by the Holy See on the issue of artificial intelligence, particularly with the "Rome Call for AI Ethics of 2020," promoted by the Pontifical Academy for Life, in a process "that leads to the concrete application of the concept of algorithmic ethics, namely giving ethics to algorithms."

"I am convinced," she added, "that the Pope's presence will provide a decisive contribution to defining a regulatory, ethical, and cultural framework for artificial intelligence, because on this ground, on the present and future of this technology, our capacity will once again be measured, the capacity of the international community to do what another Pope, Saint John Paul II, recalled on October 2, 1979, in his famous speech to the United Nations."

"Political activity, whether national or international, comes from man, is exercised by man, and is for man," Meloni quoted.

Pope Francis dedicated his Message for the 57th World Day of Peace on 1 January 2024 to Artificial Intelligence and Peace urging humanity to cultivate wisdom of the heart which, he says, can help us to put systems of artificial intelligence at the service of a fully human communication.

See the original post:

Pope Francis to participate in G7 session on AI - Vatican News - English

Recommendation and review posted by G. Smith

iOS 18 could be loaded with AI, as Apple reveals 8 new artificial intelligence models that run on-device – TechRadar

Posted: April 27, 2024 at 2:41 am

Apple has released a set of several new AI models that are designed to run locally on-device rather than in the cloud, possibly paving the way for an AI-powered iOS 18 in the not-too-distant future.

The iPhone giant has been doubling down on AI in recent months, with a carefully split focus across cloud-based and on-device AI. We saw leaks earlier this week indicating that Apple plans to make its own AI server chips, so this reveal of new local large language models (LLMs) demonstrates that the company is committed to both breeds of AI software. Ill dig into the implications of that further down, but for now, lets explain exactly what these new models are.

The suite of AI tools contains eight distinct models, called OpenELMs (Open-source Efficient Language Models). As the name suggests, these models are fully open-source and available on the Hugging Face Hub, an online community for AI developers and enthusiasts. Apple also published a whitepaper outlining the new models. Four were pre-trained on CoreNet (previously CVNets), a massive library of data used for training AI language models, while the other four have been instruction-tuned by Apple; a process by which an AI models learning parameters are carefully honed to respond to specific prompts.

Releasing open-source software is a somewhat unusual move for Apple, which typically retains quite a close grip on its software ecosystem. The company claims to want to "empower and enrich" public AI research by releasing the OpenELMs to the wider AI community.

Apple has been seriously committed to AI recently, which is good to see as the competition is fierce in both the phone and laptop arenas, with stuff like the Google Pixel 8s AI-powered Tensor chip and Qualcomms latest AI chip coming to Surface devices.

By putting its new on-device AI models out to the world like this, Apple is likely hoping that some enterprising developers will help iron out the kinks and ultimately improve the software - something that could prove vital if it plans to implement new local AI tools in future versions of iOS and macOS.

Its worth bearing in mind that the average Apple device is already packed with AI capabilities, with the Apple Neural Engine found on the companys A- and M-series chips powering features such as Face ID and Animoji. The upcoming M4 chip for Mac systems also appears to sport new AI-related processing capabilities, something that's swiftly becoming a necessity as more-established professional software implements machine-learning tools (like Firefly in Adobe Photoshop).

Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.

In other words, we can probably expect AI to be the hot-button topic for iOS 18 and macOS 15. I just hope its used for clever and unique new features, rather than Microsofts constant Copilot nagging.

Read more from the original source:

iOS 18 could be loaded with AI, as Apple reveals 8 new artificial intelligence models that run on-device - TechRadar

Recommendation and review posted by G. Smith

Machine learning and experiment | symmetry magazine – Symmetry magazine

Posted: April 27, 2024 at 2:41 am

Every day in August of 2019, physicist Dimitrios Tanoglidis would walk to the Plein Air Caf next to the University of Chicago and order a cappuccino. After finding a table, he would spend the next several hours flipping through hundreds of thumbnail images of white smudges recorded by the Dark Energy Camera, a telescope that at the time had observed 300 million astronomical objects.

For each white smudge, Tanoglidis would ask himself a simple yes-or-no question: Is this a galaxy? I would go through about 1,000 images a day, he says. About half of them were galaxies, and the other half were not.

After about a month, Tanoglidiswho was a University of Chicago PhD student at the timehad built up a catalogue of 20,000 low-brightness galaxies.

Then Tanoglidis and his team used this dataset to create a tool that, once trained, could evaluate a similar dataset in a matter of moments. The accuracy of our algorithm was very close to the human eye, he says. In some cases, it was even better than us and would find things that we had misclassified.

The tool they created was based on machine learning, a type of software that learns as it digests data, says Aleksandra Ciprijanovic, a physicist at the US Department of Energys Fermi National Accelerator Laboratory who at the time was one of Tanoglidiss research advisors. Its inspired by how neurons in our brains work, she saysadding that this added brainpower will be essential for analyzing exponentially larger datasets from future astronomical surveys. Without machine learning, wed need a small army of PhD students to give the same type of dataset.

Today, the Dark Energy Survey collaboration has a catalogue of 700 million astronomical objects, and scientists continue to use (and improve) Tanoglidiss tool to analyze images that could show previously undiscovered galaxies.

In astronomy, we have a huge amount of data, Ciprijanovic says. No matter how many people and resources we have, well never have enough people to go through all the data.

Classificationthis is probably a photo of a galaxy versus this is probably not a photo of a galaxywas one of machine learnings earliest applications in science. Over time, its uses have continued to evolve.

Machine learning, which is a subset of artificial intelligence, is a type of software that can, among other things, help scientists understand the relationships between variables in a dataset.

According to Gordon Watts, a physicist at the University of Washington, scientists traditionally figured out these relationships by plotting the data and looking for the mathematical equations that could describe it. Math came before the software, Watts says.

This math-only method is relatively straightforward when looking for the relationship between only a few variables: the pressure of a gas as a function of its temperature and volume, or the acceleration of a ball as a function of the force of an athletes kick and the balls mass. But finding these relationships with nothing but math becomes nearly impossible as you add more and more variables.

A lot of the problems were tackling in science today are very complicated, Ciprijanovic says. Humans can do a good job with up to three dimensions, but how do you think about a dataset if the problem is 50- or 100-dimensional?

This is where machine learning comes in.

Artificial intelligence doesnt care about the dimensionality of the problems, Ciprijanovic says. It can find patterns and make sense of the data no matter how many different dimensions are added.

Some physicists have been using machine-learning tools since the 1950s, but their widespread use in the field is a relatively new phenomenon.

The idea to use a [type of machine learning called a] neural network was proposed to the CDF experiment at the Tevatron in 1989, says Tommaso Dorigo, a physicist at the Italian National Institute for Nuclear Physics, INFN. People in the collaboration were both amused and disturbed by this.

Amused because of its novelty; disturbed because it added a layer of opacity into the scientific process.

Machine-learning models are sometimes called "black boxes" because it is hard to tell exactly how they are handling the data put into them; their large number of parameters and complex architectures are difficult to understand. Because scientists want to know exactly how a result is calculated, many physicists have been skeptical of machine learning and reluctant to implement it into their analyses. In order for a scientific collaboration to sign off on a new method, they first must exhaust all possible doubts, Dorigo says.

Scientists found a reason to work through those doubts after the Large Hadron Collider came online, an event that coincided with the early days of the ongoing boom in machine learning in industry.

Josh Bendavid, a physicist at the Massachusetts Institute of Technology, was an early adopter. When I joined CMS, machine learning was a thing, but seeing limited use, he says. But there was a big push to implement machine learning into the search for the Higgs boson.

The Higgs boson is a fundamental particle that helps explain why some particles have mass while others do not. Theorists predicted its existence in the 1950s, but finding it experimentally was a huge challenge. Thats because Higgs bosons are both incredibly rare and incredibly short-lived, quickly decaying into other particles such as pairs of photons.

In 2010, when the LHC experiments first started collecting data for physics, machine learning was widely used in industry and academia for classification (this is a photo of a cat versus this is not a photo of a cat). Physicists were using machine learning in a similar way (this is a collision with two photons versus this is not a collision with two photons).

But according to Bendavid, simply finding photons was not enough. Pairs of photons are produced in roughly one out of every 100 million collisions in the LHC. But Higgs bosons that decay into pairs of photons are produced in only one of 500 billion. To find Higgs bosons, scientists needed to find sets of photons that had a combined energy close to the mass of the Higgs. This means they needed more complex algorithmsones that could not only recognize photons, but also interpret the energy of photons based on how they interacted with the detector. Its like trying to estimate the weight of a cat in a photograph, Bendavid says.

That became possible when LHC scientists created high-quality detector simulations, which they could use to train their algorithms to find the photons they were looking for, Bendavid says.

Bendavid and his colleagues simulated millions of photons and looked at how they lost energy as they moved through the detector. According to Bendavid, the algorithms they trained were much more sensitive than traditional techniques.

And the algorithms worked. In 2012, the CMS and ATLAS experiments announced the discovery of the Higgs boson, just two years into studying particle collisions at the LHC.

We would have needed a factor of two more data to discover the Higgs boson if we had tried to do the analysis without machine learning, Bendavid says.

After the Higgs discovery, the LHC research program saw its own boom in machine learning. Before 2012, you would have had a hard time to publish something which used neural networks, Dorigo says. After 2012, if you wanted to publish an analysis that didnt use machine learning, youd face questions and objections.

Today, LHC scientists use machine learning to simulate collisions, evaluate and process raw data, tease signal from background, and even search for anomalies. While these advancements were happening at the LHC, scientists were watching closely from another, related field: neutrino research.

Neutrinos are ghostly particles that rarely interact with ordinary matter. According to Jessie Micallef, a fellow at the National Science Foundations Institute for Artificial Intelligence and Fundamental Interactions at MIT, early neutrino experiments would detect only a few particles per year. With such small datasets, scientists could easily reconstruct and analyze events with traditional methods.

That is how Micallef worked on a prototype detector as an intern at Lawrence Berkeley National Laboratory in 2015. I would measure electrons drifting in a little tabletop detector, come back to my computer, and make plots of what we saw, they say. I did a lot of programming to find the best fit lines for our data.

But today, their detectors and neutrino beams are much larger and more powerful. Were talking with people at the LHC about how to deal with pileup, Micallef says.

Neutrino physicists now use machine learning both to find the traces neutrinos leave behind as they pass through the detectors and to extract their properties, such as their energy and flavor. These days, Micallef collects their data, imports it into their computer, and starts the analysis process. But instead of toying with the equations, Micallef says that they let machine learning do a lot of the analysis for them.

At first, it seemed like a whole new world, they saybut it wasnt a magic bullet. Then there was validating the output. I would change one thing, and maybe the machine-learning algorithm would do really good in one area but really bad in another.

My work became thinking about how machine learning works, what its limitations are, and how we can get the most out of it.

Today, Micallef is developing machine-learning tools that will help scientists with some of the unique challenges of working with neutrinosincluding using gigantic detectors to study not just high-powered neutrinos blasting through from outside the Milky Way, but also low-energy neutrinos that could come from nearby.

Neutrino detectors are so big that the sizes of the signals they measure can be tiny by comparison. For instance, the IceCube experiment at the South Pole uses about a cubic kilometer of ice peppered with 5,000 sensors. But when a low-energy neutrino hits the ice, only a handful of those sensors light up.

Maybe a dozen out of 5,000 detectors will see the neutrino, Micallef says. The pictures were looking at are mostly empty space, and machine learning can get confused if you teach it that only 12 sensors out of 5,000 matter.

Neutrino physicists and scientists at the LHC are also using machine learning to give a more nuanced interpretation of what they are seeing in their detectors.

Machine learning is very good at giving a continuous probability, Watts says.

For instance, instead of classifying a particle in a binary method (this event is a muon neutrino versus this event is not a muon neutrino), machine learning can provide an uncertainty associated with its assessment.

This could change the overall outcome of our analysis, Micallef says. If there is a lot of uncertainty, it might make more sense for us to throw that event away or analyze it by hand. Its a much more concrete way of looking at how reliable these methods are and is going to be more and more important in the future.

Physicists use machine learning throughout almost all parts of data collection and analysis. But what if machine learning could be used to optimize the experiment itself? Thats the dream, Watts says.

Detectors are designed by experts with years of experience, and every new detector incrementally improves upon what has been done before. But Dorigo says he thinks machine learning could help detector designers innovate. If you look at calorimeters designed in the 1970s, they look a lot like the calorimeters we have today, Dorigo says. There is no notion of questioning paradigms.

Experiments such as CMS and ATLAS are made from hundreds of individual detectors that work together to track and measure particles. Each subdetector is enormously complicated, and optimizing each ones designnot as an individual component but as a part of a complex ecosystemis nearly impossible. We accept suboptimal results because the human brain is incapable of thinking in 1,000 dimensions, Dorigo says.

But what if physicists could look at the detector wholistically? According to Watts, physicists could (in theory) build a machine-learning algorithm that considers physics goals, budget, and real-world limitations to choose the optimal detector design: a symphony of perfectly tailored hardware all working in harmony.

Scientists still have a long way to go. Theres a lot of potential, Watts says. But we havent even learned to walk yet. Were only just starting to crawl.

They are making progress. Dorigo is a member of the Southern Wide-field Gamma-ray Observatory, a collaboration that wants to build an array of 6,000 particle detectors in the highlands of South America to study gamma rays from outer space. The collaboration is currently assessing how to arrange and place these 6,000 detectors. We have an enormous number of possible solutions, Dorigo says. The question is: how to pick the best one?

To find out, Dorigo and his colleagues took into account the questions they wanted to answer, the measurements they wanted to take, and number of detectors they had available to use. This time, though, they also developed a machine-learning tool that did the sameand found that it agreed with them.

They plugged a number of reasonable initial layouts into the program and allowed it to run simulations and gradually tweak the detector placement. No matter the initial layout, every simulation always converged to the same solution, Dorigo says.

Even though he knows there is still a long way to go, Dorigo says that machine-learning-aided detector design is the future. Were designing experiments today that will operate 10 years from now, he says. We have to design our detectors to work with the analysis tools of the future, and so machine learning has to be an ingredient in those decisions.

Here is the original post:

Machine learning and experiment | symmetry magazine - Symmetry magazine

Recommendation and review posted by G. Smith

What We Learned From Big Tech Earnings This Week – Investopedia

Posted: April 27, 2024 at 2:41 am

Key Takeaways

Artificial intelligence (AI) was in focus as Meta Platforms (META), Google-parent Alphabet (GOOGL), and Microsoft (MSFT) reported earnings this week, but investors werent easily impressed despite better-than-expected results posted by all three tech giants.

Meta shares plunged after the company emphasized increased spending to invest in AI. Meanwhile, Alphabet shares surged and Microsoft shares gained as cloud strength seems to ease investors' concerns about the increased AI spending.

Big tech earnings demonstrated that companies' enterprise customer businesses were key to AI monetization last quarter. The emphasis on enterprise offerings persisted with a focus on cloud segments.

Meta's earnings beat was overshadowed by the company's plans to increase spending on AI investments which sent the stock tumbling. The worry for investors in the near term was perhaps how quickly the investment would yield returns, even as analysts said it could boost Meta's position in the long term.

However, investors didn't seem to feel that way about Meta's counterparts.

Alphabet noted increased spending fueled by AI investments. AI-related growth in Google Cloud and YouTube "support the notion that Google is seeing AI tailwinds across the business," analysts at Raymond James wrote.

Microsoft's chief financial offer Amy Hood said the company expects "capital expenditures to increase materially on a sequential basis driven by cloud and AI infrastructure investments," during the company's earnings call.

Hood said while the company expects capital expenditures to be higher in the 2025 fiscal year than in 2024, "these expenditures over the course of the next year are dependent on demand signals and adoption of [Microsoft's] services."

While Meta has highlighted its early success in leveraging its AI tech, analysts say investors are looking for more clarity on how it can contribute to the company's existing structure.

"Upside in the near term may be limited," Wedbush analysts wrote in a note, adding that investors are waiting for "more clarity on potential 2025 spending levels," evidence that the company can meet growth expectations despite harder comparables, and sustainable user and advertiser engagement with new AI offerings.

The company generates almost all of its revenue from advertising and has been increasingly looking at ways to leverage AI to boost that revenue. Meta reported that 30% of the content users see on Facebook and 50% on Instagram is delivered by its AI recommendation engines which improve engagement and increase ad efficiency.

Alphabet also has set its sights on AI-driven advertising revenue growth. The companys Chief Business Officer (CBO) Philipp Schindler spoke during its earnings call about how generative AI helps advertisers target their audience better, and tools like Gemini could also aid in creating the images and text they need for those ads.

At Alphabet's recent Google Cloud Next conference, hundreds of the company's enterprise customers spoke about using the cloud platform's genAI tools, with some notable business users including Mercedes Benz and Walmart (WMT).

Alphabet CEO Sundar Pichai said the company is "committed to making the investments required to keep [it] at the leading edge in technical infrastructure" as increased capital expenditures "will fuel growth in Cloud, help [the company] push the frontiers of AI models, and enable innovation across our services, especially in Search."

Pichai outlined the company's "clear paths to AI monetization through Ads and Cloud." He said the "cloud business continues to grow as we bring the best of Google AI to enterprise customers."

While AI initiatives are top of mind for investors, Microsoft's cloud strength fueled its third-quarter earnings beat.

"Cloud and AI continued to fuel upside for Microsoft," Bank of America analysts wrote, saying they "believe Azure strength is enough to drive total revenue growth higher for now."

Microsofts Hood said "I know it isn't as exciting as talking about all the AI projects," but Azure "is still really foundational" to the company's enterprise customers.

Excerpt from:

What We Learned From Big Tech Earnings This Week - Investopedia

Recommendation and review posted by G. Smith


Page 30«..1020..29303132..4050..»