It’s no secret that Nvidia (NASDAQ: NVDA) is dominating the artificial intelligence (AI) chip market, especially the massive data center segment — think server farms built by companies like Amazon and Alphabet. Last quarter, the company brought in $26.5 billion in revenue from the lucrative segment. That’s 10 times more than its biggest competitor, AMD, managed, but Nvidia isn’t just selling more, it’s doing so at a lower cost. In the same period, the company’s net income here was a whopping 25 times that of AMD’s.
Media attention has been focused almost entirely on this segment. It makes perfect sense; those numbers are insane. Make no mistake, data centers are the beating heart of Nvidia’s explosive growth and that’s not liable to change. But an exciting opportunity in AI exists outside of these colossal server farms. Your personal computer could play a crucial role in the near future.
That’s why Nvidia is hosting its RTX AI PC Day this Oct. 19 and 20. The event will showcase how the company is incorporating AI and machine learning capabilities into its consumer-facing GPUs and the future it envisions for AI-powered PCs. What could this mean for investors?
Pushing AI to the edge can save time and money
In tech, the last two decades have been defined by the cloud. Advances in internet bandwidth allowed much of the actual work of computation and storage to move from your local device to sprawling server farms hidden away in mostly rural areas. Out of sight, out of mind, as they say. AI is, by and large, no different. ChatGPT doesn’t run on your computer; it’s likely running on Microsoft servers thousands of miles away. You’re using your computer simply to communicate with it.
However, that is beginning to change. In some cases, computing is moving back to the “edge” as the techies call it. In other words, more and more, a significant portion of computing will again happen locally on your device. Some believe this could be a big trend in AI. But why, you ask? Great question.
Some AI applications are just too intensive to run anywhere but in a data center equipped with thousands of, say, Nvidia’s H100s. At $25,000 a pop, the cost is enormous. That’s why these data centers are built by companies that can invest billions of dollars to build them. However, not all AI applications need that much power. Some models, specific tools, or other limited applications of AI could run locally with a powerful enough GPU. But just because they could, does it mean they should? What advantages does edge computing have here?
First, where edge computing lacks power, it makes up for it in speed. Cloud computing introduces latency into the equation — a delay — that crunching the numbers locally doesn’t. Those milliseconds could make all the difference for some applications of the technology. Second, given the cost of building and operating large-scale data centers, using them is anything but cheap. Customers who opt for an edge approach may pay more upfront, but it could easily be the cheaper option over time. Another big one? If your internet connection isn’t guaranteed or limited, you don’t want to have to rely on the cloud.
AI PCs will be used for everything from enhancing video game graphics and expanding what’s possible for content creation to bringing a personal AI assistant to your car.
AI PCs could be a big market for Nvidia
This could be a major leap for the whole PC industry. The CEO of Qualcomm called it “as significant as Windows 95.” No one wants to be left behind here and that means Nvidia has stiff competition from a host of other chipmakers all vying for a piece of the pie that research firm Canalys expects to grow at a 44% compound annual growth rate for the next four years. The firm believes that more than 200 million AI PCs will ship in 2028.
Given the competition, it may be unlikely that Nvidia will capture as much of this market as it has the data center market. Given the scale, however, it doesn’t need to. A modest share of this market will still boost its bottom line significantly.
Nvidia’s event will help raise the profile of AI PCs and could impact the stock price, but I wouldn’t hold my breath for anything significant. The event won’t move the needle in and of itself. What’s important is what it represents: another massive opportunity for a company firing on all cylinders.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $831,707!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than quadrupled the return of S&P 500 since 2002*.
See the 10 stocks »
*Stock Advisor returns as of October 14, 2024
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Johnny Rice has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Microsoft, Nvidia, and Qualcomm. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Oct. 19 Is Almost Here. Could Nvidia’s AI PC Day Be a Game-Changer for Investors? was originally published by The Motley Fool
Credit: Source link