Nvidia Doubles Data Center Sales As CEO Dismisses Fears Of 'AI Wall'
Nvidia’s third-quarter data center revenue jumped 112% year-over-year, driving a strong earnings report that stoked investor confidence in artificial intelligence and, at least for now, tamped down emerging fears that Big Tech’s trillion-dollar data center build-out may not be yielding the promised results.
Data center sales accounted for the vast majority of Nvidia’s $35.1B in quarterly revenue, a record three months in which the artificial intelligence computing giant saw earnings grow 17% sequentially and 94% year-over-year.
The results exceeded investor expectations, the sixth straight quarter Nvidia’s revenue surpassed its quarterly guidance by $2B or more — a run that has made it the world's most valuable company.
Industry analysts have pointed to the results as an indicator that Big Tech’s AI arms race, and subsequently the global data center building boom, aren't likely to slow down in the coming months.
But as fears of an “AI bubble” grow, Nvidia CEO Jensen Huang, speaking on the company's earnings call Wednesday, emphasized what he called the durability of an AI infrastructure gold rush that he said is still “in full steam.”
“AI is transforming every industry, company and country,” Huang said. “The age of AI is upon us, and it's large and diverse.”
As the maker of the vast majority of the graphics processing unit-based computing systems used for AI, Nvidia occupies a unique position as a measuring stick for the unprecedented growth of AI and a bellwether for the data center development needed to support it.
Less than two years ago, its quarterly chip sales totaled $4B, and it is now more than eight times that, indicating that the skyrocketing demand is expected to drive global data center capital expenditures into the trillions within the decade.
In the hours after Nvidia released its quarterly numbers, Wall Street largely agreed with Huang’s assessment that demand for AI chips and the data centers to house them is in its early innings, with few signs of an imminent market correction. The company's stock price fluctuated early Thursday but ended the day up around 1%.
In notes to investors, Goldman Sachs pointed to “growing demand for AI infrastructure across all customer groups,” while Jefferies said Nvidia’s momentum “should accelerate from here.”
Morningstar strategist Brian Colello projects Nvidia’s revenue jumping 141% in 2025. At the moment, Nvidia’s data center sales are artificially suppressed by the fact that they can’t manufacture their products quickly enough to keep up with demand from hyperscale data center users — demand that continues to accelerate, he said.
“[Data center] revenue remains supply-constrained and near-term revenue will rise as more supply comes online,” Colello wrote in a note shared with Bisnow. “The main driver of this growth is an ongoing increase in capital expenditures in data centers at leading enterprise and cloud computing customers.”
The optimism in the wake of Nvidia’s quarterly results stands in stark contrast to ominous reports about the company and the broader AI landscape that had emerged in the days prior.
Last week, The Information reported Nvidia was struggling to solve an overheating issue with its next-generation Blackwell processors, a product line expected to be released this year and central to the chipmaker’s fortunes.
Nvidia had been forced to redesign server racks repeatedly to address the problem, according to The Information, stoking fears of a delayed rollout that could jeopardize Nvidia’s bottom line and the pace of AI development.
But on its earnings call Wednesday, Nvidia’s Huang said the Blackwell overheating problem had been resolved and that no delays were expected. Business Insider reported Tuesday the Blackwell issues had been fixed months ago.
But far more existential questions about the future of AI have also been swirling across the data center and tech landscape following reports that the billions of dollars being spent on chips and data centers may not be yielding the promised result.
One of the assumptions underpinning Big Tech’s massive AI infrastructure spending is the idea that the foundational AI models will become far more advanced and capable as the computing power used to train them scales up.
The validity of these “scaling laws” had seemingly been demonstrated over the past two years by the largest AI players, as firms like OpenAI saw the models behind applications like ChatGPT take massive leaps forward as the infrastructure footprint grew.
The economic promise of AI is largely predicated on the idea that world-changing commercial uses will emerge as more and more computing power makes AI smarter. But a report from Bloomberg last week has raised doubts about whether more computing and more data centers are actually moving AI forward.
The latest AI models trained by OpenAI, Google and Anthropic, three of the leading AI firms, failed to make significant improvements over earlier versions, despite being trained on far more powerful computing infrastructure, according to Bloomberg. These diminishing returns compared to past training cycles have reportedly come as a surprise. Developers have attributed the stalled progress to a shortage of good data to train the AI models on.
The news has set off alarm bells in some parts of the tech sector on whether AI progress is beginning to plateau. If more compute doesn’t equate to better AI, the logic fueling the global data center boom could be left in shambles.
“There’s a lot of debate: have we hit the wall with scaling laws?” Microsoft CEO Satya Nadella said at the firm’s annual conference this week.
Huang defended the idea of scaling laws, which he said “is intact and continuing,” and dismissed fears that AI development had plateaued.
“The evidence is that it continues to scale,” Huang said on the earnings call.
Huang argued that while it is true that a lack of available data is making it harder to train the newest AI models, there are new approaches to training that will ultimately move those models forward. Those training approaches also require AI firms to continue scaling up their computing infrastructure and the data centers to host it, Huang said.
Not everyone is a believer, however. Experts like Margaret Mitchell, chief ethics scientist at AI firm Hugging Face, say it is becoming increasingly apparent that humanlike artificial intelligence, known as AGI, is much further away than the industry has portrayed and that the firms spending billions on AI chips and data centers will ultimately determine the juice isn’t worth the squeeze.
“The AGI bubble is bursting a little bit,” Mitchell told Bloomberg.