Contact Us
News

Big Tech Firms, Blackstone Tamp Down DeepSeek Data Center Fears

DeepSeek’s artificial intelligence model has spooked some investors around the future of data center demand. But leaders of the largest data center tenants and providers say any potential impact should be less dramatic than many have feared.

Placeholder
Meta founder and CEO Mark Zuckerberg speaks at a conference in 2018.

Top executives at Meta, Microsoft and data center investment behemoth Blackstone all discussed the emergence of DeepSeek on quarterly earnings calls this week, addressing the potential implications of the more efficient AI model on their massive investments in digital infrastructure. 

While not dismissive of the long-term implications for data center demand, the heads of all three firms made it clear that DeepSeek’s advances haven't instantly caused them to abandon their data center plans.

The tech giants anticipate spending more on data center development than ever in the year ahead. And even as they acknowledged that the future data center landscape has become murkier, they said a drop-off in demand is far from a certainty. 

“It's probably too early to really have a strong opinion on what this means for the trajectory around infrastructure and capex and things like that,” Meta CEO Mark Zuckerberg said. “I continue to think that investing very heavily in capex and infra is going to be a strategic advantage over time. It's possible that we'll learn otherwise at some point, but I just think it's way too early to call that.”

On Monday, share prices of tech, data center and energy firms plummeted after the emergence of Chinese AI firm DeepSeek’s R1 model raised questions about the hundreds of billions of dollars in capital expenditures major tech firms are deploying toward AI infrastructure. While the best models from U.S. tech firms still outperform R1, the efficiency of the Chinese model has generated amazement from tech leaders like venture capitalist Marc Andreessen, who wrote on X that it is “one of the most amazing and impressive breakthroughs I’ve ever seen.”

DeepSeek claims that training one of its AI models costs just $5.6M, whereas American AI firm Anthropic spent nearly $1B training an AI model, with the cheapest close to $100M. DeepSeek reportedly trained its model on just 2,000 older Nvidia chips, while U.S. tech firms routinely use tens of thousands of the latest processors. 

The model's low price tag and development on fewer, more primitive chips have suddenly cast doubt on a fundamental assumption underpinning Big Tech’s AI spending spree: that massive amounts of energy-intensive chips are needed to develop more advanced AI models that would lead to commercially viable products.

The sell-off in data center and energy stocks earlier this week showed investors were concerned that more efficient AI training could mean far less demand for data center capacity than previously expected. But at least in the immediate future, there is no indication that the tech giants driving the data center building boom have any intent on cutting their spending on digital infrastructure.

Meta reported this week that it plans to increase its capex to as much as $65B this year, with CEO Mark Zuckerberg announcing a new 2-gigawatt AI data center project on top of the gigawatt of data center capacity the social media giant will bring online in 2025. 

Microsoft, whose capex soared last quarter, also told analysts this week it has no plans to shift course with its spending on data centers and other digital infrastructure to support AI. The company’s capex is expected to stay near current levels for at least the next two quarters. 

Leaders at both firms were emphatic that the rapid build-out of data centers is vital to their short-term competitiveness.

Microsoft blamed its underwhelming AI growth largely on not being able to expand its data center footprint fast enough to keep up with cloud customers’ needs, with Chief Financial Officer Amy Hood saying the firm is “working from a pretty constrained-capacity place.”

Such backlogs have become the norm in a challenging data center development landscape where the supply of new data centers being built by major tech firms and third-party providers trails demand significantly. The result has been a leasing market with vacancy rates below 3% in major markets and new data centers being preleased two years before their completion. 

Blackstone President Jon Gray alluded to these market dynamics Thursday as he sought to alleviate investor concerns about DeepSeek’s implications for the firm’s $80B portfolio of leased data centers. While acknowledging the potential for a leap forward in computing efficiency to reshape the market, he suggested there is no immediate impact or risk for an existing portfolio and pipeline that is largely committed to credit-grade tech giants. 

“We've obviously been spending a lot of time the last week looking at the impact of DeepSeek,” Gray said on Blackstone's earnings call. “The good news in that business is these are long-term leased data centers with some of the biggest companies in the world, and we do not build data centers speculatively anywhere in the world.”

Placeholder

While both Meta and Microsoft acknowledged that a long-term reduction in data center spending due to more efficient AI training is a possibility, they also emphasized that it is far from a certainty. 

While AI training workloads have been a contributor to growing demand and captured headlines through AI training megaprojects like Oracle and OpenAI’s Stargate, they are just part of the data center demand picture. Meta executives said most of the company’s planned capex isn’t for training generative AI models but for the various products and services that make up its core business.

That tracks with a broader trend across the hyperscale data center space, with all generative AI workloads expected to still account for less than half of all data center capacity by 2030.   

And even if the efficiency gains unveiled by DeepSeek do lead tech firms to cut spending on computing capacity, leaders across the data center and tech spaces say it could ultimately lead to an increase in data center demand. 

Microsoft CEO Satya Nadella is one of a chorus of voices suggesting that if AI training becomes significantly cheaper, it will subsequently drive down the cost of adoption by enterprises and consumers and produce a more vibrant AI economic ecosystem. This would mean more AI startups, more companies incorporating AI into their existing businesses, and more people using AI integrated into consumer products. As a result, this new demand for data center capacity ultimately exceeds that lost from hyperscale pullback. 

But how that demand manifests across the data center landscape could look very different.

“Maybe there's a little less training that's done as a result of less [capital] intensity, but at the same time, there's more inference, maybe there's more cloud, maybe there's more to do with enterprise,” Blackstone’s Gray said. “There's a belief that as usage goes up significantly, there's still a vital need for data centers. The form of that use may change.” 

Should this phenomenon, known as “Jevons paradox,” emerge, industry executives say there will be a shift in which segments of the data center industry absorb that demand. As adoption increases, the need for computing capacity shifts away from AI training and toward inference, the computing through which end users interact with an AI model. 

Such a shift could have a dramatic impact on the data center development landscape. A surge in demand for inference computing would require more capacity close to large population centers where the people and companies using those AI products and services are located, Harrison Street Head of Digital Assets Michael Hochanadel told Bisnow this week.

That would mean more demand for capacity in major markets and areas with the robust network infrastructure needed for low latency. 

At the same time, it would mean demand may fade for the data centers specifically used for AI training that have emerged over the past three years. These facilities, some of them former bitcoin mining operations, have massive amounts of power but not the fiber connectivity or proximity to population hubs needed to be viable for other data center use cases. 

“Even before this DeepSeek announcement, there were some people saying that once the AI models are trained, those types of facilities would become superfluous and would have less defensible residual value,” Hochanadel said.