'The Numbers Start To Add Up': Global Data Center Capacity Expected To Triple In Next 6 Years
The dramatic impact artificial intelligence will have on the data center industry’s energy footprint is starting to come into focus.
The launch of ChatGPT last November kicked off a Big Tech AI arms race, with companies like Microsoft, Google and Meta scrambling to build out the infrastructure needed to integrate generative AI across their various products and services. Leaders from both the data center and tech sectors were quick to predict that the rapid adoption of AI technology would boost demand for data center capacity: There would be more data centers, and they would use more power.
Almost a year later, a clearer picture is emerging of the massive scale at which AI is driving increases in data center capacity and power consumption.
A study published Tuesday by Synergy Research Group projects that global hyperscale data center capacity will triple in the next six years, with the average power demand from these data centers doubling sooner than that.
A separate report published a week earlier in the research journal Joule says global AI computing alone will use more electricity than some Western European nations within the next four years, a dramatic spike in energy consumption that could exacerbate power challenges already plaguing the sector.
“AI-related electricity consumption, even in a moderate scenario, could escalate quite rapidly in the coming years, with potentially as much electricity consumption from AI devices as from a country like the Netherlands by 2027,” said Alex de Vries, author of the Joule study and the founder of Digiconomist, a research firm focused on the environmental impact of digital trends. “That’s a pretty significant expansion of data center energy use.”
Such capacity growth would also represent a 50% increase in the data center industry’s share of global energy consumption. Traditional data centers have accounted for 1% of global energy use for years, according to de Vries. But AI computing alone could realistically raise that figure to 1.5% by 2027. And that is before the significant amount of power needed to cool data centers is taken into account.
The Joule study projects future AI energy demand largely through an analysis of the supply chain for the power-intensive, high-performance servers needed for AI — particularly those produced by Nvidia, which account for 95% of the market. These AI servers, often referred to as GPUs, use far more power than previous data center IT equipment.
According to de Vries, an AI-powered Google search utilizing these servers can use 30 times more electricity than a traditional Google search.
Big Tech is loading up on this hardware. Nvidia saw a 141% increase in data center segment sales between the first two quarters of 2023. And although de Vries said he expects supply chain constraints and economic factors to slow this growth in the months ahead, he said it is reasonable to expect sales to reach 15 times their current levels by 2027. Running just those servers would use 134 terawatts of power annually — more than 65% of the estimated combined capacity of all the data centers in the world today.
“Each of those servers can already consume as much power as five U.S. households combined,” de Vries said. “We're not talking about just a few of these rolling off the manufacturing belts in the coming years. We’re talking about hundreds of thousands of units, potentially up to 1.5 million by 2027, so the numbers start to add up really fast.”
Tuesday’s report from Synergy Research Group paints a similar picture of AI’s role in driving dramatic data center capacity growth. Drawing on development pipeline data for 19 of the world’s largest data center users, Synergy’s model forecasts that global hyperscale data center capacity will triple before the end of the decade, a dramatic spike in energy consumption the report’s authors attribute primarily to the rapid adoption of AI.
But this growing energy footprint doesn’t necessarily mean more data centers. While the total number of hyperscale data centers is expected to continue growing at close to the present rate of approximately 100 new facilities each year, the energy footprint of those data centers is becoming significantly larger.
According to Synergy Chief Analyst and Research Director John Dinsdale, the average capacity of hyperscale data centers to open over the next six years will soon be more than double that of current facilities to accommodate the power-intensive IT equipment needed for AI.
“When it comes to hyperscale operators and the global network of large data centers that they continue to build out, we are seeing some changes to plans for future deployment that are being driven by generative AI technology and services,” Dinsdale said in an email to Bisnow.
“These changes are not so much to do with the number of new data centers that will be launched, but rather the capacity and power density of those facilities, as GPUs are deployed in ever-greater numbers,” he added. “While the trend was already there for new hyperscale data centers to increase in size over time, generative AI has provided a substantial boost to that trend.”
This shift toward a significantly larger energy footprint has the potential to add to power constraints that have emerged as perhaps the most significant impediment facing data center development. Key data center markets such as Northern Virginia and Silicon Valley are already experiencing years-long delays for power connections, largely due to insufficient transmission infrastructure needed to deliver energy to digital infrastructure clusters. Now, AI is causing that power demand to spike.
Still, Dinsdale said data centers supporting AI workloads can often be built outside of major data center hubs where these energy constraints are most acute. Many current AI applications don’t require the low latency that makes physical proximity to other data centers or robust fiber networks necessary, so developers have started building wherever they can find power quickly.
“Many AI workloads are not as latency sensitive as other workloads, so it can enable the operator to place data centers in more distant and less expensive locations,” Dinsdale said. “For example, we were already seeing hyperscale data center growth in the US Midwest outpacing growth in other regions such as Northern Virginia and Silicon Valley. We fully expect that trend to continue.”