Why Apple’s Big AI Push Could Change Where Data Centers Are Built
Apple this week launched a major effort to deploy artificial intelligence across its products, and it may mark the start of a significant shift in the data center landscape.
While other tech giants like Microsoft, Google and Meta went all-in on generative AI over the past two years, driving an unprecedented wave of data center demand as they scrambled for the infrastructure to support these technologies, Apple had remained on the sidelines of the AI arms race.
That was until this week, when the company unveiled what it has branded Apple Intelligence, a range of generative AI tools and functionalities it is incorporating into the operating systems for iPhones and other Apple consumer products.
Apple’s AI integration puts generative AI capabilities, from image creation to ChatGPT, at the fingertips of billions of iPhone users. In doing so, it marks a potential inflection point in turning technologies that have thus far seen only scattered adoption into an integral part of consumers’ everyday digital lives — a normal and perhaps unnoticed part of how they text, take photos, post on social media or make dinner reservations.
The need for computing power to support AI has already fundamentally changed where and how data centers are built. But a meaningful increase in the number of people regularly using AI on their phones and other devices would likely herald another seismic shift, said Ali Greenwood, executive director of Cushman & Wakefield’s data center group.
While the initial AI wave disproportionately drove the growth of data center build-out and leasing in small markets and rural areas, the computing requirements to support widespread consumer adoption of AI are expected to drive a new wave of demand in primary markets, the major cities and population centers where the bulk of users live and work.
“It’s going to mean data center demand in Tier 1 cities with large population bases,” Greenwood said. “I think you're going to see a tremendous increase in demand around these real-life rollouts of AI tools that are going into the consumer’s hands.”
Apple Intelligence is slated to start appearing in the company’s operating systems this fall. New capabilities include several AI writing tools integrated across Apple’s various apps, along with tools that use AI to transcribe voice memos, retouch photos and write software code, according to the company. There are also generative image creation features built in, with the company highlighting a tool that creates new emoji from written prompts.
Apple has also promised a vastly more intelligent Siri with AI integrations that allow the digital assistant to do things like identify images of specific individuals or objects in a user’s photo library or proactively flag potential scheduling conflicts.
To some observers, Apple is tacking on relatively simple AI tools that already exist elsewhere. Indeed, competitors like Samsung have already released phones with similar AI integrations, to far less fanfare.
But Apple is unique in its ability to drive adoption of new technology, analysts say. The company has a track record of pushing existing tech into the mainstream. Apple didn’t make the first digital music player or smartphone, but it was the iPod and iPhone that made those devices ubiquitous.
Some expect the same to occur with consumer AI, with Apple Intelligence pushing other phone-makers to prioritize AI integrations, as well driving increased AI functionality in third-party apps as users become accustomed to using these tools.
“Once Apple enters [the AI phone sector], we expect AI to immediately become a must-have feature in all mid-to-premium smartphone launches starting 2025,” said Tarun Pathak, research director at technology market research firm Counterpoint Research, according to Light Reading.
If generative AI becomes part of the everyday functionality of mobile devices and the apps that run on them, experts say it will disproportionately drive demand for data center capacity capable of supporting AI to major markets.
Apple says the bulk of the data processing for its new AI tools will occur on the phone itself, but more complex tasks requiring more computing power will be processed in the cloud. Most of the time, this means computing located at data centers where Apple operates the infrastructure for the private cloud it developed specifically for these workloads. Other tasks will be handled at OpenAI’s data centers, if users allow.
This kind of AI computing, allowing users to interact with massive AI models, has vastly different siting considerations from the AI computing that has primarily driven the data center demand surge so far.
To date, the bulk of AI-driven data center demand has revolved around training the large language models behind products like ChatGPT. AI training doesn't need the fast data transfer speeds, known as latency, that have traditionally required data centers to be located close to major population hubs. This has allowed the industry’s most robust growth over the past 24 months to occur in smaller markets and rural areas that were previously well off the data center map.
But once an AI model has been created, it runs on a different set of infrastructure for users to interact with it in real time. This latter stage, where the AI is actually applied, is known as inference. Inference computing typically needs to be located closer to the user to keep latency within acceptable limits.
A sudden surge in inference demand from billions of scattered users adopting generative AI technologies through their phones and other devices would be a game-changer, Greenwood said, pushing a wave of demand for capacity in or near major population centers where the bulk of users live.
“It’s maybe the hardest test case because the consumer has so many different needs and wants compared to building business AI tools meant to do a certain task,” Greenwood said. “It’s going to have to do a lot of things for a lot of people, and therefore, it needs to be as close as possible to those consumers.”
While Apple’s AI push alone hasn't moved the needle in driving demand toward primary markets, according to Greenwood, she expects the impact of mobile AI adoption to play out in much the same way as the rapid growth of demand from streaming and social media services reshaped the industry nearly a decade ago.
The computing infrastructure needed to seamlessly stream content on demand or play mobile games with no lag has similar geographic requirements to what’s needed to support device-based AI tools. The rapid adoption of those services changed the leasing landscape, fueling a significant boost in demand for data center space to host computing nodes close to major population hubs.
“What we're hearing in the marketplace is that it's going to be very similar to how the data center supports content delivery,” Greenwood said. “Think Netflix or think gaming.”
Jürgen Hatheier, the international chief technology officer at optical equipment provider Ciena, also predicted the move would have major geographic implications for data centers.
“This move will also bring massive demand to communication service provider networks and AI inference sites, be it on device, on-premises, at the network edge or in a metro data center,” Hatheier said in an emailed statement reported by Light Reading.