Michael DempseyTechnology reporter

This is such a large number that it is difficult to imagine. Globally, around $3 trillion (£2.2 trillion) will be spent on AI-enabled data centers between now and 2029.
That estimate comes from investment bank Morgan Stanley, which adds that roughly half of that amount will go toward construction costs and half toward expensive equipment supporting the artificial intelligence revolution.
For comparison: this is approximately that the entire French economy cost in 2024.
Estimated UK only 100 more data centers will be built over the next few years to meet the demand for AI processing.
Some of them will be created for Microsoft. which earlier this month has announced a $30bn (£22bn) investment in the UK's artificial intelligence sector.
What's so special about artificial intelligence data centers that are different from traditional buildings with rows of computer servers that constantly run our personal photos, social media accounts, and work apps?
And are they worth such a huge expense?
Data centers have been growing in size for years. The new term “hyperscale” was coined in the technology industry to describe facilities where power demands reached tens of megawatts before gigawatts, a thousand times larger than megawatts, came onto the scene.
But AI has perfected this game. Most artificial intelligence models rely on expensive computer chips from Nvidia to process tasks.
Nvidia's chips come in large packages that cost about $4 million each. And these cabinets hold the key to understanding why AI data centers are different from others.
Large Language Models (LLMs) that train AI software must break the language down into all the tiny elements of meaning possible. This is only possible with a network of computers working in unison and in very close proximity.
Why is intimacy so important? Every meter of distance between two chips adds a nanosecond, one billionth of a second, to the processing time.
This may not seem like much time, but when a warehouse full of computers is running, these microscopic delays add up and reduce the performance needed for AI.
AI processing cabinets are combined together to eliminate this element of latency and create what in the technology sector is called parallel processing, working as one huge computer. All of this means density, the magic word in AI circles.
Density eliminates the processing bottlenecks that occur in conventional data centers when dealing with processors located several meters apart.

However, these dense rows of cabinets absorb gigawatts of energy, and LLM training causes spikes in electricity demand.
These bursts are equivalent to thousands of houses synchronously turning their kettles on and off every few seconds.
This type of intermittent demand on the local network requires careful management.
Daniel Bizo of data center design consultancy The Uptime Institute analyzes data centers for a living.
“Conventional data centers are constant background noise compared to the demands placed on the network by the AI workload.”
Like these synchronized teapots, sudden bursts of artificial intelligence present a particular challenge, according to Bizot.
“A special workload of this magnitude is unheard of,” says Mr. Bizot, “an extreme engineering challenge like the Apollo program.”
Data center operators are addressing the energy problem in a variety of ways.
Speaking to the BBC earlier this month, Nvidia CEO Jensen Huang said that in the UK in the short term he hoped more gas turbines could be used “off-grid so we don't burden people who are on-grid.”
He said AI itself will develop more efficient gas turbines, solar panels, wind turbines and fusion power to produce more cost-effective and sustainable energy.
Microsoft is investing billions of dollars in energy projects, including a deal with Constellation Energy that will allow nuclear energy is being produced again on Three Mile Island.
Google, owned by Alphabet, also invests in nuclear energy as part of a continuation strategy carbon-free energy by 2030.
Meanwhile, Amazon Web Services (AWS), which is part of the retail giant Amazon, says it has already largest corporate buyer renewable energy sources in the world.

The data center industry is keenly aware that lawmakers are keeping an eye on the shortcomings of AI factories because of their energy-intensive potential impacts on local infrastructure and the environment.
One of these environmental consequences is the excess water used to cool the chips.
In the US state of Virginia, where a growing number of data centers support the business of tech giants like Amazon and Google, legislation is being considered that would tie the approval of new sites to water consumption metrics.
Meanwhile, a proposed artificial intelligence plant in north Lincolnshire in the UK has faced objections from Anglian Water, which is responsible for maintaining taps in the area of the proposed site.
Anglian Water notes that it is under no obligation to supply water for non-domestic use and suggests using recycled water from the final stage of wastewater treatment rather than potable water as the heating medium.
Given the practical challenges and huge costs facing AI data centers, is this whole movement really one big bubble?
One speaker at a recent data center conference coined the term “bravawatts” to describe how the industry is exaggerating the scale of its AI offerings.
Zal Limbuwala is a data center specialist at a DTCP technology investment consultancy. He admits there are big questions about the future of AI data center spending.
“The current trajectory is very difficult to believe. Of course there was a lot of bragging. But investments must be profitable, otherwise the market will correct itself.”
Taking these caveats into account, he still believes that AI deserves a special place in terms of investment. “AI will have a greater impact than previous technologies, including the Internet. So it’s entirely possible that we’ll need all those gigawatts.”
He notes that, if anything, AI data centers “are the real estate of the tech world.” Speculative technology bubbles, such as the dot-com boom of the 1990s, did not have a solid foundation. AI data centers are very reliable. But the spending boom behind them cannot last forever.