AI boom could drive 500% surge in data-centre power demand by 2040

The rapid expansion of AI is turning data centres into a major new electricity load, with global power demand from the sector projected to rise 500% by 2040 and about three-quarters of that demand still expected to be met by hydrocarbons, executives and policymakers heard during Abu Dhabi Sustainability Week.

The warning sharpens a reality energy planners are now forced to model explicitly: even if AI helps industries cut waste, the compute behind it is becoming a power-hungry, always-on layer of the economy. That creates a direct challenge for grid capacity, peak management, and national net-zero pathways, especially in markets accelerating hyperscale build-outs.

“An energy play” with infrastructure-scale consequences

Dr Sultan Al Jaber, Minister of Industry and Advanced Technology and Chairman of Masdar, framed the issue as an unavoidable demand shock that energy systems must design around, not deny. “Over 70 per cent of this energy will still come from hydrocarbons,” he said, as demand accelerates across economies and new infrastructure is built to support the AI cycle.

He also pointed to the scale of what the world must build to keep up, citing the need for $4 trillion in annual investment to expand grids and data-centre infrastructure.

AI can optimise consumption, but it also adds a huge new load

David Auriau, CEO of Positive Zero, said AI’s promise is real, but so is the strain it creates on power systems.

“AI represents a tremendous opportunity to optimize energy consumption. But it also drives a huge increase in electricity demand from data centers, putting national sustainability goals under pressure,” Auriau said. “The way forward is to accelerate deployment of renewable energy and use AI to boost efficiency across buildings and grid networks. That is how the UAE can meet rising demand, while capturing the economic upside of the AI boom.”

His argument is essentially a grid strategy: add clean supply faster, then use software and retrofits to reduce waste and flatten peaks across buildings and networks, so the system can absorb new compute demand without constant emergency expansion.

Why measurement standards now matter

As energy becomes the constraint on AI scale, industry leaders are also pushing for efficiency metrics that translate directly into electricity use, not marketing claims.

Fred Lherault, CTO EMEA/Emerging at Pure Storage, said one industry standard needs urgent updating: how storage efficiency is measured.

“In the face of increased energy demands for data centres, an industry standard which should be updated is the way data storage efficiency is measured. Terabytes per Watt (TBe/W) measures the amount of data stored per unit of energy and should be introduced,” Lherault said. “This is a relevant and clear measurement that captures real-world energy use, and is a simple, vendor neutral, and accurate benchmark. This approach could reduce the impact of increases in energy prices, enhance energy security, and relieve pressure on overstretched infrastructure.”

The push for Terabytes per Watt is a practical bid to make energy efficiency legible to buyers, regulators and utilities. If adopted widely, it would make it easier to compare storage architectures by their actual power draw, and harder to hide inefficient design behind headline performance.

The global trajectory is already steep

The energy implications of the AI build-out are increasingly reflected in international forecasts. The International Energy Agency expects global electricity demand from data centres to more than double by 2030 to around 945 TWh, driven largely by AI workloads, with AI-optimised data centres projected to more than quadruple their electricity consumption by 2030.

For energy systems, this is not just about adding generation. It is about transmission, substation capacity, interconnections, and the ability to deliver power reliably to dense clusters of compute, often on accelerated timelines.

What “keeping up” looks like in practice

Across the Gulf, the conversation is shifting from broad sustainability narratives to the mechanics of meeting load growth without losing credibility on emissions. That means:

  • More clean power, faster, because data-centre demand does not wait for long policy cycles.

  • Grid upgrades at scale, because generation alone does not solve bottlenecks if transmission and distribution cannot deliver.

  • Efficiency standards that bite, because the cheapest megawatt is the one not consumed, and the sector needs benchmarks that reflect real-world energy use.

Auriau’s point is that AI should be deployed as a system-wide efficiency tool, not only as a driver of additional demand. Lherault’s point is that without better measurement, the industry cannot manage what it refuses to quantify.

The through-line is blunt: the AI boom is no longer only a technology story. It is a power-system story, and the winners will be the markets that can deliver reliable electricity at scale while squeezing waste out of every layer of the stack.

Next
Next

The Women in Tech Problem Is Not Awareness. It’s Power.