A modern hyperscale data centre can consume as much electricity as a city of 100,000 people. That was startling enough when these facilities mainly handled streaming, cloud storage and e-commerce.
But the rapid shift toward artificial intelligence – especially generative AI and large-scale inference – is driving energy consumption into unprecedented territory.
OpenAI, Nvidia, Google, Microsoft, Amazon and national governments around the world are preparing to build a new class of compute infrastructure: AI data centres, which are larger, hotter and far more power-hungry than anything the tech sector has deployed before.
The question now is simple, unsettling and unavoidable: How will humanity generate enough electricity to keep the AI era running?
The energy footprint of today’s data centres
Data centres already consume roughly 2-3 percent of global electricity, according to the International Energy Agency (IEA). Individual sites typically draw 30-50 megawatts (MW), with some hyperscale sites at 100 MW or more.
Ireland, one of the world’s most concentrated data-centre markets, now projects the sector will consume up to 30 percent of the country’s electricity by 2030.
AI training workloads make these figures look modest. A single training run for a frontier model – something in the class of GPT-5, Claude 3, Gemini Ultra or future successors – can consume several gigawatt-hours (GWh) of energy on its own.
And that is before factoring in the much larger energy burden of AI inference, the real-time processing powering chatbots, copilots, factory systems, autonomous vehicles and future humanoid robots.
AI’s projected demand: A 10x leap in electricity
Energy analysts, grid operators and investment banks agree on one theme: AI workloads are increasing so quickly that demand curves are rising faster than the capacity to generate clean power.
Forecasts vary, but most converge on the same broad trend:
- AI electricity use could triple by 2030.
- Frontier AI could require 2-4 percent of global electricity on its own.
- Total AI compute demand could grow 10x this decade.
As Nvidia, OpenAI and others release roadmaps for exponentially larger models, energy becomes the new limiting factor. The bottleneck is no longer algorithms – it’s watts.
The billion-dollar race to build AI megacentres
OpenAI “Stargate”
OpenAI is reportedly planning a $100 billion “Stargate” super-data-centre requiring up to 5 GW of power – roughly the output of five nuclear reactors.
Nvidia sovereign AI
Nvidia is working with nations to build sovereign AI supercomputing hubs, each expected to require tens or hundreds of megawatts.
Microsoft
Microsoft has announced a nuclear-energy strategy team and is openly exploring small modular reactors (SMRs) for future AI data centres.
Google is integrating geothermal systems, heat-recovery cooling and long-duration storage technologies into new facilities.
Amazon
Amazon continues to buy more renewable energy than any corporation in history – yet its future AI centres may also require nuclear or hybrid grids to maintain reliability.
The world’s largest technology companies are quietly transforming into quasi-energy utilities.
The town-next-door problem
When a hyperscale AI data centre appears on a municipal planning map, the local community often reacts less to the promise of “digital transformation” and more to the blunt reality: It will use more power than all of us combined.
This has already happened:
- Some towns in the US and Europe have experienced grid strain because nearby data centres signed priority contracts.
- Residents worry that during heatwaves or supply disruptions, the “lights go out for the people first, not the processors”.
- Planning disputes now focus on water use, noise, transformers, energy reliability and taxpayer-funded grid upgrades.
The world could soon face a strange situation where AI has more reliable electricity than the humans living next to it.
Governments step in – awkwardly
United States
Federal and state officials are now assessing whether AI’s energy needs can be met without significant new power-generation projects. Some counties have paused or restricted new data-centre construction.
Europe
Ireland temporarily froze new data-centre approvals. The EU is examining whether AI’s electricity footprint is compatible with its climate goals.
Middle East
Here lies one of the most important shifts:
- Saudi Arabia and UAE are rapidly entering the AI-infrastructure race.
- Cheap energy (gas plus vast solar expansion) makes them natural hubs.
- Mega-projects such as NEOM explicitly plan huge AI data-centre footprints.
They are also investing heavily in renewables – because in the Gulf, the sun is the one thing that never runs short.
Asia
Singapore, after partially freezing data-centre growth, now allows only energy-efficient AI facilities. China is pushing inland “computing clusters” powered by hydro and wind.
Government involvement is no longer optional – AI infrastructure is becoming a national-scale energy policy issue.
Renewables: Helpful, but not enough
Tech companies proudly advertise wind and solar purchasing programmes. But the maths is unforgiving:
- Wind and solar are intermittent, not 24/7.
- AI data centres require constant baseload power.
- Annual renewable matching does not equal real-time carbon neutrality.
This is why nuclear – once politically toxic – is back in favour.
Is nuclear clean?
In carbon terms, yes. In safety terms, modern reactors and SMRs are dramatically safer than older designs. The IEA classifies nuclear as low-carbon, and countries from the US to China are planning expansions.
For AI, nuclear provides the one thing wind and solar cannot: uninterrupted power.
Beyond Earth: The space data-centre experiment
Companies such as Thales Alenia Space are now studying the feasibility of orbital data centres powered by continuous solar radiation outside Earth’s day – night cycle.
Key challenges:
- Radiation-hardened compute hardware
- Thermal management in vacuum
- Uplink/downlink latency
- Orbital debris
Still, the idea is clear: if Earth struggles to power AI, move the computation somewhere with unlimited sunlight.
Humanity’s coming war over energy with AI
AI is accelerating faster than global energy systems can adapt. To power this new era, governments may need to build:
- More nuclear plants
- Massive solar and wind farms
- Reinforced transmission grids
- New storage systems
- Better cooling technologies
- Smarter scheduling and load-balancing algorithms
Tech giants are already acting like new-age power companies, securing priority electricity contracts, buying entire wind farms, and exploring nuclear partnerships.
But humanity now faces an uncomfortable question: What happens if we can’t generate enough electricity for both AI and ourselves?
