Not since the advent of virtualisation has the data centre world seen such a profound shift. Virtualisation, which allowed one physical server to act as many, transformed the efficiency and economics of computing in the 2000s.
Today, data centres stand at a similar inflection point – only this time, the change is being driven by artificial intelligence, supercomputing, and quantum technologies.
Every week brings new announcements: IBM teaming with AMD on “quantum-centric supercomputing”, Nokia and nScale building AI-first networks, Fujitsu working with Japan’s National Institute of Advanced Industrial Science and Technology (AIST) on quantum competitiveness.
On the AI side, Nvidia and OpenAI have unveiled infrastructure plans so large they rival the scale of national utilities. Oracle and SoftBank are pouring billions into new data centre campuses.
What ties all these developments together is the recognition that yesterday’s data centres – designed to host websites, emails, and enterprise applications – cannot cope with the demands of tomorrow’s computing.
From conventional to specialised
Traditional data centres were built around the CPU (central processing unit), the all-purpose workhorse of computing. These facilities ran the cloud services most people are familiar with: file storage, banking apps, e-commerce, and streaming.
Power consumption was heavy but stable, with most of the electricity spent on running the machines and keeping them cool.
In the past five years, however, workloads have shifted. The rise of artificial intelligence, particularly generative AI, has changed the equation.
Training large models such as ChatGPT or image-generation systems is not something CPUs are well suited for. These tasks demand parallel processing – thousands of calculations happening at once.
AI data centres and the GPU revolution
Enter the GPU (graphics processing unit). Originally designed for rendering video game graphics, GPUs are ideal for AI because they can crunch vast amounts of data simultaneously.
Today, AI-specialised data centres look very different on the inside: racks filled with GPUs, high-speed interconnects linking them, and advanced cooling systems – often liquid-based – to deal with the heat.
This transformation is not subtle. Nvidia, which dominates the GPU market, is at the heart of it:
- It has agreed to invest $5 billion into Intel, co-developing custom CPUs and hybrid chips designed specifically for AI and data centre workloads.
- In partnership with OpenAI, Nvidia is deploying 10 gigawatts of computing capacity – an infrastructure build so large it’s equivalent to the output of multiple nuclear power plants.
- Alongside Oracle and SoftBank, OpenAI is expanding the so-called Stargate project, adding five new AI-focused data centre sites.
- Nvidia and OpenAI have described their efforts as the largest AI infrastructure deployment in history.
All of this means more electricity. Analysts point out that a single AI data centre can draw as much power as a small town.
There are attempts to make systems more efficient. For example, some AI models, like DeepSeek, consume less energy but may sacrifice certain capabilities such as image generation. The trade-off between capability and efficiency is becoming a central debate in AI infrastructure.
Supercomputing meets commercial AI
Traditionally, “supercomputers” were government or research machines used for climate models, nuclear simulations, or advanced physics. They often made headlines as the “fastest in the world” based on FLOPs (floating-point operations per second).
But the distinction between supercomputers and AI data centres is blurring. Both rely heavily on GPUs and high-speed networking. Today’s largest AI training clusters rival or exceed traditional supercomputers in raw capability.
The difference is that instead of being confined to national labs, these GPU-powered giants are now deployed by tech companies for commercial use – serving billions of users through cloud platforms.
Quantum computing in the data centre
Quantum represents a different leap altogether. Instead of bits (0 or 1), quantum computers use qubits that can exist in multiple states at once. This allows them to explore solutions to certain problems much faster than classical machines.
The catch is that qubits are delicate. They must be cooled to near absolute zero, isolated from vibrations, and shielded from interference. As a result, quantum hardware doesn’t sit in rows of servers like GPUs. It requires specialised enclosures with cryogenic systems.
Companies are now working to integrate quantum into mainstream computing:
- IBM and AMD are collaborating on architectures for quantum-centric supercomputing, aiming to blend quantum and classical systems.
- Fujitsu and AIST are developing hybrid platforms to strengthen industrial competitiveness in fields such as materials science and logistics.
For now, quantum remains experimental, but many expect it will eventually coexist inside or alongside conventional data centres – much like GPUs once did.
Having said that, “quantum data centers” already exist in several countries – our main picture shows former Chancellor of Germany, Olaf Scholz, with the quantum system at the IBM facility in Germany last year.
The networking layer
Powerful chips are only one part of the puzzle. Moving data quickly between them is just as critical. AI training requires thousands of GPUs to work together seamlessly. That means ultra-fast, low-latency networks inside the data centre and across global sites.
This is where companies like Nokia and nScale are stepping in. Its partnership with nScale focuses on building AI-first data centre infrastructure with high-performance connectivity. Networking, once a supporting role, is becoming a central design challenge in next-generation facilities.
Why the money is pouring in
The scale of investment is staggering. Nvidia, Intel, OpenAI, Oracle, SoftBank, IBM, Fujitsu, and Nokia are all pouring billions into infrastructure. Governments are funding quantum research as part of broader industrial strategies.
Why? Because data centres are no longer just warehouses of servers. They are becoming strategic assets – the equivalent of digital power plants. Countries and companies see them as critical to economic competitiveness, national security, and scientific progress.
The power problem
All this raises an uncomfortable question: can the power grid keep up?
- AI-focused facilities consume megawatts to gigawatts of electricity.
- Cooling methods are evolving, from air conditioning to liquid immersion and direct-to-chip cooling.
- Quantum paradoxically uses little power at the chip level but enormous amounts for cooling infrastructure.
Some companies are exploring ways to build renewable generation or even nuclear small modular reactors directly next to data centres. Otherwise, the world risks a clash between energy supply and the hunger for computation.
Why we need them
It’s fair to ask: why build machines that use the power of a small town just to answer prompts or generate pictures?
The truth is that the applications go far beyond chatbots and images. AI systems are already being used for:
- Drug discovery and medical diagnostics.
- Financial risk analysis.
- Autonomous vehicles.
- Industrial automation.
Supercomputers continue to be essential for climate modelling, defence simulations, and physics. Quantum could one day revolutionise logistics, energy systems, and cryptography.
At stake is not just convenience, but leadership in science, industry, and security.
The new industrial infrastructure
Data centres are in the midst of their biggest transformation since virtualisation. AI, supercomputing, and quantum are reshaping not just how they are built, but why they exist.
They are no longer background utilities but the factories of the digital economy. Whoever controls the most advanced data centres controls the future of computing – and by extension, the trajectory of industries and nations.
For readers, the plain-English takeaway is this: the world is building new kinds of data centres because yesterday’s infrastructure cannot handle tomorrow’s problems. And the scale of this transformation will touch everything from how we consume energy to how we live and work.