Nuclear data centers are a waste of time – even as AI power demand skyrockets

The cooling towers of a nuclear power plant photographed from the ground against a pale blue sky. The cooling towers are painted white and are framed to either side by corrugated steel structures.
(Image credit: Getty Images)

Plentiful energy is in sight – we’re entering into what some clean tech experts have dubbed the ‘era of energy abundance’. As tech companies consider more green data centers and clean power to meet spiking generative AI energy demands and rapid data center expansion, some have turned back to nuclear power as the answer.

Some of the biggest names in tech have spent the last year making sizable investments in nuclear power, including AWS, Google, Microsoft, and Oracle. But while these are eye-catching developments, I think that time will see them go the way of Microsoft’s Project Natick – novel pursuits that were beaten out by cheaper and more efficient approaches.

I believe that by 2030, the data centers relying on solar, wind, and geothermal energy will have the flexibility and scalability they seek. Nuclear, by contrast, just can’t keep up.

Hyperscaler nuclear hype is a dead end

It’s clear to me that the hyperscalers are simply trying to diversify their energy investments. If you have hundreds of billions of dollars for investment to work with, a few billion in nuclear alongside renewable energy investments isn’t something any tech giant would miss.

This is no bad thing on paper, as any company would be foolish to put all their eggs in one basket when it comes to investing in critical infrastructure such as one’s energy supply. But the sheer amount of noise and money going into nuclear seems a foolish overcorrection: the tech giants are about to discover just how much of a nightmare nuclear is to construct on any kind of scale.

If you want to build a nuclear reactor as fast as possible, you go to Japan, South Korea, or China. All three have a record of building reactors in under five years.

Sizewell B, situated in Suffolk, England, was built in just eight years. But over these years, the cost for the plant was revised three times and ended up 35% over budget by the time it came online in 1995. The US’ newest nuclear reactor, Vogtle 4, took 11 years to build.

Including planning and consultation time, both projects took closer to 15 years to come to fruition. This is not the kind of time frame that big tech can stomach, particularly for AI expansion. As disagreements over whether AI training has hit a wall – that is, whether the scaling laws that govern how sophisticated generative AI models can be according to how much data is used to train them – no one can say for certain what the AI data center demands of 2026 will look like, nevermind 2035.

Solar farms, on the other hand, can be constructed in just eight to eighteen months according to US Light Energy, with very low operating costs and no requirement to import fissile material.

Then there’s the potential for nuclear reliance to drive up emissions.

If these nuclear projects overrun – and I see no reason to suspect they won’t – data centers will be forced to run on ‘temporary’ fossil fuel generators. This isn’t a theoretical worry. Elon Musk’s xAI has been running its ‘Colossus’ supercomputer, an array of 100,000 Nvidia H100s intended to train advanced LLMs, on gas combustion turbines since the facility opened.

The site has been accused of worsening smog in Memphis, per CNBC, with the nonprofit Southern Environmental Law Center further alleging xAI is running the gas turbines without a permit and demanding an EPA investigation. The firm may add more gas turbines yet, as it balances its 150MW of grid access agreed with the Tennessee Valley Authority (TVA) with a push to multiply its processing power ten times over as reported by the FT.

Critics of renewable energy like to cite ‘dunkelflaute’, a German term for those periods of both low wind and low sunlight, in which renewable energy struggles to meet the power needs of users. There's no doubt data centers can't rely solely on the weather but battery energy storage systems (BESS), which store renewable energy to be accessed when the sun isn’t shining and the wind isn’t blowing, greatly reduce this risk.

BESS costs have plunged in the past decade, with analysis by the think tank Ember measuring a drop from $450 per kWh in 2021 to $200 per kWh in 2024. As the price continues to drop, driven by advances in battery technology, big tech will see renewables as a safer and cheaper option.

Nuclear, meanwhile, is sold as the ultimate source of reliable energy – despite notable periods in which it’s proved anything but. Just ask France, which in 2022 suffered a widespread nuclear power plant outage due to maintenance requirements that forced it to import more energy than it generated for the first time in decades. On average France’s nuclear fleet has an energy availability factor, a measure of how much time a power plant is actually generating power, of around 74%.

Worldwide, the highest score is held by Finland at 91.4%, giving big tech a best-case scenario of needing backup power for a collective tenth of data center operations. The stats underline both the need to diversify power sources and the overstated degree to which nuclear is ‘firm’ power.

Meanwhile, the sun keeps shining and the wind keeps blowing. With enough generating capacity and energy storage – the kind that could be built using, say, billions of dollars from tech companies desperate to expand data center campuses – you can lock in 24/7 power without having to split any atoms.

Jam tomorrow

Following its AI Growth Zone announcement, the UK government has reaffirmed its backing for more small modular reactors (SMRs), factory-built reactors that can in theory be put together quicker and cheaper in factories and strategically deployed outside factories, refineries, and data centers.

When they start rolling off the production lines, these could make a genuine difference for sites such as data centers, allowing industry to run off-grid without relying on dirty power generators. The problem? We’re years away from pinning down which design to pursue and even building the factories that will eventually put SMRs together.

It’s possible that AI could shorten the design times for reactors, or lead to breakthroughs in nuclear energy that make it a more viable long-term bet. But the same is true of renewable energy, with potential for predictive AI to make solar and wind energy output easier to forecast and even use stored energy more efficiently.

Put simply, nuclear power simply cannot be stood up in time to make any difference to data center demand. If your answer to “how can we scramble to meet AI energy demand?” is “give us the best part of a decade to build this reactor”, you’re already on a path to spiraling costs and unreliable services.

Rory Bathgate
Features and Multimedia Editor

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.

In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at rory.bathgate@futurenet.com or on LinkedIn.