Data center power demand to triple in US off AI boom

Data center blackout concept image showing server room with red lighting signifying downtime during cloud outages.
(Image credit: Getty Images)

Data centers currently chew through 4.4% of American energy — and that could be set to triple to 12% by 2028.

That's according to the Berkeley Lab, which has analysed data center electricity use on behalf of the US government, including servers, storage, network equipment, and other infrastructure.

The report comes as the AI boom is driving power demand at data centers, with Gartner predicting 40% of operators will face power constraints. That's led big tech companies to investigate alternative power sources, including nuclear.

According to the report, data center energy use remained stable from 2014 to 2016 at about 60TWh, with little growth seen since 2010. But in 2017 something changed: the use of GPUs to power AI sparked the beginning of an increase in energy consumption, growing to 76TWh in 2018 and 176TWh in 2023 — making up 4.4% of total energy used in the US.

Given that growth, the report said it's likely that power demand for data centers will increase to between 325TWh to 580TWh in the next four years alone, making up between 6.7% to 12% of US electricity consumption.

That still trails other countries, however. Data centers already eat up around 18% of Ireland's electricity.

The range is so wide, the report said, because it captures different scenarios depending on future equipment shipments, variations in cooling, and more.

"The equipment variations are based on the assumed number of GPUs shipped each year, which depends on the future GPU demand and the ability of manufacturers to meet those demands," the report explains.

“Average operational practices for GPU-accelerated servers represent how much computational power, and how often AI hardware in the installed base is used, to meet AI workload demand," the report adds.

"Cooling energy use variations are based on scenarios in cooling system selection type and efficiency of those cooling systems, such as shifting to liquid base cooling or moving away from evaporative cooling."

Cloud history

The report noted this isn't the first time tech sector innovation has led to an increase in data center use — but not necessarily energy use.

Between 2000 and 2005, data center electricity use roughly doubled. But then from the early and mid 2010s, the shift from on-premise infrastructure to the cloud did not lead to a significant increase in energy use. This was largely due to the fact that colocation and cloud facilities tended to be more efficient than those run at corporate headquarters.

"The efficiency strategies that allowed the industry to avoid increased energy needs during this period included improved cooling and power management, increased server utilization rates, increased computational efficiencies, and reduced server idle power," the report noted.

That might not be possible this time around, despite the many claims that AI will be able to address climate crisis related challenges. That's because new types of hardware are required for the AI boom.

"Most notably, the rapid growth in accelerated servers has caused current total data center energy demand to more than double between 2017 and 2023, and continued growth in the use of accelerated servers for AI services could cause further substantial increases by the end of this decade," the report notes.

But the industry should be trying to do better at seeking data center efficiency, and that would be helped by transparency in data center energy use and benchmarking.

"The current and possible near-future surge in energy demand highlights the need for future research to understand the early-stage, rapidly changing AI segment of the data center industry and identify new efficiency strategies to minimize the resource impacts of this growing and increasingly significant sector in our economy," the report adds.