Google DeepMind's chief scientist says AI energy use not as bad as it looks

Jeff Dean, chief scientist at Google DeepMind and Google Research, speaking at an event in 2020. He is pictured from the chest up with one hand raised, against a dotted background.
(Image credit: Getty Images)

Google's head of AI has said energy demands for the technology aren't as big of a problem as critics suggest.

Jeff Dean, chief scientist at Google DeepMind and Google Research, was speaking after the release of a report by Google into its own rising greenhouse gas emissions, which have increased by almost half in the last five years — largely pinned on AI-driven increases in data-centre energy consumption.

Dean argued that though AI demands in data centers are increasing rapidly, it was from very little to begin with. "There’s been a lot of focus on the increasing energy usage of AI, and from a very small base that usage is definitely increasing," Dean said at a conference, according to Fortune.

"But I think people often conflate that with overall data center usage — of which AI is a very small portion right now but growing fast — and then attribute the growth rate of AI based computing to the overall data center usage," he added.

Dean's employer is one of those sounding a note of concern. “As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute, and the emissions associated with the expected increases in our technical infrastructure investment," Google said in its report.

Microsoft co-founder Bill Gates has also recently suggested we need not worry about AI's energy demands, as efficiencies created by the technology would solve the problem. Like Dean, he believes it's not that big of a problem to start with.

"Let’s not go overboard on this," Gates said at a conference in London, per to The Guardian. "Data centers are, in the most extreme case, a 6% addition [in energy demand] but probably only 2% to 2.5%. The question is, will AI accelerate a more than 6% reduction? And the answer is: certainly."

More efficient systems through AI

Dean said Google remained on track to source all of its energy from renewable sources by the end of 2030, noting that the company is waiting for more clean energy providers to come online. When that happens, Google's proportion of energy that can be considered "clean" will immediately improve.

That said, Dean admitted that efficiency should be improved, too, saying: "we also want to focus on making our systems as efficient as possible."

Back in 2020, Google researchers warned in a paper about the environmental impact of AI development. They were told not to put their names on the paper if it was published, and two departed Google under conflicting stories, including the head of ethics at the time Timnit Gebru. At the time, Dean said in an internal email that the paper "didn't meet our bar for publication" and failed to include recent findings on how models could be made more efficient.

Still awaiting solutions for AI

Despite those findings from four years ago, AI continues to have high energy demands. A recent study from researchers at machine learning company Hugging Face showed that a generative AI system uses as much as 33 times the energy as alternative, task-specific software.

RELATED WHITEPAPER

Part of that is down to the design of these massive systems. "Every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective," Hugging Face researcher Sasha Luccioni told the BBC.

One study suggested that making a single image using generative AI uses as much energy as charging your smartphone, per Technology Review, while cloud-based tools such as ChatGPT continue to drive up energy demand at data centers.