×
AI model training could consume 4 gigawatts by 2030
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI model training could consume more than 4 gigawatts of power by 2030—enough to power entire cities—as energy demands for frontier AI development continue doubling annually, according to a new report from Epoch AI, a research institute investigating AI trajectory, and the Electric Power Research Institute (EPRI), an independent nonprofit. This exponential growth in power consumption poses significant challenges for utility companies and could derail tech giants’ climate commitments, even as companies explore distributed training and flexible power solutions to manage the unprecedented energy demands.

What you should know: Recent AI training runs like Elon Musk’s Grok AI already require 100-150 megawatts, but power demands are accelerating rapidly.

  • By 2028, each frontier AI training run is projected to consume 1-2 gigawatts of power.
  • Individual training runs could hit 4 gigawatts by 2030, approaching the power draw of some U.S. states.
  • Power demands have been more than doubling every year, despite significant efficiency improvements in chips and cooling systems.

Why efficiency gains aren’t helping: Tech companies are reinvesting efficiency improvements into larger-scale operations rather than reducing overall energy consumption.

  • “AI companies tend to just reinvest those efficiency gains into scaling up,” Joshua You, a data analyst at Epoch AI, told Newsweek. “So that tends to swamp these efficiency improvements.”
  • This reflects a modern version of the Jevons paradox, where efficiency improvements paradoxically lead to greater overall consumption.

Potential solutions emerging: Companies are testing distributed training methods and flexible power systems to manage peak demand.

  • EPRI, Oracle, NVIDIA, and startup Emerald AI demonstrated shifting AI computing work between data centers in Phoenix to avoid straining local power grids during peak demand.
  • The DCFlex partnership launched last year explores on-site power generation and large-scale battery storage systems for data centers.
  • “We have technical demonstrations where you can distribute the training,” Tom Wilson, principal technical executive at EPRI, explained.

Climate implications: The AI boom is undermining tech companies’ net-zero commitments as they turn to both renewable and fossil fuel sources.

  • Tech companies are already among the world’s biggest purchasers of renewable energy.
  • However, many are also choosing fossil fuel sources for new data center developments.
  • The energy surge has knocked major players like Google and Microsoft off target for their ambitious climate goals.

The big picture: This research focused specifically on training new AI models rather than broader AI infrastructure energy use, providing utilities with crucial insights for planning future power capacity and grid management strategies.

Training AI models could eat up 4 gigawatts of power by 2030, report warns

Recent News

Why most AI pilots fail to scale beyond proof-of-concept

The gap between pilot and platform represents enterprise AI's biggest challenge today.

On-premises GPU servers cost same as 6-9 months of cloud

Cloud flexibility's fine print undermines its core value proposition.