A new report from EPRI and Epoch AI, Scaling intelligence: the exponential growth of AI’s power needs, forecasts that training a leading large scale AI model could, by 2030, require more than 4 GW.

Despite rapid efficiency gains, the power demands of training a leading model have more than doubled every year for the past decade, according to the report. AI companies have found that increasing model size and complexity provides better performance, which in turn drives the need for additional compute and electrical power.

The report finds that the AI industry will likely continue to scale up its models in the coming years, despite recent computational efficiency breakthroughs.

Total AI power capacity in the USA is estimated at around 5 GW today and could reach more than 50 GW by 2030 — matching total global demand from data centres today and comprising a rapidly growing share of overall data centre power demands.

“This report offers a rigorous, data-driven look at these trends and where they’re headed,” said Jaime Sevilla, director of Epoch AI.