Data center growth spurred largely by generative AI is expected to cause global energy demand to surge in the coming years and a report this week from the International Energy Agency provides new estimates on just how much.
The amount of electricity needed for data centers worldwide is projected to more than double to roughly 945 terawatt-hours in 2030, which is more than what the entire country of Japan consumes today, according to the report.
The popularity of gen AI tools like OpenAI’s ChatGPT and Google’s Gemini has soared in the past few years. These large language models and their kin require a huge amount of computing power, running on high-end graphics processing units like those manufactured by Nvidia. These not only need a lot of electricity to operate but they also generate heat — meaning even more energy is required to keep them cool. All of that adds up quickly.
About half of the increased demand for electricity in the US by 2030 is expected to be for data centers. Processing data is expected to need more electricity than manufacturing all “energy-intensive goods” together — aluminum, steel, cement and chemicals, the IEA said.
“AI is one of the biggest stories in the energy world today — but until now, policymakers and markets lacked the tools to fully understand the wide-ranging impacts,” IEA Executive Director Fatih Birol said in a press release.
AI’s energy use has policymakers worried
The anticipated energy demand for AI has been well-documented and on the minds of policymakers and experts for years. Just this week, the House Energy and Commerce Committee held a hearing on how to provide enough energy for data centers as members of Congress try to understand the issue.
Experts said the US electricity system will need improvements and changes to meet rising demand after decades of relatively flat consumption. One area of focus for the committee was on providing “baseload power” — a consistent electricity supply throughout the day. That desire for consistency prompted questions from members about whether renewable energy sources like solar and wind would work or whether the rising demand would have to be met by new power plants burning fossil fuels like natural gas. The burning of those fossil fuels is a major driver of climate change.
Data centers like this one in Ashburn, Virginia, are becoming increasingly numerous as demand for computing power grows to keep up with new generative AI models.
There’s also the question of whether the existing US power grid can handle the demand and additional supply necessary.
“Even without the anticipation of rapidly increasing electricity demand, the US power grid is in need of modernization investments,” testified Melissa Lott, a professor at the Climate School at Columbia University. “The recent forecast for rapidly increasing power demand make these investments even more urgent and necessary.”
The timing of electricity availability, given surges in demand at times like heat waves and the daily up-and-down of solar generation, is a problem that can be dealt with in various ways. Lott pointed to energy efficiency efforts, like Energy Star appliances, and demand-reduction efforts also known as virtual power plants. Those can decrease energy demand and level out the peaks and valleys throughout the day.
Can AI solve its own energy problem?
The IEA report suggests generative AI could, over time, help fix the problem caused by its energy demand. If AI accelerates science and technology improvements, it could lead to better solar panels and batteries, for example. “Energy innovation challenges are characterized by the kinds of problems AI is good at solving,” the IEA report said.
But another solution might lie in how AI data centers use power. A report earlier this year from researchers at Duke University suggested that AI data centers can more easily be turned off and on to adjust to the needs of the electrical system, allowing grid operators to accommodate growth more easily.
“This analysis should not be interpreted to suggest the United States can fully meet its near and medium-term electricity demands without building new peaking capacity or expanding the grid,” the researchers wrote. “Rather, it highlights that flexible load strategies can help tap existing headroom to more quickly integrate new loads, reduce the cost of capacity expansion, and enable greater focus on the highest-value investments in the electric power system.”
Read the full article here