AI Uses Energy, AI Saves Energy The International Energy Agency examines the energy costs and potential savings of the AI boom

Published
Reading time
3 min read
Bar chart comparing electricity use by various text-generation models: very small, small, medium-sized, large MoE, and large reasoning.
Loading the Elevenlabs Text to Speech AudioNative Player...

AI’s thirst for energy is growing, but the technology also could help produce huge energy savings over the next five to 10 years, according to a recent report.

What’s new: The International Energy Agency (IEA), which advises 44 countries on energy policy, performed a comprehensive analysis of AI’s energy consumption including energy required to obtain critical materials needed for chips and data centers. The report sees dark clouds ahead but also silver linings.

Dark clouds: The report, which is based on interviews with officials in government, energy, and technology, makes four projections for AI’s energy consumption. In the base scenario, future growth and efficiency gains are similar to those of the past five years. The agency also plots a “take-off” scenario in which AI adoption happens faster, a “high efficiency” scenario with lower energy needs, and a “headwinds” scenario in which adoption of AI slows or infrastructure bottlenecks impede construction. Among the conclusions:

  • Demand for electricity by data centers worldwide will more than double by 2030 in the base scenario, growing from 415 terawatt-hours (TWh) today to 945 TWh, around 2.5 percent of current global energy consumption. By 2035, this figure will range from 700 TWh to 1700 TWh.
  • By 2030, data centers outfitted with AI accelerator chips will consume four times the energy they do today.
  • The United States, China, and Europe have more data centers (and use more electricity) than the rest of the world. Like many countries, their data centers are in a few geographic regions, drawing from the same power sources, which eventually will strain local electrical grids. Together, the U.S. and China will account for 80 percent of global growth in data center electricity consumption by 2030. Japan and Malaysia will also see strong growth.

Silver linings: AI already makes energy generation, distribution, and use more efficient. The authors expect these savings to accelerate. 

  • Existing AI algorithms predict energy generation and consumption. This makes it easier to integrate renewable energy sources into the grid, which reduces reliance on fossil fuels and cuts the resulting pollutants and greenhouse gases. Extending existing programs to increase use of renewables by 1 percent would reduce CO2 emissions by 120 megatons by 2035, which is roughly 40 percent of the projected emissions attributable to data centers.
  • Widespread adoption of existing AI applications that streamline energy consumption in industry, transportation, and buildings could reduce CO2 emissions by 1.4 gigatons, nearly five times the projected emissions attributable to data centers, by 2035. For example, scaling up existing AI optimization of heating, ventilation, and air-conditioning systems would save 300 TWh, about one-third of total energy used by data centers.
  • AI and cloud-computing companies continue to negotiate long-term purchase agreements that can secure renewable and zero-emissions energy for as much as 20 years. Data center operators are responsible for most of the long-term contracts that have been announced, nearly all of them for solar energy. Consequently, renewables generation is projected to grow by over 450 TWh by 2035.
  • The energy costs of training, inference, and cooling hardware are expected to fall further thanks to trends in AI models (fewer parameters, more efficient algorithms, task-specific models) hardware (more energy-efficient chips, improved cooling methods), and usage (batch processing, running smaller models locally rather than in the cloud).

Yes, but: The authors concede that lower energy costs for AI likely will lead to much greater consumption — according to the Jevons paradox — so more-efficient models and hardware will result in higher energy consumption overall.

Behind the news: Data centers were growing rapidly prior to the boom in generative AI. Data centers’ electricity use doubled between 2000 and 2005 and again between 2017 and 2022, driven by the growth of cloud computing and data storage, streaming and social media, and cryptocurrency mining. However, these periods of accelerating growth were followed by periods of slower growth as efforts to cut costs led to more-efficient software and hardware. The authors expect this pattern to hold.

Why it matters: The IEA report is a first-of-its-kind analysis of AI’s energy requirements, how they’re likely to grow, as well as the potential of the technology itself to reduce those requirements. It confirms that AI is poised to consume huge amounts of energy. However, it also suggests that today’s energy costs will be tomorrow’s energy savings as AI makes energy generation, distribution, and use more efficient across a wide variety of industries.

We’re thinking: While demand for electricity for data centers is growing rapidly, calibrating the right level of investment is tricky. High levels of growth come with high levels of hype that can lead analysts to overestimate future demand. For example, Microsoft, after examining its forecasts, canceled data-center projects that would have consumed 2 gigawatts.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox