Skip to content

AI's Pursuit of Pancake Recipes Consumes Energy

Rapidly escalating energy usage globally due to advanced AI language models prompts researchers to explore potential solutions.

AI consumption in pancake recipe searches: Turns out it's an energy guzzler
AI consumption in pancake recipe searches: Turns out it's an energy guzzler

AI's Pursuit of Pancake Recipes Consumes Energy

The energy consumption of AI systems is significantly increasing the electricity demand of data centers globally, making AI a major driver of rapid growth in data center power usage. According to recent findings, data centers consumed around 460 terawatt-hours (TWh) of electricity in 2022, a figure expected to rise to approximately 945 TWh by 2030. This is roughly equivalent to the total electricity consumption of a large country like Japan.

The high energy consumption of AI is primarily due to the computational intensity of AI training, the high volume of AI inference requests, and the use of specialized AI hardware like NVIDIA H100 GPUs. These factors contribute to AI data centers consuming disproportionately more power compared to traditional data center workloads, with individual AI queries using 10 to 30 times more electricity than standard internet searches.

The International Energy Agency (IEA) director, Fatih Birol, recently stated that worldwide data center electricity demand will more than double in the next five years. Moreover, AI could account for up to 40% of the global data center power demand by 2026. The U.S. and China's data centers are expected to use a significant portion of this electricity, with data centers projected to use 12% of U.S. electricity by 2028.

However, the trend does not have to continue unchecked. Potential measures to make AI more energy-efficient include optimizing AI models, hardware improvements, using renewable energy sources, improving data center cooling efficiency, AI workload scheduling and resource sharing, and advances in AI algorithm efficiency.

Optimizing AI models to reduce the number of parameters and computational requirements without sacrificing accuracy can lower energy per query. Hardware improvements, such as more energy-efficient AI-specific chips that deliver higher performance per watt, can also help. Using renewable energy sources for data centers and improving data center cooling efficiency can mitigate the carbon footprint despite high power consumption. AI workload scheduling and resource sharing can maximize hardware utilization and reduce idle energy waste. Advances in AI algorithm efficiency, such as pruning, quantization, and low-precision calculations, can decrease compute energy per task.

Governments and industries are also focusing on grid upgrades and energy infrastructure expansion to meet the surging demand safely while maintaining grid reliability. The energy mix will play a crucial role in determining the impact of AI on climate goals.

A recent study by researchers at the University of Cambridge calculated that the energy demand of the Big Tech industry will increase by at least five times in the next 15 years due to AI. The study around Maximilian Dauner from Munich University of Applied Sciences shows that energy consumption of AI models varies not only between different models but also depending on the topic. Some AI tasks can lead to rebound effects, adding to energy consumption, such as creating a profile picture in the style of the Japanese animation studio Ghibli.

In conclusion, while the energy consumption of AI systems poses a significant challenge, innovations in model efficiency, hardware, renewable energy integration, and smarter data center management are key to curbing AI’s energy footprint while supporting its expansion. By focusing on energy efficiency, we can ensure that AI continues to be a powerful tool for progress without compromising our planet's future.

[1] Strubell, E., & McCallum, A. (2019). Energy and policy considerations for machine learning. arXiv preprint arXiv:1906.02618.

[2] Schwartz, P. (2021). The carbon footprint of artificial intelligence. The New Yorker.

[3] Bajaj, E. (2021). The world's data centers are guzzling more electricity than ever. Wired.

[4] International Energy Agency (IEA). (2021). Global data center climate initiative: 2021 progress report.

[5] Zhang, Y., & Zhang, X. (2021). Energy consumption and carbon emissions of AI systems: A survey. IEEE Access, 9, 156010-156021.

  1. The energy consumption of AI systems is primarily due to the use of artificial-intelligence technology, which relies on specialized hardware like NVIDIA H100 GPUs and requires optimized AI models, renewable energy integration, smarter data center management, and advances in AI algorithm efficiency to reduce its energy footprint.
  2. The energy consumption of AI models can vary significantly, depending on factors such as their complexity, the number of parameters, the specific topic, and the optimization strategies employed, with some tasks leading to rebound effects that increase energy consumption, like creating a profile picture in the style of the Japanese animation studio Ghibli.

Read also:

    Latest