AI's Rise Drives Data Center Boom, Powering Generative Models Like DeepSeek R1
The data center landscape has transformed dramatically over the past decade. Today, hyperscalers are planning AI data center campuses with over 1 GW of power demand, a stark contrast to the 30-MW data centers considered large a decade ago. Meanwhile, a Chinese startup's generative AI model, DeepSeek R1, matches top U.S. models like ChatGPT and Gemini while running on less powerful and cheaper hardware, making AI more affordable and accessible.
The surge in AI's popularity and usage has led to a significant increase in power demand for data centers. By 2030, an additional 18 GW of power capacity will be needed for U.S. data centers alone, equivalent to three times New York City's current power demand. This growth is driven by reasoning AI models that require more computing power and run longer, more resource-intensive inference cycles.
Power companies are responding to this increased demand by aligning capital investments with technology-driven needs. They are modernizing transmission and distribution networks and utilizing unused generation capacity. Global data center electricity consumption is predicted to double by 2030, with hyperscalers, data center operators, and asset managers investing in larger and more modern data centers. Oracle and OpenAI, for instance, are planning AI data centers with a combined capacity of up to 4.5 gigawatts in the U.S. as part of the Stargate AI infrastructure project.
The global race to build data centers is on, fueled by the rapid deployment of generative AI. While natural gas is expected to play a key role in meeting short-term energy demand due to its scalability and reliability, the long-term impact of AI on energy consumption remains uncertain. The Jevons Paradox suggests that as AI becomes more efficient, overall consumption of computing power and electricity could increase instead of decrease, highlighting the need for sustainable and efficient data center solutions.