Limited Assessment of AI Energy Consumption by Businesses Posed as Issue
AI's Power Consumption: A Blind Spot in Business
Many business leaders express their concerns about AI's energy demands within their organizations, yet few are properly monitoring it. According to a survey, seven out of ten leaders understand the substantial energy consumption of AI models, with half anxious about the energy and efficiency challenges that come with it. However, only 13% are keeping tabs on their AI systems' power consumption.
Six out of ten acknowledge that energy efficiency will play a significant role in future strategic planning, due to both cost management imperatives and operational scalability concerns. Rodrigo Liang, CEO of SambaNova Systems, stated that this study portrays a concerning picture of AI adoption, with companies rushing to embrace the technology while neglecting to manage its energy impact.
Liang expects that by 2027, more than 90% of leaders will be worried about AI's power demands and will monitor consumption as a Key Performance Indicator (KPI) that corporate boards will closely track. Among those companies that have significantly deployed AI, over three-quarters are actively pursuing methods to cut power usage.
Popular strategies include hardware and software optimization, adopted by 40%, adopting energy-efficient processors (39.3%), and investing in renewable energy (34.9%). Rising power costs are a significant issue for one-fifth of companies, with 37% experiencing growing stakeholder pressure to enhance AI energy efficiency, and a further 42% expecting these demands to surface soon.
However, while seven out of ten leaders recognize the energy-intensive nature of training Large Language Models (LLMs), only six out of ten are aware of the significant power demands of inference. This reveals a critical gap, as inference workloads are poised to become AI's primary usage with the scaling of Agentic AI.
To address this issue, businesses can consider various strategies, such as streamlining AI workflows, transitioning to smaller, more efficient AI models, performing workload analysis, adopting cloud and edge computing, implementing advanced cooling systems, building green data centers, forging strategic partnerships, and investing in sustainable practices.
In response to the environmental concerns surrounding AI, the International Energy Agency (IEA) discovered that interactions with solutions like ChatGPT use ten times more electricity than a standard Google search. Training a large language model uses nearly 1,300MWh of electricity, the equivalent annual consumption of about 130 US homes. If ChatGPT were integrated into the nine billion searches conducted daily, the electricity demand would increase by 10 terawatt-hours per year.
As businesses increasingly integrate AI, addressing energy efficiency and infrastructure readiness will be crucial for long-term success. This could potentially drive a shift in the AI hardware landscape, favoring solutions that deliver high performance without exorbitant energy demands.
- The concern about energy demands of AI in business is not just about cost management; it's also about addressing environmental issues as interactions with solutions like ChatGPT use ten times more electricity than a standard Google search.
- With the scaling of Agentic AI, inference workloads are poised to become AI's primary usage, yet seven out of ten leaders recognize the energy-intensive nature of training Large Language Models but are less aware of the significant power demands of inference.
- To manage AI's power consumption effectively, businesses can invest in renewable energy, adopt energy-efficient processors, streamline AI workflows, transition to smaller, more efficient AI models, implement advanced cooling systems, build green data centers, and forge strategic partnerships—all while considering the impact of cybersecurity and climate-change on their infrastructure, as well as harnessing the power of artificial intelligence and environmental-science to drive technology innovation.