Artificial intelligence systems may represent nearly 50% of a data center’s power consumption by the end of this year, according to a recent analysis.
These estimates, provided by Digiconomist Tech Sustainability founder Alex de Vries-Gao, echo a prediction from the International Energy Agency regarding AI’s energy needs by the decade’s end, similar to current usage in Japan.
De Vries-Gao’s calculations, as detailed in the Sustainable Energy Journal Joule, are based on the energy consumed by chips developed by companies like Nvidia and Advanced Micro Devices that are used for training and operating AI models. The study also factors in energy usage of chips from other providers, such as Broadcom.
The IEA reported that all data centers (excluding those for cryptocurrency mining) consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao asserts that AI currently contributes to 20% of that total.
He highlights various factors influencing his calculations, including energy efficiency in data centers and the power requirements of cooling systems that manage AI workloads. Data centers serve as the central nervous system for AI technology, making their energy consumption a significant sustainability issue for AI development and usage.
De Vries-Gao projects that by the end of 2025, AI systems could consume up to 49% of total data center energy, potentially reaching 23 gigawatts (GW) — double the total energy usage of the Netherlands.
However, De Vries-Gao mentioned that several factors might dampen hardware demand, including reduced interest in applications like ChatGPT. Geopolitical tensions creating restrictions on AI hardware production, such as export limitations, are another hurdle. De Vries-Gao notes the challenges faced by Chinese access to chips, which led to the introduction of the Deepseek R1 AI model that purportedly required fewer chips.
“These innovations could help decrease both AI processing and energy costs,” said De Vries.
That said, he mentioned that enhanced efficiency could further encourage AI adoption. Additionally, a trend referred to as “sovereign AI,” where countries aim to create their own AI systems, might spur hardware demand. De Vries-Gao cited US Data Centre startup Crusoe Energy, which secured 4.5GW of gas-powered energy capacity, making it a leading contender for potential clients like OpenAI through its Stargate venture.
“These early indicators suggest that [Stargate] data centers may increase our reliance on fossil fuels,” noted De Vries-Gao.
On Thursday, OpenAI unveiled its Stargate project in the United Arab Emirates, marking its expansion outside the United States.
After the newsletter promotion
Last year, Microsoft and Google acknowledged that AI poses risks to meet their internal environmental objectives.
De Vries-Gao commented that information about AI’s power requirements is increasingly scarce, describing the industry as “opaque.” While the EU AI Act mandates that AI firms disclose energy consumption related to model training, it does not cover daily usage metrics.
Professor Adam Sobey, mission director for sustainability at the UK’s Alan Turing Institute, stressed the importance of enhanced transparency regarding the energy usage of AI systems and the potential savings from advancing carbon reduction sectors like transport and energy.
Sobey remarked, “We don’t necessarily need an extensive number of compelling use cases for AI to offset the energy costs incurred upfront.”
Source: www.theguardian.com
Discover more from Mondo News
Subscribe to get the latest posts sent to your email.