Growing Pains: How AI is Challenging the Decarbonization

icon DATA
Photo - Growing Pains: How AI is Challenging the Decarbonization
According to experts at the World Economic Forum (WEF), the computational power used in the artificial intelligence industry is doubling every 100 days. This growth corresponds with a significant increase in the electricity consumed by data centers, presenting a complex paradox.
AI Energy Consumption: Forecasts
For AI systems to get better, they will need more training. A stage that involves bombarding the software with data. And that’s going to run up against the limits of energy capacity,
states Rene Haas, CEO of Arm, a global leader in processor design.
The WEF's estimates show that the energy needed for AI is growing by about a third each year. They predict that by 2028, the global AI industry will consume more energy annually than Iceland, a country known for its low temperatures (the average temperature in Reykjavik in July is +11 °C). 

However, even more drastic forecasts exist. Rene Haas expresses concerns that power suppliers may not be able to meet the surging demand from expanding server farms. For example, achieving a tenfold increase in AI platform efficiency would require a 10,000-fold increase in computational power, says the WEF. Haas's projections suggest that by 2030, data centers could be consuming as much energy as India, the world's most populous country.

These figures have led the WEF to scrutinize AI's escalating ecological impact and to doubt whether its continued development can be sustainable.

AI Threatens Global Transition to Renewable Energy

The reality of the challenge is apparent from the operations of global cloud service providers. For instance, Amazon Web Services (AWS), the leading cloud company, has faced difficulties in Virginia and Oregon, which host the majority of its server farms. There, Dominion Energy, struggling with the increased demand, has halted the power grid connections for new AWS data centers. 

In Oregon, Amazon's electricity consumption surpassed what the local utilities could secure from hydroelectric plants, which produce renewable energy. Consequently, these utilities had to rely on electricity generated from burning natural gas—a non-renewable resource—to meet the demand. This example is crucial for understanding the broader implications.

The concern extends beyond a simple shortage of electricity for artificial intelligence. The core issue lies in the fact that to meet data center demands, energy from fossil fuel sources may become increasingly necessary, thereby contradicting the global strategy aimed at transitioning to renewable energy. 

The term "energy transition" refers to the shift from reliance on fossil fuels to renewable energy sources, ideally leading to an entirely carbon-neutral economy. 

Regrettably, the surge in generative AI has not only revitalized old, environmentally harmful power plants but has also led to arguments supporting the construction of new such facilities. 

A Dose of Skepticism

The WEF highlights a paradox: AI technology itself could facilitate the sought-after sustainable development of energy. AI is already at the forefront of developing new energy storage solutions, enhancing energy planning, and more. In this way, AI serves as a vital tool for achieving the immediate objectives of the energy transition, such as tripling renewable energy capacities and doubling the energy efficiency of key manufacturing processes by 2030.

As a proposed compromise, the WEF recommends reducing energy consumption during the AI model training phase—which could result in a 12–15% energy savings—and optimizing the workloads of data centers. Strategies include scheduling shorter tasks for nighttime and planning major projects during the winter months, when the need for data center cooling is reduced. 

Moreover, transitioning to mega data centers instead of individual company infrastructures could be highly beneficial. Such facilities, due to their scale, can achieve lower costs per unit of production. Ideally, these centers would be integrated with renewable energy infrastructure to mitigate their environmental impact.

However, independent researcher Jonathan Koomey believes the WEF may be exaggerating the issue. Currently, the energy demand of AI globally accounts for merely 0.1% of total electricity production. Even with an expected increase in computational power, this percentage is projected to rise only to 0.5%. For context, Koomey points out that a complete transition to electric vehicles in the U.S. would likely increase national electricity consumption by 20-25%. 
Everyone needs to calm the heck down. I think there will be local impacts. There will be some places where AI and data centers drive load growth. In Ireland that has 17-18% of its load from data centers, that could give them real challenges. Same thing, Loudoun County in Virginia. But you really do have to separate the global story from the local story,
Koomey asserts.
Jesse Jenkins, a professor of energy system engineering at Princeton University, also emphasizes that the discussion about data centers' rising energy consumption often overlooks broader technological advancements.

He highlights the launch of Nvidia’s new chip designed for generative AI, which prioritized energy efficiency. Training AI with this new chip is expected to require 73% less energy compared to older models. This suggests a promising trend toward enhanced energy efficiency in future technologies.
That’s just one generation of GPU with a reduction nearly three-quarters in the amount of consumed energy,
Jesse Jenkins notes.