Prof. Babak Falsafi was consulted by the Neue Zürcher Zeitung

Training and using artificial intelligence consumes huge amounts of energy. Even more efficient algorithms are unlikely to change this.

More and better artificial intelligence (AI) is the goal of the major tech companies in the USA. In recent weeks, Open AI, Amazon and Meta have announced billions in investments in AI. Even Donald Trump spoke about the topic on his second day in office. A large part of this money is going into the infrastructure that forms the backbone of AI: data centers.

$100 billion have been earmarked for the “Stargate” project by Open AI and Microsoft, Alphabet is planning to invest $75 billion in the development of AI this year, Amazon is investing $86 billion in infrastructure and Meta $65 billion. New data centers are also to be built in Europe, particularly in France. French President Emmanuel Macron announced investments of 109 billion Euros at an AI summit in Paris. Energy requirements will increase with the expansion of the AI infrastructure, but by how much is debatable.

There are far more data centers in the USA than in Europe. There, highly specialized chips train and operate AI models around the clock. These data centers consume a lot of electricity. In the state of Virginia, data centers already account for a quarter of the total electricity demand. But where should this energy come from?

Special cooling required

AI consumes electricity several times: firstly, large amounts are needed when the AI model is trained. Then, more is required every time a user makes a request to the model. Depending on whether text, images or videos are to be created, each request to a chatbot such as Chat-GPT consumes ten to thirty times as much energy as an online search using a search engine.

The computer chips that are needed to train and use AI are also even bigger power guzzlers than the conventional chips that are needed for cloud applications, for example. This is because in order to train an AI model or process a query, a chip has to perform calculations, not just store information. This also generates heat. That's why the data centers need to be specially cooled. Especially in hot regions such as Texas or Arizona, this requires a lot of additional electricity. This is also reflected in the projections for the future energy consumption of data centers. A study by the consulting firm McKinsey estimates this at 80 Gigawatts in the USA in 2030. Last year, this figure was 25 Gigawatts. The Boston Consulting Group (BCG) also expects energy requirements to triple. The fact that AI is becoming increasingly efficient has been factored into the calculation, BCG reported. Data centers are not only being built in the USA; countries and companies around the world are investing in their expansion. The consulting firm Bain writes that the energy consumption of data centers worldwide increased by 72% between 2019 and 2023 and is expected to double again by 2027.

Today, data centers are responsible for one percent of global energy demand. If the estimates are correct, data centers will account for a full 2.6 percent of global energy demand in 2027. This may still be a small proportion, but the rapid increase highlights the need for reliable energy sources.

In principle, chips have become increasingly efficient in recent years. However the trend, for AI chips in particular, is towards higher power consumption. Nvidia’s latest Blackwell chip will require 14 kilowatts of energy per server – that's 1,800 watts per chip. If you fill an entire data center with such chips, it will easily consume as much energy as a medium-sized city. Babak Falsafi is a professor at the École Polytechnique Fédérale in Lausanne and works on the efficiency of data centers. He says: “The energy consumption of chips developed specifically for AI doubles with each new generation.”

A year ago, Sam Altman, CEO of Open AI, warned that an energy bottleneck would jeopardize the development of AI.

Deepseek breaks the trend

The success of Chinese start-up Deepseek's AI has cast doubt on the assumption that AI really needs more and more computing capacity. According to the company, Deepseek trained its chatbot with fewer and less powerful chips and still achieved similar performance to Open AI's top model. In a research paper, Deepseek explains the techniques they used to achieve more efficient AI. Other AI developers can adopt these innovations for their own models.

If computing power can be saved, data centers consume less electricity. Babak Falsafi explains: “Improvements to algorithms could make them more efficient and thus save energy.” In fact, algorithms that train AI more efficiently do not necessarily reduce the overall energy required for AI applications. They make AI cheaper and therefore more attractive for users. If more people and companies use AI, power consumption will increase again. The costs and energy requirements then only shift from training to application.

It could be years or even decades before new forms of energy production provide sufficient power

Microsoft is therefore focusing on nuclear energy and is financing the recommissioning of the Three Mile Island nuclear power plant, which was shut down in 2019. The power plant was no longer profitable. Amazon and Google announced a large investment in so-called “Small Modular Reactors” in the fall. These small, modular nuclear power plants generate up to 300 megawatts of power and can supply data centers directly with electricity. None of these mini nuclear power plants are yet connected to the grid in the USA or are even close to being commissioned. Sam Altman himself is backing start-ups such as Oklo, which is developing small nuclear reactors that run on nuclear waste as fuel. He is also investing in Helion, a company that specializes in nuclear fusion. Altman is spending hundreds of millions of dollars on energy bets in the hope of a breakthrough.

It could be years or even decades before new forms of energy production provide sufficient electricity. Until then, data centers will often be powered by energy from fossil fuels. This is because the AI hype is consuming electricity today.

 

By Lena Waltle and Anna Weber - original story in German on the 22nd February, 2025, Neue Zürcher Zeitung

Babak Falsafi Desk 2024