Swiss national radio channel RTS spoke to Dr. Xavier Ouvrard, of EcoCloud and Prof. David Atienza, Associate Vice-President for Centers and Platforms at EPFL about current trends in data center technology at the EcoCloud research facility.
You can listen to the interview in French, or read the transcript in English, below.
7.08.2025 - Interview for RTS Première, Les Énergiques by François Jeannet
Transcription in English:
Servers always have to run, you can probably hear it: these servers are running, lights are blinking everywhere… and they generate a lot of heat. Which means they need to be cooled.
I’m here with Professor David Atienza, who runs EPFL’s data center and research platform, and with (Dr.) Xavier Ouvrard, IT systems specialist at EcoCloud.
Server Consumption
François Jeannet: Mr. Ouvrard, how much energy do these servers consume?
Xavier Ouvrard: The servers we have in the experimental area consume very little, up to 150 kW.
But in EPFL’s main data center, we’re currently at around 900 kW, with an installed capacity of 2 MW.
François Jeannet: How does that compare to the consumption of a city or a neighborhood?
Xavier Ouvrard: This isn’t a huge data center, not even a medium-sized one: it’s small, even very small.
Its consumption is equivalent to that of 2,500 Swiss households per year.
François Jeannet: Why does it need cooling?
Xavier Ouvrard: Because when you do calculations, computers heat up. All electricity is transformed into heat.
Without cooling, the temperature in the room would rise quickly, which would be unbearable with full rows of servers.
Energy Optimization
François Jeannet: At EPFL, you conduct research to optimize this energy management. How?
Xavier Ouvrard: We created EcoCloud, a research center dedicated to sustainable computing.
One of our flagship projects, Heating Bits, takes a holistic approach:
- managing the energy supplied to data centers,
- storing renewable energy when it’s available,
- optimizing cooling.
We use lake water for cooling, then recover the heat to:
- heat buildings in winter,
- possibly co-generate electricity in summer.
Security and Redundancy
François Jeannet: In case of a breakdown, how do you prevent data loss?
Xavier Ouvrard: We have redundancy in the main data center, battery storage for temporary backup, and diesel generators for extended outages.
National Scale
François Jeannet: There is also a data center in Gland, the largest in Switzerland, whose annual consumption is equivalent to that of a city of 30,000 inhabitants.
Impact of Artificial Intelligence
François Jeannet: Will AI require even more energy?
David Atienza: Yes, but it depends on how we use it.
Many applications can run on smaller, less energy-hungry models.
We plan to double our data center’s capacity to 4 MW in the coming years.
For comparison, some centers in the United States or China reach 150 to 600 MW.
Without optimization, AI could account for up to 10% of global energy consumption.
With current efforts, we think we can keep the increase to around 4%.
David Atienza: We can:
- recover heat for heating,
- adjust consumption based on energy availability,
- use data centers as regulators for power grids,
- manage loads with AI to optimize operation.
Consumption: AI vs Google Search
François Jeannet: What’s the difference between a Google search and an AI query (Gemini, ChatGPT…)?
David Atienza: An AI query consumes more because it needs to go through large knowledge networks.
But companies are starting to integrate smaller models directly into our devices, which will reduce consumption.
François Jeannet: Sam Altman (ChatGPT) estimates that one AI query = 0.3 Wh.
One week of Paléo Festival = 250,000 kWh, or 735 million possible queries.
That would allow 230,000 festivalgoers to make 3,200 queries per day for the same energy.
David Atienza: Indeed, a festival like Paléo consumes almost 10 times more than our data center.
For comparison: a household consumes about 10 Wh per minute, so one AI query equals 30 minutes of domestic energy consumption.
Conclusion
AI is not “ultra energy-hungry” per query, but massive use multiplies the impact.
Research at EPFL aims to keep optimizing this consumption, notably through heat recovery, load modulation, and local integration of AI models.