News - June 15, 2025

AI, Data Centres, and Their Expanding Impact on the Environment

- Photo by Joshua Sortino

Written by Solar Impulse Foundation 9 min read

The Surge of AI-Specific Data Centres

Since the beginning of the 21st century, artificial intelligence has had a profound impact on ICT's (Information and Communication Technologies) evolution and more recently on our day-to-day habits. Whether to find sources for a bachelor thesis, generate images for social media, or even assist in the interpretation of an X-Ray, AI-technology can now provide quick and reliable answers with remarkable efficiency.

We are witnessing enormous public and private investments aimed at further elevating AI-deployment globally: in February, the European Union launched the InvestAI program, aiming to mobilise €200 billion for AI-application. Saudi Arabia has also announced Project Transcendence; a $100 billion project aiming to position the country as a worldwide AI leader.

AI could strengthen solutions to global warming and support the development of mitigation and adaptation measures in response to our changing climate. For example, Netcarbon Farming uses satellite data and IA Algorithms to precisely measure carbon capture for rural and urban sectors, providing a follow-up of carbon storage projects everywhere on earth.  However, AI also creates new challenges in our efforts to reduce CO2 emissions and energy waste caused by data centres' energy consumption. By 2028, estimates from the Lawrence Berkeley National Laboratory suggest that the electricity demand for AI-specific purposes could rise to between 165 and 326 terawatt-hours per year, more than all electricity currently used by US data centres. As AI adoption expands at an unprecedented pace, we must carefully consider an essential question: How to continue the development of AI while respecting our environmental goals?

Visual Representation of Electricity Consumption at Night, NASA

 

How AI Works and Why It Impacts The Environment

A broad understanding of the necessary steps to develop AI-models can explain AI’s influence on the use of our resources and climate change.

First of all, the goal of an AI needs to be defined; will it rather understand and generate human language or do meteorological predictions? Billions to trillions of words or texts are then collected from books, websites or public code, depending on the AI’s target. These datasets, along with the model’s architecture (number of layers, parameters) determine how deep and complex the neural network will be (GPT-3 contains 175 billion parameters, GPT-4 is likely much larger, with some estimates stating it contains close to a trillion parameters)1.

Then comes the most expensive and energy-intensive step: training the model. Using GPUs (Graphics Processing Unit) like NVIDIA A100 and supercomputing clusters (Microsoft Azure for Open AI), massive amounts of text are fed into the model, which learns by predicting the next word and compares predictions to the actual text to adjust its parameters. This process is repeated for weeks or months.

Subsequently, the model is fine-tuned: toxic or biased content is filtered, more specific datasets are introduced, and some models use RLHF (Reinforcement Learning from Human Feedback), where they learn to produce answers that are more aligned and safer based on human feedback. The model is then optimised for faster use and deployed on cloud servers (Microsoft Azure). It becomes available to users via APIs or a web interface. Models also continually improve by collecting feedback, fixing bugs, adding new knowledge and improving efficiency and safety.

As said previously, it is training the model that costs the most financially, environmentally and energetically. Thousands of GPUs running in parallel 24/7 for months have a gigantic energy draw, and the data centres, who are temperature-controlled buildings that house the computing infrastructure, need a vast amount of water as a cooling source. Running the model in itself is less demanding, but because of the number of users currently using AI, the energy costs are huge and will only keep expanding if the current trend continues.

These energy costs translate directly into CO2 emissions. When AI models are powered by grids running on fossil fuels or coal, their operation can result in substantial carbon emissions. An environmental impact that will only grow as AI employment continues to expand.

 AI workloads consumed up to 20% of global data centre electricity in 2024. Specialists predict this number will rise to nearly 50% by the end of the year.2

Figures

Before we dive into the numbers associated to the implementation of AI models, one must keep in mind that to this day, companies like OpenAI, Google, and Anthropic do not publicly disclose the energy consumption of their models. We lack precise information about how many GPU hours were used, the amount of training runs performed or a data centre’s efficiency. However, we can make educated estimates based on research papers, leaks or expert analyses. For example, we know that ChatGPT was trained on NVIDIA A100 GPUs. Additionally, we know how much raw computing work the model needed. By using this knowledge, we can convert FLOPs (number of operations) into GPU time and ultimately into kilowatt-hours (kWh) to obtain a range of the electricity required for AI. The following data illustrates AI’s resource use and environmental impact:

  • 51,772,500-62,318,750 kWh to train GPT-4, around 40-48 times higher than the electricity required for GPT-3 (1,287,000 kWh), equivalent to the annual consumption of about 3,600 U.S. homes.3

  • Each interaction with AI models such as GPT can use up to 10 times more electricity than a standard Google search.4

  • Despite pledging to become carbon negative by 2030, Microsoft’s total emissions have risen by 23.4%, with a significant increase in Scope 3 emissions due to AI infrastructure expansion.5

  • By 2027, global AI demand is projected to account for 4.2-6.6 billion cubic meters of water withdrawal per year, surpassing the total annual water withdrawal of countries like Denmark.6

  • The lifecycle of AI models, from chip manufacturing to data centre operations, degrade air quality through emissions of pollutants like fine particulate matter, impacting public health.7

  • Approximately 315 million people worldwide were using AI tools (writing, image&design, video&audio, business&productivity) in 2024, with projections estimating this number will more than double to over 700 million users by 2030.8

Having witnessed AI’s numerous benefits to our society, we cannot imagine reducing its application. However, the way we develop AI technologies and facilities can evolve and become cleaner and more sustainable. Let’s explore the possibilities.  

Bristol Robotics Laboratory, Louis Reed

 

Reducing AI’s Impact on Natural Resources and The Environment

Various methods aiming to reduce energy consumption and waste as well as the carbon footprint caused by AI-tech are already accessible on a global scale and being implemented. Among these solutions, some have received the Solar Impulse Efficient Solution Label. Here are concrete actions that have been taken:


  • Data Centres running on renewable energies like solar, hydro and wind:

    • Since 2017, Meta (formerly Facebook) has reduced its operational emissions by 94% by powering its data centres and offices with 100% renewable energy.


  • Increase computational hardware’s energy efficiency:

    • NVIDIA’s H100 GPUs provide performance and efficiency improvements over the A100s for AI workloads. Due to its 67% greater memory bandwidth which facilitates data transfer and a new transformer engine that accelerates training of transformer-based model, it could be a very efficient solution.


  • Optimise Data Centres by improving Power Usage Effectiveness (PUE), and developing data centres in colder areas to reduce cooling needs:

    • Beeyon’s Papillon solution, labelled by the Solar Impulse Foundation since 2019, identifies energy-saving actions on any server-type and operating system and reduces capital costs by up to 30%.

    • Similarly, Heex Technologies’ Smart Data Management Services offer a platform for organisations to transition from Big Data to Smart Data. This provides an opportunity to reduce energy consumption generated by data processing by 30%.

    • The Verne Global Data Centre in Keflavik, Iceland, operates with a very low PUE thanks to natural cold air and therefore no need for mechanical chillers. In addition, it has a minimised carbon footprint as it runs on renewable energies. Customers include NVIDIA and DELL Technologies, companies with an important presence in the world of AI.


  • Recycle Heat and Water for cooling purposes or to heat buildings:

    • Deutsche Telekom and Cloud&Heat have come up with a solution to reduce CO2 emissions of data centres by 41%. By developing a cooling system with direct hot water cooling and waste heat recovery, 78% of the heat is dissipated and fed into the hot water circuit of the building for reutilisation. It can reduce energy costs by up to 40%.

    • Hyperion’s Immersion Cooling Solution for Data Centres consists of servers configured as an immersible 100% biodegradable heat transfer fluid. This cloud technology avoids the use of air conditioning systems thanks to immersion cooling tech. Last year, it was adopted in France by the community of communes of Ernée to reduce electricity consumption caused by data hosting and reuse the heat produced by computer servers.

    • Qarnot (QH-1) by Qarnot Computing reduces the carbon footprint of computations by an estimated 70-75%! QH-1 takes advantage of the heat lost by data centres and uses it as a heat source to warm rooms. Qarnot technologies were implemented 6 years ago by Département de la Gironde in Bordeaux and recognised by the French Law as renewable energy! In Bordeaux, 6000 m2 of buildings are fully heated by the QH-1 and provide renewable, free heat to users. The building was the first in the world entirely heated by computers.

Deutsche Telekom and Cloud&Heat's Solution

 

Solar Impulse Foundation’s AI platform 

At the Solar Impulse Foundation, we aim to benefit from the advantages of AI without losing sight of our mission for sustainability. Therefore, we have taken some measures to reduce our AI’s environmental footprint.

Our AI and platform infrastructure are hosted on Microsoft Azure; a cloud provider committed to carbon neutrality. Azure offsets the CO2 emissions generated by its data centres to ensure that our operations contribute to a more sustainable future.

We use Open AI’s GPT-4o-mini, a smaller, less resource-intensive model that reduces energy consumption during operations yet still delivers efficient responses giving us a balance between quality and environmental responsibility.

Our tokens have been optimised by summarising interactions and streamlining queries. By limiting token usage, reducing computational resource requirements and designing interactions to be as efficient as possible we can lower our environmental footprint.

Additionally, we are exploring opportunities to adopt open-source language models (LLMs) that offer better transparency and environmental impact metrics that could provide a viable alternative to proprietary models like OpenAI and help us pursue our path towards sustainable development.

 

The Expansion of AI as an Example of Sustainable Development

The leaders of the AI-industry have an opportunity to prove that expanding and advancing one of the most exciting innovations of our century does not have to come at the cost of environmental destruction and pollution. Embracing a mindset focused on efficiency and clean production could reduce costs both financially and environmentally without damaging AI’s capacity. It could also serve as a global example for how innovation and sustainability can go hand in hand and that we have truly learned from the past mistakes of mass production at any costs!

Smart Building, as captured here in Singapore by Felix Fuchs, should serve as an inspiration for the development of AI industry!