AI’s dirty secret: the environmental cost of intelligence
Artificial intelligence holds enormous potential to help solve environmental problems… But there is a growing recognition that the technology also has an environmental footprint!
Hello everyone,
Artificial intelligence holds enormous potential to help solve environmental problems, from detecting methane leaks to managing smart grids… But there is a growing recognition that the technology, and in particular AI, and even more Generative AI, also has an environmental footprint of its own!
Training and deploying advanced AI systems like large language models require substantial energy and water resources, and their use is expanding fast. As these models become embedded in daily applications, from search engines to productivity tools, their environmental impact is also scaling with them.
So how bad is it, really? The answer depends on where, when, and how AI is used, and who's keeping track.
What we know: energy, water, and waste
⚡️ AI’s environmental impact start in the training phase, and especially for large models like GPT-3 or GPT-4. The training phase consumes large bursts of energy (GPT-3’s training reportedly used over 1.2 GWh, roughly the annual usage of 120 U.S. households).
Because of the hardware energy consumption ⇒
And also, the models becoming bigger ⇒
Which leads to more power needed to train models (bigger models on more energy-intensive hardware) ⇒
ℹ️ Beware that the scale is logarithmic in the y-axis of the three last graphics.
But, the training phase it not the only one! The operational phase, known as inference (when you prompt the model for a LLM like ChatGPT), can quickly become the larger contributor as usage scales up! This is even more true for Generative AI that consume much more than for example a Google search (which is also use AI):
💧 Then there is water! Data centers often rely on water-intensive cooling systems. Estimates suggest that training GPT-3 may have directly evaporated over 700,000 liters of freshwater. And when we are using LLMs models, each inference is also using GPUs to compute the request that will need to be cooled down. Here is an example from Li et al. of a “data center’s operational water usage: on-site scope-1 water usage for data center cooling (via cooling towers in the example), and off-site scope-2 water usage for electricity generation” ⇒
In total, the AI demand projected by 2027 could account for 4.2 to 6.6 billion cubic meters of water withdrawal, depending on usage patterns and infrastructure choices.
♻️ Hardware production also plays a role. The GPUs used in AI training require rare earth elements, whose extraction can be environmentally damaging... Add to this the issue of electronic waste (if not recycled), and the picture becomes even more complex.
Different types of IA models and tasks vs energy consumption
Different types of AI models and tasks use varying amounts of energy, depending on their complexity and purpose. In Machine Learning, simpler models like regression, which predicts a value based on input data, or decision trees, which split data into branches to make decisions, are lightweight and use relatively little energy. These models are efficient for straightforward tasks such as forecasting or basic classifications. In contrast, more advanced systems like generative AI, which can create text, images, or music, require large amounts of data and computing power. These models often involve deep learning and take much more energy to train and run (a generative AI model contains several Deep Learning models). As AI applications grow, especially in areas like natural language processing and image generation, it is important to consider the environmental impact of their energy use!
Local vs global: who bears the cost?
One key insight emerging from recent studies is that AI’s environmental impact is unevenly distributed. While the majority of emissions may occur in high-income countries that host large data centers, the mining of raw materials and water stress can disproportionately affect regions with fewer resources and less regulatory capacity.
This raises questions not just of environmental sustainability, but also of fairness. Who gets the benefit of AI applications, and who pays the price in terms of water stress, pollution, or energy instability?
You can click on ❤️ and/or share button at the start or at the end of the newsletter if you like the content, it would help me a lot!
Uncertainty and complexity in measurement
Despite the alarming headlines, it is worth noting that measuring the environmental impact of AI is still an emerging science! We often lack standard reporting practices or reliable data, particularly for the inference stage or across different geographic contexts.
Studies have tried to estimate the carbon footprint per image generated or per query made to an AI system, but figures vary depending on factors like server location, power source, and model architecture. Even metrics like “carbon per query” can obscure broader systemic impacts over time.
This means any numbers should be interpreted with caution, but that shouldn’t delay efforts to measure, benchmark, and improve.
Electricity is becoming more and more competitive
AI is a (small) part of the expected electricity growth! The area is becoming more and more competitive, and that is why we are seeing Microsoft, OpenAI, Amazon or Alphabet (Google) investing in nuclear energy!
ℹ️ To be noted, not all the growth of data centers electricity demand is to be imputed to AI.
Are there solutions? Some emerging practices
Researchers and companies are beginning to explore ways to reduce AI’s environmental cost.
Efficient models
Some models, like BLOOM developed in 2022 by Hugging Face, were trained with more efficient chips and infrastructure, leading to substantially lower emissions than earlier models.
Green scheduling
Flexible workloads can be timed to coincide with periods of surplus renewable energy or cooler outdoor temperatures, helping to reduce both emissions and water usage.
Inference optimization
Developers are being encouraged to use smaller, more efficient models where possible, and to reduce unnecessary computations.
On the governance side, the OECD and IEEE (Institute of Electrical and Electronics Engineers) are working on technical standards to help quantify AI’s environmental footprint and support more transparent reporting.
DeepSeek: a case study in energy-efficient AI
DeepSeek, a Chinese AI startup, has garnered attention for its energy-efficient approach to training and running large language models. The company claims to have trained its R1 model for approximately $5.6 million using 2,000 Nvidia H800 GPUs, significantly less than the costs reported by competitors for similar models!
DeepSeek's architecture employs a Mixture-of-Experts design, activating only a subset of its parameters during inference, which reduces computational overhead. Additionally, the company utilizes techniques to enhance computation and data handling, which itself enhance efficiency.
While these innovations suggest a lower energy footprint per operation, some experts caution that increased efficiency could lead to higher overall usage, potentially offsetting environmental gains… This phenomenon is known as the Jevons paradox.
Governance gaps and next steps
Current AI governance frameworks rarely incorporate environmental metrics. Model documentation typically omits water usage and offers little data on operational emissions. This makes it difficult for policymakers or even companies themselves to understand the full picture.
However, some ideas are gaining traction:
Require the disclosure of energy and water use in public sector AI procurement
Fund “green AI” research and infrastructure
Align global sustainability goals with national AI strategies
But above all, we need better data, clearer standards, and cross-sector collaboration!
Final thought
AI does not have to be an environmental villain. But it is not automatically a hero either! Like any powerful tool, it needs to be wielded thoughtfully, balancing innovation with impact and ambition with accountability. Moreover, we should always ask ourselves, as companies or people, if it is necessary to use AI for what I want to do…. And if, “yes”, if Generative AI is necessary… A lot of time it won’t be. Or Generative AI can be used for the prototype, and we can then move to another Machine Learning model that will consume less.
Sources
MIT Technology Review: https://www.technologyreview.com/2023/12/05/1084417/ais-carbon-footprint-is-bigger-than-you-think/
Wikipedia: https://en.wikipedia.org/wiki/Environmental_impact_of_artificial_intelligence
Harvard Business Review: https://hbr.org/2024/07/the-uneven-distribution-of-ais-environmental-impacts
OECD: https://oecd.ai/en/wonk/the-hidden-cost-of-ai-energy-and-water-footprint
MIT: https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
Stanford HAI: https://hai.stanford.edu/ai-index/2025-ai-index-report