Insights Business| SaaS| Technology The Hidden Environmental Footprint of AI Including Water Consumption and Carbon Emissions
Business
|
SaaS
|
Technology
Dec 6, 2025

The Hidden Environmental Footprint of AI Including Water Consumption and Carbon Emissions

AUTHOR

James A. Wondrasek James A. Wondrasek
Graphic representation of the topic The Hidden Environmental Footprint of AI Including Water Consumption and Carbon Emissions

You probably know AI infrastructure eats electricity. Data centres consumed 4.4% of U.S. electricity in 2023, heading to 9% by 2030. But here’s what you’re missing—water consumption.

A typical AI-focused data centre burns through 300,000 gallons of water daily. That’s the hidden cost. And it’s projected to jump 870% by 2030.

This article is part of our comprehensive guide to understanding AI data centre energy consumption and sustainability challenges, where we explore the full environmental footprint beyond just electricity usage.

So you need frameworks to measure the total footprint—energy, water, and carbon. The difference between training and inference matters for your resource planning. And your cooling technology choices create trade-offs between water and energy use.

This article gives you practical measurement approaches using PUE and WUE metrics, shows you the real impacts using data from Cornell, Google, and MIT research, and walks you through optimisation strategies with 100x energy and 1000x carbon reduction potential.

What is the environmental footprint of AI infrastructure?

AI infrastructure hits the environment in three ways: electricity, water, and carbon.

By 2030, AI data centres will emit 24 to 44 million metric tons of carbon dioxide annually—that’s like adding 5 to 10 million cars to U.S. roadways. Water usage? Just as bad: 731 to 1,125 million cubic metres per year. That’s what 6 to 10 million Americans use at home annually.

Why does AI eat so much? GPU infrastructure throws off way more heat per rack than traditional computing—30-100 kW versus 5-15 kW. All that concentrated heat needs serious cooling.

The carbon footprint breaks down into operational emissions (Scope 2—electricity you buy), facility emissions (Scope 1—backup generators and refrigerants), and supply chain emissions (Scope 3—chip manufacturing, building construction, transport). And here’s a kicker: each kilowatt hour of energy a data centre consumes requires two litres of water for cooling.

But here’s what makes measurement tricky: production systems require 15-30% idle capacity sitting around ready for spikes and failover. That overhead burns energy. You can’t just measure active computation.

How much water do AI data centres consume daily?

The typical number is 300,000 gallons (1,135 cubic metres) daily for an AI-focused data centre. That’s driven by evaporative cooling systems needed to dump GPU heat.

At the individual query level? Google’s Gemini consumes approximately 0.26 mL water per median text prompt. Tiny per query. But billions of daily queries add up fast to facility-scale volumes.

The 870% growth projection between now and 2030 comes from AI adoption accelerating and GPU density increasing. More heat, more cooling, more water.

Water Usage Effectiveness (WUE) measures litres of water per kilowatt-hour of IT equipment energy. The typical ratio is approximately 2 litres per kilowatt-hour. Average WUE across data centres is 1.8 L/kWh, while best-in-class facilities get below 0.5 L/kWh.

In water-scarce regions, water consumption competes with agricultural and residential use. Geographic variation matters—desert facilities versus humid climate facilities have different water needs.

What is the difference between training and inference carbon emissions?

AI training is a one-time, computationally intensive hit with concentrated carbon cost. Inference is the ongoing operational cost.

Training a large language model generates 25-500 tonnes CO2e depending on model size and how long training takes. Big upfront hit.

Inference generates 0.001-0.01 gCO2e per query. Tiny. But it stacks up across billions of daily interactions.

Here’s the thing: cumulative inference emissions often exceed training costs within 6-12 months for popular models. The ongoing cost overtakes the upfront cost faster than you’d think.

There’s another wrinkle. Generative AI models have a short shelf-life driven by rising demand for new applications. Companies release new models every few weeks, so energy used to train prior versions goes to waste.

Training optimisation through efficient architectures and renewable energy timing offers 100-1000x carbon reduction potential. Selecting efficient ML model architectures such as sparse models can lead to computation reductions by approximately 5 to 10 times. For practical strategies on reducing carbon emissions, tech giants are increasingly turning to nuclear and renewable energy sources.

How do data centres use water for cooling AI servers?

Most data centres use a combination of chillers and on-site cooling towers to stop chips from overheating.

Evaporative cooling through cooling towers gives you the highest efficiency but consumes water that cannot be reclaimed. The water evaporates—it’s gone for good.

Direct-to-chip liquid cooling delivers liquid coolant directly to GPUs and CPUs. Closed-loop systems cut facility water use and let you pack in higher density racks.

Immersion cooling submerges servers in specialised dielectric fluid. Near-zero water use. But immersion cooling entails higher upfront costs despite giving you significant energy savings.

Water-cooled data centres use less energy than air-cooled data centres. This creates trade-offs across all cooling approaches.

Geographic context matters. In water-stressed regions, priority should be low- to zero-water cooling systems to reduce direct use. In wetter regions with carbon-intensive grids, priority should be reducing power use to lower overall water consumption. These considerations tie directly into efficiency strategies when choosing your cooling approach.

What is Power Usage Effectiveness (PUE) and why does it matter?

PUE measures data centre energy efficiency as the ratio of total facility energy to IT equipment energy.

Here’s the formula: Total Facility Energy (IT equipment + cooling + lighting + overhead) ÷ IT Equipment Energy.

A perfect score of 1.0 means every watt goes directly to computing. Average PUE in 2022 was approximately 1.58, though high-efficiency facilities hit 1.2 or better. Industry-leading hyperscale data centres achieve PUE of 1.1-1.2.

Lower PUE means less energy wasted on non-computing stuff. Every 0.1 PUE improvement cuts energy costs proportionally. And PUE directly multiplies grid carbon intensity impact.

But PUE has limitations. Traditional metrics for data centre efficiency like PUE are insufficient for measuring AI workloads because they don’t account for energy efficiency at the intersection of software, hardware, and system levels.

What is Water Usage Effectiveness (WUE) and how is it measured?

WUE measures water efficiency as litres of water per kilowatt-hour of IT equipment energy.

The calculation: Annual Water Consumption (litres) ÷ Annual IT Equipment Energy (kWh).

Lower WUE is better water efficiency. Best-in-class facilities achieve WUE below 0.5 L/kWh. Average WUE across data centres is 1.8 L/kWh—that’s your baseline to beat.

WUE complements PUE by capturing the non-energy environmental dimension people overlook in efficiency discussions. It’s an emerging metric gaining importance as water scarcity increases. When evaluating the complete sustainability challenges of AI infrastructure, both metrics are essential for comprehensive assessment.

Geographic context matters a lot. WUE of 2.0 is acceptable in a water-abundant region but problematic in drought areas. Same number, different environmental impact.

There’s a distinction between consumption and withdrawal. Water withdrawal is total water taken from sources; water consumption is the portion not returned. Evaporative cooling consumes water permanently through evaporation. Closed-loop systems withdraw water but return most of it.

How do cooling technology trade-offs affect environmental footprint?

Evaporative cooling gives you the best energy efficiency (lowest PUE) but the highest water consumption (highest WUE). Dry cooling eliminates water use but increases energy use 15-25%, raising the carbon footprint.

Direct-to-chip liquid cooling cuts facility-level consumption while letting you deploy higher-density GPUs. Immersion cooling offers 45% energy reduction with near-zero water use but requires operational changes.

Here’s what each technology looks like:

Evaporative cooling: 2+ L/kWh water use, PUE 1.1-1.3, proven technology with geographic limitations.

Dry cooling: near-zero water, PUE 1.3-1.6, energy penalty, works best in cool climates.

Direct-to-chip: 0.5-1.0 L/kWh water, PUE 1.1-1.2, enables 100+ kW racks, higher complexity.

Immersion: near-zero water, PUE 1.05-1.15, 45% energy savings, operational transformation required.

Geographic location influences your optimal choice. In water-stressed regions, priority should be low- to zero-water cooling systems. In wetter regions with carbon-intensive grids, priority should be reducing power use. For actionable approaches to reducing your environmental footprint, consider both technology selection and workload optimisation strategies.

How can organisations measure their AI environmental impact?

You need to calculate total facility energy (PUE), water consumption (WUE), and carbon emissions (Scope 1/2/3).

Google’s comprehensive approach covers: active computation + idle capacity + CPU/RAM overhead + data centre overhead (PUE) + water consumption (WUE). Their comprehensive methodology estimates a median Gemini text prompt uses 0.24 Wh energy, 0.03 gCO2e, 0.26 mL water.

Why the comprehensive approach matters: production systems require provisioned idle capacity ready to handle traffic spikes or failover that consumes energy you need to factor into total footprint.

Tools you can use: CodeCarbon estimates emissions during ML model training. MLCarbon is the most comprehensive framework for LLMs supporting end-to-end phases: training, inference, experimentation, storage.

Carbon accounting framework: Scope 1 (direct facility emissions—backup generators, refrigerants), Scope 2 (purchased electricity—the biggest chunk for AI), Scope 3 (supply chain emissions—chip manufacturing, facility construction, equipment transport, end-of-life disposal).

Common measurement mistakes: active-machine-only calculations, ignoring the water dimension, missing Scope 3.

Understanding the full environmental impact of AI infrastructure requires measurement across electricity, water, and carbon dimensions. For a complete overview of all sustainability challenges facing AI data centres, including grid stress and emerging energy solutions, see our comprehensive guide to AI data centre energy consumption and sustainability challenges.

FAQ Section

How much water does ChatGPT use per query?

Based on Google’s published Gemini metrics (0.26 mL per median text prompt), similar AI assistants likely use 0.2-0.5 mL water per query. At billions of daily queries, it adds up to facility-scale volumes.

Why does AI use so much water?

AI models run on GPU/TPU processors generating significantly more heat per rack than traditional computing (30-100 kW versus 5-15 kW). All that concentrated heat needs substantial cooling, mostly through water-based evaporative systems consuming 2+ litres per kilowatt-hour.

Is using AI bad for the environment?

AI has measurable environmental impact. But impact varies dramatically based on infrastructure efficiency (PUE 1.1 versus 1.8), renewable energy usage, cooling technology, and geographic location. Combined optimisation strategies show 100x energy and 1000x carbon reduction potential.

What is the environmental impact of using AI every day?

Individual AI queries have small per-interaction impact (0.001-0.01 gCO2e, 0.2-0.5 mL water), but cumulative effect at scale is substantial. A ChatGPT query consumes about five times more electricity than a simple web search. If you make 50 daily queries you’re generating roughly 180-365 gCO2e annually.

How can organisations reduce AI carbon footprint?

Key strategies: 1) Smart siting in renewable energy regions (73% carbon reduction potential), 2) Model optimisation through selecting efficient architectures (5-10x efficiency gains), 3) Workload scheduling during high renewable energy availability, 4) Infrastructure efficiency improvements (PUE reduction), 5) Renewable energy procurement through PPAs.

What is the biggest environmental problem with AI?

Cumulative scale represents the primary challenge. Individual improvements get offset by exponential growth in AI usage. Even if each kilowatt-hour gets cleaner, total emissions can rise if AI demand grows faster than the grid decarbonises. Projected 2030 impact: 24-44 million metric tons CO2 and 731-1,125 million cubic metres water annually for U.S. AI data centres alone.

Are AI models environmentally sustainable?

Current trajectory is unsustainable without intervention. However, combined optimisation strategies show 100x energy and 1000x carbon reduction potential through efficient model architectures, renewable energy scheduling, and geographic smart siting. Sustainability requires: renewable energy transition, cooling technology innovation, model efficiency improvements, and geographic smart siting.

How do tech companies measure AI carbon emissions?

Leading companies use comprehensive lifecycle assessment including: operational energy (Scope 2), facility direct emissions (Scope 1), and supply chain/manufacturing (Scope 3). CodeCarbon measures training emissions while cloud dashboards track inference. Transparent reporting includes PUE, WUE, renewable energy percentage, and progress toward net-zero targets.

What is the difference between water consumption and water withdrawal?

Water withdrawal is total water taken from sources; water consumption is the portion not returned. Evaporative cooling consumes water permanently through evaporation. Closed-loop systems withdraw water but return most of it, resulting in low consumption despite high withdrawal.

Can data centres operate without using water?

Yes, through dry cooling or immersion cooling technologies. Dry cooling uses air convection (near-zero water) but increases energy consumption 15-25%. Immersion cooling submerges servers in dielectric fluid, eliminating water cooling while cutting energy 45%. Trade-off is higher capital cost and operational complexity.

How does data centre location affect environmental impact?

Location determines grid carbon intensity, water scarcity impact, cooling efficiency, and renewable energy access. Midwest and windbelt states deliver the best combined carbon-and-water profile. Cornell study identifies smart siting as the most important factor: 73% carbon reduction and 86% water reduction potential through optimal location selection.

What are Scope 1, 2, and 3 emissions for AI infrastructure?

Scope 1 covers direct facility emissions (backup generators, refrigerants). Scope 2 is purchased electricity—the biggest component for AI. Scope 3 includes supply chain emissions like chip manufacturing, facility construction, equipment transport, and end-of-life disposal. You need all three scopes for comprehensive accounting.

AUTHOR

James A. Wondrasek James A. Wondrasek

SHARE ARTICLE

Share
Copy Link

Related Articles

Need a reliable team to help achieve your software goals?

Drop us a line! We'd love to discuss your project.

Offices
Sydney

SYDNEY

55 Pyrmont Bridge Road
Pyrmont, NSW, 2009
Australia

55 Pyrmont Bridge Road, Pyrmont, NSW, 2009, Australia

+61 2-8123-0997

Jakarta

JAKARTA

Plaza Indonesia, 5th Level Unit
E021AB
Jl. M.H. Thamrin Kav. 28-30
Jakarta 10350
Indonesia

Plaza Indonesia, 5th Level Unit E021AB, Jl. M.H. Thamrin Kav. 28-30, Jakarta 10350, Indonesia

+62 858-6514-9577

Bandung

BANDUNG

Jl. Banda No. 30
Bandung 40115
Indonesia

Jl. Banda No. 30, Bandung 40115, Indonesia

+62 858-6514-9577

Yogyakarta

YOGYAKARTA

Unit A & B
Jl. Prof. Herman Yohanes No.1125, Terban, Gondokusuman, Yogyakarta,
Daerah Istimewa Yogyakarta 55223
Indonesia

Unit A & B Jl. Prof. Herman Yohanes No.1125, Yogyakarta, Daerah Istimewa Yogyakarta 55223, Indonesia

+62 274-4539660