ai water
environment,  sustainable living

How Much Water Does ChatGPT Use? The Complete 2025 Guide to AI Water Consumption

Have you ever stopped to think about what happens behind the scenes every time you chat with ChatGPT or another AI assistant?

The question “how much water does ChatGPT use per day” has become increasingly urgent as AI usage explodes globally. With over 1 billion queries processed daily, the daily water consumption is staggering – far more than most people realise.

How much water does ChatGPT use? The answer has sparked heated debate since 2023, with estimates ranging from a few drops to an entire water bottle per conversation. In June 2025, OpenAI CEO Sam Altman finally provided an official answer—but it conflicts sharply with academic research.

This comprehensive guide cuts through the confusion to reveal the truth about ChatGPT’s water consumption, why the numbers vary so wildly, and what it means for our planet as AI usage explodes globally.

Updated February 2026 – Latest data on how much water ChatGPT uses daily
How Much Water Does AI Use
ai water

How Much Water Does ChatGPT Use Per Query? The Competing Answers

The short answer: It depends entirely on who you ask—and what they’re measuring.

Sam Altman’s Official Claim (June 2025)

In a blog post titled “The Gentle Singularity,” OpenAI CEO Sam Altman stated:

“The average query uses about 0.34 watt-hours… It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon.”

Sam Altman’s figure: 0.3 millilitres (0.000085 gallons) per ChatGPT query

That’s smaller than an eyedrop. Barely noticeable. Almost negligible.

But there’s a critical catch: Altman’s figure only includes direct, on-site water cooling at OpenAI’s data centres. It excludes:

  • Water used for electricity generation (power plants)
  • Water is consumed in manufacturing GPUs and servers
  • Indirect water footprints across the supply chain

Academic Research Estimates

Independent researchers using full lifecycle analysis (LCA) tell a dramatically different story:

University of California, Riverside (2023): 500 millilitres per 20-50 queries
Approximately 10-25 ml per individual query

Washington Post Analysis (2024): 519 millilitres to generate a 100-word email using GPT-4
Approximately 5+ ml per query

NIAIS Report: 5 millilitres per query (hybrid cooling systems)

European Climate Studies: 6-9 millilitres per query (accounting for energy grid water use)

GPT-3 Era Research: 10-12 millilitres per query (older, less efficient systems)

The Reality: Both Are Technically Correct

Sam Altman’s 0.3 ml and researchers’ 5-10 ml estimates are both accurate—they’re just measuring different things:

  • Altman measures: Direct data centre cooling water only
  • Researchers measure: Full lifecycle, including electricity generation, manufacturing, and indirect impacts

Think of it like calculating a car’s carbon footprint: Do you only count the exhaust, or also the emissions from manufacturing the car, refining the oil, and building the roads?

For this article, we’ll reference both figures but focus primarily on full lifecycle estimates (5-10 ml range) as they provide the complete environmental picture.


Why Do ChatGPT Water Usage Estimates Differ So Dramatically?

Understanding why estimates range from 0.3 ml to 25 ml per query requires understanding what each measurement includes—and excludes.

What Sam Altman’s 0.3 ml Figure Includes

Scope: On-site operational water only
Includes:

  • Cooling towers at OpenAI’s data centres
  • Liquid chillers
  • Evaporative cooling systems
  • Direct HVAC water circulation

Excludes:

  • Off-site electricity generation, water use
  • GPU and server manufacturing water
  • Supply chain water footprints
  • Indirect water consumption

System boundary: The physical walls of OpenAI’s facilities

This narrow scope makes Altman’s number look incredibly small—but it’s incomplete from an environmental perspective.

What Academic Estimates Include

Scope: Full lifecycle analysis (cradle-to-grave)
Includes everything Altman counts, PLUS:

  • Scope 2: Water used at power plants generating electricity for data centres
  • Scope 3: Water consumed in manufacturing semiconductors and servers
  • Supply chain: Water embedded in materials, transport, and infrastructure
  • Indirect effects: Water used for cooling power transmission equipment

System boundary: The entire AI infrastructure from chip fabrication to query response

This comprehensive approach reveals AI’s true water footprint—but generates much higher numbers.

The Methodology Debate

OpenAI’s Position:
“We measure what we directly control. Our on-site water efficiency has improved dramatically.”

Researchers’ Counter:
“Environmental impact doesn’t stop at your property line. Full accountability requires measuring the entire supply chain.”

The Truth:
Both perspectives have merit. Altman’s figure shows operational improvements. Academic estimates show the total environmental impact. The 5-10 ml range using full lifecycle analysis is the most honest representation of ChatGPT’s actual water consumption.

Model Efficiency Improvements

There’s also genuine progress to acknowledge:

  • GPT-3 (2020): ~10-12 ml per query
  • GPT-3.5 (2022): ~6-8 ml per query
  • GPT-4o (2024): ~3.5-5 ml per query (medium response)
  • Latest optimisations (2025): Approaching Altman’s claimed ~0.3 ml operational water

Newer models are genuinely more water-efficient, but they’re also being used far more frequently—meaning total water consumption is still rising.


How Does AI Actually Consume Water?

Every ChatGPT conversation triggers a complex chain of water-intensive processes. Let’s break down exactly how and where water gets used.

1. Data Centre Cooling (On-Site Water Use)

When you send a query to ChatGPT, powerful servers process billions of calculations. These servers generate immense heat—enough to fry the hardware without cooling.

Primary cooling methods:

Evaporative Cooling (Water-Intensive)

  • Water absorbs heat from servers
  • Hot water flows to cooling towers
  • Water evaporates, carrying heat away
  • Water loss: ~70-80% evaporates (not returned to system)
  • Usage: 2-10 litres per kWh, depending on climate

An average 100-megawatt data centre consumes approximately 2 million litres of water daily just for cooling—equivalent to 6,500 UK households.

Air Cooling with Evaporation

  • Uses water to cool intake air in hot/dry climates
  • Consumes roughly 70% of the withdrawn water
  • More efficient than full water cooling, but still water-intensive

Liquid Cooling (Emerging Technology)

  • Direct-to-chip liquid cooling using synthetic fluids
  • Dramatically reduces water consumption
  • Microsoft’s zero-water designs (launched August 2024) eliminate evaporation entirely
  • Still uncommon—most facilities use traditional water cooling

Water Quality Requirements:
Data centres typically require potable (drinking-quality) water because impurities damage sensitive equipment. This means AI competes directly with human needs for clean water.

2. Electricity Generation (Scope 2 Water Use)

Data centres consume massive amounts of electricity. In 2024, ChatGPT used approximately 39.98 million kWh daily to serve queries.

That electricity must be generated somewhere—and most power plants are water-intensive:

Coal Power Plants:

  • 19,185 gallons (72,628 litres) per megawatt-hour
  • Most water-intensive energy source
  • 40% of all U.S. water withdrawals in 2022 came from coal and gas plants

Natural Gas Plants:

  • 2,800 gallons (10,597 litres) per MWh
  • Still substantial water demand

Nuclear Power:

  • 10,000-40,000 gallons per MWh for cooling
  • Massive water requirements

Hydroelectric:

  • Water use through reservoir evaporation and flow requirements
  • Variable by location and design

The Math:
If 1 kWh requires 9 litres of water to generate (average), and ChatGPT uses 40 million kWh daily, that’s 360 million litres of water daily just for electricity—before counting direct cooling.

This is why academic estimates include Scope 2 water—it’s a direct consequence of AI operation.

3. Hardware Manufacturing (Scope 3 Water Use)

Before a single query runs, enormous water investments go into creating AI infrastructure.

GPU and Chip Manufacturing:

  • 8-10 litres of ultra-pure water per single microchip
  • Approximately 2,200 gallons (8,328 litres) per advanced GPU
  • Water needed for:
    • Cooling machinery
    • Cleaning wafer sheets
    • Preventing contamination
    • Chemical processes

Semiconductor Fabrication:
Modern AI chips require cutting-edge semiconductor manufacturing—one of the most water-intensive industrial processes globally.

Server Production:
Apple reports that 99% of its water footprint comes from its supply chain, not operations—mostly from manufacturing. The same applies to AI hardware.

Low Recycling Rates:
Unlike data centres that can recycle some cooling water, chip manufacturing has poor water recycling rates, making it especially wasteful.

4. Infrastructure and Supply Chain Water

Additional hidden water costs include:

  • Mining rare earth minerals for chips
  • Manufacturing and cooling warehouse equipment
  • Transporting hardware globally
  • Building and maintaining data centre facilities
  • Producing renewable energy equipment (solar panels, wind turbines)

Apple’s Supply Chain Data:
Apple disclosed that 99% of its water use comes from supply chains. The same principle applies to AI—most water is consumed before your query even runs.


How Much Water Does ChatGPT Use Per Day? Daily Numbers Revealed

Individual queries may seem small, but scale transforms them into something massive.

Daily Water Consumption

So how much water does ChatGPT use per day exactly? The answer depends on which methodology you use:

ChatGPT serves approximately 1 billion queries daily (as of December 2025, from OpenAI’s 300 million weekly active users generating 1 billion+ messages per day).

Using different calculation methods:

Sam Altman’s Figure (0.3 ml operational water only):
1 billion queries × 0.3 ml = 300,000 liters daily
Equivalent to: 1,000 UK households

Academic Full Lifecycle (5-10 ml per query):
1 billion queries × 7.5 ml average = 7,500,000 litres daily (academic full lifecycle estimate)
Equivalent to: 25,000 UK households or filling 3 Olympic swimming pools

With Electricity Generation Included:
Approximately 39.16 million gallons (148 million litres) daily, when including all water used in the electricity supply chain.

Visualisation:
ChatGPT’s daily water use equals everyone in Taiwan flushing their toilet simultaneously.

When people ask “how much water does ChatGPT use per day,” this 7.5 million litre figure represents the most honest answer. It accounts for direct cooling, electricity generation, and the full infrastructure supporting ChatGPT’s daily operations

Quick Answer: How much water does ChatGPT use per day? 7.5 million litres using full lifecycle accounting, or 300,000 litres using only direct operational water. The reality lies between these figures.

Annual Water Consumption

Low estimate (operational only): 109 million litres annually
Mid-range estimate (full lifecycle): 2.7 billion litres annually
High estimate (with full grid impacts): 54 billion litres annually

That’s enough to:

  • Fill Central Park Reservoir (New York) 7 times (high estimate)
  • Supply 9,000-90,000 households for a year (range)
  • Grow 270,000-2.7 million kg of wheat (range)

Context: Is This A Lot?

Compared to other industries:

Residential lawn watering (U.S.): 34 billion litres daily
ChatGPT water use: 0.15-7.5 million litres daily

Percentage: ChatGPT represents approximately 0.02-0.4% of U.S. residential lawn watering

For now, AI’s daily water totals are small compared with common uses like lawns, showers and laundry.

However, this doesn’t tell the full story:

  1. AI is growing exponentially. Lawn watering is stable; AI demand doubles yearly.
  2. Location matters. Many data centres sit in water-stressed regions where every litre counts more.
  3. It’s cumulative. ChatGPT + Google Gemini + Anthropic Claude + Microsoft Copilot + Meta AI + image generators + video generators = massive combined footprint.
  4. It uses potable water. Lawns can use greywater; data centres need drinking-quality water.

Projected Growth

Morgan Stanley projects AI data centres’ global annual water consumption will reach 1,068 billion litres by 2028—11 times higher than 2024’s projection.

By 2030, the International Energy Agency estimates global data centres could consume 1,200 billion litres annually—more than double the 560 billion used in 2022.

The trajectory is alarming. Even if per-query efficiency improves, total consumption skyrockets as AI becomes ubiquitous.


Comparing ChatGPT to Other AI Models and Activities

How does ChatGPT’s water consumption stack up against alternatives?

ChatGPT vs. Other AI Models

AI Model/ServiceWater Per Query (Lifecycle)Notes
ChatGPT (GPT-3)10-12 mlOlder, less efficient
ChatGPT (GPT-3.5)6-8 mlImproved efficiency
ChatGPT (GPT-4)8-10 mlLarger model, more computation
ChatGPT (GPT-4o)3.5-5 mlMost efficient OpenAI model (2024)
Google Gemini0.26 ml (Google claim)Median text prompt; unclear scope
Google Search0.2-0.5 mlTraditional search is far more efficient
AI Image Generation (DALL-E, Midjourney)50-100 mlMuch more water-intensive than text
AI Video Generation (5 seconds)500-1,000 mlExtremely resource-heavy

Key insights:

  • Text generation is relatively efficient
  • Image generation uses 10-20x more water
  • Video generation is the most water-intensive AI task
  • Google’s figures are even lower than OpenAI’s, but may have similar scope limitations
  • Traditional search engines use a fraction of what AI chatbots consume

ChatGPT vs. Everyday Activities

ActivityWater UsedComparison
1 ChatGPT query5-10 ml
Flushing a toilet6,000-9,000 ml~1,000x more
2-minute shower15,000-30,000 ml~2,000-4,000x more
Washing machine load50,000-100,000 ml~7,000-15,000x more
Google search0.2-0.5 ml~15x less
Watering lawn (10 min)60,000-90,000 ml~10,000x more
Making 1 cup of coffee140,000 ml (lifecycle)~20,000x more
Producing 1 kg beef15,400,000 ml~2+ million times more

Context matters:
Yes, ChatGPT uses far less water than most daily activities. But:

  • You use ChatGPT dozens to hundreds of times daily
  • It uses potable water in drought-stressed regions
  • The number of users and queries is exploding
  • It’s an additional water demand on top of existing needs

30 Minutes of ChatGPT Use

If you have an average conversation of 30 queries in 30 minutes:

Operational water (Altman): 9 ml (less than 2 teaspoons)
Full lifecycle (academic): 150-300 ml (half to full water bottle)

Multiply by daily users (millions), and the numbers become staggering.


Factors That Dramatically Change ChatGPT’s Water Usage

Water consumption varies wildly based on when, where, and how you use ChatGPT.

1. Geographic Location of Data Centres

Cool, humid regions (Ireland, Scandinavia):

  • Can rely on air cooling for months
  • Minimal water use (1.8 litres/kWh at Microsoft Ireland)
  • A data centre in cool, humid Ireland can often rely on outside air or chillers and run for months with minimal water use

Hot, dry regions (Arizona, Nevada, Texas):

  • Heavy reliance on evaporative cooling
  • Massive water consumption (12 litres/kWh at Microsoft, Washington)
  • A data centre in Arizona in July may depend heavily on evaporative cooling, consuming large volumes of water

Water-stressed areas:

  • About 20% of US data centres already pull from sources that are moderately to highly stressed
  • Meta’s data centre in Santiago, Chile: 7.6 million litres of potable water daily—in a region suffering severe drought
  • Google’s Mesa, Arizona, facility initially used up to 15 million litres daily

Emerging markets:
Arid regions like Saudi Arabia, the UAE, China, and India are building massive new AI data centres in already water-scarce areas.

2. Seasonal and Time-of-Day Variations

Summer vs. Winter:

  • A University of Massachusetts Amherst study found that a data centre might use only half as much water in winter as in summer
  • Midday heatwaves push cooling systems to work overtime
  • Nighttime queries use less water due to cooler ambient temperatures

Best time for lower water usage: Winter evenings in cool climates

3. Query Complexity and Length

Not all ChatGPT interactions are equal:

Short query (20-50 words):

  • Energy: ~0.3-0.5 Wh
  • Water: ~3-5 ml

Medium query (100-300 words input/output):

  • Energy: ~0.5-1.5 Wh
  • Water: ~5-10 ml

Long query (analysing a paper, 10,000 tokens):

  • Energy: ~2.5 Wh
  • Water: ~20-30 ml

Very long query (processing 100,000 tokens, ~200 pages):

  • Energy: ~40 Wh
  • Water: ~300-400 ml

Tasks that use more water:

  • Code generation
  • Document analysis
  • Long conversations
  • Image generation (if using DALL-E)
  • Complex problem-solving

Tasks that use less water:

  • Simple questions
  • Short translations
  • Quick factual lookups

4. Infrastructure and Cooling Technology

Traditional water cooling: High consumption
Advanced liquid cooling: Moderate consumption
Immersion cooling: Minimal water (uses synthetic oils)
Microsoft’s zero-water design (2024+): Eliminates evaporation entirely

5. Energy Grid Carbon and Water Intensity

Renewable-powered data centres (hydro, wind, solar):

  • Lower water footprint (but not zero—manufacturing and some operations still use water)

Coal-heavy grids:

  • Massive indirect water consumption from power generation

Natural gas grids:

  • Moderate water intensity

Nuclear-powered regions:

  • Very high water consumption for cooling reactors

Where your query is processed matters enormously for total water impact.


The Hidden Water Costs of AI Nobody Talks About

Beyond per-query consumption, AI has several hidden water impacts:

Training Large Language Models

While this guide focuses on inference (answering queries), training AI models is even more water-intensive.

Training GPT-3 in Microsoft’s data centre used up 700,000 litres of water, reported The Washington Post.

GPT-4 likely required several times more. Future models will be even larger.

Training happens once; inference happens billions of times. Over an AI model’s lifetime, inference actually consumes more total water—but training is a massive upfront cost.

Semiconductor Manufacturing

Every GPU powering AI represents thousands of litres of ultra-pure water consumed during fabrication.

NVIDIA’s AI servers alone could require manufacturing enough GPUs to consume hundreds of millions of litres annually just for chip production.

Water vs. Carbon Tradeoffs

Efforts to reduce AI’s carbon footprint sometimes increase water consumption:

  • Solar panels in hot regions require more cooling
  • Switching from coal to natural gas reduces carbon but doesn’t eliminate water use
  • Renewable energy manufacturing is water-intensive

Efforts to reduce carbon emissions are often in conflict with reducing water use—creating difficult environmental tradeoffs.

Water Withdrawal vs. Water Consumption

Water withdrawal: Total water taken from sources
Water consumption: Water evaporated or permanently removed

Data centres consume roughly 70% of the water withdrawn through evaporation. Only 30% returns to treatment systems.

This matters because evaporated water is lost from local ecosystems—it doesn’t immediately return to local water supplies.

Social Justice Implications

In Aragón, Spain, protests erupted under the slogan “Your cloud is drying my river,” reflecting tensions between global tech giants and local residents competing for the same water.

The ethical question: Is it legitimate to allocate millions of litres to train AI in drought-prone regions while communities face scarcity?

Data centres are being built in water-stressed areas globally:

  • US Southwest: Severe drought regions hosting major AI facilities
  • Chile: Santiago’s Google centre in drought conditions
  • India, China: Massive new data centres in already water-scarce areas

UNESCO has reaffirmed that access to water is a human right. The debate isn’t just about efficiency—it’s about justice.

ai water

Lack of Transparency and Accountability

There is no requirement for data centres to report how much water they are consuming—making independent verification difficult.

Companies can claim “water positive” status without meaningful accountability:

  • Companies such as Google and Microsoft claim they will be “water positive” by 2030…However, there is little accountability or evidence…and water replenishment often happens in different locations than where it was extracted

Replenishing water in Oregon doesn’t help a drought-stricken Arizona community.


Solutions: How to Reduce AI’s Water Footprint

The good news? Technology and policy can dramatically reduce how much water ChatGPT and other AI systems consume.

Corporate Solutions Already in Progress

1. Zero-Water Cooling Systems

Starting in August 2024, Microsoft launched a new data centre design that optimises AI workloads and consumes zero water for cooling

These systems:

  • Recycle water through closed loops
  • Avoid evaporation entirely
  • Save more than 125 million litres per data centre annually
  • Will debut in Phoenix, Arizona and Mt. Pleasant, Wisconsin in 2026

2. Immersion and Direct-to-Chip Cooling

Immersion cooling submerges servers in fluids that don’t conduct electricity, such as synthetic oils, reducing water evaporation almost entirely

The global data centre liquid cooling market is exploding from $5.7 billion (2024) to a projected $48.4 billion by 2034—a 23.96% annual growth rate.

Benefits:

  • Up to 50% improvement in energy efficiency
  • Dramatically reduced water consumption
  • Allows higher-density computing (more AI power per square meter)

3. Strategic Data Centre Location

Moving facilities to:

  • Cooler climates (Scandinavia, Ireland, Canada)
  • Regions with abundant renewable energy
  • Areas NOT facing water stress

4. Renewable Energy Transition

Powering data centres with wind and solar (combined with battery storage) eliminates water use from fossil fuel power generation.

5. Water Replenishment and Conservation

As of July 2024, Microsoft has invested more than $34 million in 76 replenishment projects around the world…estimated to provide more than 100 million cubic metres in volumetric water benefit

In 2024, Google reported replenishing 4.5 billion gallons of water, roughly 64% of its freshwater consumption

Google and Microsoft both aim for “water positive” status by 2030—returning more water than they consume.

However, Replenishment in one location doesn’t solve water stress in another. Real solutions require reducing consumption, not just offsetting it elsewhere.

6. AI-Powered Water Conservation

Paradoxically, AI itself can help conserve water:

  • Leak detection: UK utility SES Water…used satellite analytics to achieve a 24% reduction in water loss through leak detection
  • Agricultural optimisation: Vintel app helps winegrowers decrease irrigation by 45-80%
  • Predictive flood management: Predictive flood models…have stored 58,000 cubic meters of stormwater, recharging aquifers

Since 2023, Microsoft has been working…with water utilities…helping to reduce water loss from public pipeline networks using AI—nearly a third of piped water globally is lost before reaching people.

Policy and Regulatory Solutions

1. Mandatory Water Disclosure

The European Commission’s Water Resilience Strategy aims to strengthen water governance…including new performance standards for high-water-use industries such as data centres

The UK has introduced new reporting requirements for data centre consumption.

Under the EU AI Act, high-risk AI systems will soon require annual reports on energy and resource usage.

2. Water Use Limits in Stressed Regions

Restricting or banning new data centres in water-scarce areas unless they use zero-water cooling.

3. Water Pricing Reform

Many data centres pay minimal rates for water. Pricing that reflects true scarcity would incentivise conservation.

4. Right-to-Know Laws

Communities deserve transparency about local data centre water usage before facilities are built.

What Model Developers Can Do

1. Optimise Model Efficiency

Smaller, more efficient models that maintain quality while requiring less computation.

GPT-4o uses ~30% less energy than GPT-4 while maintaining performance—proving efficiency gains are possible.

2. Inference Optimisation

  • Caching common queries
  • Batch processing
  • Smart workload distribution
  • Running queries during off-peak hours in cooler conditions

3. Hybrid Approaches

Using smaller models for simple tasks, reserving powerful models for complex queries.

4. Transparency

Publishing detailed water consumption data, methodologies, and improvements over time.


What You Can Do: Reducing Your AI Water Footprint

Individual actions matter—especially as they influence corporate behaviour.

1. Use AI More Intentionally

Before using ChatGPT, ask:

  • Could I find this answer with a traditional search? (15x less water)
  • Is this query necessary?
  • Can I combine multiple questions into one conversation?

Be concise:

  • Shorter prompts use less water
  • Avoid asking ChatGPT to repeat itself
  • Skip pleasantries like “thank you” at the end (Altman revealed this costs OpenAI “tens of millions of dollars” in electricity and corresponding water)

2. Choose Lower-Impact AI Tasks

Lower water use:

  • Text-based queries
  • Simple questions
  • Factual lookups
  • Brief summaries

Higher water use:

  • AI image generation (10-20x more)
  • AI video generation (100x+ more)
  • Long document analysis
  • Code generation
  • Extended conversations

Reserve AI for tasks where it genuinely adds value.

3. Support Sustainable AI Companies

Choose AI services from companies that:

  • Publish water consumption data transparently
  • Use renewable energy
  • Invest in water-efficient cooling
  • Build data centres in non-water-stressed regions
  • Commit to water replenishment

4. Advocate for Policy Change

  • Support mandatory water disclosure laws
  • Contact representatives about data centre regulations in water-stressed areas
  • Join or support organisations pushing for environmental accountability in tech

5. Spread Awareness

Share information about AI’s water footprint. Many people have no idea their ChatGPT conversations consume water.

6. Balance Your Overall Water Footprint

If you use AI frequently, compensate by:

  • Reducing shower time
  • Fixing leaks
  • Choosing water-efficient appliances
  • Eating less water-intensive foods (beef, almonds)
  • Supporting water conservation in your community

Remember: Generative AI does use water, but – at least for now – its daily totals are small compared with other common uses such as lawns, showers and laundry.

AI isn’t the biggest water problem—but it’s a growing one, and collective awareness drives corporate change.


Frequently Asked Questions

How much water does ChatGPT use per day in 2026?

As of February 2026, ChatGPT uses approximately 7.5 million litres of water daily when accounting for full lifecycle impacts (direct cooling, electricity generation, and infrastructure). This is based on serving roughly 1 billion queries per day and using the academic estimate of 5-10 ml per query. OpenAI’s own figures (counting only operational cooling water) suggest 300,000 litres daily. The true answer to “how much water does ChatGPT use per day” lies between these figures, with 7.5 million litres being the most environmentally honest estimate.

How much water does ChatGPT really use per query?

Official answer (Sam Altman): 0.3 millilitres (operational water only)
Academic estimate (full lifecycle): 5-10 millilitres per query, including electricity generation and indirect impacts
Most honest answer: 5-10 ml when accounting for the complete environmental footprint. Altman’s 0.3 ml only measures direct data centre cooling, excluding the water used to generate electricity and manufacture hardware. Both figures are technically correct—they just measure different things.

Is 5-10 ml of water per ChatGPT query a lot?

Per individual query, no—it’s less than two teaspoons. But scaled to billions of daily queries globally, it becomes massive. ChatGPT alone uses approximately 7.5 million litres daily (full lifecycle) or 2.7 billion litres annually. When you include all AI services (Google Gemini, Claude, image generators, etc.), projected global AI water consumption will reach 1,068 billion litres by 2028—11 times higher than in 2024.

Yes, significantly. A traditional Google search uses about 0.2-0.5 ml of water, while a ChatGPT query uses 5-10 ml (full lifecycle)—approximately 15-20 times more. This is because AI language models require far more computational power than retrieving indexed search results. Simple searches are much more water-efficient than AI chatbots.

Where does the water go when ChatGPT uses it?

Most water (70-80%) used in data centre cooling evaporates into the atmosphere and is temporarily removed from local water systems. The remaining 20-30% is discharged to wastewater treatment facilities. The evaporated water eventually returns through the water cycle, but not necessarily to the same region that supplied it—which is why data centers in drought areas create local water stress.

Why do estimates of ChatGPT’s water use vary so wildly?

Different studies measure different things. Sam Altman’s 0.3 ml includes only operational cooling water at OpenAI’s data centres. Academic estimates of 5-10 ml include the full lifecycle: direct cooling, electricity generation (power plants use enormous amounts of water), GPU manufacturing, and supply chain impacts. The variation reflects different “system boundaries”—what gets counted and what doesn’t.

Do all AI models use the same amount of water as ChatGPT?

No. Water usage varies dramatically by model size and task. GPT-4o (ChatGPT’s latest model) uses about 3.5-5 ml per medium query. GPT-4 uses 8-10 ml. AI image generation (DALL-E, Midjourney) uses 50-100 ml. AI video generation uses 500-1,000 ml for just 5 seconds. Text-based AI is the most water-efficient; visual AI is far more resource-intensive.

Is ChatGPT’s water consumption getting better or worse?

Per query: Getting better. GPT-4o uses ~40% less water than GPT-3. New cooling technologies (Microsoft’s zero-water systems, immersion cooling) are dramatically reducing operational water use.
Total consumption: Getting worse. Despite efficiency improvements, total water use is skyrocketing because billions more people are using AI far more frequently. Efficiency gains are being overwhelmed by exponential growth in demand.

Can ChatGPT use recycled or greywater instead of drinking water?

Theoretically, yes, but currently no. Data centres require ultra-pure, contaminant-free water because impurities damage sensitive equipment. Most facilities use potable (drinking-quality) water. Some newer systems can use non-potable water with advanced filtration, but this isn’t yet standard. The competition for clean water is a major ethical concern.

Does using ChatGPT at night use less water?

Potentially yes. Water consumption is lower during:

  • Cooler nighttime temperatures (less cooling needed)
  • Winter months vs. summer (50% less in some locations)
  • Queries processed in cool-climate data centres (Ireland vs. Arizona)

However, you have no control over which data centre processes your query, so timing your usage has minimal practical impact.

How does ChatGPT’s water use compare to training the model?

Training GPT-3 consumed approximately 700,000 litres of water—a massive one-time cost. However, inference (answering your queries) happens billions of times daily. Over a model’s lifetime, inference consumes far more total water than training. Both training and inference are significant water consumers.

Is AI’s water consumption more important than its energy consumption?

Both matter, but they’re interconnected. Data centres use energy to run servers and water to cool them. Additionally, generating electricity consumes water (especially coal and nuclear plants). Reducing energy consumption automatically reduces water consumption. They’re two sides of the same environmental impact coin.

Are there any AI chatbots that use zero water?

No mainstream AI chatbot uses truly zero water when you account for electricity generation and hardware manufacturing. However, Microsoft’s new zero-water data centre designs (launched August 2024) eliminate operational cooling water evaporation. Combined with renewable energy and efficient hardware, future AI could approach near-zero water consumption—but we’re not there yet.

What’s worse for the environment: using ChatGPT or taking a longer shower?

A 2-minute shower uses about 15,000-30,000 ml of water. That’s equivalent to 2,000-4,000 ChatGPT queries. However, shower water is typically non-potable and returns to local treatment systems, while data centre water often evaporates and uses potable supplies in drought-stressed regions. Both matter—they’re not mutually exclusive. Reduce both where possible.

Do “water positive” commitments from tech companies actually help?

Mixed results. Google and Microsoft pledged to replenish more water than they consume by 2030, investing millions in water projects. However, replenishing water in Oregon doesn’t help a community in drought-stricken Arizona, where a data centre operates. True impact requires reducing consumption where water stress exists, not just offsetting it elsewhere. Transparency and local solutions matter more than global offsets.

Should I stop using ChatGPT because of its water footprint?

Not necessarily. ChatGPT’s per-query water use is modest compared to daily activities like showers, laundry, or lawn watering. However, use it intentionally rather than thoughtlessly. Reserve AI for tasks where it genuinely adds value. Choose text over image/video generation. Support companies investing in water-efficient technology. Individual awareness drives collective pressure for corporate change.

How much water does generating an AI image use?

AI image generation (DALL-E, Midjourney, Stable Diffusion) uses approximately 50-100 ml per image, 10-20 times more water than a text query. Generating images requires significantly more computational power than processing text, translating to higher energy and water consumption. If you use AI for images frequently, that’s where your water footprint is concentrated.

Will AI’s water use become a bigger problem in the future?

Yes, dramatically. Morgan Stanley projects AI data centres’ global annual water consumption will reach 1,068 billion litres by 2028—11 times higher than in 2024. The International Energy Agency forecasts 1,200 billion litres by 2030. Even with efficiency improvements, explosive growth in AI usage (billions more people, more frequent use, more powerful models) will overwhelm gains unless radical infrastructure changes happen.

What’s the single biggest thing that could reduce ChatGPT’s water footprint?

Short-term: Deploying Microsoft’s zero-water cooling systems globally would eliminate operational water evaporation—the most visible water use.
Long-term: Transitioning to 100% renewable energy eliminates water consumption from fossil fuel power generation (Scope 2), which is actually larger than direct cooling water. Combined with sustainable hardware manufacturing and strategic data centre locations in cool, non-water-stressed regions, AI’s water footprint could drop by 80-90%.


Final Thoughts: The Hidden Cost of Every AI Conversation

Every time you chat with ChatGPT, a complex chain of water-intensive processes unfolds invisibly behind the scenes.

The truth about how much water ChatGPT uses is complicated:

  • Sam Altman’s 0.3 ml is technically correct for direct operational water
  • Academic estimates of 5-10 ml are more environmentally honest
  • Both numbers matter—they tell different parts of the story

The bigger truth? AI’s water footprint is growing exponentially, even as per-query efficiency improves. What seems negligible at the individual level becomes massive at the global scale.

But here’s what matters most: You now know something millions of ChatGPT users don’t. That knowledge gives you power:

  • To use AI more intentionally
  • To pressure companies for transparency
  • To support better policies
  • To make informed choices about when AI genuinely adds value

AI isn’t going away—and it shouldn’t. It offers transformative benefits. But those benefits don’t require sacrificing water sustainability.

The question isn’t whether to use AI, but how to use it responsibly while pushing for the infrastructure changes that make AI truly sustainable: zero-water cooling, renewable energy, strategic locations, and radical transparency.

Your ChatGPT conversation might use just a few millilitres of water. But your voice, combined with millions of others demanding better, can save billions of litres.

Every conversation matters—both the ones with AI, and the ones we have with each other about its true cost.


Data accurate as of December 2025. AI technology and water efficiency strategies are rapidly evolving. Check OpenAI, Google, and Microsoft’s latest sustainability reports for current figures.

Leave a Reply

Your email address will not be published. Required fields are marked *