On March 5, Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI gathered at the White House to sign the “Ratepayer Protection Pledge” - a commitment that AI’s power demands won’t raise electricity bills for ordinary Americans. The companies will “build, bring, or buy” their own power.
What went unmentioned: the water those data centers consume, much of it in regions already facing drought.
The Numbers
The International Energy Agency projects data centers will consume 1,000 terawatt-hours of electricity in 2026 - roughly equivalent to Japan’s entire national consumption. A single AI data center uses as much electricity as 100,000 households.
U.S. data centers now account for 4.4% of national electricity consumption, up from 1.9% in 2018. By 2028, that could reach 12%.
Water tells a similar story. According to Cornell research, AI data centers will consume 731-1,125 million cubic meters of water annually by 2030 - equivalent to household water usage for 6-10 million Americans. Larger facilities already require up to 5 million gallons daily, matching a city of 50,000 people.
By 2028, data centers nationwide could collectively use as much water as 18.5 million households.
Where the Thirst Is Worst
The problem isn’t just volume - it’s location. Two-thirds of data centers built since 2022 sit in water-stressed regions.
Texas: Data centers across the state used 49 billion gallons of water in 2025, with most of the state under drought conditions. No immediate relief is in sight.
Arizona: One Meta facility in drought-stricken Mesa consumes as much water as 10,000 homes. Tempe and Phoenix have begun limiting water-cooling capabilities for new data centers.
Virginia: Northern Virginia hosts the world’s densest concentration of data centers - roughly 300 facilities in a handful of counties. Their water use nearly doubled from 2019 to 2023, reaching almost 2 billion gallons annually.
The pattern is consistent: tech companies chase cheap land and friendly regulations, often in regions least able to afford the water cost.
The Energy Per Query
How much does your ChatGPT conversation actually consume?
According to Carbon Credits analysis, GPT-4o uses about 0.30 watt-hours per request, producing 0.13 grams of CO2 on a typical electricity grid. Claude 3 Opus runs higher at 4.05 Wh per request - roughly 13.5 times more - with 1.80 grams of CO2. Claude’s lighter Haiku model drops to 0.22 Wh.
These numbers seem small until you scale them. ChatGPT handles over 1 billion queries daily, consuming roughly 300 megawatt-hours of electricity and producing over 260,000 kilograms of CO2 monthly.
Inference now accounts for 80-90% of AI’s total energy consumption. Training GPT-4 required an estimated 50-62 gigawatt-hours. But the daily grind of answering questions dwarfs it.
Epoch AI projects the world could see 329 billion AI prompts per day by 2030 - about 38 queries per person alive on the planet.
The Carbon Math
By 2030, AI data centers are projected to emit 24-44 million metric tons of CO2 annually - equivalent to adding 5-10 million cars to U.S. roads.
Even optimistic scenarios leave problems. An aggressive push to renewables would drop emissions roughly 15% from baseline, but approximately 11 million tons of residual emissions would remain. Offsetting that would require about 28 gigawatts of wind or 43 gigawatts of solar capacity.
The tech companies have announced clean energy commitments. Amazon’s agreement with Energy Northwest aims for 5 gigawatts of small modular reactor capacity by 2039. Microsoft is reviving Three Mile Island. Google’s partnership with Kairos Power adds 500 megawatts of future nuclear capacity.
But these projects take years. The data centers are being built now.
The $650 Billion Push
The ratepayer pledge came as Big Tech announced $650 billion in combined AI spending for 2026. Amazon projects $200 billion in capital expenditures (up from $131 billion in 2025). Google estimates $175-185 billion (up from $91 billion).
AWS CEO Matt Garman called the pledge “an important baseline that will protect ratepayers.” Microsoft’s Brad Smith said it ensures data centers won’t contribute to higher electricity prices.
Neither mentioned water.
What Could Help
Cornell researchers identified three strategies that, combined, could reduce carbon by 73% and water consumption by 86%:
Smart siting matters most for water. Prioritizing Midwest and “windbelt” states like Texas (northern regions), Montana, Nebraska, and South Dakota could cut water use by 52%. New York’s nuclear-hydro-renewables mix makes it a climate-friendly option. Avoiding water-stressed regions is the single biggest lever.
Grid decarbonization cuts carbon 15% in optimistic scenarios. This requires accelerating clean energy deployment wherever AI facilities expand.
Operational efficiency adds another 7% carbon reduction and 29% water savings. Advanced liquid cooling and improved server utilization help, but they’re incremental compared to siting decisions.
The study emphasizes this decade represents “the build-out moment” - choices made now lock in infrastructure patterns for decades.
What You Can Do
If environmental impact matters to you:
Use smaller models when possible. Claude Haiku uses 18 times less energy than Claude Opus. GPT-4o mini exists for a reason. For simple questions, lighter models work fine.
Run local when practical. A model on your laptop uses your local grid’s energy mix, not a data center in Arizona pulling from drought-stricken aquifers.
Ask fewer throwaway questions. The “just curious” prompt to a frontier model has a real footprint. Not a reason to stop using AI, but worth considering which questions actually need the biggest models.
Support transparency. Exact figures for model training and system usage are often not publicly disclosed. Push companies to report comprehensive environmental impact data.
The Bottom Line
Big Tech signed a pledge that their AI expansion won’t raise your electricity bill. They didn’t promise the same for water tables in drought regions, or for carbon emissions that will persist even with aggressive renewable deployment.
The environmental cost of AI is real and growing faster than the mitigation efforts. Whether that changes depends on choices being made right now about where to build and how to power the infrastructure for the next decade of AI.