Sam Altman thinks AI’s energy footprint is overblown. “It also takes a lot of energy to train a human,” he argued last week, comparing ChatGPT’s electricity consumption to raising children over 20 years.
The numbers tell a different story. Global data center electricity consumption is on track to more than double by 2030, reaching 945 terawatt-hours - as much as Japan uses annually. Microsoft’s carbon emissions are up 29% since 2020. Google’s greenhouse gases have risen 48% since 2019.
Both companies still claim they’ll hit their 2030 climate goals. The math suggests otherwise.
The Energy Picture
Data centers already consume about 1-2% of global electricity. The International Energy Agency projects that share will approach 3% by 2030, with AI workloads driving most of the growth.
In the United States, the situation is more acute. Power consumption by data centers is on course to account for nearly half of the country’s electricity demand growth through 2030. In Ireland, data centers could consume 32% of national electricity by year’s end.
Goldman Sachs estimates that about $720 billion in grid spending will be needed through 2030 just to support this growth. The aging US power grid wasn’t designed for this.
Training vs. Inference
Here’s what rarely gets discussed: training AI models gets all the headlines, but inference - actually running the models - consumes far more energy over time.
Analysis shows inference costs 15 times more than training over a model’s lifetime. GPT-4’s inference bill alone hit $2.3 billion in 2024. By 2026, inference is projected to outpace training demand by 118 times.
Training OpenAI’s GPT-4 consumed an estimated 50 gigawatt-hours of energy. But that’s a one-time cost. Every query you send adds to the cumulative total.
Per-Query Costs
The widely-cited claim that ChatGPT uses 10 times more energy than a Google search? That’s based on 2023 data. Today, both hover around 0.3 watt-hours per query - ChatGPT has gotten more efficient as Google search has incorporated more AI.
But aggregate numbers matter more than per-query efficiency. ChatGPT handles over 912 billion queries annually. Even small per-query costs scale massively.
The Water Problem
Data centers need cooling. Cooling needs water.
A recent report projects that annual data center water consumption in the United States could double or quadruple by 2028, reaching 150-280 billion liters per year. That’s additional pressure on regional water systems already stressed by drought.
The location problem is worse than it sounds. Roughly two-thirds of data centers built since 2022 are in water-stressed regions - including hot, dry climates like Arizona. More than 160 new AI data centers have appeared in the past three years in areas with scarce water resources.
Per-Query Water Use
Estimates vary wildly depending on the model and data center:
- GPT-4o: approximately 3.5-5 ml per medium query
- GPT-4: 8-10 ml per medium query
- GPT-5: approximately 39 ml per medium query
- Google Gemini: about 0.26 ml per query
The often-cited “500ml per ChatGPT query” figure comes from including the water intensity of electricity generation, not just direct cooling. More recent calculations put direct data center water use much lower, around 5ml per query for current models.
Training consumes far more. GPT-3’s training required an estimated 700,000 liters of water. GPT-4? About 10 times that.
Altman called water concerns “completely untrue, totally insane,” claiming OpenAI’s data centers have moved away from water-heavy evaporative cooling. Independent researchers suggest reality is more complicated.
The Carbon Numbers
Training a single large language model produces significant emissions:
- GPT-3: approximately 500 metric tons of CO2
- Llama 2: 539 tonnes of CO2e from 1,273,000 kWh
- BLOOM: 25 metric tons - but this doubled when accounting for equipment manufacturing and running the model post-training
The bigger picture: Microsoft’s carbon footprint has risen 23.4% at the halfway point to its 2030 net-zero goal. Google’s emissions are up 48% since 2019.
Both companies cite AI and cloud data center expansion as the cause. Both say they’re still committed to their 2030 targets.
According to an analysis by two European think tanks, “Tech companies’ GHG emissions targets appear to have lost their meaning and relevance.”
The Nuclear Option
Tech companies are betting on nuclear power to square the circle.
Microsoft’s $1.6 billion deal with Constellation Energy to restart Three Mile Island is the most visible example. The 20-year agreement will bring back Unit 1’s 835 megawatts to power Microsoft’s AI infrastructure through the mid-2040s. The project is 80% staffed, on track for a 2027 restart.
Amazon followed with a 1.92-gigawatt agreement from the Susquehanna nuclear plant and a $500 million investment in small modular reactor development.
Nuclear provides carbon-free baseload power that solar and wind can’t match. But restarting retired reactors takes years, and new construction takes longer. AI demand is growing now.
What This Means
The uncomfortable reality: AI’s environmental footprint is growing faster than efficiency gains can offset.
Tech companies are investing hundreds of billions in data centers. They’re making nuclear deals. They’re improving model efficiency. But emissions keep rising, water demand keeps growing, and those 2030 climate goals keep looking less achievable.
Altman’s comparison of AI energy use to human energy use reveals the industry’s thinking: AI is worth the cost. The question is whether the rest of us agree - and whether we have any say in the matter.
What You Can Do
Choose efficient models. Smaller models like Claude Haiku or GPT-4o Mini use a fraction of the energy of frontier models. For routine tasks, they work fine.
Run locally when possible. A local Llama model on your laptop draws from your existing power supply. No additional data center load. No water cooling. Check our guide to self-hosting.
Be aware of the tradeoffs. Every AI query has a cost. It’s small individually but massive in aggregate. The industry won’t change unless users start caring about efficiency alongside capability.
The energy transition is real, and tech companies are investing heavily in renewables and nuclear. But until supply catches up with demand, every AI query adds to the gap between climate rhetoric and climate reality.