The Bill Nobody
Agreed To Pay
Training GPT-4 consumed enough electricity to power San Francisco for three days. The next generation of AI infrastructure will draw more power than some countries. Your electricity bill is already going up. Nobody voted for this.
01 — Understanding the scale
Data centres have existed for decades. What changed in 2017 was the arrival of AI-optimised hardware — GPUs and accelerators that draw dramatically more power than conventional servers. From 2005 to 2017, global data centre electricity consumption stayed roughly flat despite massive infrastructure expansion, thanks to efficiency gains. AI ended that era.
By 2030, AI-optimised servers alone are projected to consume 432 TWh — up from 93 TWh in 2025, a nearly fivefold increase in five years. GART The IEA describes AI as "the most important driver" of this growth. IEA
AI didn't just add to the trend — it broke the trend. For over a decade, efficiency gains kept data centre power flat. That era is over.
The electricity your AI query uses isn't hypothetical — it's drawing from the same grid that powers your home. And the projected 2030 demand (945 TWh) is equivalent to the entire electricity consumption of Japan. This is infrastructure-scale change happening on a consumer timeline.
02 — Your electricity bill is going up
Data centres don't pay for their own infrastructure upgrades in isolation — those costs are distributed across the grid and ultimately billed to consumers. The pattern is already visible in US and European electricity markets.
You're already paying for AI infrastructure — whether you use AI or not. Data centre power costs are socialised across the grid.
If you live in Virginia, Ireland, or near any major data centre cluster, your electricity bill has already risen. By 2030, the average US household could see an 8% increase directly attributable to data centres. In hotspots, it's 25%. This isn't a future problem — it's a current one.
03 — The water problem nobody's talking about
AI's electricity story has received enormous attention. Its water story has not. Data centres use water the same way power plants do — evaporative cooling that removes heat from racks of processors. The water doesn't come back in usable form. It evaporates.
"All water is local. What goes unacknowledged, from a natural systems perspective, is that the rules and the norms and the prices are set based on a previous reality."
— Peter Colohan, Director of Partnerships, Lincoln Institute of Land Policy, 2025Google paid $100,000 in legal costs fighting a public records request from a local Oregon newspaper asking how much water its data centres were using. When disclosed, the answer was 355 million gallons — a quarter of the city of The Dalles' annual water use in 2021. Less than a third of data centre operators track their water consumption at all. EESI
Every AI query has a water cost — and it's drawn from the same supply you drink, cook with, and use to grow food.
This isn't a problem that affects "other people." 160+ new data centres have been built in water-scarce areas in the US alone. If your region is already managing drought conditions, an AI data centre is now competing for the same water. And most operators don't even disclose how much they use.
04 — The pledge vs. the reality
Every major tech company made ambitious climate pledges between 2019 and 2021. Then the AI buildout began. The gap between what was promised and what the numbers show is now documented in their own sustainability reports.
| Company | Pledge | What the numbers say | Status |
|---|---|---|---|
| Microsoft | Carbon negative by 2030. Remove all historical emissions by 2050. | Emissions grew 23–40% since 2020 baseline. Location-based Scope 2 emissions more than doubled (4.3M → 10M metric tons CO₂). Water use up 87% since 2020. MSFT | Off Track |
| Net-zero emissions by 2030. Previously held "carbon neutral" status since 2007. | Announced 48% GHG increase in 2024. Quietly removed net-zero pledge language from website in September 2025. Data centre electricity up 27% in a single year. GOOG | Goal Removed | |
| Amazon | Net-zero carbon by 2040 (10 years longer than Paris Agreement). | Continuing major data centre expansion. In Spain, applied to increase water permits by 48% citing climate-driven heat waves as justification for needing more water. ETHGEO | Under Pressure |
| Meta | Net-zero across value chain by 2030. Water-positive by 2030. | Hyperion data centre in Louisiana will draw more than twice the power of New Orleans when complete. Wyoming data centre will use more electricity than all homes in the state combined. LINC | Scale vs. Pledge |
The companies building AI infrastructure have, by their own reporting, blown past their own climate targets. Google has removed its net-zero pledge entirely.
When you see a tech company claim "100% renewable energy," ask whether that's market-based or location-based accounting. The difference matters. Market-based accounting lets you buy certificates while your data centre runs on gas. Location-based accounting tells you what's actually powering the racks.
05 — The efficiency trap
From 2005 to 2017, data centre efficiency improvements meant total electricity consumption stayed flat despite rapid expansion. AI broke this dynamic — not because efficiency stopped improving, but because demand grew exponentially faster than efficiency gains.
When technology makes a resource more efficient to use, total consumption of that resource tends to increase — because lower cost drives higher demand. Applied to AI: DeepSeek's release in January 2025 showed models could be trained for 3% of the cost of GPT-4. The result was not a reduction in AI compute. It was a surge in AI use. More efficient AI means cheaper AI. Cheaper AI means more queries. More queries means more power. The paradox holds.
Modern AI chips are 99% more efficient than 2008 models — but demand grew exponentially faster than efficiency gains. Efficiency is necessary but not sufficient. IEA
What could actually help
Don't count on efficiency alone to solve this. History shows that when AI gets cheaper, people use more of it — not less.
The solutions are structural: carbon-aware scheduling, liquid cooling, and mandatory disclosure. None of these happen automatically. They require regulatory pressure and consumer demand. If you're building with AI, your model selection is now both a cost decision and an environmental one.
to reduce your AI footprint.
Subscribe to Veltrix Collective to stay informed. AI's environmental impact is moving fast — 945 TWh by 2030, water use doubling by 2028. We track the data every week so you don't have to piece it together from corporate sustainability reports.
Choose the smallest model that does the job. Not every task needs a frontier model. Use Claude Haiku or GPT-4o mini for quick queries — they use a fraction of the compute of Claude Opus or GPT-4. Match the model to the task. Smaller models mean less energy per query.
Run local models where you can. Tools like Ollama let you run models like Llama 3 or Mistral locally on your machine. A local inference uses roughly 1,000× less energy than a frontier API call. For repetitive tasks, batch them with n8n or Make instead of running live queries.
Ask your cloud provider about their energy sourcing. Check whether your AI vendor reports location-based or market-based emissions. If they only report market-based figures, they may be buying certificates while running on gas. Demand transparency — it's your Scope 3 footprint too.
Support mandatory disclosure legislation. California's data centre disclosure bill and the EU's Energy Efficiency Directive are the two most important regulatory pushes right now. Without mandatory reporting, there's no baseline — and without a baseline, improvement can't be measured. Write to your representative if you're in a data-centre-heavy region (Virginia, Arizona, Ireland).
Sources
06 — Don't miss the next bill
told with data.
Energy, water, jobs, regulation — the forces reshaping AI land in your inbox every Tuesday. Data-driven analysis, no hype, no corporate PR.