VeltrixVeltrix.
← All articles
09 / 62March 14, 2026

AI and the Environment

The energy bill nobody agreed to pay. What running AI at scale costs the planet.

Data centres have existed for decades. What changed in 2017 was the arrival of AI-optimised hardware — GPUs and accelerators that draw dramatically more power than conventional servers. From 2005 to 2017, global data centre electricity consumption stayed roughly flat despite massive infrastructure expansion, thanks to efficiency gains. AI ended that era.

Global — 2024
415 TWh
Consumed by all data centres — roughly 1.5% of global electricity. Comparable to the annual demand of France. IEA
USA — 2024
183 TWh
Consumed by US data centres alone — more than 4% of the country's electricity. Roughly equal to Pakistan's national demand. PEW
One single query
10×
A ChatGPT query uses roughly 10× the electricity of a Google search, per Goldman Sachs analysis. The model answers, the meter runs. GS
Global data centre electricity demand — TWh (IEA + Gartner, 2025)
Reported
Projected

By 2030, AI-optimised servers alone are projected to consume 432 TWh — up from 93 TWh in 2025, a nearly fivefold increase in five years. GART The IEA describes AI as "the most important driver" of this growth. IEA

So what does this mean?

AI didn't just add to the trend — it broke the trend. For over a decade, efficiency gains kept data centre power flat. That era is over.

The electricity your AI query uses isn't hypothetical — it's drawing from the same grid that powers your home. And the projected 2030 demand (945 TWh) is equivalent to the entire electricity consumption of Japan. This is infrastructure-scale change happening on a consumer timeline.

Data centres don't pay for their own infrastructure upgrades in isolation — those costs are distributed across the grid and ultimately billed to consumers. The pattern is already visible in US and European electricity markets.

Northern VirginiaData Centre Alley
The world's largest data centre market, with approximately 4,000 MW of capacity. Dominion Energy forecasts its peak demand will rise by over 75% by 2039 due to data centres — just 10% without them. Power costs have become a major issue in state elections. Virginia's data centres already consume 26% of the state's electricity. BLOOM
Dublin, IrelandTech Hub
Data centres consume approximately 21% of Ireland's national electricity, projected to reach 32% by 2026. In Dublin itself, the figure is already 79% of the city's electricity — a level that has prompted Ireland's grid operator to block new data centre connections in the Greater Dublin Area. LINC
PJM RegionIllinois → North Carolina
Data centres added an estimated $9.3 billion to the 2025–26 capacity market in the PJM electricity network. Average residential bills are expected to rise by $18/month in western Maryland and $16/month in Ohio as a direct result. Wholesale electricity prices near data centre clusters have risen up to 267% since 2020. BLOOM
USA Average2030 Projection
Carnegie Mellon University researchers project that data centres and cryptocurrency mining combined could increase the average US electricity bill by 8% by 2030 — and by more than 25% in the highest-demand markets of central and northern Virginia. US data centre consumption is projected to grow 133% to 426 TWh by 2030. CMU
So what does this mean?

You're already paying for AI infrastructure — whether you use AI or not. Data centre power costs are socialised across the grid.

If you live in Virginia, Ireland, or near any major data centre cluster, your electricity bill has already risen. By 2030, the average US household could see an 8% increase directly attributable to data centres. In hotspots, it's 25%. This isn't a future problem — it's a current one.

AI's electricity story has received enormous attention. Its water story has not. Data centres use water the same way power plants do — evaporative cooling that removes heat from racks of processors. The water doesn't come back in usable form. It evaporates.

5M
gallons of water consumed per day by a single large AI-focused data centre. Equivalent to the daily water needs of a town of 50,000 people. EESI
Sources: Brookings Institution, Lincoln Institute, EESI — 2025
500M
litres of drinking water required annually to cool one data centre in Aragon, Spain — Europe's largest. Amazon applied to increase its permit by 48% in December 2024. ETHGEO
Source: EthicalGEO / France 24, 2025 — Region is in drought
163B
gallons of water consumed by US data centres annually (2021 baseline). US AI and data centre water use projected to double or quadruple by 2028. EESI
Sources: EESI, NetZeroInsights, DOE — 2025
160+
new AI data centres built across the US in the past three years in areas already facing water scarcity. Arizona has restricted home construction to conserve groundwater — while approving data centres. ELI
Source: Environmental Law Institute — 2025

"All water is local. What goes unacknowledged, from a natural systems perspective, is that the rules and the norms and the prices are set based on a previous reality."

— Peter Colohan, Director of Partnerships, Lincoln Institute of Land Policy, 2025

Google paid $100,000 in legal costs fighting a public records request from a local Oregon newspaper asking how much water its data centres were using. When disclosed, the answer was 355 million gallons — a quarter of the city of The Dalles' annual water use in 2021. Less than a third of data centre operators track their water consumption at all. EESI

So what does this mean?

Every AI query has a water cost — and it's drawn from the same supply you drink, cook with, and use to grow food.

This isn't a problem that affects "other people." 160+ new data centres have been built in water-scarce areas in the US alone. If your region is already managing drought conditions, an AI data centre is now competing for the same water. And most operators don't even disclose how much they use.

Every major tech company made ambitious climate pledges between 2019 and 2021. Then the AI buildout began. The gap between what was promised and what the numbers show is now documented in their own sustainability reports.

Company Pledge What the numbers say Status
Microsoft Carbon negative by 2030. Remove all historical emissions by 2050. Emissions grew 23–40% since 2020 baseline. Location-based Scope 2 emissions more than doubled (4.3M → 10M metric tons CO₂). Water use up 87% since 2020. MSFT Off Track
Google Net-zero emissions by 2030. Previously held "carbon neutral" status since 2007. Announced 48% GHG increase in 2024. Quietly removed net-zero pledge language from website in September 2025. Data centre electricity up 27% in a single year. GOOG Goal Removed
Amazon Net-zero carbon by 2040 (10 years longer than Paris Agreement). Continuing major data centre expansion. In Spain, applied to increase water permits by 48% citing climate-driven heat waves as justification for needing more water. ETHGEO Under Pressure
Meta Net-zero across value chain by 2030. Water-positive by 2030. Hyperion data centre in Louisiana will draw more than twice the power of New Orleans when complete. Wyoming data centre will use more electricity than all homes in the state combined. LINC Scale vs. Pledge
Key distinction: Companies use market-based accounting (buying renewable energy certificates) to claim "green" electricity, while location-based emissions — what's actually happening at the grid level — tell a different story. Microsoft's location-based Scope 2 emissions more than doubled in four years. The grid runs on what the grid runs on. A certificate doesn't change the physical electrons.
So what does this mean?

The companies building AI infrastructure have, by their own reporting, blown past their own climate targets. Google has removed its net-zero pledge entirely.

When you see a tech company claim "100% renewable energy," ask whether that's market-based or location-based accounting. The difference matters. Market-based accounting lets you buy certificates while your data centre runs on gas. Location-based accounting tells you what's actually powering the racks.

From 2005 to 2017, data centre efficiency improvements meant total electricity consumption stayed flat despite rapid expansion. AI broke this dynamic — not because efficiency stopped improving, but because demand grew exponentially faster than efficiency gains.

Jevons Paradox (1865)

When technology makes a resource more efficient to use, total consumption of that resource tends to increase — because lower cost drives higher demand. Applied to AI: DeepSeek's release in January 2025 showed models could be trained for 3% of the cost of GPT-4. The result was not a reduction in AI compute. It was a surge in AI use. More efficient AI means cheaper AI. Cheaper AI means more queries. More queries means more power. The paradox holds.

Modern AI chips are 99% more efficient than 2008 models — but demand grew exponentially faster than efficiency gains. Efficiency is necessary but not sufficient. IEA

What could actually help

Fix 01
Carbon-Aware Computing
Schedule AI training and non-time-critical inference workloads for when the grid runs on cleaner electricity. Large model training runs can last weeks — when they happen matters as much as how efficient the chips are.
STATUS: Early deployment — not yet standard practice
Fix 02
Liquid & Immersion Cooling
Direct-to-chip liquid cooling and immersion cooling can eliminate evaporative water use entirely and reduce cooling energy by 30–40%. Nvidia's 2027 Rubin Ultra racks are designed around liquid cooling as default.
STATUS: Technically proven — adoption accelerating
Fix 03
Mandatory Disclosure
California is considering legislation requiring data centre operators to disclose energy and water use annually. The EU's Energy Efficiency Directive is moving toward mandatory reporting. A coalition of 230 environmental groups called for a moratorium on new data centre construction in December 2025.
STATUS: Legislation pending (CA), EU EED in force
So what does this mean?

Don't count on efficiency alone to solve this. History shows that when AI gets cheaper, people use more of it — not less.

The solutions are structural: carbon-aware scheduling, liquid cooling, and mandatory disclosure. None of these happen automatically. They require regulatory pressure and consumer demand. If you're building with AI, your model selection is now both a cost decision and an environmental one.

945 TWh
Projected global data centre electricity demand by 2030. Equivalent to Japan's entire annual consumption. IEA
87%
Increase in Microsoft's water consumption since 2020 — driven almost entirely by AI data centre expansion. MSFT
267%
Rise in wholesale electricity prices near US data centre clusters since 2020. BLOOM
5 things you can do this week
to reduce your AI footprint.
1.

Subscribe to Veltrix Collective to stay informed. AI's environmental impact is moving fast — 945 TWh by 2030, water use doubling by 2028. We track the data every week so you don't have to piece it together from corporate sustainability reports.

2.

Choose the smallest model that does the job. Not every task needs a frontier model. Use Claude Haiku or GPT-4o mini for quick queries — they use a fraction of the compute of Claude Opus or GPT-4. Match the model to the task. Smaller models mean less energy per query.

3.

Run local models where you can. Tools like Ollama let you run models like Llama 3 or Mistral locally on your machine. A local inference uses roughly 1,000× less energy than a frontier API call. For repetitive tasks, batch them with n8n or Make instead of running live queries.

4.

Ask your cloud provider about their energy sourcing. Check whether your AI vendor reports location-based or market-based emissions. If they only report market-based figures, they may be buying certificates while running on gas. Demand transparency — it's your Scope 3 footprint too.

5.

Support mandatory disclosure legislation. California's data centre disclosure bill and the EU's Energy Efficiency Directive are the two most important regulatory pushes right now. Without mandatory reporting, there's no baseline — and without a baseline, improvement can't be measured. Write to your representative if you're in a data-centre-heavy region (Virginia, Arizona, Ireland).

AI consumed 415 TWh in 2024. By 2030, it'll be 945 TWh. Efficiency alone won't close that gap. Informed choices will.

Sources

IEA
IEA, "Energy and AI" Special Report, April 2025415 TWh global data centre consumption (2024), 945 TWh projection (2030), AI as primary growth driver
GART
Gartner, "Data Center Electricity Demand to Double by 2030," November 2025448 TWh (2025) to 980 TWh (2030) projection, AI-optimised servers 93→432 TWh
PEW
Pew Research Center, US Data Centre Energy Use, October 2025183 TWh US consumption, Pakistan equivalence comparison
GS
Goldman Sachs, Data Centre Power Demand Report, 2024ChatGPT query uses approximately 10× the electricity of a Google search
BLOOM
Bloomberg, "AI Data Centers Are Sending Your Power Bill Soaring," September 2025267% wholesale price rise near data centre clusters, Virginia capacity data
CMU
Carnegie Mellon University, US electricity bill impact study, July 20258% average bill increase by 2030, 25% in Virginia hotspots
EESI
EESI, "Data Centers and Water Consumption," 20245M gallons/day per facility, 519ml per prompt (UC Riverside), 163B gallons/year US baseline
LINC
Lincoln Institute of Land Policy, "Land and Water Impacts of the AI Boom," October 2025Meta Hyperion data, Dublin 79% electricity consumption figure
ETHGEO
EthicalGEO, "The Cloud is Drying Our Rivers," July 2025Aragon Spain 500M litres/year, Amazon 48% water permit increase
MSFT
Microsoft Environmental Sustainability Report, 2024Emissions +23–40%, water +87%, Scope 2 location-based 4.3M→10M metric tons CO₂
GOOG
Tom's Hardware / Policyreview.info, September 2025Google 48% GHG increase, net-zero pledge removal, data centre electricity +27% year-on-year
ELI
Environmental Law Institute, 2025160+ new data centres in water-scarce US areas, Arizona groundwater policy analysis
Veltrix Collective
AI's real costs,
told with data.

Energy, water, jobs, regulation — the forces reshaping AI land in your inbox every Tuesday. Data-driven analysis, no hype, no corporate PR.

Weekly, every Tuesday · No spam · Privacy policy · Unsubscribe anytime

Methodology: This analysis synthesises data from the sources listed above, published between 2024 and 2025. Energy consumption figures use IEA and Gartner estimates as the primary baseline. Water consumption data is drawn from EESI, Brookings Institution, and Lincoln Institute reports. Corporate emissions data is sourced directly from each company's most recent sustainability report. Projections represent median estimates from the cited sources and carry inherent uncertainty. All figures use the most conservative available estimates where sources diverge. This content is for informational purposes only and does not constitute environmental, legal, or investment advice.

09 / 12 — AI Data Series · Veltrix Collective · March 2026

Written by Luke Madden, founder of Veltrix Collective. Data synthesis and analysis by Vel.