The Myth of Green AI

R. Castellanos / January 20, 2026

The technology industry has constructed a compelling narrative: artificial intelligence will solve the climate crisis. Smart grids, optimized logistics, precision agriculture — the story goes that AI will reduce emissions far beyond what its own operations consume. This paper argues that this framing is not merely optimistic but structurally misleading.

The Accounting Problem

When organizations claim their AI initiatives are “carbon neutral” or even “carbon negative,” they rely on a particular kind of accounting — one that credits the potential savings of AI-optimized systems while externalizing the actual costs of training and inference.

A single large language model training run can emit as much carbon as five cars over their entire lifetimes1. This figure, striking as it is, understates the problem. It excludes:

  • The embodied energy of GPU manufacturing
  • The water consumption of cooling systems
  • The e-waste generated by rapid hardware obsolescence
  • The rebound effects of efficiency gains

Jevons Paradox, Revisited

The history of energy efficiency offers a sobering lesson. When we make a process more efficient, we tend to use more of it, not less. Steam engines became more efficient and coal consumption increased. Cars became more fuel-efficient and people drove more miles.

AI follows this pattern precisely. As inference becomes cheaper, we deploy it in more contexts — recommendation engines, content generation, surveillance, advertising optimization. Each application is individually “efficient.” Collectively, they represent an enormous and growing energy demand.

The Infrastructure Question

Every AI model requires data centers. Data centers require electricity, water, land, and rare earth minerals. The current trajectory of AI development assumes these resources are essentially unlimited — or at least that their limits are someone else’s problem.

This assumption deserves scrutiny. The regions where data centers are concentrated often face water stress. The electrical grids they depend on are frequently powered by fossil fuels. The minerals in their hardware are extracted under conditions that would be considered unacceptable if they were visible to the end user.

Toward Honest Assessment

None of this means AI is without value or that research should cease. It means that the environmental costs of AI must be assessed honestly, using full lifecycle analysis rather than selective accounting.

A genuinely sustainable approach to AI would begin with a question the industry rarely asks: do we need this model at all?

Footnotes

  1. Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and Policy Considerations for Deep Learning in NLP. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.