AI's Missing Piece: Why Environmental Sustainability Must Define Trustworthy AI

Artificial intelligence is treated as weightless and immaterial, but it runs on electricity, minerals, water, and global supply chains, making environmental sustainability a critical missing pillar in how we define trustworthy AI systems. For years, discussions about trustworthy AI have focused on human-centered concerns like privacy, bias, transparency, and fairness. But as AI becomes embedded across all sectors of society, a fundamental question has been overlooked: what is the environmental cost of machines that think for us?

What Is the Real Environmental Footprint of AI Systems?

Training a single large language model (LLM), a type of AI system that powers chatbots and search tools, can consume as much electricity as hundreds of households use in a year. But training is only the beginning. The real environmental impact comes from inference, the billions of daily prompts, searches, translations, predictions, and recommendations that occur after deployment. According to the International Energy Agency, data centers already account for a significant share of global electricity demand, and this share is expected to rise sharply toward 2030 as AI becomes embedded across all sectors of society .

The environmental cost of AI extends far beyond energy consumption. Cooling data centers requires enormous amounts of water. In regions already experiencing drought and water stress, the expansion of AI infrastructure creates new pressure on ecosystems and communities. Additionally, the hardware for running AI systems depends on access to critical minerals extracted from vulnerable natural environments. Extracting these minerals often causes biodiversity loss, pollution, and serious human rights challenges in global supply chains. Before an AI system ever answers a question, it may already have accumulated a significant ecological debt .

Could AI Actually Delay the Green Transition Instead of Accelerating It?

One of the most uncomfortable questions facing the tech industry is whether AI could delay the green transition rather than accelerate it. Major oil and gas companies are among the most sophisticated adopters of AI. Machine learning helps them analyze seismic data, identify new reserves, and optimize extraction from mature wells. By lowering costs and improving efficiency, AI can make fossil fuels more competitive compared to renewables. At the same time, the explosive growth of data centers requires a stable, 24/7 electricity supply. Without sufficient production and storage for renewables, this demand risks prolonging reliance on natural gas, or even coal, to ensure necessary baseload power .

"AI can either support squeezing the last drops of carbon from the earth or help phase out fossil fuels faster. The choice is political and ethical," explained Morten Dæhlen, co-director of The Norwegian Centre of Trustworthy AI and professor in computational mathematics at the University of Oslo.

Morten Dæhlen, Co-director of The Norwegian Centre of Trustworthy AI, University of Oslo

How to Build More Sustainable AI Systems

Experts argue that the dominant trend in AI, where "bigger is better," must change. Several approaches show promise for dramatically reducing the environmental impact of AI:

  • Frugal AI Models: Moving from large models to smaller, specialized models represents a shift from brute-force computing to knowledge-informed computational efficiency. Running AI on local devices ("edge intelligence") rather than at large, centralized data centers reduces both energy demand and latency, the time it takes for systems to respond.
  • Open-Source Development: International collaboration and open innovation, including the development of open-source software, can prevent redundant training of similar large models, potentially saving enormous amounts of resources that must be harvested from vulnerable natural environments.
  • Emerging Hardware Technologies: Neuromorphic computing, inspired by the human brain, optical computing using photons rather than electrons for data transport, and quantum computing promise significant energy efficiency gains. While still developing, these technologies signal that AI does not have to be permanently tied to today's energy-intensive data and computing architectures.
  • Circular Data Centers: Data centers can be redesigned to integrate into local energy ecosystems. Waste heat can support district heating systems, backup power can rely on green hydrogen instead of diesel generators, and facilities can become integrated components of local energy systems rather than isolated energy sinks.

Technological innovation alone, however, will not automatically make AI sustainable. Governance is essential. Environmental impact must become a formal part of AI accountability, with Environmental, Social and Governance (ESG) frameworks including mandatory reporting of energy use, water consumption, and material footprint. Standardized metrics, labeling schemes, and easy-to-use key performance indicators can empower users to choose greener AI providers. Progressive taxation of digital resource consumption is one proposed approach, keeping basic access and use affordable while discouraging wasteful overuse .

The challenge ahead is redefining what trustworthy AI actually means. Currently, trustworthy AI is defined by its precision, fairness, and transparency. But as AI becomes increasingly resource-intensive, trustworthy AI must also be defined by whether it safeguards the ecological foundations of life itself. The future requires frugal AI: systems that are smaller, smarter, more energy-efficient, and embedded into a circular economy. When machines think, the planet should not pay the price .