The Hidden Water Crisis Behind AI: Why Data Centers Are Draining Aquifers Faster Than Energy

AI's water footprint is becoming a critical environmental concern that rivals, and in some cases exceeds, its energy consumption problem. While public attention has focused on electricity usage at data centers, a quieter crisis is unfolding: the massive amounts of water required to cool the servers powering artificial intelligence systems. Global data center water consumption currently stands at approximately 560 billion liters per year, roughly equivalent to flushing every toilet in Germany for an entire year . By 2030, that figure is projected to roughly double to over 1 trillion liters annually, creating a structural vulnerability that will worsen as climate change intensifies .

Why Is Water More Critical Than Energy for AI Data Centers?

The relationship between cooling systems and water consumption reveals a troubling trade-off that engineers cannot easily escape. Current-generation artificial intelligence server racks from Nvidia and AMD produce sustained power densities of 50 to 100 kilowatts per rack, far exceeding what traditional air cooling systems can handle . Standard air cooling, which was the default for most data centers built before 2020, becomes thermally inadequate above roughly 10 to 15 kilowatts per rack. This means facilities hosting modern AI workloads have no choice but to transition to liquid cooling systems, which require significantly more water .

Research from Lawrence Berkeley National Laboratory published in 2025 demonstrates just how variable water consumption can be across different facilities. Water intensity varies by roughly 10,000 times, from facilities consuming less than one liter per megawatt-hour of computing load to those consuming tens of thousands of liters . The analysis identified server energy efficiency as the primary determinant of water use, followed by the water consumption factor of the local electricity grid. Cooling type, which many assume is the dominant factor, actually ranked fourth in importance .

The largest facilities are responsible for the overwhelming majority of water consumption. In England, which represents a best-case scenario with a temperate climate and relatively low-carbon grid, 67 percent of registered data center facilities use fewer than 1,000 cubic meters of water per year, comparable to a small office building . However, the top six facilities account for 65 percent of the sector's total water consumption, revealing a highly concentrated problem .

How Are Universities and Tech Companies Addressing the AI Sustainability Challenge?

Institutions and companies are beginning to implement practical strategies to reduce both energy and water consumption, though the solutions remain incomplete. Arizona State University has emerged as a case study in balancing AI adoption with environmental responsibility. The university has rapidly deployed AI technology across classrooms, research, and operations, while simultaneously confronting the reality that data centers in the Phoenix area are warming surrounding neighborhoods by several degrees .

ASU President Michael Crow emphasized the importance of creating data centers with "minimal environmental footprint" while acknowledging that AI will play a key role in the future. He noted that major companies like Amazon Web Services are exploring renewable energy systems and distributed data center locations to reduce environmental impact . Kyle Bowen, the deputy chief information officer at ASU, explained that the university developed CreateAI, a platform that helps users select AI models suited to specific tasks rather than defaulting to large-scale models that consume more resources. The platform displays a small green leaf next to smaller models to encourage environmentally conscious choices .

Michael Crow

Beyond energy optimization, some companies are exploring water reuse strategies. Chloe Bethune, a junior studying sustainability and president of SustainHUB at ASU, suggested using recycled wastewater in AI facilities and data centers to reduce pressure on fresh water supplies, a technique companies like Nona Technologies are currently exploring .

Steps to Build More Sustainable AI Infrastructure

  • Implement Model Selection Tools: Deploy platforms that show users the environmental cost of different AI models and encourage selection of smaller, more efficient models through visual cues like green leaf indicators next to lower-impact options.
  • Transition to Distributed Data Center Locations: Move away from centralized cloud systems toward smaller, device-based AI models that can run on smartphones or laptops, reducing reliance on energy-intensive centralized infrastructure.
  • Explore Water Recycling and Reuse: Investigate the use of recycled wastewater and alternative cooling methods like adiabatic pre-cooling or immersion cooling to reduce freshwater consumption in data center operations.
  • Integrate Sustainability Education: Incorporate carbon awareness and model efficiency concepts into computer science and AI curricula, ensuring that engineers understand the environmental implications of their design choices from the start.
  • Foster Cross-Disciplinary Collaboration: Create spaces where engineers and environmental professionals can communicate openly about trade-offs and constraints, building mutual understanding of both technical and sustainability priorities.

What Role Is Nuclear Energy Playing in AI's Energy Future?

As data center electricity consumption accelerates, nuclear energy has emerged as a potential solution to power AI infrastructure without relying on fossil fuels. Data centers consumed roughly 183 terawatt-hours of electricity in 2024, representing 4 percent of total United States electricity consumption, a figure expected to rise to 426 terawatt-hours by 2030 . Because data centers require 24/7 electricity, classic renewable energy sources such as wind and solar are not considered viable options, leaving fossil fuels as the dominant power source, with 40 percent from natural gas and 15 percent from coal .

The Trump administration set an ambitious target to increase nuclear energy capacity in the United States from 100 gigawatts in 2024 to 400 gigawatts by 2050, with significant federal and private investment flowing into the sector . Two landmark projects exemplify this shift: the Palisades nuclear plant near South Haven, Michigan, which received a $400 million Department of Energy investment and is expected to operate again in 2026, will provide clean energy to 1.4 million Michigan households and businesses . The Three Mile Island site, historically infamous for its 1979 accident, received a $1 billion loan to restart operations and is expected to open in 2027 with an 837 megawatt capacity, backed by a 20-year purchase agreement with Microsoft to power its data centers in the region .

"Some companies, like AWS for Amazon and others, and their big AI data centers, are looking at renewable energy systems, distributed formats, distributed locations. We're going to try to be a part of helping to figure out how all of that can happen," said Michael Crow, President of Arizona State University.

Michael Crow, President at Arizona State University

Tennessee is currently leading the nation with plans for at least three small modular reactors, one potential fusion reactor, and a planned microreactor factory . Alabama is receiving a $40 billion joint investment from the U.S. and Japanese governments for new reactor development, while Amazon and Dominion Energy signed an agreement to develop small modular reactors in Virginia .

Is the Individual AI Query Really an Environmental Problem?

The environmental impact of a single AI query is surprisingly modest when examined in isolation. A typical text query to ChatGPT uses roughly 0.3 watt-hours of electricity according to analysis from Epoch AI in February 2025, with OpenAI's Sam Altman confirming a similar figure of 0.34 watt-hours . Google reported in its 2025 methodology report that a median text prompt in Gemini uses roughly 0.24 watt-hours, produces 0.03 grams of carbon dioxide, and consumes 0.26 milliliters of water . These figures place AI queries in the same ballpark as a standard Google search, which uses approximately 0.3 watt-hours, though that estimate comes from a 2009 blog post and is likely lower today given improvements in hardware efficiency .

The widely cited claim that an AI query uses "ten times more electricity than a search" originated from researcher Alex de Vries' article published in Joule in October 2023 and was referenced by the International Monetary Fund, among others . However, this figure is now outdated for typical text queries, since both hardware and model architectures have become dramatically more efficient . The real environmental concern is not the individual query but the aggregate infrastructure supporting billions of queries daily.

The scale of AI adoption makes the cumulative impact impossible to ignore. ChatGPT reached 900 million weekly active users by February 2026, and AI is being integrated into search engines, email clients, office software, and autonomous agents, creating an entirely new computational layer underneath the digital economy . Five companies alone plan to spend close to 700 billion dollars on AI infrastructure in 2026 . Even with impressive efficiency gains, absolute electricity consumption continues to rise in nearly all plausible future scenarios according to the International Energy Agency .

The path forward requires acknowledging that AI's environmental impact is neither a simple problem nor a simple solution. Gunika Dhingra, a graduate student studying data science and student advisor for The AI Society at ASU, noted that the effect of AI's impact on the environment depends heavily on policy, energy sourcing, and how industries responsibly scale their use of technology . She emphasized that there is a significant gap in education on sustainability concepts, including carbon awareness and model efficiency, which should be taught in computer science and AI classes with support from student organizations .

The challenge ahead is not choosing between technological progress and environmental protection, but rather integrating them. As Bethune observed, "I don't see technology slowing down. As the environmentalists and the sustainability professionals, we have to keep up with that pace if we want to have a balanced future of both technology and sustaining what makes us whole and alive and thriving" .

As Bethune