Every time you ask an AI chatbot to summarize an email or generate an image, you're triggering high-density computing in a facility hundreds of miles away, consuming far more energy than traditional web searches. As artificial intelligence infrastructure expands globally, IT leaders face a critical choice: integrate sustainability into their AI strategy from the start, or risk amplifying both environmental impact and operational costs. New data shows datacentres will require approximately 1,200 terawatt-hours (TWh) of energy by 2030, a 20 percent increase from forecasts just one year ago, with AI-optimized servers alone expected to consume around 156 gigawatts (GW) of power. How Much More Energy Does AI Actually Use Than Traditional Search? The energy gap between conventional web searches and generative AI (GenAI) interactions has become substantial. While a traditional Google search draws negligible power, a single interaction with a generative AI model can consume approximately 10 times that amount. The problem intensifies when users request image or video generation. Generating one high-resolution AI image consumes the equivalent of roughly half a smartphone charge. What makes this particularly concerning is the invisibility of these costs. Most users have no idea that their casual prompts, "summarize this email" or "draw a cat in a dinner jacket," trigger cascades of compute-intensive operations. This creates what experts call a "rebound effect": because the technology feels free and effortless, people use it frivolously. The United Nations' sustainable development goal 12 advocates for efficient use of natural resources, yet the current AI economy encourages high-volume, low-intent consumption. Why Are IT Leaders Treating Sustainability as an Afterthought? For much of the past decade, cloud computing allowed energy consumption to be abstracted away from enterprise decision-making. Hyperscalers invested at scale, efficiencies improved, and sustainability narratives focused on relative gains compared to on-premise infrastructure. AI fundamentally changes that equation. "Every AI model trained, every dataset retained indefinitely, every compute-intensive workload spun up without scrutiny contributes incrementally to the overall footprint. Multiply that across thousands of organisations and the cumulative effect is substantial," warned Daniel Smith, CEO of Astralis Technology. Daniel Smith, CEO at Astralis Technology Smith emphasizes that environmental accountability in AI is not about restraint for its own sake, but about intelligent demand management. Too often, sustainability strategies are treated as parallel initiatives that are well-intentioned but secondary to core digital transformation. AI changes that equation by amplifying both the opportunity and the risk. Steps IT Leaders Should Take to Align AI Strategy With Sustainability Goals - Assess AI Requirements Critically: Determine whether your organization genuinely needs any given AI workload to run continuously, and make informed choices about model selection and deployment rather than defaulting to brute-force computing approaches. - Optimize Model Size and Training Frequency: Evaluate whether models are being optimized for efficiency or whether teams are simply scaling up compute power without scrutiny, then adjust architectural decisions accordingly. - Rationalize Legacy Systems Alongside AI Expansion: Review whether legacy systems and data estates are being properly rationalized or whether AI capabilities are simply being layered on top of inefficient existing infrastructure. - Integrate Sustainability Metrics Into Architecture Decisions: Make sustainability metrics influence architectural choices rather than using them merely to satisfy reporting requirements for environmental key performance indicators. - Reassess Sustainability Roadmaps in Light of AI Growth: Update organizational sustainability strategies to account for increased AI usage rather than deferring or suspending them to build out AI capabilities unhindered by environmental concerns. Smith urges IT leaders to "do their bit" by applying the same discipline to compute consumption that many organizations already apply to financial spend or cyber risk management. What's Driving the Energy Crisis in AI Infrastructure? Unlike earlier waves of cloud adoption, AI infrastructure drives sustained high-intensity compute, exacerbates peak demand pressures, and accelerates the need for grid reinforcement and transmission upgrades. The expansion of UK datacentre capacity is unfolding in an increasingly chaotic manner. More than 60 separate planning applications for new datacentres were filed in England and Wales in 2025 alone, creating significant local strain and signaling a lack of national oversight. The geography of datacentre deployment is similarly imbalanced, with the largest clusters concentrated around London Docklands and Slough, two of Europe's most mature and interconnected digital hubs. As AI servers become more power-dense, datacentre connection requests are placing increasing demands on electricity networks, prompting providers to explore alternative solutions that may carry environmental trade-offs. "If we are to avoid a future where AI growth is decoupled from our planetary boundaries, we must move beyond the idea that hyperscalers are the sole curators of the carbon footprint," noted Shane Herath, chair of the Eco-Friendly Web Alliance. Shane Herath, Chair at Eco-Friendly Web Alliance Herath believes that true sustainability requires a recalibrated landscape where enterprises and individuals become active participants in a "digital diet," taking responsibility for their own AI consumption patterns rather than relying solely on large tech companies to manage environmental impact. Why Is Carbon Accounting for AI Still So Opaque? A critical gap exists in how organizations measure and report the environmental impact of their digital workloads. There is a lack of standardized carbon accounting for digital workloads, which means their environmental impact remains opaque and poorly quantified. Without clear metrics and standardized measurement approaches, IT leaders struggle to understand the true cost of their AI deployments and make informed decisions about optimization. Microsoft CEO Satya Nadella addressed energy efficiency during a keynote presentation at the Microsoft AI Summit in London, framing the challenge in terms of tokens, the smallest measurable unit of work in AI systems. The goal, according to Nadella, is not only to reduce the energy needed to process a token but to do so in a cost-effective manner. Microsoft has committed to powering all of its Azure cloud and AI datacentres with 100 percent renewable energy, though this approach alone does not address the underlying consumption challenge. The real cost of AI infrastructure is no longer hidden from view. As organizations scale their AI capabilities, the question of who pays for grid reinforcement and transmission upgrades becomes increasingly visible at the national infrastructure level. IT leaders who integrate sustainability into their AI strategy now, rather than treating it as an afterthought, will be better positioned to manage both environmental impact and operational costs as AI workloads continue to grow.