India's $100 Billion Bet: How One Company Plans to Build AI's Green Future

India is making a historic move to become a global AI powerhouse without sacrificing its climate commitments. The Adani Group has announced a $100 billion investment to construct renewable-powered hyperscale AI data centers by 2035, a strategy designed to give India control over its own computing infrastructure while tackling one of technology's biggest environmental challenges: the staggering energy demands of artificial intelligence .

This isn't just about building bigger data centers. The initiative represents a fundamentally different approach to AI infrastructure, one that integrates renewable energy generation, transmission systems, and computing capacity into a single unified architecture. By doing so, Adani aims to create what executives call a "sovereign energy and compute backbone" that reduces India's reliance on foreign cloud providers while positioning the nation as a creator and exporter of AI technology, not merely a consumer .

Why Does AI's Energy Problem Matter So Much?

The scale of AI's energy appetite is staggering. Training a single large language model like GPT-3, which contains 175 billion parameters, consumed 1,287 megawatt-hours of electricity and generated 502 metric tons of carbon dioxide emissions . A recent inventory of 369 major generative AI models built between 2018 and 2024 estimated their combined annual energy consumption at between 25 and 41 terawatt-hours, producing 10.7 to 18.6 million tons of CO2 annually .

The problem is accelerating. Global data centers consumed 415 terawatt-hours in 2024, roughly 1.5% of all electricity worldwide, and the International Energy Agency projects that demand will double to 945 terawatt-hours by 2030 as AI workloads expand . In the United States alone, data centers are on track to consume between 325 and 580 terawatt-hours annually by 2028, with approximately 22% of that coming from AI-related work . Without immediate action, AI's energy explosion could undermine global climate commitments like the Paris Agreement's 2030 targets .

How Does Adani's Plan Actually Work?

The Adani Group's strategy centers on three interconnected components: renewable energy supply, advanced computing infrastructure, and domestic manufacturing. The company plans to expand its AdaniConnex data center platform from its current 2 gigawatts to 5 gigawatts of capacity, with major facilities planned in Visakhapatnam and Noida . These centers will be powered by Adani Green Energy's massive Khavda renewable project, which already has more than 10 gigawatts operational out of a planned 30 gigawatts total .

The facilities will incorporate advanced cooling systems and high-efficiency power designs specifically engineered to handle the intense energy demands of AI processing. To reduce geopolitical and supply chain vulnerabilities, Adani also plans to co-invest in domestic manufacturing of critical infrastructure components, including transformers, grid systems, power electronics, and thermal management technologies . This vertical integration approach means India could eventually produce not just AI compute capacity but the physical infrastructure that powers it.

  • Renewable Energy Integration: Drawing power from Adani Green Energy's 30 gigawatt Khavda renewable project, with plans for an additional $55 billion investment in renewable expansion and battery storage systems
  • Hyperscale Data Centers: Expanding from 2 gigawatts to 5 gigawatts of computing capacity across multiple geographic locations with partnerships from Google and Microsoft
  • Domestic Manufacturing: Co-investing in production of transformers, grid systems, power electronics, and thermal management technologies to reduce supply chain dependencies
  • Global Connectivity: Leveraging cable landing stations at Adani-operated ports to enable low-latency connections with digital networks across the Americas, Europe, Africa, and Asia

What Makes This Different From Other Data Center Expansion?

Unlike conventional data center buildouts that treat energy as a separate concern, Adani's model treats renewable power generation and computing capacity as inseparable components of a single system. The company is also reserving dedicated GPU capacity for Indian AI startups, research institutions, and deep tech innovators to address compute shortages and support domestic model development . In partnership with academic institutions, Adani plans to launch specialized AI infrastructure engineering programs, applied research labs focused on energy and logistics, and a national fellowship initiative to address skills shortages .

"The world is entering an Intelligence Revolution more profound than any previous Industrial Revolution. Nations that master the symmetry between energy and compute will shape the next decade. India is uniquely positioned to lead," stated Gautam Adani, Chairman of the Adani Group.

Gautam Adani, Chairman of the Adani Group

The investment is expected to catalyze an additional $150 billion in related industries, including server manufacturing, electrical systems, and sovereign cloud platforms, creating a projected $250 billion AI infrastructure ecosystem over the next decade . This multiplier effect suggests the initiative could reshape not just India's technological landscape but global AI supply chains and investment flows.

What About the Carbon Accounting Problem?

While Adani's renewable-powered approach addresses operational emissions, a comprehensive academic review reveals that the full carbon picture of AI systems is far more complex than most companies acknowledge. A systematic review of 62 peer-reviewed studies published in March 2026 found that algorithmic efficiency improvements like pruning, quantization, and sparsity can reduce computational burden, but they only achieve real carbon reduction when integrated with hardware and data center design decisions .

The research also identified a critical blind spot: while operational energy accounts for most emissions during large-scale model training, the embodied carbon from semiconductor fabrication plays an increasingly important role in computing fleets with frequent equipment refreshes . Deployment decisions such as data center location, carbon-aware scheduling, and cloud-to-edge workload placement create much more variance in real emissions compared to what can be achieved through model-level optimization alone .

The review highlighted a major methodological problem: inconsistencies in carbon reporting, system boundaries, and energy telemetry across studies hinder reproducibility and comparison of findings . This fragmentation means companies often lack standardized frameworks for measuring and reporting their AI carbon footprints, making it difficult to verify climate claims or compare environmental performance across organizations.

Why Should Investors and Policymakers Pay Attention?

For investors, the Adani initiative signals the emergence of a major sovereign compute ecosystem designed to reduce geopolitical risk and accelerate digital growth while anchoring long-term technological competitiveness . For policymakers, it demonstrates that integrating renewable energy infrastructure with AI compute capacity is technically feasible and economically attractive at scale. The model also addresses data sovereignty concerns, allowing India to maintain control over critical digital infrastructure rather than depending on foreign cloud providers.

The timing is strategic. As AI demand accelerates globally, control over energy supply, compute infrastructure, and data governance is becoming a strategic priority for nations and corporations alike . India's integrated approach linking renewable energy, hyperscale compute, and domestic manufacturing could reshape how countries think about AI infrastructure development, moving away from the current model where energy and computing are treated as separate concerns toward a unified, carbon-conscious architecture.