AI data centers no longer have to be power-hungry drains on the electrical grid; they can now actively stabilize it by adjusting their computing workloads in real-time. A groundbreaking demonstration in London proved that artificial intelligence (AI) factories equipped with the right software can detect grid stress signals and temporarily reduce power consumption without disrupting critical AI tasks, acting as shock absorbers for sudden demand spikes. This shift transforms data centers from infrastructure problems into infrastructure solutions. Why Does Grid Stability Matter for AI Infrastructure? The challenge is simple but urgent. When millions of people simultaneously turn on their kettles during a televised sports event, electricity demand can spike by about 1 gigawatt, equivalent to the output of a standard nuclear reactor, in just minutes. Grid operators must carefully manage these peaks to keep the system stable, and this becomes increasingly difficult as large new customers like AI data centers come online. Traditionally, grid operators respond by building massive new infrastructure to handle worst-case demand peaks. This is expensive, time-consuming, and ultimately passed on to everyday consumers through higher electricity rates. But what if those new customers could be flexible and actually help during peak strain periods? That's the premise behind power-flexible AI factories, which represent a fundamentally different approach to how data centers interact with the electrical grid. How Can AI Factories Become Grid-Friendly Assets? Emerald AI, in collaboration with NVIDIA, the Electric Power Research Institute (EPRI), National Grid, and Nebius, developed a platform called the Emerald AI Conductor that enables this flexibility. The system works by continuously monitoring GPU (graphics processing unit) power consumption at the second-by-second level and receiving signals from grid operators about demand stress. When the grid faces strain, the platform can autonomously reduce power use by prioritizing high-priority AI workloads while temporarily slowing down more flexible jobs. In a demonstration at Nebius' AI factory in London, researchers ran production-grade AI workloads on a cluster of 96 NVIDIA Blackwell Ultra GPUs connected through NVIDIA Quantum-X800 InfiniBand technology. EPRI and National Grid simulated real-world grid stress scenarios, including lightning strikes, low wind power supply periods, and the famous "TV pickup" phenomenon where millions of people turn on appliances simultaneously. The results were striking. When the simulated tea kettle surge was about to hit the grid, the AI cluster ramped down its power use and successfully acted as a shock absorber for the abrupt power surge. Remarkably, the Emerald AI Conductor achieved 100% alignment with over 200 power targets that EPRI and National Grid instructed the cluster to follow during the experiment. What Are the Real-World Benefits of Power-Flexible AI Factories? The implications extend far beyond a single demonstration. For grid operators and electricity consumers, power-flexible AI factories offer several concrete advantages: - Reduced Infrastructure Costs: The grid can manage sudden demand swings using existing capacity more efficiently, reducing the need to overbuild permanent infrastructure to meet worst-case peaks and helping keep electricity rates affordable for everyday consumers. - Faster Data Center Connections: AI factories equipped with this technology can connect to the grid much faster because they can tap into existing power infrastructure rather than waiting for years-long upgrades to be completed. - Comprehensive System Testing: The London demonstration tested not just GPUs but also CPUs and all supporting IT equipment, plus total power consumption, going beyond previous trials conducted in the United States. "With this technology, AI factories become friendly and helpful grid assets. Simultaneously, the AI factories get connected much faster to the grid because they can tap into existing power grids," said Varun Sivaram, founder and CEO of Emerald AI. Varun Sivaram, Founder and CEO at Emerald AI How Does This Technology Unlock Economic Growth? For countries like the United Kingdom, power-flexible AI infrastructure opens new economic opportunities. London's power grid faces a key bottleneck: constraints in infrastructure upgrades needed to connect large new customers. By optimizing the use of existing grid infrastructure through flexible AI factories, grid operators can accelerate the deployment of hyperscaler data centers without massive capital expenditures. "We have enormous skills and potential in AI. We're never going to be on the scale of the U.S. in terms of data centers, but relative to the size of the country, we could be, and we're certainly seeing that interest from many of the hyperscalers. So it gives us the opportunity to play our part as National Grid in helping unlock that economic growth for the country," noted Steve Smith, group chief strategy officer of National Grid. Steve Smith, Group Chief Strategy Officer at National Grid What's Next for Power-Flexible AI Infrastructure? The London demonstration was the fourth successful proof-of-concept trial for Emerald AI's flexible grid solution. Earlier trials took place in Arizona, Virginia, and Illinois. Now, the technology is moving toward real-world deployment. The Aurora AI Factory in Virginia is set to open this year and will operate as the first power-flexible AI factory powered by NVIDIA GPUs in production. This shift represents a fundamental change in how the technology industry approaches infrastructure challenges. Rather than treating data centers as isolated consumers that must be accommodated by grid upgrades, power-flexible AI factories become active participants in grid stability. As AI computing demand continues to surge globally, this approach could prove essential for keeping electricity affordable while enabling the infrastructure buildout that AI requires.