AI data centers no longer have to wait years for grid upgrades to connect to the power system. Instead, they're becoming active participants in keeping electricity stable by adjusting their power consumption on demand, much like a shock absorber for sudden surges. A new demonstration in London proved this works at scale, with an AI cluster successfully reducing power use to absorb a simulated nationwide surge equivalent to millions of people turning on kettles simultaneously. Why Grid Operators Are Desperate for Flexible Power Customers? Electrical grids face a persistent problem: sudden, massive demand spikes that require expensive infrastructure to handle. During the 2020 UEFA Euro football match between England and Germany, millions of viewers in the United Kingdom simultaneously turned on their kettles at halftime. National Grid, which supplies electricity to England and Wales, recorded a demand spike of approximately 1 gigawatt in just minutes, equivalent to the output of a standard nuclear reactor. Grid operators must build infrastructure to handle these worst-case peaks, even though they occur infrequently. This drives up costs that ultimately get passed to everyday consumers through higher electricity rates. As AI data centers become major new customers requiring enormous amounts of power, grid operators face an even bigger challenge: connecting these facilities without massive, years-long infrastructure upgrades. The bottleneck isn't just physical space; it's the electrical capacity to serve new demand. But what if those new customers could actually help stabilize the grid instead of straining it further? How Can AI Data Centers Help Balance the Electrical Grid? Emerald AI, in collaboration with NVIDIA, the Electric Power Research Institute (EPRI), National Grid, and Nebius, developed a solution called the Emerald AI Conductor Platform. This software allows AI data centers to autonomously adjust their power consumption when the grid experiences stress, without disrupting the most critical workloads. The concept works like this: when grid operators detect a demand spike or supply shortage, they send a signal to the AI data center instructing it to reduce power use temporarily. The Conductor Platform then intelligently throttles less critical jobs while maintaining peak performance for high-priority tasks. The system uses real-time power telemetry from GPUs (graphics processing units, the specialized chips that power AI) to make these adjustments in seconds. In a proof-of-concept trial at Nebius' new AI factory in London, researchers tested this approach with production-grade AI workloads running on a cluster of 96 NVIDIA Blackwell Ultra GPUs connected through NVIDIA's Quantum-X800 InfiniBand platform. The team simulated multiple stress scenarios, including the exact "TV pickup" phenomenon that caused the kettle surge during the Euro 2020 match. When millions of simulated tea kettles were about to turn on, the AI cluster ramped down its power use and successfully absorbed the surge without disrupting high-priority workloads. "With this technology, AI factories become friendly and helpful grid assets. Simultaneously, the AI factories get connected much faster to the grid because they can tap into existing power grids," said Varun Sivaram, founder and CEO of Emerald AI. Varun Sivaram, Founder and CEO of Emerald AI What Were the Real-World Results from the London Test? The London demonstration went beyond previous trials conducted in the United States. Researchers tested not just GPU power consumption, but also CPU (central processing unit) power use and the total energy draw of all IT equipment in the cluster. Over the course of the experiment, the Emerald AI Conductor achieved 100% alignment with over 200 power targets that EPRI and National Grid instructed the cluster to follow. This level of precision matters because it proves the technology can work reliably in real-world grid conditions. Steve Smith, group chief strategy officer of National Grid, emphasized the significance of testing the entire system, not just individual components. "We did tests that go beyond the ones that have been done so far in the U.S. because we tested not just the GPUs, but also the CPUs and everything that sits around it, as well as the total power consumption of the IT equipment. We've proved the value that this technology brings," explained Steve Smith, group chief strategy officer of National Grid. Steve Smith, Group Chief Strategy Officer of National Grid Steps to Deploy Power-Flexible AI Infrastructure - Install Real-Time Power Monitoring: Deploy systems like NVIDIA's System Management Interface to collect GPU power telemetry at the seconds-level, enabling the platform to respond quickly to grid signals and adjust consumption in real-time. - Implement Intelligent Workload Prioritization: Use the Emerald AI Conductor Platform to classify AI jobs by priority, allowing flexible tasks to be temporarily slowed while critical workloads maintain peak performance during grid stress events. - Coordinate with Grid Operators: Establish communication protocols with local grid operators so they can send power reduction signals to the data center, and the facility can confirm its ability to meet those targets reliably. How Does This Benefit Consumers and the Economy? The immediate benefit for everyday consumers is lower electricity rates. By using existing grid capacity more efficiently, grid operators no longer need to overbuild permanent infrastructure to handle worst-case demand peaks. This reduces the capital costs that utilities pass along to ratepayers. For AI companies and data center operators, the benefit is faster grid connections. Instead of waiting years for infrastructure upgrades, power-flexible AI factories can connect to existing grids much more quickly, accelerating deployment and economic growth. In the United Kingdom, this advantage is particularly significant. London's power grid constantly works to meet growing energy demands, but grid operators face a key bottleneck: infrastructure constraints that slow connections for large new customers. Power-flexible AI factories could unlock new economic opportunities by allowing hyperscalers and AI companies to establish operations without waiting for massive grid upgrades. "We have enormous skills and potential in AI. We're never going to be on the scale of the U.S. in terms of data centers, but relative to the size of the country, we could be, and we're certainly seeing that interest from many of the hyperscalers. So, it gives us the opportunity to play our part as National Grid in helping unlock that economic growth for the country," noted Steve Smith. Steve Smith, Group Chief Strategy Officer of National Grid What's Next for Power-Flexible AI Infrastructure? Emerald AI and NVIDIA have completed four successful demonstrations of power-flexible AI factories. The next phase involves real-world deployment with the Aurora AI Factory in Virginia, which is set to open this year. This facility will operate under actual grid conditions, not simulated scenarios, providing crucial data on how the technology performs when managing real demand fluctuations and supply constraints. The success of these trials suggests that power flexibility could become a standard feature of future AI infrastructure. As AI data centers continue to proliferate globally, the ability to adjust power consumption in real-time could transform how grids manage demand, reduce infrastructure costs, and accelerate the deployment of AI computing capacity in regions with constrained electrical systems. " }