Nvidia and Emerald AI are collaborating with six major US energy companies to develop AI data centers that can adjust their electricity usage in real time to support grid stability, addressing a critical bottleneck in AI infrastructure deployment. The initiative, announced at CERAWeek in Houston, represents a fundamental shift in how the tech industry approaches the massive power demands of artificial intelligence workloads. Why Are Energy Companies Getting Involved in AI Data Center Design? The explosive growth of AI has created an unprecedented electricity demand that conventional power grids struggle to meet. Traditional data center interconnection timelines have fallen far behind the pace of AI investment, forcing companies to seek alternative solutions. Rather than building fully off-grid facilities, the new model keeps AI data centers connected to the broader power system while allowing them to operate as flexible assets that respond to grid conditions. The partnership brings together AES, Constellation, Invenergy, NextEra Energy, Nscale Energy and Power, and Vistra alongside Nvidia and Emerald AI. These energy providers will evaluate generation and infrastructure configurations to support the AI factory model, including hybrid setups that combine grid access with onsite power generation and battery storage. How Do These Grid-Flexible AI Data Centers Actually Work? - Vera Rubin DSX Architecture: Nvidia's reference design includes DSX Flex software that allows facilities to adjust electricity usage in response to real-time grid conditions and demand signals. - Emerald AI's Conductor Platform: This coordination system matches computing workloads with onsite energy resources such as generation and battery storage to balance power demand while maintaining performance for AI operations. - Flexible Power Supply Strategy: AI facilities draw power more quickly through co-located generation and storage, then later connect more fully to the grid as capacity becomes available, with the ability to supply electricity back to the grid during peak demand periods. This approach differs fundamentally from fully off-grid data center models. By keeping energy assets connected and responsive to broader system needs, the facilities can improve utilization of existing infrastructure and reduce the need for expensive grid expansion. AI data centers can scale consumption up or down during periods of grid stress, essentially acting as flexible loads that help stabilize the entire power system. Nvidia and Emerald AI have already tested elements of this concept at five commercial data centers globally over the past year. Commercial-scale deployment of the DSX Flex system is expected to begin later this year at an Nvidia AI research facility in Virginia, with plans to identify and advance additional projects using the architecture. "AI factories are the engines of the intelligence era, and like any great engine, every system must be designed together, energy, compute, networking and cooling as one architecture," said Jensen Huang, Nvidia CEO. Jensen Huang, CEO at Nvidia What Problem Does This Actually Solve? The electricity demand from AI workloads has become a genuine constraint on infrastructure expansion. Some projects have turned to dedicated generation to speed deployment, but that approach can limit broader system benefits if the power assets remain disconnected from the grid. The new model addresses this by aligning flexible operations with new generation capacity, helping meet rising electricity demand while maintaining system reliability. By enabling AI facilities to draw power more quickly through co-located resources and then integrate with the grid as capacity becomes available, the approach solves two problems simultaneously. It accelerates the time required to bring data center capacity online, which has been a major bottleneck for AI companies. At the same time, it supports grid reliability by creating facilities that can absorb or release power based on system needs rather than operating as fixed, inflexible loads. The collaboration reflects increasing recognition that conventional approaches to powering AI infrastructure are unsustainable. Rather than forcing the grid to expand infinitely to meet AI demand, this model makes AI infrastructure itself part of the solution to grid management. The participating energy providers indicated that this approach could help them meet rising electricity demand from AI while maintaining system reliability across their service territories. As AI continues to consume more electricity, this kind of integrated design between energy systems and computing infrastructure may become the standard rather than the exception. The announcement signals that solving AI's power problem requires collaboration between technology companies and the energy sector, not competition between them.