Battery storage is becoming the secret weapon for AI data centers facing power grid constraints, allowing operators to run significantly more computing workloads without expanding their grid connections. Rather than treating batteries as backup power or revenue-generating assets, a new framework called Compute Per Megawatt (CPM) reframes storage as a tool that unlocks additional computing capacity by smoothing the dramatic power spikes that occur when thousands of processors activate simultaneously. Why Are Data Centers Suddenly Obsessed With Battery Storage? The challenge facing AI infrastructure operators is straightforward but severe: data centers running AI workloads experience extreme power fluctuations. Thousands of graphics processing units (GPUs) can spike and drop their power consumption within seconds, creating unpredictable demand patterns that utilities struggle to accommodate. Without battery storage, a data center's grid connection must be sized to handle the absolute worst-case peak demand, which is expensive and often causes projects to languish in interconnection queues waiting for utility approval. Batteries solve this problem by absorbing those spikes, effectively acting as a buffer between the data center's variable power needs and the utility's more stable grid supply. "Storage doesn't make individual GPUs more efficient, but it removes the constraints that prevent the IT side from using its hardware to full potential," explained Alejandro de Diego, a market analyst at Modo Energy. Alejandro de Diego, Market Analyst at Modo Energy This distinction is crucial: the battery itself doesn't improve processor efficiency. Instead, it enables data center operators to run their existing hardware at higher utilization rates without triggering grid violations. The practical implication is striking. With a battery absorbing power spikes, operators can effectively run more compute capacity than their grid contract technically allows, depending on spike duration and battery sizing. For hyperscalers facing interconnection delays, this becomes as much a time-to-market advantage as a cost-saving measure, allowing projects to move forward while waiting for permanent grid upgrades. How Can Data Centers Maximize Battery Value Beyond Backup Power? - Power Capping on IT Systems: Facilities can limit the power drawn by each processor, allowing more processors to run on the same power budget while processing more AI workloads overall, as demonstrated by research from Lawrence Berkeley National Laboratory. - Co-Optimization Between Storage and Compute: Battery storage acts as a buffer while IT-side power management smooths the load, enabling data centers to use smaller batteries and integrate more aggressive power caps than would otherwise be possible. - Grid-Aware Compute Operations: By acting as a balancing mechanism for utilities, data centers can demonstrate they are "grid-aware" assets that help manage constraints rather than exacerbate them, creating positive returns and regulatory goodwill. The novelty here isn't the battery technology itself, which is well-established. Rather, it's the recognition that compute workloads can be deliberately reshaped to require less power while still increasing total output. "The novelty here isn't the storage, but that compute workloads can be deliberately reshaped to need less of it, while still increasing total output," noted Alejandro de Diego. Alejandro de Diego, Market Analyst at Modo Energy This co-optimization opens up a new value stream that existing arbitrage or capacity payment models don't capture. What Changes When Batteries Become Revenue Generators? The financial case for data center batteries shifts dramatically under the CPM framework. Traditionally, battery value is determined by what electricity costs on the grid. Under CPM, the battery's value is determined by what the computation it enables is worth. "The battery's value is no longer determined by what the electricity is worth on the grid, but by what the computation it enables is worth. That distinction changes the investment case entirely," stated Alejandro de Diego. Alejandro de Diego, Market Analyst at Modo Energy This reframing means batteries can generate "far more revenue per megawatt-hour" for operators than traditional grid participation models. However, capturing this value requires breaking down organizational silos that have existed for decades. Most data centers treat IT workloads and facility infrastructure as separate cost centers with different incentive structures. "Existing platforms often treat the 'White Space' (IT workloads) and 'Gray Space' (facilities and storage) as separate silos," explained Wannie Park, founder and CEO of Pado AI, a software-as-a-service startup managing distributed energy resources for data centers. Wannie Park, Founder and CEO at Pado AI Bridging this gap between revenue-driven IT teams and cost-focused facility teams is essential for realizing the full potential of battery-enabled computing. The stakes are high for getting this right. If a hyperscaler deploys storage to unlock additional compute capacity but ends up earning revenue from wholesale electricity markets instead, the original investment thesis has failed. The financial case was built on the value of computation, not grid participation. While wholesale market revenue isn't inherently bad, it signals that something has gone wrong with the compute story that justified the battery investment in the first place. How Are Utilities Responding to Grid-Aware Data Centers? Utilities are increasingly recognizing data centers with battery storage as potential solutions rather than problems. "Utilities are facing a 'grid power wall' and are increasingly hungry for flexible resources that can mitigate the massive load growth from AI," said Wannie Park. Wannie Park, Founder and CEO at Pado AI This shift in perspective creates opportunities for data centers to participate in grid management while simultaneously unlocking additional computing capacity. The regulatory and technical barriers to wholesale market participation remain significant, making CPM a more realistic path forward for most large-load users. "With the right orchestration, [storage] isn't just a cost-center for backup. It is a revenue-generating asset that maximizes [a data center's] CPM and pays for itself through energy market participation," added Wannie Park. Wannie Park, Founder and CEO at Pado AI This approach allows batteries to serve multiple functions simultaneously: enabling additional compute, stabilizing the grid, and generating revenue through energy market participation. As AI infrastructure continues to expand globally, the battery-storage-as-compute-enabler model represents a fundamental shift in how hyperscalers approach power constraints. Rather than waiting years for grid upgrades or relocating to regions with abundant power, operators can now deploy batteries to unlock hidden capacity within their existing power budgets, accelerating time-to-market while simultaneously helping utilities manage the unprecedented load growth from artificial intelligence workloads.