The Physics Hack That Could Cut AI's Energy Hunger by 1,000 Times
A startup backed by ex-Google researchers just created the world's first thermodynamic computing chip designed to run AI models with radically less power. Instead of fighting the natural vibrations in silicon that waste enormous amounts of energy, Normal Computing's CN101 chip works with that chaos, potentially unlocking efficiency gains that could reshape how we power artificial intelligence at scale .
Why Does AI Consume So Much Energy in the First Place?
Modern AI systems demand staggering amounts of computing power. Training and running large language models (LLMs), which are AI systems trained on vast amounts of text to understand and generate human language, requires energy consumption that dwarfs what nature achieves with biological intelligence. The fundamental problem lies in how conventional computer chips work .
Traditional processors, whether central processing units (CPUs) or graphics processing units (GPUs), are built on a strictly digital architecture. They perform calculations using billions of tiny switches that are either on (representing 1) or off (representing 0). To maintain perfect precision and avoid errors from electrical noise, these chips expend enormous energy fighting against the natural, chaotic vibrations of atoms at the quantum level. This constant battle against physics itself is where much of the power consumption comes from .
"Modern day AI requires orders of magnitude more energy than the intelligence afforded to us through nature. We know highly efficient intelligence is possible, we just don't know how to build it," explained Suraj Brahmavar, Programme Director at ARIA (Advanced Research and Invention Agency).
Suraj Brahmavar, Programme Director at ARIA
How Does Thermodynamic Computing Actually Work?
Normal Computing's approach inverts this entire paradigm. Rather than suppressing the natural randomness in silicon chips, the company's engineers leverage it as a computational resource. This approach, called stochastic computing, treats noise not as an enemy but as a tool. By working with the randomness instead of against it, the chips can dramatically reduce power density and consumption while avoiding the temperature limits that plague conventional processors .
The CN101 chip, unveiled in June 2025, represents the first physical proof that this concept works. Designed specifically for multi-modal diffusion generative AI (GenAI) model inference, which is the process of running trained AI models to generate images, videos, or other content, the chip demonstrated that thermodynamic computing could produce usable results. When the team ran their first test in August 2025, the chip successfully generated data output, marking a critical validation of the entire approach .
"Stochastic computing works with a modelling of the noise in silicon chips that we're normally trying to suppress. That suppression is where a lot of the power cost comes from. We work with the randomness and that gives us an ability to reduce power density and consumption so that we can avoid hitting these maximum temperature limits," noted Marc Bright, Silicon Team Lead at Normal Computing.
Marc Bright, Silicon Team Lead at Normal Computing
What Makes This Different From Other AI Hardware Advances?
The thermodynamic computing approach differs fundamentally from incremental improvements in conventional chip design. Rather than squeezing more efficiency from the existing digital paradigm, Normal Computing is questioning core principles of how computers work. The company's roadmap targets efficiency gains of up to 1,000 times, along with dramatically lower latency, which is the time it takes for a system to respond to a request .
The team's technical success has already attracted significant commercial validation. In a recent funding round, Normal Computing announced $50 million in strategic funding led by the Samsung Catalyst Fund, bringing total funding to more than $85 million. New investors include Galvanize, an investment firm focused on energy innovation founded by Tom Steyer and Katie Hall, making its first semiconductor investment, alongside existing backers like Eric Schmidt's First Spark Ventures and Micron Ventures .
Steps to Understanding Thermodynamic Computing's Potential Impact
- Energy Efficiency Gains: The chip targets multiple orders of magnitude in efficiency improvements for datacenter silicon, potentially reducing the power consumption required to run AI inference workloads by factors of 100 to 1,000 times compared to conventional approaches.
- Scalability Questions: The critical next phase involves determining whether thermodynamic systems can maintain useful computational signals as they grow larger, and identifying where noise enables computation versus where it degrades performance.
- Model and Workload Compatibility: Researchers must validate whether the same behavior holds across different AI models and workloads, ensuring the approach generalizes beyond the initial multi-modal diffusion GenAI applications.
- Commercial Pathway: The company uses its own AI systems to design successive generations of chips, with each iteration improving upon the last, creating a feedback loop that accelerates progress toward practical deployment.
What Challenges Remain Before Thermodynamic Chips Go Mainstream?
Despite the breakthrough, significant uncertainties persist. It remains unclear how far this approach can scale or whether the computational principles that worked in CN101 will hold as chips grow more complex. The distinction between where noise enables computation and where it begins to degrade performance is still being mapped out .
Additionally, startups developing novel AI hardware face a structural challenge: they lack a "shop window" to showcase their innovations to the broader market. This forces companies to develop components in isolation, requiring massive capital investment and dependency on hyperscalers, the large cloud computing companies that dominate the industry, to access computing resources for testing and validation .
"ARIA exists to fund the ideas where the potential impact is not marginal but transformational, even when the technical risk is high. Normal's team has taken a fundamentally unconventional approach and delivered working silicon in CN101. That is an exceptionally rare outcome for work this ambitious and we are excited to witness this next phase of the journey," stated Suraj Brahmavar.
Suraj Brahmavar, Programme Director at ARIA
The thermodynamic computing breakthrough matters because it addresses one of AI's most pressing practical problems: energy consumption. As demand for AI inference, the computational work of running trained models on new data, continues to explode, the power requirements threaten to become economically and environmentally unsustainable. Normal Computing's CN101 chip suggests that a fundamentally different approach to computing architecture, grounded in physics rather than conventional digital logic, might offer a path forward. The next 12 to 24 months will be critical in determining whether this early success scales to the level needed for real-world deployment in data centers worldwide .