A groundbreaking study published in the Proceedings of the National Academy of Sciences suggests that quantum computers may have a fundamental ceiling that prevents them from ever breaking modern encryption or delivering the exponential speedups long promised by the field. The research, led by Tim Palmer of the University of Oxford, introduces a framework called Rational Quantum Mechanics (RaQM) that could reshape how scientists think about quantum computing's future. What Does This New Theory Actually Claim? The study proposes that quantum systems don't operate with infinite mathematical flexibility as currently assumed. Instead, RaQM suggests that the underlying mathematics of quantum mechanics is fundamentally discrete rather than continuous, imposing strict limits on how much information a quantum system can encode. According to the research, this translates into a maximum number of qubits (quantum bits, the basic units of quantum information) that can be meaningfully used in computation. The estimates are striking: the theory suggests a practical upper bound of roughly 200 to 400 qubits for current technologies, with an absolute ceiling of around 1,000 qubits under any physical implementation. Beyond that threshold, quantum computers would lose their computational advantage, even if engineers successfully build larger and more stable machines. To understand why this matters, consider how quantum computing is supposed to work. In conventional quantum mechanics, a system of N qubits can occupy an exponentially large number of states. This exponential scaling is the entire foundation of quantum computing's promise, enabling algorithms like Shor's method to factor large numbers far faster than classical computers. RaQM challenges this by introducing a concept called "qubit information capacity" that grows only linearly with the number of qubits, not exponentially. Why Should You Care About RSA Encryption? One of the most widely cited threats from quantum computing is its potential to break RSA encryption, the cryptographic system protecting everything from your bank account to government secrets. Shor's algorithm, a quantum method for factoring large integers, has long been considered the "killer app" that would make quantum computers a national security concern. The new study directly addresses this scenario, arguing that a quantum computer capable of factoring a 2,048-bit RSA key would require more qubits than the proposed limit allows. If RaQM is correct, this means RSA encryption may remain secure not because of technological barriers, but because of fundamental physical constraints built into the universe itself. This would be a dramatic reversal of decades of cryptography research focused on preparing for the "quantum threat". How Does This Theory Change Our Understanding of Quantum Computing? The framework rests on a radical reframing of quantum mechanics itself. Instead of allowing quantum states to be described by numbers that can take any value with infinite precision, RaQM restricts these parameters to rational numbers, meaning fractions that can be described using a finite amount of information. This effectively replaces the smooth, continuous structure of Hilbert space (the mathematical foundation of quantum mechanics) with a granular one. Palmer's theory further links this discretization to gravity, proposing that gravitational effects, often considered negligible in quantum systems, play a fundamental role in determining the structure of quantum state space. Using models of gravitationally induced state reduction, the study estimates the scale at which discretization becomes relevant, yielding the proposed limits on qubit capacity. The implications extend far beyond cryptography. If RaQM is correct, the entire trajectory of the quantum computing industry would need to shift. Rather than pursuing large-scale, fault-tolerant systems capable of running complex algorithms, the focus would need to move toward targeted applications operating within the proposed limits. What Applications Could Still Work Under These Limits? The good news is that near-term quantum computers, often called NISQ (noisy intermediate-scale quantum) devices, would remain useful. These systems operate with relatively small numbers of qubits and are already being explored for practical applications. According to the study, RaQM and standard quantum mechanics would produce indistinguishable predictions for these smaller systems, meaning the divergence only appears at larger scales where information capacity limits become significant. - Chemistry Simulations: Quantum computers could continue modeling molecular behavior and chemical reactions, helping researchers design new materials and drugs without hitting the proposed qubit ceiling. - Materials Science: Exploring properties of new materials and understanding their quantum behavior remains within the bounds of what smaller quantum systems can achieve. - Optimization Problems: Solving complex optimization challenges in logistics, finance, and manufacturing could benefit from quantum approaches without requiring massive qubit counts. How Could Scientists Test Whether This Theory Is Real? The study outlines a potential experimental path forward. Quantum algorithms that require maximal entanglement across many qubits, such as the quantum Fourier transform used in Shor's algorithm, would serve as a proving ground. If performance plateaus or degrades beyond a certain number of qubits, it could indicate the presence of the proposed limit. The paper acknowledges that RaQM and conventional quantum mechanics are indistinguishable for small systems, which complicates efforts to test the theory in current experimental setups. However, as quantum hardware improves and systems approach hundreds of qubits in the coming years, researchers may be able to test whether performance continues to scale as predicted by standard theory or whether it hits the ceiling RaQM proposes. Another open question concerns the role of error correction. Modern quantum computing roadmaps rely on encoding logical qubits across many physical qubits to suppress noise. RaQM suggests that increasing the number of qubits would not circumvent the fundamental limit, but this claim remains untested. What Do Quantum Computing Companies Think About This? While the RaQM theory remains speculative and departs from widely accepted principles of quantum mechanics, the quantum computing industry continues advancing on multiple fronts. Companies like Classiq and NVIDIA are focusing on practical hybrid quantum-classical approaches that may sidestep some of these theoretical concerns. Recent benchmark tests on a financial options-pricing application using Iterative Quantum Amplitude Estimation reduced circuit synthesis and execution time for a 31-qubit circuit from 67 minutes to 2.5 minutes using a single NVIDIA A100 GPU. This hybrid approach, combining quantum processing with classical computing power, represents a pragmatic path forward regardless of whether RaQM proves correct. By tightening the connection between quantum modeling, compilation, and execution, companies are helping researchers move faster from initial concept to runnable experiments. Steps to Understanding Quantum Computing's Future - Monitor Experimental Results: Watch for announcements from quantum hardware companies about performance scaling as they approach 200-400 qubits; any plateauing would support RaQM's predictions. - Follow Cryptography Standards: Keep track of post-quantum cryptography development; if RaQM gains credibility, the urgency of transitioning away from RSA may decrease significantly. - Explore Hybrid Quantum-Classical Applications: Focus on near-term quantum applications in chemistry, materials science, and optimization that don't require massive qubit counts and could deliver practical value regardless of theoretical limits. The theory remains speculative and departs from widely accepted principles of quantum mechanics. Standard quantum theory has been extensively validated across a broad range of experiments, and no clear evidence has yet emerged for the type of discretization proposed in RaQM. However, the paper's proposal to test these ideas experimentally means we may have answers within the next few years as quantum hardware continues to scale. Whether RaQM ultimately proves correct or not, it serves as a crucial reminder that quantum computing's future may look quite different from the grand promises of the past decade. The field may be entering an era where practical, targeted applications matter more than chasing theoretical supremacy.