Quantum Infeasibility (?)
AI;DR: (AI; Didn't Read) -- I doubt that Cryptographically Relevant Quantum Computing will be developed within the next 50 years. I have read a decent amount of other people's thoughts about this topic, but was a bit too lazy to synthesize all the notes I had collected into a single coherent essay myself. So I outsourced my thinking to a machine.
The content below was written 95% by ChatGPT 5. Latest Addenda by Gemini Pro 3.1.
Satoshi's coins won't be stolen within your lifetime, most likely.
The concept of quantum computing has captured scientific and public attention for decades. It promises a new form of computation based on the principles of quantum mechanics, where information is encoded in quantum bits known as qubits. These qubits can exist in superposition and can become entangled, allowing quantum computers to perform certain tasks faster than classical computers. However, the practical development of large scale quantum computers that could break modern encryption remains far from reality.
Many experts and organizations have invested heavily in quantum research. Companies like Google, IBM, Microsoft, and Amazon have all announced progress in creating small prototypes. Some predict useful quantum machines will appear within a few years, showing advantages in chemistry, materials science, or optimization. Yet, most of these claims refer to near term or hybrid systems and not to the large fault tolerant quantum computers necessary for breaking strong encryption like RSA 2048.
The major technical challenge lies in maintaining quantum coherence. Quantum states are fragile and lose coherence due to interaction with their environment. Maintaining coherence while operating on thousands or millions of qubits requires conditions of extreme precision and isolation. No experiment so far has achieved the stability needed to run deep quantum algorithms over long durations.
Another obstacle is error correction. Quantum error correction methods require many additional physical qubits to form a single logical qubit that can resist noise. Estimates suggest that millions of physical qubits would be needed to factor large integers such as those used in cryptographic keys. Currently, the largest systems contain only a few thousand noisy qubits, none of which form a fully error corrected logical unit.
Gate fidelity is also a limiting factor. A quantum computer performs operations called gates that manipulate qubits. These gates must act with extremely high accuracy because small errors accumulate through a computation. Present devices achieve single gate fidelities near 99.9 percent for specific conditions, but complex circuits require far greater stability. Any error above a small threshold renders long algorithms unreliable.
The hardware itself remains a major barrier. Quantum processors rely on superconducting circuits, trapped ions, photons, or topological effects. Each platform faces unique engineering problems. Scaling any of these approaches to millions of qubits would demand unprecedented advances in materials, manufacturing, and control electronics. Maintaining such systems at cryogenic temperatures and ensuring signal integrity across large networks is far beyond todayβs capability.
Recent public demonstrations, such as Google's quantum supremacy experiment, show quantum devices performing specific sampling tasks faster than supercomputers. However, these tasks were not useful computations. They demonstrated hardware behavior rather than real algorithmic advantage. Such benchmarks do not imply progress toward useful quantum factoring or cryptographic attacks.
Skeptics argue that the field overstates its achievements. They point to the lack of progress in factoring even small numbers with general purpose quantum algorithms like Shorβs. Since the first demonstration of factoring the number 15 in the early 2000s, no larger or fully functional implementations have appeared. Critics claim that many demonstrations involve partial compilation or prior knowledge of the solution, which invalidates them as true computations.
Theoretical estimates suggest that breaking 2048-bit RSA encryption would require on the order of tens of millions of logical qubits and trillions of error free gate operations. To reach that level within fifty years (by 2075) would mean several orders of magnitude of improvement in coherence, error correction, and manufacturing. Even optimistic projections from major research groups do not foresee such capability within that time frame.
Some futurists anticipate breakthroughs in topological qubits or other exotic materials that could simplify error correction. These ideas remain speculative. No experiment has yet proven a stable topological qubit capable of scalable computation. If such a discovery occurred soon, new progress could follow, but nothing indicates it will arrive quickly.
A small number of scholars and engineers have questioned whether large scale quantum computation is possible in principle. They argue that real matter may not support the continuous precision required to manipulate quantum amplitudes accurately across huge state spaces. According to this view, control errors, correlated noise, and physical imperfections could make full scale quantum computing unachievable even in theory.
Most scientists remain cautiously optimistic but acknowledge that the timeline is uncertain. It is possible that small quantum devices will continue to serve in specific research fields such as quantum chemistry, materials modeling, or limited optimization tasks. These systems may offer valuable insights into physics itself but will not threaten public key cryptography.
If progress continues at the present pace, useful commercial quantum computers may appear in limited roles within two or three decades. Cryptographically relevant quantum computers, capable of undermining secure communications, are likely much farther in the future, if they ever arrive. The resources required to construct such machines are beyond foreseeable technology.
In summary, the expectations placed on quantum computing are far greater than what current science supports. Despite major investments and periodic announcements of progress, every significant barrier remains unresolved. Unless new physical phenomena or manufacturing methods are discovered, the dream of a machine that can break RSA 2048 encryption appears infeasible within our lifetime.
Sources:
- https://scottlocklin.wordpress.com/2019/01/15/quantum-computing-as-a-field-is-obvious-bullshit
- https://mattdf.xyz/why-quantum-computing-will-take-another-50-years
- https://arxiv.org/abs/2211.07629
- https://spectrum.ieee.org/the-case-against-quantum-computing
- https://www.reuters.com/technology/google-says-commercial-quantum-computing-applications-arriving-within-five-years-2025-02-05
- https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/a-game-plan-for-quantum-computing
- https://www.bcg.com/publications/2021/building-quantum-advantage
- https://cacm.acm.org/news/a-quantum-leap-forward
- https://arxiv.org/abs/2303.04061
Addendum: March, 2026
Recent work has moved the benchmark beyond the old β15 or 21β examples, but not in a way that materially changes the conclusion about Cryptographically Relevant Quantum Computers. A 2024 paper on quantum annealing reported the largest factorization problems yet encoded on a D-Wave annealer, up to a (21 x 12)-bit multiplier, and actual experimental runs on available hardware reached instances corresponding to numbers as large as (33,423,105 = 131,071 x 255), though this was done by mapping factoring into a specialized optimization problem rather than by running Shorβs algorithm on a fault tolerant gate based quantum computer. A separate 2024 preprint then pushed the announced annealing result to a 29-bit instance using a hybrid classical plus quantum approach, explicitly noting that this does not change the problemβs complexity class and mainly improves the practical packaging of small subproblems for present day annealers. In other words, the field has demonstrated better encodings, better embeddings, and somewhat larger toy demonstrations, but not the kind of scalable quantum factoring method that would threaten RSA 2048 in practice. (nature.com)
Over the next 25 to 50 years, the most reasonable expectation is that this line of work could produce stronger heuristic tools for optimization, better hybrid attacks on small or specially structured arithmetic problems, and perhaps useful preprocessing components inside larger classical cryptanalytic workflows. What it does not presently suggest is a credible annealing path to breaking large public keys at Internet scale. The real cryptographic threat still comes from fault tolerant gate model machines running Shor-type algorithms, and even there the discussion remains about enormous error corrected systems far beyond todayβs hardware. So this new work is best understood as evidence that quantum annealing is becoming more technically sophisticated, not as evidence that Bitcoin Satoshi-era keys or RSA 2048 are about to become factorable. (arxiv.org)
References:
- Ding, Spallitta, Sebastiani et al., βEffective prime factorization via quantum annealing by modular locally-structured embedding,β Scientific Reports (2024). (nature.com)
- βMaximizing the practical achievability of quantum annealing attacks on factorization-based cryptography,β arXiv:2410.04956 (2024). (arxiv.org)
- Craig Gidney, βHow to factor 2048 bit RSA integers with less than a million noisy qubits,β arXiv:2505.15917 (2025). (arxiv.org)
Addendum II: Late March, 2026
While the early March 2026 quantum annealing updates did not threaten RSA-2048, a separate wave of gate-model breakthroughs published just weeks later now forces a material reassessment of my 50-year timeline. The existential threat to cryptography has shifted entirely away from RSA and onto Elliptic Curve Cryptography (ECC-256), the standard that secures Bitcoin and Ethereum.
Because ECC relies on much smaller keys for the same level of classical security, it presents a drastically smaller quantum target. As researchers recently summarized, "ECC-256 requires roughly 100x fewer quantum operations to break than RSA-2048 at the same classical security level" [1].
By combining this smaller algorithmic footprint with new high-rate quantum error-correcting codes, researchers have shattered the "millions of physical qubits" assumption. A theoretical hardware architecture proposed by researchers from Oratomic, Caltech, and UC Berkeley demonstrates that "Shorβs algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits" [1]. They project that by scaling up slightly to roughly 26,000 neutral-atom physical qubits, the algorithm could break ECC-256 in about 10 days [1].
Simultaneously, a whitepaper from Google Quantum AI mapped identical vulnerabilities for superconducting architectures, proving their circuits "can execute on a superconducting qubit CRQC with fewer than 500,000 physical qubits in a few minutes" [2].
However, before concluding that Satoshi's coins will be stolen tomorrow, we must contextualize these numbers within the strict physical and temporal realities of a cryptographic attack. There is a massive distinction between the time required to attack a dormant wallet versus the time required to intercept a live network transaction.
The Mempool "On-Spend" Window
To steal a live Bitcoin transaction, an attacker must execute an "on-spend" attack. When a user broadcasts a transaction, it sits unconfirmed in the public mempool for roughly 10 minutes before miners record it to the blockchain. During this window, the sender's public key is exposed. To intercept the funds, a quantum attacker must derive the private key, forge a redirect transaction, and submit it with a higher miner fee all before the 10-minute clock expires.
The Google whitepaper correctly identifies that only "fast-clock" architectures (like superconducting qubits) could act swiftly enough to execute these real-time attacks. Google estimates that "the first fast-clock CRQCs would enable 'on-spend' attacks on public mempool transactions" [2]. But crucially, achieving this speed requires abandoning high-compression, low-qubit algorithms. A January 2026 study by Kim et al. analyzed circuits optimized specifically for execution speed, finding that breaking an elliptic curve in 34 minutes would require an 19.1 million physical qubits, and extending that to 96 minutes would require roughly 6.9 million [8]. Therefore, a massive, million-qubit array was expected to be the minimum barrier to entry for the speed required to perform a mempool heist.
Homing in on "Slow-Clock Architecture" Targets
Conversely, the much smaller 26,000-qubit neutral-atom system proposed by Oratomic is a "slow-clock" architecture. Measuring its stabilizer cycles takes roughly 1 millisecond per step, extending the time to break a key out to 10 days [1]. A machine that takes a week and a half to solve a key cannot perform a 10-minute mempool heist. Its only viable targets would be exposing the keys of "dormant digital assets" with already-known public keys, such as early-era Satoshi wallets.
The Physics of Perfect Integration
Finally, we must distinguish theoretical minimums from functional hardware. Neutral atom arrays trapping upwards of 6,100 physical qubits already exist in laboratories today [1]. But perfectly integrating these innovations with deep-circuit coherence on a reliable, stable, fault-tolerant 26,000 qubit system is a monumental leap.
We know from a recent EUROCRYPT 2026 paper by the INRIA Rennes team (Chevignard et al.) that minimizing the qubit footprint requires a massive increase in the sheer volume of operations. They proved you can shrink the footprint for ECC-256 to just 1,098 logical qubits, but it requires executing upwards of 2^38.10 Toffoli gates [9]. Running Shor's algorithm at that depth for 10 consecutive days requires the quantum computer to execute billions of logical operations without a single uncorrectable error. The authors of the Oratomic paper openly acknowledge this gap, stating, "substantial engineering challenges remain" to integrate continuous large-scale trapping, universal operations, and high-rate magic state generation into one unified apparatus [1].
Takeaways
My previous 50-year estimate is now officially dead. The theoretical barrier to breaking ECC-256 has dropped from millions of qubits to tens of thousands, placing the physical hardware scale potentially within a 10-to-15 year horizon. However, successfully maintaining fault-tolerance for days on end to execute billions of gates and rob a dormant wallet, let alone building a multi-million physical qubit superconducting machine to rob in-flight active transactions in 10 minutes, ensures that the practical, existential threat to the crypto economy remains safely out of the immediate, near-term future.
Ranking of Bitcoin output and address types for long-term storage with quantum security in mind, assuming NO ADDRESS REUSE
Least suitable for quantum-conscious cold storage:
- Pay-to-Public-Key (P2PK) -- Prefix: none -- This early Bitcoin output type locks funds directly to a raw public key. If cryptographically relevant quantum computers are ever built, outputs of this type would be direct targets for Shorβs algorithm because the public key is already exposed on-chain.
- Pay-to-Taproot (P2TR) -- Prefix:
bc1p-- Taproot outputs commit directly to a tweaked x-only public key rather than hiding it behind a hash until spend. That means dormant unspent outputs would also be direct targets for a future Shor-capable quantum attacker, making P2TR less attractive for very long-term cold storage if quantum risk is a primary concern.
Most suitable for quantum-conscious cold storage:
- Pay-to-Public-Key-Hash (P2PKH) -- Prefix:
1-- This legacy format hides the public key behind a 160-bit HASH160 commitment until the output is spent. That public-key-hiding property is the main quantum-relevant advantage for long-term storage. - Pay-to-Script-Hash (P2SH) -- Prefix:
3-- Commonly used for older multisig and script-based wallets, P2SH hides the redeem script behind a 160-bit HASH160 commitment until spend. As with other hash-committed outputs, its primary quantum advantage is that spending details are not revealed while the coins remain dormant. - Pay-to-Witness-Public-Key-Hash (P2WPKH) -- Prefix:
bc1q-- Native SegWit single-key output type. Preserves the same key-hiding benefit as P2PKH while also improving transaction weight efficiency. For unspent outputs, the public key remains hidden until it is revealed in a spending transaction. - Pay-to-Witness-Script-Hash (P2WSH) -- Prefix:
bc1q, typically longer than P2WPKH -- Native SegWit script-based output type. Hides the witness script behind a SHA-256 commitment until spend and also benefits from SegWitβs transaction-weight advantages. Because it uses a 256-bit hash commitment instead of HASH160, it has a larger theoretical margin against Grover-style preimage search. However, this does not currently create a major practical quantum-security advantage over P2PKH, P2SH, or P2WPKH.
Bottom line:
For Bitcoin long-term storage under realistic quantum threat models, the dominant benefit is keeping the public key unrevealed until spend. On that dimension, P2PKH, P2SH, P2WPKH, and P2WSH all achieve the main protection. P2WSHβs 256-bit hashing provides a higher theoretical margin against Grover-style preimage attacks, but this distinction is secondary in practice. The main quantum concern remains Shorβs algorithm against already-exposed public keys.
References
- Shor's algorithm is possible with as few as 10,000 reconfigurable atomic qubits: https://arxiv.org/abs/2603.28627
- Securing Elliptic Curve Cryptocurrencies against Quantum Vulnerabilities: Resource Estimates and Mitigations: https://quantumai.google/static/site-assets/downloads/cryptocurrency-whitepaper.pdf
- How to factor 2048 bit RSA integers with less than a million noisy qubits: https://arxiv.org/html/2505.15917v1
- The Post-Quantum Clock Is Already Ticking - And Almost Nobody Is Ready: https://coderlegion.com/12840
- Google Urges Governments to Accelerate Quantum-Resistant Encryption Adoption: https://mlq.ai/news/google-urges-governments-to-accelerate-quantum-resistant-encryption-adoption-amid-imminent-threats
- Q-Day Revisited β RSA-2048 Broken by 2030: Detailed Analysis: https://postquantum.com/q-day/q-day-y2q-rsa-broken-2030
- A new era of quantum computing may pose threats closer than we think, Google warns: https://www.euronews.com/next/2026/03/27/a-new-era-of-quantum-computing-may-pose-threats-closer-than-we-think-google-warns
- New Quantum Circuits for ECDLP: Breaking Prime Elliptic Curve Cryptography in Minutes: https://eprint.iacr.org/2026/106
- Reducing the Number of Qubits in Quantum Discrete Logarithms on Elliptic Curves: https://eprint.iacr.org/2026/280
- Bitcoin Address Types (Unchained): https://www.unchained.com/blog/bitcoin-address-types-compared