Skip to content

CONCEPT Cited by 2 sources

Cryptographically-relevant quantum computer (CRQC)

Definition

A cryptographically-relevant quantum computer is a quantum computer scaled to the point where it can execute Shor's algorithm on key sizes in actual production use — RSA-2048 / RSA-3072 / ECDH-P-256 / ECDH-P-384 / Ed25519. Below that threshold, a quantum computer can factor toy numbers (published "factoring records" of small composites) but cannot threaten deployed cryptography. A CRQC is what Q-Day names the threshold of.

CRQCs do not exist as of the April 2026 Cloudflare assessment, but multiple research programs are close enough that the industry's migration window is measured in years, not decades. (Source: sources/2026-04-07-cloudflare-targets-2029-for-full-post-quantum-security)

Three independent engineering fronts

Scaling a CRQC requires progress on three axes; progress on any one compounds the others (Source: sources/2026-04-07-cloudflare-targets-2029-for-full-post-quantum-security):

1. Hardware

Multiple qubit-realization paradigms are pursued in parallel:

  • Superconducting qubits (Google, IBM, Rigetti) — nearest- neighbor connectivity, mature fabrication, high control fidelity, but physical-to-logical-qubit overhead large.
  • Neutral atoms (Oratomic, QuEra, Atom Computing, Pasqal) — reconfigurable qubit connectivity (atoms can be moved with optical tweezers), which enables much more efficient error-correcting codes. Google in 2026 announced it is also pursuing neutral atoms alongside superconducting. In 2025 this paradigm turned out to be more scalable than expected.
  • Ion traps (IonQ, Quantinuum) — high fidelity, slower gates, longer-term architecture risk on scale-out.
  • Photonics (PsiQuantum, Xanadu) — room-temperature operation, different error profile, scaling path still being demonstrated.
  • Topological qubits (Microsoft Station Q, others) — moonshot approach betting on intrinsic error protection; not production maturity yet.

Complementary approaches can be combined (e.g. microwave↔optical transducers bridging superconducting + photonic). Most paradigms had open scaling challenges a few years ago; most have made demonstrable progress by 2026. To assume Q-Day is far away, you must assume every paradigm hits a wall. Cloudflare: "To ignore this progress, you'd have to believe that every single approach will hit a wall."

2. Error correction

Quantum computers are noisy. Meaningful computation requires error-correcting codes that encode one logical qubit into many physical qubits. The physical-to-logical ratio is paradigm- specific and the single biggest lever for whether Q-Day is 2030 or 2045:

Architecture Physical qubits per logical qubit
Nearest-neighbor superconducting ~1,000 (typical)
Reconfigurable neutral atoms (Oratomic 2026 estimate) 3-4

Improved qubit connectivity enables much more efficient codes — what took 1,000 physical qubits under surface-code-on-nearest- neighbor can be done with 3-4 under reconfigurable-code-on-neutral- atoms. This is the result that made 2026's Q-Day reassessment possible: a 10,000-qubit neutral-atom machine suffices to break P-256 — a shockingly low number relative to prior superconducting estimates.

3. Software / algorithms

The algorithm Shor's-family runs with isn't fixed. Google's April 2026 disclosure was a major speed-up to the elliptic-curve variant, specifically against P-256. Google did not publish the algorithm — only a zero-knowledge proof that it has one. Oratomic's companion work includes architecture- specific optimizations exploiting reconfigurable-qubit connectivity on top of the algorithmic speed-up.

Algorithmic advances do not require hardware progress to take effect — they move the Q-Day goalpost independently. A hardware platform that was "10 years away" under the old algorithm may be "5 years away" under the new one.

Why public estimates become untrustworthy from 2026 onward

Prior to 2026, detailed physical-qubit / logical-qubit / gate- count estimates for breaking RSA-2048 on specific architectures were published openly. As CRQC feasibility approaches, this progress is expected to — and in Cloudflare's assessment already has — go dark: publishing detailed estimates gives too much information to adversaries.

Scott Aaronson (end of 2025), quoted in the Cloudflare post:

[A]t some point, the people doing detailed estimates of how many physical qubits and gates it'll take to break actually deployed cryptosystems using Shor's algorithm are going to stop publishing those estimates, if for no other reason than the risk of giving too much information to adversaries. Indeed, for all we know, that point may have been passed already.

Cloudflare: "That point has now passed indeed."

Operational implication: treat public Q-Day estimates as an upper bound on the date, not a calibrated estimate. The private-progress side of the curve can only surprise earlier.

What to watch for

Cloudflare's 2026 post names the specific capabilities to track (per-architecture) rather than tracking factoring-record milestones — which are a bad proxy:

  • Qubit count scaling to the low thousands → tens of thousands on each paradigm.
  • Logical-qubit fidelity and error-correction threshold crossings.
  • Reconfigurability / connectivity improvements that enable more efficient codes.
  • Architecture-specific algorithmic optimizations — these do not require hardware advances.

Seen in

Last updated · 200 distilled / 1,178 read