Post-Quantum · 22 min read

What Is Post-Quantum Cryptography?
A Complete Guide for 2026

Quantum computers will break RSA, ECC, and Diffie-Hellman. Post-quantum cryptography is the replacement. This guide covers every NIST standard, every algorithm family, real-world performance numbers, migration deadlines, and the common misconceptions that trip up engineering teams.

FIPS 203/4/5
NIST Standards
~240µs
ML-DSA Verify
128-bit
PQ Security
2035
Classical Crypto Disallowed

Every digital system you rely on—online banking, encrypted messaging, VPN tunnels, TLS certificates, software updates, healthcare records—is protected by public-key cryptography built on two mathematical assumptions: that factoring large integers is hard and that computing discrete logarithms is hard. These assumptions have held for decades. Quantum computers will shatter both of them.

Post-quantum cryptography (PQC) is the field of cryptographic algorithms designed to resist attacks from both classical and quantum computers. Unlike quantum cryptography (which uses quantum mechanics to transmit keys), PQC algorithms run on ordinary hardware—your existing servers, laptops, and phones—but are built on mathematical problems that remain hard even for quantum processors. NIST finalized the first three PQC standards in August 2024. The migration clock is already running.

This guide covers everything a security engineer, architect, or technical leader needs to know: what quantum computers actually break, what they do not break, the four algorithm families, deep dives on each NIST standard, key-size and performance comparisons, regulatory deadlines, the hybrid approach, H33's production PQ stack, and a concrete migration roadmap.

What Quantum Computers Break

The threat is not that quantum computers are "faster" in some general sense. The threat is specific: a sufficiently large, error-corrected quantum computer running Shor's algorithm can efficiently solve the integer factorization problem and the discrete logarithm problem. These are the two mathematical foundations underlying virtually all public-key cryptography deployed today.

Shor's Algorithm: The Core Threat

Peter Shor published his algorithm in 1994. It uses quantum Fourier transforms to find the period of a modular exponential function, which directly yields the prime factors of a composite integer (breaking RSA) or the discrete logarithm (breaking Diffie-Hellman and ECC). On a classical computer, the best known factoring algorithm (General Number Field Sieve) runs in sub-exponential time. Shor's algorithm runs in polynomial time—an exponential speedup.

Everything Built on Factoring or Discrete Log Is Broken

RSA (all key sizes) — integer factorization. ECDSA / ECDH / Ed25519 / X25519 — elliptic curve discrete log. Diffie-Hellman / ElGamal / DSA — finite field discrete log. Every TLS handshake, every SSH session, every certificate chain, every code-signing key that uses these algorithms will be retrospectively compromised once a cryptographically relevant quantum computer (CRQC) exists.

AlgorithmFoundationBroken ByEst. Logical Qubits
RSA-2048Integer factorizationShor's1,730–6,190
RSA-4096Integer factorizationShor's~12,000
ECDSA P-256Elliptic curve discrete logShor's2,330–2,619
ECDH / X25519Elliptic curve discrete logShor's~2,330
Diffie-Hellman 2048Finite field discrete logShor'sSimilar to RSA
Ed25519Elliptic curve discrete logShor's~2,330

The logical qubit estimates above represent how many error-corrected qubits are needed. Physical qubit requirements depend on error rates and error-correction overhead. Craig Gidney's May 2025 paper reduced the estimated physical qubit count for RSA-2048 from ~20 million to under 1 million, narrowing the gap between current hardware and a CRQC to roughly three orders of magnitude.

What Quantum Computers Do NOT Break

One of the most common misconceptions is that quantum computers "break all encryption." They do not. The second relevant quantum algorithm is Grover's algorithm, which provides a quadratic speedup for unstructured search. Its impact on symmetric cryptography and hash functions is real but manageable.

Grover's Algorithm: Quadratic, Not Exponential

Grover's algorithm halves the effective security level of symmetric ciphers and hash functions. AES-256 drops from 256-bit security to ~128-bit equivalent. AES-128 drops to ~64-bit equivalent (insufficient). SHA-256 collision resistance drops from 128-bit to ~85-bit. The fix is straightforward: use larger key sizes. AES-256 and SHA-384/SHA-512 remain secure against quantum adversaries with comfortable margins.

AlgorithmTypeClassical SecurityPost-Quantum SecurityVerdict
AES-256Symmetric cipher256-bit~128-bitSafe
AES-128Symmetric cipher128-bit~64-bitInsufficient
ChaCha20-Poly1305AEAD cipher256-bit~128-bitSafe
SHA-256Hash function128-bit collision~85-bit collisionAcceptable
SHA-384 / SHA-512Hash function192 / 256-bit~128 / 170-bitSafe
SHA-3 (Keccak)Hash functionUp to 256-bitUp to ~128-bitSafe
HMAC-SHA-256MAC256-bit~128-bitSafe

The practical takeaway: symmetric encryption and hashing survive the quantum era with minor upgrades (move from AES-128 to AES-256, prefer SHA-384+ where collision resistance matters). The crisis is entirely in public-key cryptography—key exchange, digital signatures, and certificate chains.

NIST PQC Standardization: From Call to FIPS

The global effort to standardize quantum-resistant algorithms has been led by the National Institute of Standards and Technology (NIST). The process took eight years, involved submissions from research teams worldwide, and withstood multiple rounds of cryptanalysis by the global academic community.

December 2016
NIST Call for Proposals—Initiated the Post-Quantum Cryptography Standardization Process. 82 submissions received across key encapsulation and digital signature categories.
January 2019
Round 2—26 algorithms advanced. Eliminated candidates with insufficient security margins or impractical performance.
July 2020
Round 3—7 finalists + 8 alternates. CRYSTALS-Kyber and CRYSTALS-Dilithium emerged as frontrunners.
July 2022
Initial selections announced—Kyber (KEM), Dilithium (signatures), FALCON (signatures), SPHINCS+ (hash-based signatures). SIKE eliminated after a classical attack completely broke it.
August 13, 2024
FIPS 203, 204, 205 finalized—ML-KEM (from Kyber), ML-DSA (from Dilithium), and SLH-DSA (from SPHINCS+) become official federal standards. The post-quantum era begins.
~2026
FIPS 206 draft—FN-DSA (from FALCON). NTRU-lattice-based signatures with compact sizes but complex implementation (requires double-precision floating-point sampling).
~2027
HQC draft—Code-based KEM selected as a backup to ML-KEM, providing algorithmic diversity in case lattice-based assumptions are weakened.

The eight-year process was not bureaucratic delay. It was necessary. One of the Round 3 candidates—SIKE (an isogeny-based scheme)—was completely broken by a classical attack in 2022 after years of analysis. Rigorous, multi-round cryptanalysis is not optional for standards that will protect the world's infrastructure for decades.

The Four Families of Post-Quantum Algorithms

Post-quantum cryptography is not a single approach. It encompasses four major mathematical families, each with different security assumptions, performance profiles, and tradeoffs. Understanding these families is essential for evaluating which algorithms fit your threat model.

Lattice-Based

Hard problem: Learning With Errors (LWE) and Module-LWE. Finding a short vector in a high-dimensional lattice is believed hard for both classical and quantum computers. Underlies ML-KEM (FIPS 203) and ML-DSA (FIPS 204). Offers the best balance of security, performance, and key size. Dominates the NIST selections.

Hash-Based

Hard problem: Security of the underlying hash function (SHA-256 / SHA-3 / SHAKE). The most conservative family—security proof reduces directly to the hash function's collision and preimage resistance. Underlies SLH-DSA (FIPS 205). Signatures only (no key exchange). Larger signatures but minimal cryptographic assumptions.

Code-Based

Hard problem: Decoding a random linear code (equivalent to generic syndrome decoding). The McEliece cryptosystem has survived over 40 years of cryptanalysis since 1978. HQC (Hamming Quasi-Cyclic) selected by NIST as a backup KEM. Drawback: very large public keys (hundreds of kilobytes for McEliece).

Multivariate

Hard problem: Solving systems of multivariate quadratic polynomial equations over finite fields (MQ problem). Compact signatures but large public keys. Rainbow, a leading multivariate scheme, was broken in 2022. The family remains active in research but has no NIST finalists remaining in the current round.

Isogeny-Based: A Cautionary Tale

A fifth family—isogeny-based cryptography, built on maps between elliptic curves—was a strong NIST candidate (SIKE). In July 2022, Wouter Castryck and Thomas Decru published a classical polynomial-time attack that completely broke SIKE on a single laptop in under an hour. The lesson: any PQC scheme can fall to future cryptanalysis. This is why NIST selected algorithms from multiple families and why the hybrid approach (PQC + classical) is recommended during the transition period.

FIPS 203: ML-KEM (Kyber) — Post-Quantum Key Encapsulation

ML-KEM (Module Lattice-Based Key Encapsulation Mechanism), derived from the CRYSTALS-Kyber submission, is the standard for quantum-resistant key exchange. It replaces RSA key exchange and ECDH in TLS handshakes, VPNs, and any protocol that needs two parties to agree on a shared secret.

How ML-KEM Works

ML-KEM is a key encapsulation mechanism (KEM), not a traditional key exchange. The distinction matters:

  1. KeyGen: The receiver generates a public/private key pair. The public key is a matrix A sampled from a seed, plus a vector t = A*s + e where s is a short secret vector and e is a small error vector. The secret key is s.
  2. Encapsulate: The sender generates a random message m, computes a ciphertext that "wraps" m using the public key with fresh randomness, and derives the shared secret K from m.
  3. Decapsulate: The receiver uses the private key to unwrap the ciphertext, recovers m (or detects tampering), and derives the same shared secret K.

The security rests on the Module Learning With Errors (MLWE) problem: given A and t = A*s + e, recovering s requires finding a short vector in a lattice—a problem for which no efficient quantum algorithm is known.

Parameter SetNIST LevelPublic KeyCiphertextShared SecretClassical Equivalent
ML-KEM-5121800 B768 B32 BAES-128
ML-KEM-76831,184 B1,088 B32 BAES-192
ML-KEM-102451,568 B1,568 B32 BAES-256

ML-KEM-768 is the recommended default for most applications, providing NIST Level 3 security (equivalent to AES-192 against both classical and quantum attacks). Key sizes are larger than ECDH (32 bytes for X25519) but still practical for network protocols—a TLS handshake adds roughly 1–2 KB.

ML-KEM Performance

ML-KEM-768 Benchmark (Typical Server Hardware)

KeyGen~35 µs
Encapsulate~45 µs
Decapsulate~40 µs
X25519 (classical baseline)~120 µs
RSA-2048 key exchange~240 µs

ML-KEM is not only quantum-safe—it is faster than both RSA and ECDH for key exchange. The NTT-based polynomial arithmetic at its core is highly parallelizable and cache-friendly. There is no performance excuse for delaying ML-KEM adoption.

FIPS 204: ML-DSA (Dilithium) — Post-Quantum Digital Signatures

ML-DSA (Module Lattice-Based Digital Signature Algorithm), derived from CRYSTALS-Dilithium, is the standard for quantum-resistant digital signatures. It replaces RSA signatures, ECDSA, and EdDSA for authenticating identities, signing documents, verifying software updates, and securing certificate chains.

How ML-DSA Works

ML-DSA uses a "Fiat-Shamir with Aborts" paradigm:

  1. KeyGen: Generate a matrix A from a seed, secret vectors s1 and s2 with small coefficients, and compute t = A*s1 + s2. The public key is (seed, t). The secret key is (s1, s2).
  2. Sign: Sample a random masking vector y, compute w = A*y, hash the message with w to get a challenge c, compute z = y + c*s1. If z is "too large" (would leak information about s1), abort and retry with fresh randomness. This rejection sampling ensures signatures reveal nothing about the secret key.
  3. Verify: Recompute w' from the signature and public key. Check that the hash matches. Constant-time verification.

The "aborts" mechanism is what makes ML-DSA secure: by rejecting signatures that would leak information, the scheme achieves zero-knowledge properties. On average, signing requires approximately 4–7 iterations.

Parameter SetNIST LevelPublic KeySignatureSecret KeyClassical Equivalent
ML-DSA-4421,312 B2,420 B2,560 BSHA-256 collision
ML-DSA-6531,952 B3,309 B4,032 BAES-192
ML-DSA-8752,592 B4,627 B4,896 BAES-256

ML-DSA-65 (NIST Level 3) is recommended for most applications. Signature sizes are larger than ECDSA (64 bytes) or Ed25519 (64 bytes), but 3.3 KB is still practical for authentication tokens, API responses, and certificate chains.

ML-DSA Performance

H33's production stack uses ML-DSA (Dilithium) for attestation signatures on every authentication. Measured on Graviton4 (c8g.metal-48xl):

ML-DSA-65 Production Performance (Graviton4)

Sign + Verify (single)~240 µs
Batch attestation (32 users)~240 µs total
Per-auth attestation cost~7.5 µs
ECDSA P-256 sign+verify~500 µs
RSA-2048 sign+verify~1,000 µs

By batching 32 users per attestation call, H33 amortizes the Dilithium overhead to ~7.5 microseconds per authentication. This is faster than classical ECDSA while providing post-quantum security.

FIPS 205: SLH-DSA (SPHINCS+) — Hash-Based Backup

SLH-DSA (Stateless Hash-Based Digital Signature Algorithm), derived from SPHINCS+, provides a fundamentally different security guarantee from the lattice-based schemes. Its security relies solely on the properties of hash functions—no lattice assumptions, no number theory, no algebraic structure to potentially exploit.

Why SLH-DSA Matters

SLH-DSA exists as algorithmic insurance. If a breakthrough in lattice cryptanalysis weakens or breaks ML-KEM and ML-DSA (which would compromise both FIPS 203 and 204 simultaneously), SLH-DSA remains secure because it depends on an entirely different mathematical foundation. Any attack on SLH-DSA would require breaking SHA-256 or SHA-3—a breakthrough that would undermine virtually all of modern cryptography.

Defense in Depth

NIST explicitly selected SLH-DSA as a diversification measure. Both ML-KEM and ML-DSA are lattice-based, meaning a single mathematical breakthrough could compromise both. SLH-DSA ensures that at least one NIST-standardized signature scheme would survive such a scenario. For high-assurance environments (government, critical infrastructure), deploying SLH-DSA alongside ML-DSA provides two independent lines of defense.

SLH-DSA Tradeoffs

The conservative security of hash-based signatures comes at a cost: larger signatures and slower signing.

Parameter SetNIST LevelPublic KeySignatureSign TimeVerify Time
SLH-DSA-128s (small)132 B7,856 B~60 ms~3 ms
SLH-DSA-128f (fast)132 B17,088 B~8 ms~0.5 ms
SLH-DSA-192s348 B16,224 B~100 ms~5 ms
SLH-DSA-256f564 B49,856 B~20 ms~1 ms

SLH-DSA signatures range from ~8 KB to ~50 KB—10x to 100x larger than ML-DSA. Signing is milliseconds rather than microseconds. For real-time authentication at scale, ML-DSA is the clear choice. SLH-DSA is best suited for use cases where signing is infrequent and verification latency is less critical: firmware updates, code signing, root certificate authorities, and long-lived document signatures.

Key Size Comparison: Classical vs. Post-Quantum

The most visible practical impact of PQC migration is increased key and signature sizes. This table provides a direct comparison across the most commonly used algorithms.

AlgorithmTypePublic KeyPrivate KeySignature / CTPQ-Secure
RSA-2048Signature / KEM256 B~1,200 B256 BNo
ECDSA P-256Signature64 B32 B64 BNo
X25519Key exchange32 B32 B32 BNo
ML-KEM-768Key encapsulation1,184 B2,400 B1,088 BYes
ML-DSA-65Signature1,952 B4,032 B3,309 BYes
SLH-DSA-128fSignature32 B64 B17,088 BYes

Key sizes increase by roughly 5–50x compared to ECC. ML-KEM-768 public keys are 1,184 bytes versus 32 bytes for X25519. ML-DSA-65 signatures are 3,309 bytes versus 64 bytes for Ed25519. For most network protocols, this is a minor bandwidth increase. For constrained devices (IoT, embedded), it requires careful engineering.

CNSA 2.0: The Migration Deadlines

The NSA's Commercial National Security Algorithm Suite 2.0, published in September 2022, establishes mandatory migration deadlines for all National Security Systems (NSS). These deadlines cascade into the broader federal supply chain and increasingly influence private-sector procurement requirements.

CategorySupport & Prefer ByExclusive Use By
Software & firmware signing20252030
Web browsers, servers, cloud20252033
Traditional networking (VPNs, routers)20262030
Operating systems20272033
Constrained devices, large PKI20302033

NIST IR 8547 (November 2024) reinforces these deadlines at the federal level: all classical public-key cryptography (RSA, ECDSA, ECDH) will be deprecated after 2030 and disallowed in federal systems after 2035.

Private Sector Impact

If you sell to the U.S. government, CNSA 2.0 compliance is not optional. If you handle healthcare data (HIPAA), financial data (SOX/PCI), or operate critical infrastructure, regulatory pressure to adopt PQC is already building. Starting January 1, 2027, all new National Security System equipment acquisitions must be CNSA 2.0-compliant by default. The procurement pipeline means you need PQC support today to be certified by then.

The Hybrid Approach: Classical + PQ for Defense in Depth

During the transition period, the recommended practice is hybrid cryptography—combining a classical algorithm (X25519/ECDH for key exchange, ECDSA/Ed25519 for signatures) with a PQC algorithm (ML-KEM for key exchange, ML-DSA for signatures) such that the system remains secure if either algorithm is broken.

Why Hybrid?

Conceptual hybrid_key_exchange.pseudo
// Hybrid key exchange: X25519 + ML-KEM-768
// Session key is secure if EITHER algorithm holds

let (x25519_shared)   = x25519_dh(my_ecdh_sk, peer_ecdh_pk);
let (mlkem_shared, ct) = ml_kem_768_encapsulate(peer_mlkem_pk);

// Combine both shared secrets into one session key
let session_key = HKDF-SHA-384(
    x25519_shared || mlkem_shared,
    "hybrid-tls-session-key"
);

// Even if ML-KEM is broken classically -> X25519 protects
// Even if X25519 is broken by Shor's -> ML-KEM protects

Chrome, Firefox, and Cloudflare have already deployed hybrid ML-KEM + X25519 key exchange in TLS 1.3. If you are browsing the web on a modern browser, you may already be using post-quantum key exchange without knowing it.

H33's Post-Quantum Stack: Production Numbers

H33's authentication infrastructure is fully post-quantum by construction. Every component in the single-API-call pipeline uses quantum-resistant algorithms.

H33 Production Pipeline (Single API Call)

FHE Biometric (BFV lattice, 32 users/batch)~1,375 µs
ZKP STARK Lookup0.067 µs
Dilithium Attestation (sign + verify)~240 µs
Total (32-user batch)~1,615 µs
Per-authentication latency~50 µs

The key architectural decisions:

Rust pq_attestation.rs
// Post-quantum attestation -- Dilithium sign + verify per batch
let batch_digest = sha3_256(&batch_results);

// Sign with ML-DSA-65 (Dilithium3)
let signature = dilithium_sign(&signing_key, &batch_digest);
// ~120 us sign time

// Verify (any party can verify with the public key)
let valid = dilithium_verify(&verify_key, &batch_digest, &signature);
// ~120 us verify time
// Total: ~240 us for 32-user batch attestation

assert!(valid, "Attestation signature invalid");

Migration Roadmap for Organizations

PQC migration is not a weekend project. It requires systematic inventory, prioritization, testing, and phased deployment. Here is a concrete roadmap based on best practices from NIST, CNSA 2.0, and real-world enterprise migrations.

Phase 1: Inventory and Assessment (Months 1–3)

Phase 2: Hybrid Deployment for Data in Transit (Months 3–9)

Phase 3: Signature and Certificate Migration (Months 9–18)

Phase 4: Data at Rest and Full Cutover (Months 18–36)

Cryptographic Agility Is the Real Goal

The specific algorithms will evolve. FIPS 206 (FN-DSA / FALCON) and HQC are still in the pipeline. Future breakthroughs may require algorithm changes. The most valuable outcome of PQC migration is not deploying any single algorithm—it is building cryptographic agility into your infrastructure so that algorithm swaps can be performed without re-architecting your systems.

Common Misconceptions

PQC discussions are plagued by misunderstandings that lead to either complacency or panic. Here are the most common ones, corrected.

"Quantum computers break everything"

False. Quantum computers break public-key crypto based on factoring and discrete log (RSA, ECC, DH). Symmetric ciphers (AES-256) and hash functions (SHA-3) are safe with minor key-size increases. PQC replaces only the public-key component.

"We have plenty of time"

Dangerous assumption. Even if CRQCs are 10–15 years away, Mosca's inequality shows that data with long shelf lives is already at risk from HNDL attacks. Migration itself takes 3–7 years for large organizations. The math says act now.

"Just increase RSA key sizes"

Does not work. Shor's algorithm provides an exponential speedup. Doubling the RSA key size only doubles the quantum runtime. To resist Shor's, you would need RSA keys of astronomical size—far beyond practical use. Fundamentally different math is required.

"PQC is too slow for production"

Outdated. ML-KEM key exchange is faster than both RSA and ECDH. ML-DSA signing at ~240 microseconds is faster than RSA-2048 signatures. H33 runs 1.2 million PQ-authenticated calls per second on a single server. Performance is not a blocker.

"Nobody is attacking yet"

Wrong. Harvest Now, Decrypt Later attacks are already underway. Intelligence agencies from multiple nations are collecting encrypted traffic today for future quantum decryption. The attack that matters is passive and invisible—you will not know until the data is exposed decades from now.

"We'll wait for quantum computers to exist"

By then it is too late. Data harvested before your migration is permanently compromised. The entire point of PQC is to protect data before CRQCs arrive. Retroactive protection is impossible for data already in transit.

The Bottom Line

Post-quantum cryptography is not a research curiosity. It is a finalized set of federal standards (FIPS 203, 204, 205) with hard migration deadlines (CNSA 2.0, NIST IR 8547). The algorithms are fast—ML-KEM outperforms ECDH, and ML-DSA outperforms RSA. The threat is real—Shor's algorithm will break every RSA and ECC key ever generated, and adversaries are already harvesting encrypted data for future decryption.

The four families of PQC (lattice, hash, code, multivariate) provide defense in depth against both quantum and classical attacks. The hybrid approach ensures security even if one algorithm falls. And the migration itself—while substantial—is well-defined, with production-grade implementations available today in every major TLS library, browser, and cloud provider.

The question is no longer whether to migrate to post-quantum cryptography. The question is whether your organization will complete the migration before the data you are protecting today becomes worthless.


H33 provides post-quantum authentication infrastructure with BFV lattice-based FHE biometric processing, ML-DSA digital signatures (~240 microseconds), and ML-KEM key exchange—all in a single API call at ~50 microseconds per authentication. Every component in the stack is quantum-resistant by construction, not by policy. 1.2 million auths/sec sustained on Graviton4.

Start Building With Post-Quantum Security

FHE biometrics, ML-DSA attestation, and ML-KEM key exchange. One API call. ~50µs per authentication. Quantum-safe by construction.

Get Free API Key → Read the Docs
Free tier · 10,000 API calls/month · No credit card required