Recursive ZK proofs are proofs that verify other proofs. This seemingly circular concept enables remarkable scalability: compress unlimited computation into a single, constant-size proof. In a world where authentication systems must handle millions of requests per second, recursive composition is the mechanism that keeps verification costs flat regardless of how many operations you batch together.
The Recursion Concept
Consider a chain of proofs:
- Proof1 verifies computation C1
- Proof2 verifies C2 AND that Proof1 is valid
- Proof3 verifies C3 AND that Proof2 is valid
- ...
The final proof verifies the entire chain. A verifier checking Proof3 confirms C1, C2, and C3 -- all at once, in constant time. No matter how deep the chain grows, the verifier's work stays the same. This is the core insight: verification cost is decoupled from computation depth.
Verify a million computations by checking one proof. Each recursive step adds only constant overhead. At H33, our ZKP lookup layer resolves in 0.085 microseconds per query using an in-process DashMap cache, enabling 2,172,518 authentications per second on production hardware.
How It Works
The circuit for recursive verification encodes three operations into a single proof step: verify the previous proof, execute a new computation, and emit the updated state as public output. Below is a simplified template illustrating the structure:
template RecursiveStep() {
// Previous proof (public)
signal input previousProof;
signal input previousPublicInputs;
// New computation (private)
signal private input newComputation;
// Output
signal output newAccumulatedState;
// 1. Verify previous proof
component verifier = Verifier();
verifier.proof <== previousProof;
verifier.publicInputs <== previousPublicInputs;
verifier.valid === 1;
// 2. Execute new computation
component step = ComputationStep();
step.previousState <== previousPublicInputs.state;
step.input <== newComputation;
// 3. Output new state
newAccumulatedState <== step.newState;
}
The critical detail is step 1: the SNARK verifier algorithm itself must be expressed as an arithmetic circuit. This means elliptic curve operations, hash evaluations, and field arithmetic all become constraint rows. The verifier circuit's size directly determines how expensive each recursive step is to prove.
Technical Challenges
Verification Circuit Size
The SNARK verifier itself must be expressed as a circuit. For SNARKs with pairing operations (such as Groth16), this is expensive -- a single pairing check can require hundreds of thousands of R1CS constraints. This cost makes naive recursion impractical for pairing-based schemes.
The Cycle-of-Curves Approach
One classical solution uses cycles of elliptic curves, where the base field of one curve equals the scalar field of the other. This means the verifier for proofs on curve A can be written efficiently as a circuit over curve B, and vice versa. The Pasta curves (Pallas and Vesta), introduced by the Halo project, are the best-known example of this technique. Each recursive step alternates between curves, keeping circuit sizes manageable.
Solutions at a Glance
- Cycles of curves -- one curve's verifier is efficient in another's field
- Accumulation schemes -- defer expensive verification checks to the very end
- Folding schemes -- combine multiple proof instances into one without full verification
Accumulation and Folding
Modern approaches avoid full verification at each step, dramatically reducing per-step overhead.
Accumulation works by collecting verification obligations into a running "accumulator." Instead of checking each proof immediately, the prover appends a lightweight commitment to the accumulator. Only at the final step does a verifier perform the full check on the accumulated object. This amortizes the heavy pairing or polynomial commitment work across the entire chain.
Folding takes a different approach: it combines two R1CS instances into a single instance of the same size. The key insight, introduced by the Nova protocol, is that folding requires only a single group scalar multiplication per step -- orders of magnitude less expensive than running an entire SNARK verifier circuit. The "relaxed R1CS" formulation absorbs cross-terms from the linear combination, keeping the folded instance well-formed.
Folding-based recursion (Nova, SuperNova, HyperNova) reduces per-step prover cost from millions of constraints to a handful of operations. This is analogous to how H33's BFV FHE pipeline processes 32 users per ciphertext in ~1,109 microseconds -- batching amortizes the expensive operation so each individual authentication costs only ~42 microseconds.
IVC and PCD: The Formal Frameworks
Incrementally Verifiable Computation (IVC)
IVC formalizes the idea of a long-running computation where any observer can verify the current state without re-executing all prior steps. The prover maintains a running proof that grows by one step at a time. Each step proves: "I correctly applied function F to the previous state, and the previous state was itself correctly derived." The resulting proof is always constant-size, regardless of how many steps have elapsed.
Proof-Carrying Data (PCD)
PCD generalizes IVC from a single sequential chain to an arbitrary directed acyclic graph. Multiple independent computation threads can each maintain their own proofs, and those proofs can be merged at junction points. This is essential for distributed systems where different nodes perform different computations that must eventually be reconciled into a single verifiable state.
| Framework | Topology | Use Case | Proof Size |
|---|---|---|---|
| IVC | Linear chain | Sequential computation, rollups | Constant |
| PCD | DAG (tree / graph) | Distributed consensus, multi-party | Constant |
Applications
Blockchain Scaling (Rollups)
- Aggregate thousands of transactions into a single recursive proof
- Single proof posted to the main chain, regardless of batch size
- Constant verification cost -- the L1 contract checks one proof whether the batch contains 10 or 10,000 transactions
Incrementally Verifiable Computation (IVC)
- Long-running computations with periodic checkpoints
- Anyone can verify current state quickly without replaying history
- Applicable to machine learning inference audits and regulatory compliance — see how computing on encrypted data enables privacy-preserving verification
Proof Aggregation
- Combine proofs from different sources or different proof systems
- Single verification confirms all bundled proofs at once
- Efficient for batch verification in high-throughput authentication pipelines
Notable Systems
| System | Technique | Key Property |
|---|---|---|
| Nova | Folding (relaxed R1CS) | Minimal per-step cost -- one MSM per fold |
| Halo / Halo2 | Accumulation (IPA) | No trusted setup, Pasta curve cycle |
| Plonky2 | FRI + Poseidon hashing | Very fast recursive proofs over Goldilocks field |
| Mina | Pickles (Pasta curves) | Constant-size blockchain (~22 KB) |
| SuperNova | Non-uniform folding | Different circuits per step (branching programs) |
Performance Characteristics
Recursive proof systems achieve a distinctive performance profile where prover work scales linearly with computation depth, but verifier work remains constant:
- Each step: ~1-10 seconds proving time (depending on circuit size and scheme)
- Final verification: milliseconds, constant regardless of chain depth
- Proof size: constant -- typically a few hundred bytes to a few kilobytes
- Folding step (Nova): single multi-scalar multiplication, sub-millisecond
Recursive ZK proofs are to verification what SIMD batching is to FHE: the mechanism that transforms per-item cost into amortized cost. H33 applies the same principle across its full stack -- 32 biometric templates packed into one BFV ciphertext, one Dilithium attestation per batch, one DashMap lookup per cached ZKP -- to sustain 1.595 million authentications per second at ~42 microseconds each.
Post-Quantum Considerations
Most current recursive proof systems rely on elliptic curve assumptions vulnerable to quantum attack. Hash-based STARKs offer a post-quantum alternative, but their larger proof sizes (tens to hundreds of kilobytes) make naive recursion expensive. Active research on lattice-based SNARKs and hybrid approaches aims to close this gap. At H33, the ZKP verification layer uses SHA3-256 hashing with a post-quantum-secure DashMap lookup cache, ensuring the proof verification path remains quantum-resistant even as the underlying proof systems evolve.
Recursive ZK proofs unlock unlimited scalability. Verify a billion operations with the same cost as verifying one. As folding schemes mature and post-quantum constructions become practical, recursive composition will be the foundational primitive powering the next generation of verifiable computation -- from blockchain rollups to privacy-preserving authentication at scale.
Ready to Go Quantum-Secure?
Start protecting your users with post-quantum authentication today. 1,000 free auths, no credit card required.
Get Free API Key →