A decision made today on encrypted data needs to be verifiable seven years from now. That is not a theoretical requirement. It is a regulatory one. Financial institutions must retain decision records for seven years under most banking regulations. Healthcare records carry even longer retention requirements. Government records may be retained indefinitely.
Every one of those decision records is signed. And every one of those signatures -- if made with RSA or ECDSA -- will be forgeable by a sufficiently large quantum computer. The timeline for that computer is debated, but the consensus narrows every year. NIST did not standardize post-quantum cryptography because quantum computers are impossible. They standardized it because quantum computers are inevitable.
This is the problem H33's Post-Quantum Decision Engine solves. Not just encrypted computation. Not just encrypted inference. Attested decisions -- cryptographic proofs that a specific decision was made on specific encrypted data according to a specific policy -- that remain tamper-proof for the lifetime of the record, even against quantum adversaries.
The Full Stack
The PQ Decision Engine is a three-layer stack. Each layer serves a distinct purpose, and the layers compose into a system that takes encrypted data as input and produces attested decisions as output.
Layer 1: CKKS Inference (Encrypted Classification). The first layer handles the machine learning computation. Input data -- encrypted floating-point features -- passes through a neural network evaluated in the CKKS homomorphic encryption scheme. CKKS supports approximate arithmetic on encrypted real numbers, making it suitable for dense matrix operations like those in neural network layers. The output is a vector of encrypted classification scores.
Layer 2: TFHE Decision (Encrypted Control Flow). The second layer converts classification scores into discrete decisions. The encrypted score vector from CKKS is bridged into the TFHE domain, where bit-level operations enable precise threshold comparisons, argmax tournaments, and band classifications. TFHE's programmable bootstrapping refreshes noise at each operation, enabling deep decision circuits without accuracy degradation. The output is an encrypted decision: a discrete action identifier.
Layer 3: H33-74 Attestation (Post-Quantum Proof). The third layer produces a 74-byte attestation that binds the decision to the computation. The attestation is signed with three post-quantum signature schemes based on three independent hardness assumptions: ML-DSA-65 (Module Lattice Digital Signature Algorithm, based on MLWE), FALCON-512 (based on NTRU lattice short vector problems), and SLH-DSA (Stateless Hash-based Digital Signature Algorithm, based on hash function security). The attestation breaks if and only if MLWE lattices, NTRU lattices, AND hash functions are ALL broken simultaneously -- three independent mathematical bets.
No decryption occurs between layers. The data enters encrypted and the attestation exits as a proof. The server never holds a secret key. The server binary does not contain decrypt functions.
Why Three Hardness Assumptions
The standard approach to post-quantum cryptography is to pick one NIST-standardized scheme and use it. ML-DSA is the most common choice -- it is fast, well-studied, and has clean implementations. Why use three?
The answer is survivability over time. ML-DSA's security rests on the hardness of the Module Learning With Errors (MLWE) problem. If a mathematical breakthrough makes MLWE tractable -- not just by quantum computers, but by any advance in lattice cryptanalysis -- every ML-DSA signature ever created becomes forgeable. Every decision record signed with ML-DSA loses its integrity guarantee.
This is not paranoia. It is history. The discrete logarithm problem was considered intractable for decades until index calculus methods dramatically reduced its effective difficulty. RSA key sizes have been ratcheted upward multiple times as factoring algorithms improved. Cryptographic assumptions are provisional -- they hold until they do not.
By combining three signature schemes based on three independent mathematical assumptions, H33-74 creates a defense-in-depth strategy for decision records. An attacker who breaks MLWE still faces NTRU lattices (a structurally different lattice problem) and hash function preimage resistance (a completely different mathematical domain). Breaking all three simultaneously requires advances in three independent areas of mathematics -- a dramatically lower probability than breaking any single one.
This matters most for long-retention records. A financial decision made today and retained for seven years faces seven years of potential cryptanalytic progress. A government record retained for thirty years faces three decades. The three-assumption approach does not make the attestation unbreakable in some absolute sense -- nothing is. But it makes the probability of a simultaneous break across all three assumptions negligibly small for any practical retention period.
74 Bytes: Distillation, Not Compression
A raw three-key PQ signature bundle is large. ML-DSA-65 produces a 3,309-byte signature. FALCON-512 produces a 690-byte signature. SLH-DSA signatures range from 7,856 to 49,856 bytes depending on the parameter set. Even taking the smallest options, a naive concatenation of three PQ signatures would be several kilobytes.
H33-74 distills these three signatures into 74 bytes. This is not compression in the traditional sense -- you cannot decompress the 74 bytes back into the original signatures. It is distillation: the 74-byte attestation captures the cryptographic binding between the decision and the three signature schemes without storing the full signature material.
The distinction matters. Compression implies reversibility -- you can recover the original from the compressed form. Distillation is a one-way process that preserves the essential property (verifiability) while discarding the bulk material. The 74-byte attestation can be verified by any system with the public keys. It cannot be expanded back into the original signatures because it was never a compressed version of them.
The practical impact is enormous. Consider a financial institution processing decisions at scale. At 2,293,766 decisions per second, a naive three-key signature bundle would generate terabytes of signature data per day. At 74 bytes per attestation, the same throughput generates gigabytes -- a 78x reduction in storage requirements (compared to a minimal 42 bytes for the compact receipt versus 3,309 bytes for a single ML-DSA-65 signature alone).
For seven-year retention requirements, this is the difference between a storage architecture that is feasible and one that is not. A bank that processes millions of decisions per day cannot afford to store kilobytes of signature data per decision for seven years. It can afford to store 74 bytes.
Cross-System Portability
The 74-byte attestation is self-contained. It does not require access to the original data, the model, the policy engine, or any secret key to verify. Any system with the public verification keys can confirm that the attestation is valid -- that a real decision was made by a real computation on real encrypted data.
This enables a new pattern in multi-party decision systems: compute once, trust everywhere. Consider a credit decision workflow involving three parties:
Bank A runs the encrypted credit model on a customer's encrypted financial data. The model classifies the customer. The policy engine produces a decision. The attestation engine generates a 74-byte proof. Total computation: one model evaluation, one decision circuit, one attestation. Bank A holds the data, the model, and the policy. The attestation is the only artifact that leaves Bank A's boundary.
Credit Bureau B receives the attestation. Bureau B verifies it -- a fast, public-key operation. The verification confirms that Bank A's decision was validly computed and PQ-attested. Bureau B updates its records based on the attested decision. Bureau B never sees the customer's data, never runs the model, and never evaluates the policy. Bureau B trusts the attestation because the mathematics guarantees it.
Regulator C audits the decision. The regulator receives the same 74-byte attestation. The regulator verifies it and confirms that the decision was made according to an attested policy on encrypted data that was never exposed. The regulator does not need to access the customer's financial records. The regulator does not need to inspect Bank A's proprietary model. The attestation is sufficient.
Three parties. One computation. One 74-byte attestation. Zero data sharing. This is what cross-system portability means in practice.
The Quantum Threat to Decision Records
Let us be precise about what quantum computers threaten. A large-scale fault-tolerant quantum computer running Shor's algorithm can factor large integers and compute discrete logarithms in polynomial time. This breaks RSA and ECDSA -- the signature schemes used by virtually every production system today.
What does it mean to break a signature scheme? It means an adversary can forge signatures. They can create a valid-looking signature on any message without the signer's private key. Applied to decision records, this means an adversary can retroactively forge decision attestations -- creating false records that claim decisions were made when they were not, or altering the attested outcome of real decisions.
This is the "harvest now, decrypt later" threat applied to integrity rather than confidentiality. An adversary does not need a quantum computer today. They need only collect signed decision records today and wait for quantum capability. Once they have it, they can forge signatures on any record they collected.
For decision records with multi-year retention, this is an active threat today -- not because quantum computers exist today, but because the records exist today and will be retained into the quantum era. A credit decision signed with ECDSA in 2026 and retained until 2033 is vulnerable to any quantum computer that becomes operational before 2033. A government decision retained for thirty years is vulnerable to any quantum computer operational before 2056.
PQ attestation eliminates this threat. A decision record attested with H33-74 in 2026 remains tamper-proof in 2033, 2056, and beyond -- because the attestation's security does not depend on the hardness of integer factorization or discrete logarithms. It depends on lattice problems and hash functions, which are believed to be resistant to quantum algorithms.
Regulatory Alignment
The regulatory landscape is converging on post-quantum requirements. NIST finalized its PQ standards (FIPS 203, 204, and 205) in 2024. The NSA's CNSA 2.0 timeline requires PQ cryptography for national security systems by 2030. The European Union's cybersecurity regulations are being updated to reference NIST PQ standards. Financial regulators in the US, UK, and EU have issued guidance on quantum risk assessment for critical systems.
For decision records specifically, the regulatory logic is straightforward. If a regulation requires that decision records be tamper-proof for N years, and quantum computers may become operational within N years, then the decision records must be PQ-protected. The H33-74 attestation satisfies this requirement by design.
Moreover, the three-assumption approach exceeds the minimum regulatory requirements. NIST PQ standards are satisfied by a single PQ scheme. H33-74 uses three. This provides a margin of safety that anticipates regulatory tightening -- as quantum capabilities advance, regulators may require defense-in-depth approaches rather than single-scheme protection. Institutions using H33-74 are already ahead of that curve.
The Attestation Structure
The 74-byte attestation encodes several commitments in a compact format. Without disclosing the full internal structure (which is covered by pending patent claims), the attestation binds together:
- A commitment to the input data (proving the decision was made on specific encrypted inputs)
- A commitment to the computation (proving the model and policy were evaluated correctly)
- A commitment to the decision outcome (proving what the decision was, without revealing it in plaintext)
- A temporal binding (proving when the decision was made)
- A policy binding (proving which policy governed the decision)
These commitments are distilled from the three PQ signature schemes. Each scheme signs over the same commitment structure, and the three signatures are distilled into the compact 74-byte format. Verification reconstructs the commitment structure and checks it against the public keys for all three schemes.
The 74 bytes break down as 32 bytes stored on-chain (for anchoring and public verifiability) and 42 bytes stored in the Cachee layer (for fast retrieval and cross-system portability). The on-chain component provides an immutable timestamp and public anchor. The Cachee component provides the full verification material needed for cross-system trust.
Performance at Scale
The PQ Decision Engine is not a research prototype. It is a production system. The full three-layer pipeline -- CKKS inference, TFHE decision, H33-74 attestation -- processes 2,293,766 decisions per second at 38 microseconds per decision on production hardware.
The attestation layer is not the bottleneck. At 74 bytes per attestation, the memory bandwidth required for attestation generation is negligible compared to the FHE computation. The signing operations for the three PQ schemes are parallelized across the three independent key hierarchies, and the distillation step is a lightweight hash computation.
At 2,293,766 decisions per second with 74 bytes per attestation, the system produces approximately 170 megabytes of attestation data per second. Over a full day, that is approximately 14.7 terabytes of attested decisions -- roughly 170 million decisions per hour, each individually PQ-secured with three independent hardness assumptions.
For comparison, the same throughput with undistilled three-key PQ signatures would produce approximately 13.3 terabytes of signature data per second -- a physically impossible storage requirement. The 78x reduction from distillation is not an optimization. It is what makes the system viable at scale.
Decision Records as Durable Assets
When decision records are PQ-attested, they become durable assets rather than ephemeral artifacts. A seven-year-old attestation carries the same cryptographic weight as a fresh one. It cannot be forged by classical or quantum adversaries. It can be verified by any system at any time. It proves exactly what decision was made, when, on what data, under what policy.
This durability creates new possibilities for regulatory compliance, legal proceedings, and institutional trust. An institution that can produce verifiable decision records from any point in its history has a fundamentally stronger compliance posture than one relying on database logs that could have been altered.
Consider a legal dispute over a credit decision made five years ago. The plaintiff claims the decision was discriminatory. The bank produces the H33-74 attestation from five years ago. The attestation proves: the decision was made by a specific model evaluated on encrypted data, the data was never exposed (so no human bias could have entered through data inspection), the policy that governed the decision was the one attested (not a different policy substituted after the fact), and the decision outcome was the one attested (not altered retroactively).
This is not conclusive proof of fairness -- the model itself could encode biases. But it is conclusive proof of process integrity: the decision followed the stated policy on the actual data. Combined with separate model auditing (which can be done on the plaintext model without accessing individual records), this creates a complete accountability framework.
Three Markets
The PQ Decision Engine addresses three distinct markets, each with different value propositions.
Regulated Finance. Banks, insurers, and asset managers face the intersection of quantum risk and regulatory retention. Their decision records must be tamper-proof for seven-plus years, and quantum computers may arrive within that window. The PQ Decision Engine provides quantum-durable attestations that satisfy both current and anticipated regulatory requirements. The cross-system portability enables efficient multi-party workflows (credit decisions that flow between banks, bureaus, and regulators) without data sharing.
Healthcare. Medical decision records carry even longer retention requirements and higher sensitivity. A diagnostic classification, a treatment recommendation, a triage decision -- these records are both deeply personal and legally critical. The PQ Decision Engine enables encrypted diagnostic systems that produce attested decisions without exposing patient data, and the attestations remain verifiable for the lifetime of the record.
Government and Defense. Government decision records may be retained indefinitely and face the most sophisticated adversaries. The three-assumption approach provides defense-in-depth against state-level cryptanalytic capabilities. The cross-system portability enables interagency trust without interagency data sharing -- a long-standing challenge in government information systems.
Comparison to Single-Scheme Approaches
Most PQ implementations in the market use a single NIST-standardized scheme -- typically ML-DSA (Dilithium). This is a reasonable starting point. ML-DSA is fast, well-analyzed, and has clean reference implementations. For many applications, a single scheme is sufficient.
The three-scheme approach is appropriate when the decision records must be durable for years or decades. The additional cost of three-scheme signing is minimal (the FHE computation dominates the pipeline latency by orders of magnitude). The additional storage of 74 bytes versus approximately 20-30 bytes for a single-scheme compact receipt is modest. And the additional security margin -- requiring simultaneous breaks in three independent mathematical assumptions rather than one -- is substantial.
The decision of whether to use one scheme or three is a risk management question, not a technical one. For short-lived attestations (session tokens, real-time access decisions), a single scheme is fine. For long-lived decision records (credit decisions, medical records, legal documents), the three-scheme approach provides a risk margin that justifies its minimal additional cost.
Integration with Existing Systems
The PQ Decision Engine does not require greenfield deployment. It integrates with existing systems through a standard API. An existing ML pipeline sends encrypted features to the inference endpoint and receives an attested decision token. The token is a self-contained 74-byte blob that can be stored in any database, transmitted through any message queue, and verified by any system with the public keys.
The integration surface is deliberately minimal. There is one endpoint for submitting encrypted inputs and receiving attested decisions. There is one endpoint for verifying attestations. There are no complex state management requirements, no long-lived sessions, and no bidirectional dependencies. A system integrates by sending encrypted data and storing the 74-byte result.
This design reflects a philosophy: the PQ Decision Engine is a decision authority, not a platform. It does not want to own the data pipeline, the ML training infrastructure, or the business logic. It wants to receive encrypted features, produce attested decisions, and get out of the way. The 74-byte attestation is the only artifact it produces, and the only artifact downstream systems need.
Looking Forward
The quantum threat is not speculative. NIST standardized PQ cryptography. NSA mandated PQ migration timelines. Every major government and financial institution has a quantum risk assessment program. The question is not whether PQ protection is needed, but when the migration happens.
For decision records, the answer is now. A decision record signed today with ECDSA and retained for seven years is already at risk -- because the record is being created within the quantum threat horizon. Waiting to migrate means every decision record created during the delay period is unprotected.
The PQ Decision Engine makes the migration practical. It does not require replacing existing ML infrastructure. It does not require retraining models. It does not require restructuring data pipelines. It adds a PQ attestation layer on top of encrypted computation, producing durable 74-byte proofs that travel with decisions across systems and across time.
74 bytes. Three independent hardness assumptions. Any computation. Post-quantum attested. For the lifetime of the record.