← Blog
April 28, 2026 · Security

Why Every Biometric Vendor Decrypts to Compare — And Why That's a Breach Waiting to Happen

Every server-side biometric system in production today follows the same pattern: encrypt at rest, decrypt to compare, re-encrypt after. The comparison step — the moment the plaintext biometric exists in server memory — is the vulnerability window that no amount of perimeter security can close. H33 eliminates it entirely. The server computes the match on encrypted ciphertext. The plaintext biometric never exists on the server. Not at rest, not in transit, not during computation.


The Industry's Dirty Secret

Pick any enterprise biometric vendor. Read their security whitepaper. You will find a consistent architecture: biometric templates are encrypted at rest using AES-256 or similar, stored in a secured database, and then — at the moment of authentication — decrypted into server memory so the matching algorithm can compute a similarity score against the enrollment template.

This is the standard. It is the universal standard. Every major cloud biometric service, every enterprise identity platform, every government ID system that performs server-side matching follows this pattern. The template must be plaintext for the comparison function to operate on it.

Apple's Face ID is the notable exception — it performs matching entirely on the device's Secure Enclave and never transmits the template. But Face ID is a local authentication mechanism. It does not solve the server-side matching problem that every enterprise, every government agency, and every multi-device authentication system must address.

The moment the template is decrypted for comparison, it exists in plaintext in process memory. If an attacker has achieved memory access on the server — via kernel exploit, side-channel, memory dump, cold boot, or compromised hypervisor — they obtain the raw biometric. Not an encrypted blob. Not a hash. The actual biometric template that can reconstruct the user's face, fingerprint, or voice characteristics.

Biometrics are permanent. You cannot reset your fingerprint. You cannot change your retinal pattern. You cannot get a new face. Every breach of a plaintext biometric database creates permanent, irrevocable identity compromise for every affected user. There is no remediation. There is no "change your password." The damage is forever.

What Decryption During Matching Actually Means

To understand the vulnerability, walk through the standard biometric authentication flow step by step:

  1. Enrollment. The user provides a biometric sample (face scan, fingerprint, voice recording). The system extracts a feature vector — a numerical template — and encrypts it with a server-side key. The encrypted template is stored in a database. At this point, the data is protected.
  2. Authentication request. The user provides a new biometric sample. The system extracts a fresh feature vector and sends it to the server.
  3. Template retrieval. The server retrieves the stored encrypted template from the database and decrypts it using the server-side key. The plaintext enrollment template now exists in memory.
  4. Comparison. The server computes a similarity score between the fresh sample and the decrypted enrollment template. This is typically a cosine similarity or Euclidean distance calculation. Both templates — enrollment and fresh — are plaintext in memory during this operation.
  5. Decision. If the similarity score exceeds the threshold, authentication succeeds. The plaintext templates are (hopefully) zeroed from memory.
  6. Re-encryption. The enrollment template is re-encrypted and returned to storage.

Steps 3 through 5 are the vulnerability window. The duration depends on the matching algorithm, the number of concurrent authentications, and the server's memory management. For a system handling thousands of authentications per second, plaintext biometric templates are continuously present in server memory.

TEEs Are Not a Solution

The industry's response to this vulnerability has been Trusted Execution Environments: Intel SGX, ARM TrustZone, AMD SEV. The pitch is simple — perform the decryption and comparison inside a hardware enclave, where even the operating system cannot access the memory.

TEEs are not a cryptographic guarantee. They are a hardware trust assumption. And hardware trust assumptions get broken:

Every one of these attacks extracted data that was supposed to be protected by hardware isolation. TEEs raise the bar. They do not eliminate the vulnerability. The plaintext biometric still exists — it just exists inside an enclave instead of in general memory. When the enclave is compromised, the data is exposed.

The fundamental problem is not where the decryption happens. The fundamental problem is that the decryption happens at all.

FHE Inner Product on Ciphertext

H33 takes a different approach. The biometric template is encrypted client-side using BFV fully homomorphic encryption. The encrypted template is sent to the server. The server stores it encrypted. When authentication is requested, the server computes the similarity score directly on the encrypted ciphertext — without ever decrypting either template.

Here is the flow:

  1. Enrollment. The client extracts a biometric feature vector and encrypts each component using BFV with the client's public key. The encrypted template (a vector of BFV ciphertexts) is sent to the server and stored. The server never possesses the decryption key.
  2. Authentication request. The client extracts a fresh biometric feature vector and encrypts it with the same BFV public key. The encrypted fresh template is sent to the server.
  3. Encrypted matching. The server computes the inner product of the two encrypted vectors using FHE homomorphic multiplication and addition. The result is an encrypted similarity score — a BFV ciphertext that encodes the match result. The server performs this computation without accessing any plaintext values.
  4. Encrypted response. The server returns the encrypted similarity score to the client.
  5. Client-side decision. The client decrypts the similarity score using its private key and applies the threshold locally. Authentication succeeds or fails based on the decrypted score.

At no point does the server hold a plaintext biometric. Not during enrollment. Not during storage. Not during matching. Not during the decision. The decryption key never leaves the client. The server operates exclusively on ciphertext.

This is not encrypt-at-rest. This is encrypt-always. The server is cryptographically prevented from accessing the biometric data, not just policy-prevented. Even if the server is fully compromised — root access, memory dumps, everything — the attacker obtains BFV ciphertexts. Without the client's private key, these are computationally indistinguishable from random polynomial vectors.

What a Breach Gets You

Consider two scenarios: a traditional biometric server is breached, and an H33 biometric server is breached. Both scenarios assume full server compromise — the attacker has root access, can dump memory, and can exfiltrate any data on the machine.

What the attacker obtainsTraditionalH33 (FHE)
Stored templatesPlaintext (AES key on same server)BFV ciphertexts
Templates in memory during matchPlaintextBFV ciphertexts
Match scoresPlaintext float valuesEncrypted BFV ciphertext
Decryption keysOn server (must be, for decryption)Never on server
Biometric reconstructionPossible from templatesComputationally infeasible
Replay attackRe-use extracted templateCiphertext is non-deterministic
Damage durationPermanent (biometrics don't change)None (no biometric data obtained)

With a traditional system, a breach yields the biometric itself. The attacker can reconstruct face geometry, fingerprint minutiae, or voice characteristics. They can replay the template against any system that accepts the same biometric format. The damage is permanent because the biometric cannot be revoked or rotated.

With H33, a breach yields BFV polynomial vectors. Each ciphertext is a pair of polynomials in Z_q[x]/(x^N + 1) where N=4096 and q is a 56-bit modulus. Without the secret key, extracting the plaintext from the ciphertext requires solving the Ring Learning With Errors (RLWE) problem — a lattice problem believed to be hard even for quantum computers. The ciphertexts are useless. No biometric data is exposed. No permanent damage occurs.

The Numbers

2,209,429
Encrypted biometric authentications per second · Sustained 120 seconds · Graviton4 c8g.metal-48xl

H33 performs encrypted biometric matching at 42 microseconds per authentication. That is the full pipeline: FHE inner product on the encrypted template pair, batch attestation with post-quantum signatures, and ZKP cached verification. Every authentication. Every time.

Pipeline StageLatency% of Pipeline
FHE batch inner product (32 users)943 µs70%
Batch attestation (SHA3 + Dilithium sign + verify)391 µs29%
ZKP cached verification (CacheeEngine)0.358 µs<1%
Total (32-user batch)1,345 µs100%
Per authentication42 µs

Three independent biometric modalities are supported — face, voice, and keystroke dynamics — all processed on encrypted ciphertext. Seventeen spoofing types are detected without decrypting the biometric: replay attacks, presentation attacks, synthetic face injection, voice synthesis, and twelve additional vectors, all evaluated homomorphically on the encrypted template.

Every match result is attested via H33-74: 74 bytes containing a three-family post-quantum signature (ML-DSA + FALCON + SLH-DSA) that permanently commits the authentication decision, the modality used, and the confidence score. The attestation is cryptographically bound to the computation — you cannot forge a match result without forging three independent post-quantum signatures simultaneously.

BIPA, GDPR, and CCPA: The Compliance Shift

The legal frameworks governing biometric data were written with one assumption: that the entity performing biometric matching possesses the plaintext biometric. FHE breaks that assumption.

Illinois BIPA

The Illinois Biometric Information Privacy Act (740 ILCS 14) imposes the strictest biometric data requirements in the United States. It requires written informed consent before collection, mandates a published retention and destruction schedule, and provides a private right of action with statutory damages of $1,000 per negligent violation and $5,000 per intentional or reckless violation.

BIPA defines "biometric identifier" as a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry. The critical question for FHE systems: is a BFV ciphertext that encodes a biometric template a "biometric identifier" under BIPA?

The server processes the ciphertext but cannot extract the biometric from it. The ciphertext is not a "scan of face geometry" — it is an encrypted polynomial that encodes face geometry in a form that is computationally inaccessible without the client's private key. The compliance posture shifts from "we protect your biometric data" to "we never possess your biometric data."

GDPR Article 32

GDPR Article 32 requires "appropriate technical and organisational measures" to ensure security appropriate to the risk, explicitly listing "encryption" as an example measure. FHE is the strongest possible implementation of this requirement because it extends encryption protection to the computation itself, not just storage and transit.

The Article 29 Working Party (now EDPB) has established that pseudonymization and encryption are key measures under the risk-based approach. Processing biometric data without ever accessing the plaintext represents the maximum possible technical safeguard — the data processor is cryptographically prevented from accessing the personal data, not merely contractually restricted.

CCPA / CPRA

The California Consumer Privacy Act and its successor (CPRA) classify biometric information as "sensitive personal information" subject to the right to limit use and disclosure. If the server-side processor never accesses the plaintext biometric, the "use" of the biometric data is limited by mathematics, not by policy. The consumer's biometric data is processed but never disclosed — even to the entity performing the processing.

This is not legal advice. These are observations about the interaction between FHE biometric processing and current regulatory frameworks. The legal interpretation of encrypted biometric processing under BIPA, GDPR, and CCPA is an emerging area. What is not emerging is the technical fact: FHE prevents the server from accessing the plaintext biometric. That technical fact changes the risk calculus under any framework.

Why This Matters Now

Biometric adoption is accelerating. The FIDO Alliance reports over 12 billion online accounts are passkey-enabled as of 2025. India's Aadhaar system holds biometric data for 1.4 billion people. The EU's Entry/Exit System will collect biometric data from every non-EU traveler. Financial institutions are deploying voice biometrics for call center authentication. Hospitals are using palm vein scanners for patient identification.

Every one of these systems creates a growing pool of permanently sensitive data. Every one of them follows the same architecture: encrypt at rest, decrypt to compare. Every one of them maintains a vulnerability window during matching that cannot be closed by firewalls, access controls, or TEEs.

The scale of the problem compounds the severity. A breach of a biometric database containing 10 million templates creates 10 million permanent identity compromises. There is no notification letter that fixes it. There is no credit monitoring that helps. The biometric is exposed forever.

The only architectural solution is to eliminate the plaintext from the server entirely. Not to protect it better. Not to isolate it in hardware. To ensure it never exists there in the first place. That is what FHE biometric matching provides.

The Three-Engine Advantage

Biometric authentication is not a single operation. It is a pipeline that requires different types of computation at different stages. H33 routes each stage to the optimal FHE engine automatically:

StageEngineOperationWhy This Engine
Template matchingBFV (H33-128)Inner product on encrypted vectorsExact integer arithmetic, no approximation error
Threshold decisionTFHEEncrypted comparison: "is similarity > 0.95?"Boolean/comparison operations on encrypted bits
AttestationH33-74Three-family PQ signature on match resultPermanent cryptographic proof of computation

The BFV engine handles the core similarity computation. BFV operates on exact integers — no approximation, no accumulated floating-point error across the inner product. The template vectors are quantized to integers during enrollment, and the inner product is computed exactly in the encrypted domain.

If the application requires an encrypted threshold decision — "is this person a match?" answered in the encrypted domain without revealing the similarity score — the TFHE engine handles the comparison. TFHE operates on encrypted bits and supports arbitrary Boolean circuits, including greater-than comparisons. The threshold decision is computed without decrypting the similarity score.

Every match — regardless of which engines are involved — produces an H33-74 attestation. The attestation binds the match result to the computation that produced it: the modality, the engine routing decision, the similarity threshold, and the authentication decision. The attestation is 74 bytes, three-family post-quantum, and permanent.

The router handles engine selection automatically. The developer submits a biometric authentication request; the system selects BFV for the inner product, optionally routes through TFHE for the threshold, and produces the H33-74 attestation. One API call. Three engines. Zero plaintext exposure.

The Server Never Sees the Face

This is the core claim, stated plainly: an H33 biometric authentication server can process millions of biometric authentications per second without ever seeing a single plaintext biometric. Not for a millisecond. Not in a TEE. Not in a temporary buffer. The plaintext does not exist on the server at any point during any operation.

The security of this claim does not depend on hardware trust assumptions (TEE integrity), operational security (key management discipline), access control (perimeter firewalls), or policy enforcement (data handling agreements). It depends on the hardness of the Ring Learning With Errors problem — a mathematical assumption that has withstood decades of cryptanalysis and is believed to resist quantum attack.

42 µs
Per-authentication latency · Full FHE pipeline · Post-quantum attested · Zero plaintext exposure

Every biometric vendor will eventually face a breach. The question is not whether the server will be compromised. The question is what the attacker gets when it is. If the answer is "BFV ciphertexts that cannot be decrypted without a key that was never on the server," the breach is a non-event. If the answer is "plaintext biometric templates for every enrolled user," the breach is permanent and irrevocable.

The architecture you choose today determines which answer your users get.


H33 encrypted biometric authentication is in production. 2,209,429 authentications per second sustained. 42 microseconds per authentication. Three modalities. Seventeen spoofing types detected on ciphertext. Post-quantum attested. The server never sees the biometric. Schedule a demo to see encrypted matching on your own biometric data.


Eric Beans
CEO, H33.ai, Inc.
Patent pending. U.S. Patent Application Nos. 19/309,560 and 19/645,499. Additional applications pending.
All benchmarks measured on AWS c8g.metal-48xl (Graviton4, 192 vCPUs, Neoverse V2), April 2026. Rust 1.94.0. 120-second sustained runs.
All NIST security tests passed: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), FIPS 205 (SLH-DSA). FIPS 140-3 KATs operational. 20,000+ tests across the platform.
H33-74 is a trademark of H33.ai, Inc. AWS and Graviton4 are trademarks of Amazon Web Services, Inc.
BIPA, GDPR, and CCPA references are for informational purposes only and do not constitute legal advice.