When your password is stolen, you change it. When your credit card is compromised, the bank issues a new one. When your Social Security number leaks, you can freeze your credit and monitor for fraud.

When your fingerprint is stolen, you are out of options for the rest of your life.

You have ten fingerprints. You have one face. You have two irises. That is the complete, non-renewable inventory of biometric credentials you will ever possess. There are no replacements. There are no resets. There is no "forgot my fingerprint" flow.

And right now, billions of biometric records are sitting in plaintext on servers around the world, protected by the same perimeter security that has failed catastrophically and repeatedly for the last two decades.

The Breach Timeline Nobody Talks About

The cybersecurity industry publishes breach reports measured in email addresses and passwords. The biometric breaches are worse, and they are permanent.

2015

U.S. Office of Personnel Management (OPM)

5.6 million fingerprints stolen from federal employee background check records. The attackers — attributed to Chinese state actors — obtained the raw fingerprint images. These are the fingerprints of people who hold security clearances. They cannot be reissued.

2019

BioStar2 / Suprema

1 million+ fingerprint records and facial recognition data exposed in an unprotected database. BioStar2 is used by thousands of organizations worldwide for building access. The data included raw fingerprint images and facial geometry — stored unencrypted in Elasticsearch.

2019

India Aadhaar

1.1 billion biometric records in the world's largest biometric database, repeatedly found accessible through unsecured API endpoints. Fingerprints and iris scans for the majority of India's population, stored centrally and accessed through a system that was breached multiple times by journalists for as little as $8.

2020

Clearview AI

3 billion+ facial images scraped from social media and stored in a searchable database. The client list was stolen. Law enforcement agencies, private companies, and unauthorized users had access to a facial recognition system built on data people never consented to provide.

2021

Verkada

150,000+ security cameras compromised, including cameras in hospitals, prisons, Tesla factories, and Cloudflare offices. Attackers accessed live feeds and archived footage containing facial data. The root cause: a shared admin credential found on the public internet.

2024

Outabox (Australia)

1.05 million facial recognition records leaked from a facial recognition system used in clubs and pubs for identity verification. Names, driver's license numbers, addresses, dates of birth, phone numbers, and facial scans — all in one breach. A website was set up allowing anyone to check if their face was in the database.

Every one of these breaches has something in common: the biometric data was stored in plaintext. Raw fingerprint images. Unencrypted facial geometry vectors. Iris scans sitting in databases that were accessible to anyone with the right (or wrong) credentials.

The industry's response has been to add more perimeter security. Better firewalls. Stronger access controls. Shorter session tokens. All of which address how attackers get in, but none of which address what attackers find when they do.

Why Standard Biometrics Are Architecturally Broken

The fundamental flaw in every standard biometric system is the same: the server must see your biometric in plaintext to match it.

Here is how every major biometric authentication system works today:

  1. You scan your fingerprint or face on a sensor
  2. A biometric template (mathematical representation) is generated
  3. That template is sent to a server
  4. The server compares it against a stored template in a database
  5. The server returns a match/no-match decision

Steps 3 and 4 are the problem. The template must be transmitted in a form the server can read. The stored template must be in a form the server can compare against. This means the server has access to the raw biometric data at two critical points: during transmission and during storage.

"But we hash the template!" is the most common objection. It doesn't solve the problem. Hashing works for passwords because you can compare hash outputs directly. Biometric matching requires similarity comparison, not exact matching. Your fingerprint scan is slightly different every time — different angle, different pressure, different moisture. A hash of today's scan will never match a hash of yesterday's scan, even from the same finger. So the server needs the actual vector to compute similarity. And that means the server needs plaintext.

The Permanence Problem

A password breach affects one account. You change the password. A biometric breach affects every system that uses that biometric, forever. If your fingerprint is stolen from a building access system, it can be used to defeat the fingerprint scanner on your phone, your bank's biometric login, your government ID system, and every future system that uses fingerprints for the rest of your life. The attack surface is permanent and expanding.

How Encrypted Biometrics Work: The H33 Approach

There is exactly one way to eliminate the plaintext vulnerability: never let the server see the biometric at all.

H33 uses Fully Homomorphic Encryption (FHE) to match biometrics on encrypted data. The server performs the entire matching computation without ever decrypting the biometric template. Here is the exact pipeline:

Standard Biometrics

STEP 1

User scans fingerprint

STEP 2

Template generated: [0.82, 0.15, 0.94, ...]

STEP 3

Template sent to server in plaintext

STEP 4

Server reads stored template in plaintext

STEP 5

Server computes similarity on plaintext vectors

RESULT

Server saw everything. Database has everything. Breach = permanent identity theft.

H33 Encrypted Biometrics

STEP 1

User scans fingerprint

STEP 2

Template generated and encrypted on-device with FHE

STEP 3

Encrypted ciphertext sent to server: 7a3f9c...noise

STEP 4

Server reads stored template — also encrypted

STEP 5

Server computes similarity on encrypted vectors (FHE inner product)

RESULT

Server saw nothing. Database has ciphertext. Breach = useless noise.

This is not a theoretical capability. H33's FHE biometric matching runs in production at 2,209,429 operations per second on a single ARM CPU. The per-authentication latency is 35.25 microseconds. For context, a human blink takes 300 milliseconds — H33 completes 8,500 full biometric authentications in the time it takes you to blink once.

Biometric Scan On-Device FHE Encrypt Encrypted Match (BFV Inner Product) ZK-STARK Proof Dilithium Signature
// Server never sees raw biometric. Entire pipeline: 35.25μs. Post-quantum at every layer.

What the Server Actually Stores

In a standard biometric system, the database contains something like this:

-- Standard biometric database (what attackers steal)
user_id: 4819273
fingerprint_template: [0.8234, 0.1592, 0.9471, 0.3218, 0.7743, ...]  // 128 floats, plaintext
face_geometry: [0.2341, 0.8872, 0.1193, 0.5567, 0.9912, ...]       // 512 floats, plaintext
iris_pattern: [0.5521, 0.3319, 0.7789, 0.1145, 0.6623, ...]       // 256 floats, plaintext
-- Stolen = permanent identity compromise

In H33's system, the database contains this:

-- H33 encrypted biometric database (what attackers steal)
user_id: 4819273
enrolled_template: BFV_CIPHERTEXT[
  4a9f3c2d8b1e7f6a5c0d9e8b7a6f5c4d3e2b1a0f9e8d7c6b5a4f3e2d1c0b9a
  8f7e6d5c4b3a2918f7e6d5c4b3a29180f7e6d5c4b3a29187f6e5d4c3b2a1908
  ... // 32KB of lattice-based ciphertext. No mathematical relationship to the biometric.
]
-- Stolen = random noise. Cannot be decrypted without the private key.
-- Cannot be used to reconstruct the biometric.
-- Cannot be replayed against other systems.

The ciphertext is computationally indistinguishable from random noise. An attacker who steals the entire database gets exactly nothing. No fingerprints. No facial geometry. No iris patterns. Just noise that cannot be reversed without the decryption key, which never exists on the server.

The Comparison That Matters

PropertyStandard BiometricsH33 FHE Biometrics
Server sees raw biometricYes — required for matchingNever — matching on ciphertext
Database stores plaintextYes — templates in readable formNever — BFV ciphertext only
Breach exposes biometricsYes — permanent identity theftNo — ciphertext is noise
Template replay attackStolen template works on other systemsCiphertext is key-specific, non-transferable
Quantum resistanceNone — classical crypto onlyBFV lattice (NIST L1-L5), Dilithium signatures
Regulatory complianceGDPR/BIPA liability for storing biometricsNo biometric data in scope — only ciphertext
Matching accuracy99.5-99.9% (depending on vendor)Mathematically identical — same inner product, same threshold
Matching latency50-200ms typical35.25μs (1,000-5,000x faster)
Insider threatDBAs, engineers can access templatesNobody can access templates — no plaintext exists on server
Subpoena exposureBiometric data can be compelledOnly ciphertext exists — nothing useful to produce

The Regulatory Tsunami

Legislators have noticed that biometric data is uniquely dangerous. The regulatory response is accelerating:

The legal landscape is clear: storing biometric data in plaintext is an escalating liability. Every database that holds raw biometric templates is a lawsuit and a regulatory fine waiting for a breach to trigger it.

FHE Biometrics and BIPA/GDPR Compliance

When biometric matching happens entirely on encrypted data, the server never possesses "biometric information" as defined by BIPA or "biometric data" as defined by GDPR. The ciphertext stored on the server has no mathematical relationship to the biometric it was derived from. This is not a legal workaround — it is an architectural elimination of the data category that triggers regulatory requirements. You cannot breach data that doesn't exist in a readable form.

Why "Encrypt at Rest" Is Not the Same Thing

The most common pushback from biometric vendors is: "We encrypt biometric data at rest." This sounds reassuring. It is not.

Encryption at rest means the data is encrypted when it's sitting on disk. But to perform a biometric match, the server must decrypt the data into memory, perform the comparison, and then re-encrypt or discard it. During that window — which can last milliseconds to seconds — the plaintext biometric exists in server memory.

This means:

Encryption at rest protects against someone stealing the hard drive. It does not protect against someone compromising the server. And server compromise is the actual threat model. Nobody breaks into a data center to steal disk drives. They exploit a vulnerability, gain remote access, and read data from a running system.

FHE is fundamentally different. The data is never decrypted at any point in the processing pipeline. There is no "decryption window." There is no plaintext in memory. The computation happens on the ciphertext itself. The server doesn't have the decryption key. There is nothing to intercept, nothing to dump, nothing to extract.

The Performance Myth

For years, the argument against encrypted biometrics was performance. "FHE is too slow for real-time matching." That was true in 2018. It is not true in 2026.

H33's BFV-based biometric engine processes 32 users per ciphertext using SIMD batching. The FHE inner product that performs the biometric match runs in 939 microseconds per batch. Each authentication includes a full post-quantum pipeline: FHE matching, ZK-STARK proof generation, and Dilithium signature attestation.

Total: 35.25 microseconds per authentication. That is faster than most plaintext biometric systems, which typically require 50-200 milliseconds for template extraction, network round-trip, and server-side comparison.

The encrypted path is not just more secure. It is faster. The performance penalty that was supposed to make FHE impractical doesn't exist anymore.

What Should Have Been Done

Every biometric breach in history — OPM, BioStar2, Aadhaar, Clearview, Outabox — was preventable with one architectural decision: don't let the server see the biometric.

Not "add more firewalls." Not "encrypt at rest." Not "tokenize the template." Not "hash the features." Those are mitigations. They reduce the blast radius of a breach. They don't eliminate it.

Eliminating it requires FHE. The server matches without seeing. The database stores without knowing. The breach exposes without revealing.

Your fingerprint is permanent. Your face is permanent. Your iris is permanent. The system that protects them should make their exposure impossible, not merely unlikely.

H33 makes it impossible. One API call. 35.25 microseconds. The server never sees your face. Ever.

Get API Key → FHE Biometrics Watch the Demo