Table of Contents
- Why Biometric Login in React?
- Architecture Overview
- Project Setup and Dependencies
- Initializing the H33 SDK
- Building the Camera Capture Component
- Liveness Detection
- Biometric Enrollment Flow
- Biometric Login Flow
- Auth Context and State Management
- Protected Routes and Session Handling
- Server-Side Proxy Layer
- WebAuthn Integration
- Error Handling and Edge Cases
- Security Hardening
- Testing Your Biometric Auth
- Production Deployment Checklist
Passwords are a liability. Users reuse them, phishing attacks steal them, and breaches expose them by the billions. Biometric authentication eliminates these risks entirely — your face is not something you can forget, share, or accidentally paste into a Slack channel. But building biometric login into a React application has historically been painful: raw WebRTC APIs, fragile camera handling, client-side template processing, and an entirely separate backend for matching.
H33 collapses all of that into a single API. Your React app captures the biometric, H33 encrypts it with fully homomorphic encryption, matches it against enrolled templates without ever decrypting the data, and returns a post-quantum signed authentication token. The entire round trip completes in under 50 microseconds of server-side compute.
This tutorial walks through every component you need, from camera capture to protected routes. All code is TypeScript, all components are production-ready, and every example is complete enough to copy and run.
1. Why Biometric Login in React?
Before writing any code, let us be clear about what biometric authentication solves and what it does not.
What Passwords Get Wrong
The fundamental problem with password-based authentication is that it relies on a shared secret. The user knows the password. The server knows a hash of the password. Every interaction revolves around proving knowledge of that secret. This model has three critical weaknesses:
- Credential stuffing — Over 15 billion leaked credentials are publicly available. If your users reuse passwords (and research shows 65% do), your application is exposed regardless of how strong your own security is.
- Phishing — Even security-conscious users can be fooled by convincing fake login pages. Passwords are inherently phishable because they are transferable.
- Harvest-now, decrypt-later — Nation-state adversaries are recording encrypted authentication traffic today, planning to decrypt it once quantum computers can break RSA and ECC. Your password hashes and JWT signatures are targets.
What Biometrics Get Right
Biometric authentication eliminates the shared-secret model. Instead of proving what you know, you prove who you are. A face cannot be phished. A face cannot be reused from a breach. A face cannot be guessed by a brute-force bot.
When combined with H33's encrypted biometric pipeline, you get additional guarantees that traditional biometric systems lack:
Encrypted Matching
Biometric templates are encrypted with FHE before leaving the client. The server matches against enrolled templates without ever seeing the raw biometric. Even a full server breach reveals nothing.
Post-Quantum Tokens
Authentication tokens are signed with Dilithium (NIST ML-DSA), making them unforgeable by both classical and quantum adversaries. Your sessions are future-proof.
Zero-Knowledge Proofs
H33 generates a ZK proof that the biometric matched without revealing which template matched or any details about the biometric itself.
Sub-Millisecond Latency
Full authentication completes in ~50 microseconds of server compute. That is faster than a blink — your users will not perceive any delay.
WebAuthn (FIDO2) is excellent for device-bound authentication, but it ties the credential to a specific hardware authenticator. If the user loses their phone, they lose access. H33 biometric auth is device-independent — your face works on any device with a camera. We will show how to combine both approaches later in this tutorial for maximum security.
2. Architecture Overview
Before diving into code, let us map the full data flow. Understanding this architecture will make every subsequent component make sense.
Enrollment Flow
getUserMedia, captures a video frame, and extracts biometric features client-side.
/v1/biometric/enroll endpoint. H33 stores the encrypted template and returns an enrollment ID.
Authentication Flow
/v1/biometric/verify with the encrypted biometric and the user's enrollment ID. H33 performs FHE-encrypted matching and returns a match result with a ZK proof and a post-quantum signed token.
Key Architectural Principle
Never call H33 directly from the browser. All H33 API calls go through your backend server. This keeps your API key secret and lets you enforce additional business logic (rate limiting, user lookup, audit logging) before and after the biometric check. The React client only talks to your own API.
3. Project Setup and Dependencies
We will use a standard React + TypeScript setup. The examples use Vite for fast development, but the components work identically with Next.js, Remix, or Create React App.
# Create React project with TypeScript npm create vite@latest h33-biometric-app -- --template react-ts cd h33-biometric-app # Install core dependencies npm install react-router-dom axios # Install H33 SDK npm install @h33/sdk @h33/react # Install dev dependencies npm install -D @types/react @types/react-dom
Create the project directory structure:
mkdir -p src/{components,hooks,context,services,pages,utils}
mkdir -p src/components/{biometric,auth,layout}
mkdir -p server/{routes,middleware,services}
Set up your environment variables:
# Client-side (VITE_ prefix makes these available in the browser) VITE_API_URL=http://localhost:3001/api # Server-side only (never exposed to the browser) H33_API_KEY=your_h33_api_key_here H33_API_URL=https://api.h33.ai/v1 SESSION_SECRET=generate-a-256-bit-random-secret PORT=3001
Your H33_API_KEY must never appear in client-side code. Only server-side environment variables (without the VITE_ prefix) are safe. If your API key is exposed in the browser bundle, revoke it immediately at h33.ai/get-api-key and generate a new one.
4. Initializing the H33 SDK
The H33 React SDK provides hooks and components that handle the cryptographic heavy lifting. Initialize it once at the root of your application.
import React from 'react'; import ReactDOM from 'react-dom/client'; import { BrowserRouter } from 'react-router-dom'; import { H33Provider } from '@h33/react'; import { AuthProvider } from './context/AuthContext'; import App from './App'; ReactDOM.createRoot(document.getElementById('root')!).render( <React.StrictMode> <H33Provider // The SDK only needs the API URL — your API key stays server-side apiUrl={import.meta.env.VITE_API_URL} config={{ biometric: { captureMode: 'face', livenessCheck: true, encryptOnCapture: true, // FHE encryption before transmission captureQuality: 'high', }, }} > <BrowserRouter> <AuthProvider> <App /> </AuthProvider> </BrowserRouter> </H33Provider> </React.StrictMode> );
The H33Provider wraps your application with the cryptographic context needed for biometric operations. The encryptOnCapture: true flag is critical — it ensures that biometric data is encrypted with FHE before it ever leaves the user's browser. The raw biometric template never touches your server or H33's servers in an unencrypted form.
5. Building the Camera Capture Component
The camera capture component is the heart of biometric login. It needs to handle WebRTC permissions, camera selection (front vs. rear), real-time preview, and graceful error handling when cameras are unavailable or permissions are denied.
import React, { useRef, useState, useCallback, useEffect } from 'react'; import { useBiometricCapture } from '@h33/react'; interface BiometricCaptureProps { onCapture: (encryptedTemplate: Uint8Array, livenessScore: number) => void; onError: (error: Error) => void; mode: 'enroll' | 'verify'; } type CaptureStatus = | 'idle' | 'requesting-permission' | 'camera-active' | 'capturing' | 'processing' | 'complete' | 'error'; export default function BiometricCapture({ onCapture, onError, mode, }: BiometricCaptureProps) { const videoRef = useRef<HTMLVideoElement>(null); const streamRef = useRef<MediaStream | null>(null); const [status, setStatus] = useState<CaptureStatus>('idle'); const [feedback, setFeedback] = useState(''); // H33 SDK hook — handles encryption, liveness, and template extraction const { captureFrame, checkLiveness, encryptTemplate, faceDetected, facePosition, } = useBiometricCapture({ videoRef }); // Start the camera const startCamera = useCallback(async () => { try { setStatus('requesting-permission'); const stream = await navigator.mediaDevices.getUserMedia({ video: { facingMode: 'user', // Front camera width: { ideal: 1280 }, height: { ideal: 720 }, frameRate: { ideal: 30 }, }, audio: false, }); streamRef.current = stream; if (videoRef.current) { videoRef.current.srcObject = stream; await videoRef.current.play(); } setStatus('camera-active'); setFeedback('Position your face within the oval guide.'); } catch (err) { setStatus('error'); if (err instanceof DOMException) { if (err.name === 'NotAllowedError') { setFeedback('Camera permission denied. Please allow camera access.'); } else if (err.name === 'NotFoundError') { setFeedback('No camera found. Please connect a camera.'); } else { setFeedback(`Camera error: ${err.message}`); } } onError(err as Error); } }, [onError]); // Capture and process biometric const handleCapture = useCallback(async () => { if (!faceDetected) { setFeedback('No face detected. Please look at the camera.'); return; } try { setStatus('capturing'); setFeedback('Hold still...'); // 1. Capture the video frame const frame = await captureFrame(); // 2. Run liveness detection setStatus('processing'); setFeedback('Verifying liveness...'); const livenessResult = await checkLiveness(frame); if (!livenessResult.isLive) { setFeedback('Liveness check failed. Please try again with better lighting.'); setStatus('camera-active'); return; } // 3. Encrypt the biometric template with FHE setFeedback('Encrypting biometric data...'); const encryptedTemplate = await encryptTemplate(frame); // 4. Return encrypted template to parent component setStatus('complete'); setFeedback('Capture complete!'); onCapture(encryptedTemplate, livenessResult.score); } catch (err) { setStatus('error'); setFeedback('Capture failed. Please try again.'); onError(err as Error); } }, [faceDetected, captureFrame, checkLiveness, encryptTemplate, onCapture, onError]); // Clean up camera on unmount useEffect(() => { return () => { streamRef.current?.getTracks().forEach((t) => t.stop()); }; }, []); return ( <div className="biometric-capture"> <div className="camera-viewport"> <video ref={videoRef} autoPlay playsInline muted style={{ transform: 'scaleX(-1)' }} {/* Mirror for UX */} /> {status === 'camera-active' && ( <div className={`face-guide ${faceDetected ? 'detected' : ''}`} /> )} </div> <p className="capture-feedback">{feedback}</p> {status === 'idle' && ( <button onClick={startCamera} className="btn-primary"> Start Camera </button> )} {status === 'camera-active' && faceDetected && ( <button onClick={handleCapture} className="btn-primary"> {mode === 'enroll' ? 'Enroll My Face' : 'Verify Identity'} </button> )} {status === 'processing' && ( <div className="capture-spinner">Processing...</div> )} </div> ); }
A few important details in this component:
scaleX(-1)mirrors the video preview so the user sees themselves as they would in a mirror. This is purely a UX decision — the captured frame is not mirrored.playsInlineis required for iOS Safari, which otherwise forces fullscreen video playback.- Stream cleanup in the
useEffectreturn is essential. Without it, the camera LED stays on and the MediaStream leaks when the component unmounts. - Face detection feedback via the
faceDetectedflag from the SDK hook lets you guide the user in real time before they attempt capture.
6. Liveness Detection
Liveness detection is what separates a real biometric system from a glorified photo upload. Without it, an attacker can hold a printed photo or play a video of the target's face in front of the camera and authenticate as them.
The H33 SDK includes built-in liveness detection that runs client-side before the biometric is sent to the server. It uses multiple passive signals to determine liveness:
Texture Analysis
Detects the micro-texture differences between a live face and a printed photo or screen display. Paper has visible dot patterns; screens have pixel grids.
Depth Estimation
Uses monocular depth cues to verify three-dimensionality. A flat photo or screen has no depth variation across the face.
Temporal Analysis
Analyzes micro-movements across multiple frames — subtle involuntary motions like eye micro-saccades that are present in live faces but absent in static attacks.
Reflection Detection
Identifies specular reflections on glasses or screens that indicate a presentation attack using a digital display.
If you need to customize the liveness threshold or add active challenges (blink detection, head turn), you can configure the SDK:
import { useBiometricCapture } from '@h33/react'; export function useLiveness(videoRef: React.RefObject<HTMLVideoElement>) { const { checkLiveness } = useBiometricCapture({ videoRef, livenessConfig: { // Minimum confidence score (0.0 to 1.0) threshold: 0.85, // Number of frames to analyze (more = more accurate, slower) frameCount: 5, // Optional: require active challenge activeChallenge: { enabled: false, // Set true for high-security flows type: 'blink', // 'blink' | 'head-turn' | 'smile' timeoutMs: 5000, // Max time to complete challenge }, }, }); return { checkLiveness }; }
A biometric system without liveness detection is worse than a password. An attacker who finds a photo of your user on social media can authenticate as them in seconds. Always keep livenessCheck: true in your H33Provider configuration. The latency cost is minimal (typically under 100ms client-side) and the security benefit is enormous.
7. Biometric Enrollment Flow
Enrollment is when a user registers their face for the first time. This typically happens during account creation or when an existing user adds biometric login to their account. The component ties together the camera capture, your backend API, and user feedback.
import React, { useState } from 'react'; import { useNavigate } from 'react-router-dom'; import BiometricCapture from '../components/biometric/BiometricCapture'; import { useAuth } from '../context/AuthContext'; import { biometricApi } from '../services/api'; export default function EnrollPage() { const [step, setStep] = useState<'intro' | 'capture' | 'success' | 'error'>('intro'); const [error, setError] = useState(''); const { user } = useAuth(); const navigate = useNavigate(); const handleCapture = async ( encryptedTemplate: Uint8Array, livenessScore: number ) => { try { // Send encrypted template to YOUR server (not directly to H33) const response = await biometricApi.enroll({ encryptedTemplate, livenessScore, userId: user!.id, }); if (response.success) { setStep('success'); // Redirect to dashboard after brief success message setTimeout(() => navigate('/dashboard'), 2000); } } catch (err: any) { setError(err.response?.data?.message || 'Enrollment failed.'); setStep('error'); } }; return ( <div className="enroll-page"> <h1>Set Up Face Login</h1> {step === 'intro' && ( <div className="enroll-intro"> <p> We will capture a short video of your face to set up biometric login. Your biometric data is encrypted on your device and never stored in a readable format. </p> <ul> <li>Find a well-lit area</li> <li>Remove sunglasses or hats</li> <li>Face the camera directly</li> </ul> <button onClick={() => setStep('capture')} className="btn-primary"> Begin Enrollment </button> </div> )} {step === 'capture' && ( <BiometricCapture mode="enroll" onCapture={handleCapture} onError={(err) => { setError(err.message); setStep('error'); }} /> )} {step === 'success' && ( <div className="enroll-success"> <h2>Enrollment Successful</h2> <p>Your face has been securely enrolled. You can now log in with your face.</p> </div> )} {step === 'error' && ( <div className="enroll-error"> <p>{error}</p> <button onClick={() => setStep('capture')} className="btn-secondary"> Try Again </button> </div> )} </div> ); }
The API Service Layer
Create a thin service layer that handles all communication with your backend:
import axios from 'axios'; const api = axios.create({ baseURL: import.meta.env.VITE_API_URL, withCredentials: true, // Send httpOnly cookies headers: { 'Content-Type': 'application/json' }, }); // Automatically convert Uint8Array to base64 for transport function templateToBase64(template: Uint8Array): string { return btoa(String.fromCharCode(...template)); } export const biometricApi = { async enroll(data: { encryptedTemplate: Uint8Array; livenessScore: number; userId: string; }) { const res = await api.post('/auth/biometric/enroll', { template: templateToBase64(data.encryptedTemplate), livenessScore: data.livenessScore, userId: data.userId, }); return res.data; }, async verify(data: { encryptedTemplate: Uint8Array; livenessScore: number; userId: string; }) { const res = await api.post('/auth/biometric/verify', { template: templateToBase64(data.encryptedTemplate), livenessScore: data.livenessScore, userId: data.userId, }); return res.data; }, }; export const authApi = { async getSession() { const res = await api.get('/auth/session'); return res.data; }, async logout() { await api.post('/auth/logout'); }, async refreshSession() { const res = await api.post('/auth/refresh'); return res.data; }, };
8. Biometric Login Flow
The login flow mirrors enrollment but calls the verify endpoint instead. The key UX difference is that the user needs to identify themselves first (by entering a username or email), so the server knows which enrolled template to match against.
import React, { useState } from 'react'; import { useNavigate } from 'react-router-dom'; import BiometricCapture from '../components/biometric/BiometricCapture'; import { useAuth } from '../context/AuthContext'; import { biometricApi } from '../services/api'; export default function LoginPage() { const [email, setEmail] = useState(''); const [step, setStep] = useState<'email' | 'capture' | 'error'>('email'); const [error, setError] = useState(''); const { login } = useAuth(); const navigate = useNavigate(); const handleEmailSubmit = (e: React.FormEvent) => { e.preventDefault(); if (!email) return; setStep('capture'); }; const handleCapture = async ( encryptedTemplate: Uint8Array, livenessScore: number ) => { try { const result = await biometricApi.verify({ encryptedTemplate, livenessScore, userId: email, }); if (result.authenticated) { // The server sets an httpOnly cookie — update client auth state await login(result.user); navigate('/dashboard'); } else { setError('Face did not match. Please try again.'); setStep('error'); } } catch (err: any) { const message = err.response?.status === 429 ? 'Too many attempts. Please wait before trying again.' : err.response?.data?.message || 'Login failed.'; setError(message); setStep('error'); } }; return ( <div className="login-page"> <h1>Sign In</h1> {step === 'email' && ( <form onSubmit={handleEmailSubmit}> <label htmlFor="email">Email address</label> <input id="email" type="email" value={email} onChange={(e) => setEmail(e.target.value)} placeholder="you@example.com" required autoFocus /> <button type="submit" className="btn-primary"> Continue with Face Login </button> </form> )} {step === 'capture' && ( <BiometricCapture mode="verify" onCapture={handleCapture} onError={(err) => { setError(err.message); setStep('error'); }} /> )} {step === 'error' && ( <div className="login-error"> <p>{error}</p> <button onClick={() => setStep('capture')} className="btn-secondary"> Try Again </button> </div> )} </div> ); }
Why Email First?
Requiring the user to identify themselves before capturing their biometric avoids a 1:N search across all enrolled templates. Instead, H33 performs a 1:1 match against the specific user's enrolled template. This is both faster (constant time regardless of user count) and more privacy-preserving (the server does not learn which enrolled face is closest to the probe).
9. Auth Context and State Management
A React Context manages authentication state across the application. It handles session hydration on page load, login/logout state transitions, and automatic session refresh.
import React, { createContext, useContext, useState, useEffect, useCallback, } from 'react'; import { authApi } from '../services/api'; interface User { id: string; email: string; name: string; biometricEnrolled: boolean; } interface AuthState { user: User | null; loading: boolean; isAuthenticated: boolean; login: (user: User) => void; logout: () => Promise<void>; } const AuthContext = createContext<AuthState | undefined>(undefined); export function AuthProvider({ children }: { children: React.ReactNode }) { const [user, setUser] = useState<User | null>(null); const [loading, setLoading] = useState(true); // Hydrate session on mount (reads httpOnly cookie) useEffect(() => { let cancelled = false; (async () => { try { const session = await authApi.getSession(); if (!cancelled && session.user) setUser(session.user); } catch { // No valid session — user is not authenticated } finally { if (!cancelled) setLoading(false); } })(); return () => { cancelled = true; }; }, []); // Automatic session refresh every 10 minutes useEffect(() => { if (!user) return; const interval = setInterval(async () => { try { await authApi.refreshSession(); } catch { setUser(null); // Session expired — force re-auth } }, 10 * 60 * 1000); return () => clearInterval(interval); }, [user]); const login = useCallback((userData: User) => { setUser(userData); }, []); const logout = useCallback(async () => { await authApi.logout(); setUser(null); }, []); return ( <AuthContext.Provider value={{ user, loading, isAuthenticated: !!user, login, logout }} > {children} </AuthContext.Provider> ); } export function useAuth(): AuthState { const ctx = useContext(AuthContext); if (!ctx) throw new Error('useAuth must be used within AuthProvider'); return ctx; }
Key design decisions in this context:
- No tokens in React state — The auth token lives in an httpOnly cookie, set by the server. React never touches it. This makes XSS attacks unable to steal the session token.
- Session hydration on mount — When the app loads, it calls
/auth/sessionto check if the httpOnly cookie is still valid. If so, the user is restored without re-authentication. - Automatic refresh — A background interval refreshes the session every 10 minutes. If the refresh fails (expired session), the user is logged out gracefully.
- Cancellation guard — The
cancelledflag prevents state updates after unmount, avoiding the classic React memory leak warning.
10. Protected Routes and Session Handling
With the auth context in place, protecting routes is straightforward. Any route that requires authentication redirects to the login page if the user is not authenticated.
import { Navigate, Outlet, useLocation } from 'react-router-dom'; import { useAuth } from '../../context/AuthContext'; export default function ProtectedRoute() { const { isAuthenticated, loading } = useAuth(); const location = useLocation(); // Show nothing while checking session (prevents flash of login page) if (loading) return <div className="loading-spinner" />; if (!isAuthenticated) { // Preserve the intended destination for post-login redirect return <Navigate to="/login" state={{ from: location }} replace />; } return <Outlet />; }
Wire up the router with protected and public routes:
import { Routes, Route } from 'react-router-dom'; import ProtectedRoute from './components/auth/ProtectedRoute'; import LoginPage from './pages/LoginPage'; import EnrollPage from './pages/EnrollPage'; import DashboardPage from './pages/DashboardPage'; import ProfilePage from './pages/ProfilePage'; export default function App() { return ( <Routes> {/* Public routes */} <Route path="/login" element={<LoginPage />} /> {/* Protected routes */} <Route element={<ProtectedRoute />}> <Route path="/dashboard" element={<DashboardPage />} /> <Route path="/profile" element={<ProfilePage />} /> <Route path="/enroll" element={<EnrollPage />} /> </Route> </Routes> ); }
11. Server-Side Proxy Layer
Your backend server sits between the React client and H33's API. It holds the API key, enforces business logic, and manages sessions. Here is a minimal Express server that handles enrollment and verification.
import express from 'express'; import axios from 'axios'; const router = express.Router(); const H33_API_URL = process.env.H33_API_URL!; const H33_API_KEY = process.env.H33_API_KEY!; // Helper: call H33 API with auth header async function h33Request(path: string, data: any) { const res = await axios.post(`${H33_API_URL}${path}`, data, { headers: { 'Authorization': `Bearer ${H33_API_KEY}`, 'Content-Type': 'application/json', }, }); return res.data; } // POST /api/auth/biometric/enroll router.post('/enroll', async (req, res) => { try { const { template, livenessScore, userId } = req.body; // Validate liveness score server-side if (livenessScore < 0.85) { return res.status(400).json({ error: 'Liveness score too low. Please try again.', }); } // Call H33 enrollment endpoint const result = await h33Request('/biometric/enroll', { encrypted_template: template, user_id: userId, liveness_score: livenessScore, }); // Store enrollment ID in your database // await db.users.update(userId, { // h33EnrollmentId: result.enrollment_id, // biometricEnrolled: true, // }); res.json({ success: true, enrollmentId: result.enrollment_id }); } catch (err: any) { console.error('Enrollment error:', err.response?.data || err.message); res.status(500).json({ error: 'Enrollment failed' }); } }); // POST /api/auth/biometric/verify router.post('/verify', async (req, res) => { try { const { template, livenessScore, userId } = req.body; if (livenessScore < 0.85) { return res.status(400).json({ error: 'Liveness score too low.', }); } // Look up the user's enrollment ID from your database // const user = await db.users.findByEmail(userId); // if (!user?.h33EnrollmentId) { ... } // Call H33 verification endpoint const result = await h33Request('/biometric/verify', { encrypted_template: template, enrollment_id: 'user_enrollment_id_from_db', liveness_score: livenessScore, }); if (result.match && result.confidence >= 0.95) { // Set httpOnly session cookie res.cookie('session', result.session_token, { httpOnly: true, secure: process.env.NODE_ENV === 'production', sameSite: 'strict', maxAge: 15 * 60 * 1000, // 15 minutes }); res.json({ authenticated: true, user: { id: userId, email: userId, name: 'User', biometricEnrolled: true }, proof: result.zk_proof, // ZK proof of match (for audit logging) }); } else { res.status(401).json({ authenticated: false, error: 'Face did not match.' }); } } catch (err: any) { console.error('Verify error:', err.response?.data || err.message); res.status(500).json({ error: 'Verification failed' }); } }); export default router;
Notice that we validate the liveness score on the server, not just the client. A determined attacker can bypass client-side checks by sending a crafted API request directly to your server. Always enforce the liveness threshold server-side as the authoritative check.
12. WebAuthn Integration
For maximum security, combine H33 biometric auth with WebAuthn (FIDO2). WebAuthn provides device-bound authentication using hardware security keys or platform authenticators (Touch ID, Windows Hello). Layering WebAuthn on top of biometric login gives you two independent factors: something you are (your face) and something you have (your device's authenticator).
import { useState, useCallback } from 'react'; import axios from 'axios'; const API_URL = import.meta.env.VITE_API_URL; export function useWebAuthn() { const [supported] = useState( () => !!window.PublicKeyCredential ); // Register a new WebAuthn credential const register = useCallback(async (userId: string) => { // 1. Get challenge from server const { data: options } = await axios.post( `${API_URL}/auth/webauthn/register-options`, { userId }, { withCredentials: true } ); // 2. Decode server response for the browser API options.challenge = base64ToBuffer(options.challenge); options.user.id = base64ToBuffer(options.user.id); // 3. Create credential via browser const credential = await navigator.credentials.create({ publicKey: options, }) as PublicKeyCredential; // 4. Send credential to server for storage const attestation = credential.response as AuthenticatorAttestationResponse; await axios.post( `${API_URL}/auth/webauthn/register-verify`, { id: credential.id, rawId: bufferToBase64(credential.rawId), response: { attestationObject: bufferToBase64(attestation.attestationObject), clientDataJSON: bufferToBase64(attestation.clientDataJSON), }, type: credential.type, }, { withCredentials: true } ); }, []); // Authenticate with an existing credential const authenticate = useCallback(async (userId: string) => { const { data: options } = await axios.post( `${API_URL}/auth/webauthn/login-options`, { userId }, { withCredentials: true } ); options.challenge = base64ToBuffer(options.challenge); options.allowCredentials = options.allowCredentials.map((c: any) => ({ ...c, id: base64ToBuffer(c.id), })); const assertion = await navigator.credentials.get({ publicKey: options, }) as PublicKeyCredential; const assertionResponse = assertion.response as AuthenticatorAssertionResponse; const { data } = await axios.post( `${API_URL}/auth/webauthn/login-verify`, { id: assertion.id, rawId: bufferToBase64(assertion.rawId), response: { authenticatorData: bufferToBase64(assertionResponse.authenticatorData), clientDataJSON: bufferToBase64(assertionResponse.clientDataJSON), signature: bufferToBase64(assertionResponse.signature), }, type: assertion.type, }, { withCredentials: true } ); return data; }, []); return { supported, register, authenticate }; } // Utility: ArrayBuffer <-> Base64 function base64ToBuffer(b64: string): ArrayBuffer { const bin = atob(b64); const buf = new Uint8Array(bin.length); for (let i = 0; i < bin.length; i++) buf[i] = bin.charCodeAt(i); return buf.buffer; } function bufferToBase64(buf: ArrayBuffer): string { return btoa(String.fromCharCode(...new Uint8Array(buf))); }
You can then offer WebAuthn as a secondary factor on the login page. After biometric verification succeeds, prompt the user for a WebAuthn assertion to complete the two-factor login.
13. Error Handling and Edge Cases
Biometric auth introduces failure modes that password-based systems never encounter. Your application needs to handle all of them gracefully.
| Error | Cause | User-Facing Message | Recovery |
|---|---|---|---|
NotAllowedError | Camera permission denied | "Please allow camera access in your browser settings." | Show instructions for the user's browser |
NotFoundError | No camera available | "No camera detected. Please connect a camera." | Offer fallback auth (password, WebAuthn) |
OverconstrainedError | Requested resolution unavailable | None (retry silently with lower constraints) | Retry with relaxed getUserMedia constraints |
| Liveness failure | Photo attack or poor lighting | "Liveness check failed. Try better lighting." | Allow 3 retries, then offer fallback |
| Match failure | Face does not match enrolled template | "Face did not match. Please try again." | Allow 3 attempts, then lock for 15 minutes |
| Network error | Connectivity issue | "Connection error. Please check your internet." | Retry with exponential backoff |
| 429 Too Many Requests | Rate limit exceeded | "Too many attempts. Please wait." | Show countdown timer |
Here is a custom hook that encapsulates retry logic with exponential backoff:
import { useState, useCallback, useRef } from 'react'; interface RetryConfig { maxAttempts: number; baseDelayMs: number; maxDelayMs: number; onMaxAttemptsReached: () => void; } export function useRetry(config: RetryConfig) { const [attempts, setAttempts] = useState(0); const [isLocked, setIsLocked] = useState(false); const timeoutRef = useRef<NodeJS.Timeout>(); const recordFailure = useCallback(() => { const next = attempts + 1; setAttempts(next); if (next >= config.maxAttempts) { setIsLocked(true); config.onMaxAttemptsReached(); // Auto-unlock after cooldown timeoutRef.current = setTimeout(() => { setIsLocked(false); setAttempts(0); }, config.maxDelayMs); } }, [attempts, config]); const reset = useCallback(() => { setAttempts(0); setIsLocked(false); if (timeoutRef.current) clearTimeout(timeoutRef.current); }, []); return { attempts, isLocked, recordFailure, reset, remaining: config.maxAttempts - attempts }; }
14. Security Hardening
Building biometric login is only half the battle. Securing the entire authentication pipeline is what separates a demo from a production system. Here are the critical hardening measures.
Token Storage
localStorage is accessible to any JavaScript running on your page. A single XSS vulnerability — in your code, a dependency, or a third-party script — can exfiltrate every token. Use httpOnly cookies exclusively. They are invisible to JavaScript and automatically included in requests by the browser.
CSRF Protection
With httpOnly cookies, you must protect against Cross-Site Request Forgery. The SameSite=Strict cookie attribute provides strong CSRF protection in modern browsers, but add a CSRF token for defense in depth:
import crypto from 'crypto'; import { Request, Response, NextFunction } from 'express'; export function csrfProtection() { return (req: Request, res: Response, next: NextFunction) => { if (req.method === 'GET') { // Generate CSRF token and set as a non-httpOnly cookie // (JS needs to read it to include in request headers) const token = crypto.randomBytes(32).toString('hex'); res.cookie('csrf-token', token, { httpOnly: false, // JS needs to read this secure: process.env.NODE_ENV === 'production', sameSite: 'strict', }); return next(); } // For mutations, verify the CSRF token header matches the cookie const cookieToken = req.cookies['csrf-token']; const headerToken = req.headers['x-csrf-token']; if (!cookieToken || !headerToken || cookieToken !== headerToken) { return res.status(403).json({ error: 'Invalid CSRF token' }); } next(); }; }
Security Checklist
Transport Security
- Enforce HTTPS everywhere (HSTS header)
- Set
Secureflag on all cookies - Use TLS 1.3 minimum
- Pin your H33 API certificate
Session Security
- httpOnly + Secure + SameSite=Strict cookies
- 15-minute session lifetime with refresh
- Invalidate server-side on logout
- Rotate session ID after authentication
Rate Limiting
- Max 5 biometric attempts per 15 minutes
- Progressive delays: 1s, 2s, 4s, 8s, lockout
- IP-based + user-based rate limiting
- Alert on repeated failures (account takeover signal)
Content Security
- Strict CSP header (no
unsafe-inline) - X-Frame-Options: DENY
- X-Content-Type-Options: nosniff
- Helmet middleware for Express
15. Testing Your Biometric Auth
Testing biometric authentication requires a different approach than testing password-based flows. You cannot simply submit a form with credentials. Here is a testing strategy that covers unit tests, integration tests, and end-to-end tests.
Unit Testing the Auth Context
import { renderHook, act, waitFor } from '@testing-library/react'; import { AuthProvider, useAuth } from '../AuthContext'; import { authApi } from '../../services/api'; // Mock the API layer jest.mock('../../services/api'); describe('AuthContext', () => { it('hydrates session on mount', async () => { const mockUser = { id: '1', email: 'test@example.com', name: 'Test', biometricEnrolled: true }; (authApi.getSession as jest.Mock).mockResolvedValue({ user: mockUser }); const { result } = renderHook(() => useAuth(), { wrapper: AuthProvider, }); // Initially loading expect(result.current.loading).toBe(true); await waitFor(() => { expect(result.current.loading).toBe(false); expect(result.current.user).toEqual(mockUser); expect(result.current.isAuthenticated).toBe(true); }); }); it('clears user on logout', async () => { (authApi.getSession as jest.Mock).mockResolvedValue({ user: { id: '1', email: 'test@example.com', name: 'Test', biometricEnrolled: true }, }); (authApi.logout as jest.Mock).mockResolvedValue(undefined); const { result } = renderHook(() => useAuth(), { wrapper: AuthProvider }); await waitFor(() => expect(result.current.isAuthenticated).toBe(true)); await act(async () => { await result.current.logout(); }); expect(result.current.user).toBeNull(); expect(result.current.isAuthenticated).toBe(false); }); });
Mocking Biometric Capture for Integration Tests
Since automated tests cannot access a real camera, mock the H33 SDK hooks:
// Mock H33 React SDK for testing export function useBiometricCapture() { return { captureFrame: jest.fn().mockResolvedValue(new Uint8Array(128)), checkLiveness: jest.fn().mockResolvedValue({ isLive: true, score: 0.95 }), encryptTemplate: jest.fn().mockResolvedValue(new Uint8Array(256)), faceDetected: true, facePosition: { x: 0.5, y: 0.5, width: 0.3, height: 0.4 }, }; } export function H33Provider({ children }: { children: React.ReactNode }) { return children; }
End-to-End Testing with Cypress
For E2E tests, use a mock camera feed. Cypress and Playwright both support mocking getUserMedia to return a synthetic video stream. Test the full flow: enter email, capture biometric, verify, and confirm the user reaches the dashboard.
16. Production Deployment Checklist
Before shipping your biometric login to production, verify every item on this checklist.
Pre-Launch Verification
Performance Expectations
With H33 handling the biometric matching, your performance bottleneck will be network latency, not computation. Here is what to expect:
| Operation | Where It Runs | Typical Latency |
|---|---|---|
| Camera initialization | Browser | 200-500ms |
| Face detection (per frame) | Browser (SDK) | ~15ms |
| Liveness check (5 frames) | Browser (SDK) | ~80ms |
| FHE template encryption | Browser (SDK WASM) | ~50ms |
| Network round-trip | Client → Server → H33 | 20-100ms |
| H33 biometric match | H33 server | ~50µs |
| ZK proof generation | H33 server | ~0.07µs |
| Post-quantum token signing | H33 server | ~240µs |
| Total user-perceived | End to end | 400-800ms |
The ~50 microseconds of server-side compute is negligible. The user-perceived latency is dominated by camera initialization and network round-trips. In production, the entire experience — from button click to authenticated dashboard — takes well under a second.
Next Steps
You now have a complete, production-grade biometric login system in React. Here are some directions to explore next:
- Enrollment best practices — Multi-angle capture, re-enrollment flows, and template quality scoring.
- Template protection deep dive — How H33's FHE-based encrypted biometric matching works under the hood.
- Advanced liveness detection — Active challenges, 3D depth sensing, and anti-spoofing techniques.
- Mobile SDK integration — React Native components for iOS and Android biometric capture.
- Post-quantum cryptography — Why the tokens H33 signs are resistant to quantum attacks.
- Testing authentication flows — Comprehensive testing strategies for auth systems.
Biometric authentication is the future of identity. Passwords will join floppy disks and fax machines in the museum of technologies we tolerated for too long. With H33 and React, you can build that future today — securely, performantly, and with code you actually understand.
Ready to Go Quantum-Secure?
Start protecting your users with post-quantum biometric authentication today. Free tier includes 10,000 API calls per month, no credit card required.
Get Free API Key →