Table of Contents
- Why Authentication Matters
- Setting Up Your Node.js Project
- Password Hashing with bcrypt and Argon2
- Session-Based vs Token-Based Auth
- Implementing JWT Authentication
- OAuth 2.0 / OpenID Connect Integration
- Adding Biometric Auth with the H33 API
- Post-Quantum Security Considerations
- Rate Limiting and Brute-Force Protection
- Best Practices Checklist
- Migrating to Post-Quantum Auth with H33
Authentication is the single most critical piece of any web application. Get it wrong, and nothing else matters — your database, your business logic, your user trust, all compromised. Yet the majority of Node.js tutorials stop at bcrypt + JWT and call it a day. That was adequate in 2020. It is not adequate in 2026, when quantum computers are advancing from laboratory curiosities toward practical cryptanalytic threats.
This tutorial takes a different approach. We start with the fundamentals — password hashing, session management, JWTs, OAuth — because you need to understand classical authentication before you can appreciate why it needs to evolve. Then we layer on biometric authentication and post-quantum cryptography through H33's API. By the end, you will have a production-grade authentication system that is secure against both classical and quantum adversaries.
Every code example in this tutorial is complete and runnable. No pseudo-code, no hand-waving. Copy, paste, and build.
1. Why Authentication Matters
Authentication vulnerabilities remain the most exploited attack vector on the internet. The OWASP Top 10 (2021) lists Broken Access Control as the number one risk, and Identification and Authentication Failures at number seven. But these rankings understate the problem because authentication failures are a precondition for nearly every other category — injection, SSRF, and security misconfiguration attacks all become trivial once an attacker has authenticated as a privileged user.
The Threat Landscape in 2026
Consider what attackers are doing right now:
- Credential stuffing — Automated tools test billions of leaked username/password pairs against your login endpoint. Over 15 billion credentials are publicly available from historical breaches. If your users reuse passwords (and most do), you are exposed.
- Phishing and social engineering — Even strong passwords fail when users hand them to convincing fake login pages. SMS-based 2FA codes are interceptable via SIM swapping.
- Session hijacking — Stolen session tokens allow attackers to impersonate users without ever knowing their password. XSS vulnerabilities are the primary vector.
- Brute-force attacks — Without rate limiting, attackers can attempt millions of password guesses per hour against your API.
- Harvest-now, decrypt-later — Nation-state actors are recording encrypted traffic today, planning to decrypt it once quantum computers are powerful enough to break RSA and ECC. Your authentication tokens and TLS sessions are targets.
NIST estimates that cryptographically relevant quantum computers could arrive by the early 2030s. Every JWT you sign today with RS256 (RSA) or ES256 (ECDSA) will be forgeable by a sufficiently powerful quantum computer running Shor's algorithm. If your tokens have long lifetimes or your signed data has long-term value, you are already vulnerable to harvest-now-decrypt-later attacks.
The good news: building authentication correctly in Node.js is not difficult. It just requires discipline, the right libraries, and a willingness to move beyond passwords-only. Let us start building.
2. Setting Up Your Node.js Project
We will build a complete Express application with user registration, login, JWT-based sessions, and eventually H33 biometric authentication. Start with a clean project.
# Create and enter project directory mkdir h33-auth-tutorial && cd h33-auth-tutorial # Initialize Node.js project npm init -y # Install core dependencies npm install express dotenv helmet cors cookie-parser # Install authentication dependencies npm install bcryptjs jsonwebtoken argon2 express-rate-limit # Install H33 SDK (post-quantum auth) npm install @h33/sdk # Install development dependencies npm install -D nodemon
Here is what each package does:
express— Web framework. The foundation of our API.dotenv— Loads environment variables from.envfiles. Keeps secrets out of source code.helmet— Sets security-focused HTTP headers (CSP, HSTS, X-Frame-Options, etc.).cors— Configures Cross-Origin Resource Sharing. Required if your frontend and API are on different domains.cookie-parser— Parses cookies from incoming requests. Essential for httpOnly token storage.bcryptjs— Pure JavaScript bcrypt implementation. No native compilation required.argon2— The winner of the Password Hashing Competition. Stronger than bcrypt for new deployments.jsonwebtoken— Signs and verifies JWTs.express-rate-limit— IP-based rate limiting middleware.@h33/sdk— H33's Node.js SDK for post-quantum authentication and biometric verification.
Create the project structure:
mkdir -p src/{routes,middleware,models,services,config}
touch src/app.js src/server.js .env .env.example
Set up the environment file:
# Server PORT=3000 NODE_ENV=development # JWT Configuration JWT_SECRET=your-256-bit-secret-replace-in-production JWT_EXPIRY=15m REFRESH_TOKEN_EXPIRY=7d # H33 API (get yours at h33.ai/get-api-key) H33_API_KEY=your_h33_api_key_here H33_API_URL=https://api.h33.ai/v1 # Database (use your preferred DB) DATABASE_URL=postgresql://user:pass@localhost:5432/authdb
Now create the Express application:
const express = require('express'); const helmet = require('helmet'); const cors = require('cors'); const cookieParser = require('cookie-parser'); require('dotenv').config(); const authRoutes = require('./routes/auth'); const protectedRoutes = require('./routes/protected'); const app = express(); // Security middleware app.use(helmet()); app.use(cors({ origin: process.env.CORS_ORIGIN || 'http://localhost:3000', credentials: true // Required for httpOnly cookies })); app.use(cookieParser()); app.use(express.json({ limit: '10kb' })); // Limit body size // Routes app.use('/api/auth', authRoutes); app.use('/api/protected', protectedRoutes); // Global error handler app.use((err, req, res, next) => { console.error('Unhandled error:', err); res.status(500).json({ error: 'Internal server error', // Never leak stack traces in production ...(process.env.NODE_ENV === 'development' && { stack: err.stack }) }); }); module.exports = app;
const app = require('./app'); const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log(`Auth server running on port ${PORT}`); console.log(`Environment: ${process.env.NODE_ENV}`); });
Add the start script to your package.json:
{
"scripts": {
"start": "node src/server.js",
"dev": "nodemon src/server.js"
}
}
3. Password Hashing with bcrypt and Argon2
Storing passwords in plaintext is an unforgivable sin. Storing them with MD5 or SHA-256 is only marginally better — general-purpose hash functions are designed to be fast, which makes them easy to brute-force. You need a password hashing function: a deliberately slow, memory-hard function designed specifically to make offline cracking expensive.
bcrypt: The Established Standard
bcrypt has been the go-to password hash since 1999. It uses a cost factor (number of rounds) that can be increased over time as hardware gets faster. A cost factor of 12 is the current minimum recommendation, producing roughly 250ms of computation per hash on modern hardware.
const bcrypt = require('bcryptjs'); const BCRYPT_ROUNDS = 12; // ~250ms on modern hardware /** * Hash a password using bcrypt. * Salt is generated automatically and embedded in the output. */ async function hashPassword(plaintext) { return bcrypt.hash(plaintext, BCRYPT_ROUNDS); } /** * Verify a password against a bcrypt hash. * Uses constant-time comparison internally. */ async function verifyPassword(plaintext, hash) { return bcrypt.compare(plaintext, hash); } module.exports = { hashPassword, verifyPassword };
Argon2: The Modern Choice
Argon2 won the Password Hashing Competition in 2015 and is recommended by OWASP for new deployments. Unlike bcrypt, Argon2 is memory-hard — it requires a configurable amount of RAM, which makes GPU and ASIC-based cracking dramatically more expensive. Argon2id is the recommended variant, combining resistance to both side-channel attacks and GPU parallelism.
const argon2 = require('argon2'); /** * Hash a password using Argon2id. * OWASP recommended: memoryCost=19456 (19 MiB), timeCost=2, parallelism=1 */ async function hashPassword(plaintext) { return argon2.hash(plaintext, { type: argon2.argon2id, memoryCost: 19456, // 19 MiB timeCost: 2, // 2 iterations parallelism: 1, // single-threaded hashLength: 32 // 256-bit output }); } /** * Verify a password against an Argon2id hash. * Parameters are encoded in the hash string — no config needed. */ async function verifyPassword(plaintext, hash) { return argon2.verify(hash, plaintext); } /** * Check if a hash needs rehashing (e.g., after upgrading parameters). */ function needsRehash(hash) { return argon2.needsRehash(hash, { type: argon2.argon2id, memoryCost: 19456, timeCost: 2 }); } module.exports = { hashPassword, verifyPassword, needsRehash };
For new projects in 2026, use Argon2id. It is memory-hard (resistant to GPU/ASIC cracking), OWASP-recommended, and the PHC winner. Use bcrypt only if you are maintaining an existing codebase that already uses it, and plan a gradual migration to Argon2id by rehashing on login.
Password Validation Rules
Hashing is useless if users choose password123. Enforce minimum complexity:
/** * Validate password strength before hashing. * NIST SP 800-63B: minimum 8 characters, check against breached list. * No arbitrary complexity rules (uppercase, special chars) — they reduce entropy. */ function validatePassword(password) { const errors = []; if (typeof password !== 'string') { errors.push('Password must be a string'); return { valid: false, errors }; } if (password.length < 8) { errors.push('Password must be at least 8 characters'); } if (password.length > 128) { errors.push('Password must be 128 characters or fewer'); } // Block common passwords (expand this list in production) const BLOCKED = [ 'password', '12345678', 'qwerty123', 'letmein', 'admin123', 'welcome1' ]; if (BLOCKED.includes(password.toLowerCase())) { errors.push('This password is too common'); } return { valid: errors.length === 0, errors }; } module.exports = { validatePassword };
4. Session-Based vs Token-Based Auth
Before writing authentication routes, you need to decide how you will track authenticated users across requests. There are two dominant patterns, each with distinct tradeoffs.
| Property | Session-Based | Token-Based (JWT) |
|---|---|---|
| State | Server-side (session store) | Client-side (token contains claims) |
| Storage | Redis, database, or memory | Cookie or header |
| Scalability | Requires shared session store | Stateless — any server can verify |
| Revocation | Instant (delete session) | Difficult (requires blocklist) |
| Payload | Session ID only (opaque) | Encoded claims (user ID, roles, expiry) |
| XSS risk | Low (httpOnly cookie) | High if stored in localStorage |
| CSRF risk | High (cookie auto-sent) | Low if sent via Authorization header |
| Best for | Traditional web apps, SSR | APIs, SPAs, microservices |
Our Recommendation: JWT in httpOnly Cookies
For modern applications, we recommend JWTs stored in httpOnly, Secure, SameSite=Strict cookies. This gives you the scalability benefits of stateless tokens with the XSS protection of httpOnly cookies. Never store JWTs in localStorage — any XSS vulnerability would allow an attacker to exfiltrate the token.
localStorage is accessible to any JavaScript running on the page. A single XSS vulnerability — even in a third-party script — can steal the token. httpOnly cookies are invisible to JavaScript entirely. The browser sends them automatically with each request, and they cannot be read by client-side code. For more on secure token patterns, see our passwordless authentication guide.
5. Implementing JWT Authentication
Now let us build the actual authentication routes. We will implement registration, login, token refresh, and logout. The implementation uses short-lived access tokens (15 minutes) paired with longer-lived refresh tokens (7 days).
JWT Utility Functions
const jwt = require('jsonwebtoken'); const JWT_SECRET = process.env.JWT_SECRET; const JWT_EXPIRY = process.env.JWT_EXPIRY || '15m'; const REFRESH_EXPIRY = process.env.REFRESH_TOKEN_EXPIRY || '7d'; if (!JWT_SECRET) { throw new Error('JWT_SECRET must be defined in environment'); } /** * Generate a short-lived access token. */ function generateAccessToken(user) { return jwt.sign( { sub: user.id, email: user.email, role: user.role || 'user', type: 'access' }, JWT_SECRET, { expiresIn: JWT_EXPIRY, algorithm: 'HS256' } ); } /** * Generate a long-lived refresh token. * Contains minimal claims — just enough to issue a new access token. */ function generateRefreshToken(user) { return jwt.sign( { sub: user.id, type: 'refresh', version: user.tokenVersion || 0 // For forced invalidation }, JWT_SECRET, { expiresIn: REFRESH_EXPIRY, algorithm: 'HS256' } ); } /** * Verify and decode a token. Throws on invalid/expired. */ function verifyToken(token) { return jwt.verify(token, JWT_SECRET); } module.exports = { generateAccessToken, generateRefreshToken, verifyToken };
Authentication Middleware
const { verifyToken } = require('../services/jwt'); /** * Middleware: require a valid access token. * Checks httpOnly cookie first, then Authorization header. */ function requireAuth(req, res, next) { let token = req.cookies?.access_token; // Fallback: check Authorization header (for API clients) if (!token) { const authHeader = req.headers.authorization; if (authHeader?.startsWith('Bearer ')) { token = authHeader.slice(7); } } if (!token) { return res.status(401).json({ error: 'Authentication required' }); } try { const decoded = verifyToken(token); if (decoded.type !== 'access') { return res.status(401).json({ error: 'Invalid token type' }); } req.user = decoded; next(); } catch (err) { if (err.name === 'TokenExpiredError') { return res.status(401).json({ error: 'Token expired', code: 'TOKEN_EXPIRED' }); } return res.status(401).json({ error: 'Invalid token' }); } } /** * Middleware: require a specific role. */ function requireRole(...roles) { return (req, res, next) => { if (!roles.includes(req.user?.role)) { return res.status(403).json({ error: 'Insufficient permissions' }); } next(); }; } module.exports = { requireAuth, requireRole };
Registration and Login Routes
const express = require('express'); const router = express.Router(); const { hashPassword, verifyPassword } = require('../services/password-argon2'); const { generateAccessToken, generateRefreshToken, verifyToken } = require('../services/jwt'); const { validatePassword } = require('../middleware/validate-password'); // In-memory store for demo purposes. // In production, use PostgreSQL, MongoDB, or another persistent store. const users = new Map(); // Cookie configuration — secure defaults const COOKIE_OPTIONS = { httpOnly: true, // Not accessible via JavaScript secure: process.env.NODE_ENV === 'production', // HTTPS only in prod sameSite: 'strict', // No cross-site sending path: '/' }; // ─────────── REGISTER ─────────── router.post('/register', async (req, res) => { try { const { email, password, name } = req.body; // Validate input if (!email || !password || !name) { return res.status(400).json({ error: 'Email, password, and name required' }); } // Validate password strength const pwCheck = validatePassword(password); if (!pwCheck.valid) { return res.status(400).json({ error: 'Weak password', details: pwCheck.errors }); } // Check for existing user if (users.has(email)) { return res.status(409).json({ error: 'Email already registered' }); } // Hash password with Argon2id const passwordHash = await hashPassword(password); // Create user const user = { id: `user_${Date.now()}`, email, name, passwordHash, role: 'user', tokenVersion: 0, createdAt: new Date().toISOString() }; users.set(email, user); // Generate tokens const accessToken = generateAccessToken(user); const refreshToken = generateRefreshToken(user); // Set httpOnly cookies res.cookie('access_token', accessToken, { ...COOKIE_OPTIONS, maxAge: 15 * 60 * 1000 // 15 minutes }); res.cookie('refresh_token', refreshToken, { ...COOKIE_OPTIONS, maxAge: 7 * 24 * 60 * 60 * 1000, // 7 days path: '/api/auth/refresh' // Only sent to refresh endpoint }); res.status(201).json({ message: 'Registration successful', user: { id: user.id, email: user.email, name: user.name } }); } catch (err) { console.error('Registration error:', err); res.status(500).json({ error: 'Registration failed' }); } }); // ─────────── LOGIN ─────────── router.post('/login', async (req, res) => { try { const { email, password } = req.body; const user = users.get(email); if (!user) { // Constant-time: hash a dummy password to prevent timing attacks await hashPassword('dummy-password-for-timing'); return res.status(401).json({ error: 'Invalid credentials' }); } const valid = await verifyPassword(password, user.passwordHash); if (!valid) { return res.status(401).json({ error: 'Invalid credentials' }); } // Generate tokens const accessToken = generateAccessToken(user); const refreshToken = generateRefreshToken(user); res.cookie('access_token', accessToken, { ...COOKIE_OPTIONS, maxAge: 15 * 60 * 1000 }); res.cookie('refresh_token', refreshToken, { ...COOKIE_OPTIONS, maxAge: 7 * 24 * 60 * 60 * 1000, path: '/api/auth/refresh' }); res.json({ message: 'Login successful', user: { id: user.id, email: user.email, name: user.name } }); } catch (err) { console.error('Login error:', err); res.status(500).json({ error: 'Login failed' }); } }); // ─────────── REFRESH ─────────── router.post('/refresh', async (req, res) => { const refreshToken = req.cookies?.refresh_token; if (!refreshToken) { return res.status(401).json({ error: 'No refresh token' }); } try { const decoded = verifyToken(refreshToken); if (decoded.type !== 'refresh') { return res.status(401).json({ error: 'Invalid token type' }); } // Look up user — verify they still exist and token version matches const user = [...users.values()].find(u => u.id === decoded.sub); if (!user || user.tokenVersion !== decoded.version) { return res.status(401).json({ error: 'Token revoked' }); } // Issue new access token const accessToken = generateAccessToken(user); res.cookie('access_token', accessToken, { ...COOKIE_OPTIONS, maxAge: 15 * 60 * 1000 }); res.json({ message: 'Token refreshed' }); } catch (err) { res.status(401).json({ error: 'Invalid refresh token' }); } }); // ─────────── LOGOUT ─────────── router.post('/logout', (req, res) => { res.clearCookie('access_token', COOKIE_OPTIONS); res.clearCookie('refresh_token', { ...COOKIE_OPTIONS, path: '/api/auth/refresh' }); res.json({ message: 'Logged out' }); }); module.exports = router;
Protected Route Example
const express = require('express'); const router = express.Router(); const { requireAuth, requireRole } = require('../middleware/auth'); // Any authenticated user can access their profile router.get('/profile', requireAuth, (req, res) => { res.json({ userId: req.user.sub, email: req.user.email, role: req.user.role, message: 'You are authenticated' }); }); // Only admins can access this route router.get('/admin/dashboard', requireAuth, requireRole('admin'), (req, res) => { res.json({ message: 'Admin dashboard data', userId: req.user.sub }); }); module.exports = router;
6. OAuth 2.0 / OpenID Connect Integration
For most production applications, you will want to support social login (Google, GitHub, Apple) alongside email/password. OAuth 2.0 with PKCE is the modern standard. Here is a conceptual flow and practical implementation.
The Authorization Code Flow with PKCE
PKCE (Proof Key for Code Exchange, pronounced "pixy") prevents authorization code interception attacks. It is required for public clients (SPAs, mobile apps) and strongly recommended for all OAuth flows.
- Your app generates a random
code_verifierand its SHA-256 hash, thecode_challenge. - User is redirected to the OAuth provider with the
code_challenge. - User authenticates and is redirected back with an authorization
code. - Your backend exchanges the
code+code_verifierfor tokens. The provider verifies thatSHA256(code_verifier) == code_challenge.
const crypto = require('crypto'); /** * Generate PKCE code verifier and challenge. * RFC 7636 compliant. */ function generatePKCE() { // 32 bytes = 43 base64url characters (meets 43-128 char requirement) const codeVerifier = crypto.randomBytes(32) .toString('base64url'); const codeChallenge = crypto .createHash('sha256') .update(codeVerifier) .digest('base64url'); return { codeVerifier, codeChallenge }; } /** * Build an OAuth authorization URL. */ function buildAuthURL(provider, { clientId, redirectUri, codeChallenge, state }) { const providers = { google: { authUrl: 'https://accounts.google.com/o/oauth2/v2/auth', scope: 'openid email profile' }, github: { authUrl: 'https://github.com/login/oauth/authorize', scope: 'user:email' } }; const cfg = providers[provider]; if (!cfg) throw new Error(`Unknown provider: ${provider}`); const params = new URLSearchParams({ client_id: clientId, redirect_uri: redirectUri, response_type: 'code', scope: cfg.scope, state: state, code_challenge: codeChallenge, code_challenge_method: 'S256' }); return `${cfg.authUrl}?${params.toString()}`; } module.exports = { generatePKCE, buildAuthURL };
OAuth Callback Route
const { generatePKCE, buildAuthURL } = require('../services/oauth'); const crypto = require('crypto'); // Temporary PKCE store (use Redis in production) const pkceStore = new Map(); // ─────────── INITIATE OAUTH ─────────── router.get('/oauth/:provider', (req, res) => { const { codeVerifier, codeChallenge } = generatePKCE(); const state = crypto.randomBytes(16).toString('hex'); // Store PKCE verifier keyed by state (expires in 10 minutes) pkceStore.set(state, { codeVerifier, expiresAt: Date.now() + 600000 }); const authURL = buildAuthURL(req.params.provider, { clientId: process.env[`${req.params.provider.toUpperCase()}_CLIENT_ID`], redirectUri: `${process.env.APP_URL}/api/auth/oauth/callback`, codeChallenge, state }); res.redirect(authURL); }); // ─────────── OAUTH CALLBACK ─────────── router.get('/oauth/callback', async (req, res) => { const { code, state } = req.query; const stored = pkceStore.get(state); if (!stored || stored.expiresAt < Date.now()) { return res.status(400).json({ error: 'Invalid or expired state' }); } pkceStore.delete(state); // Exchange authorization code for tokens using code_verifier // (Provider-specific token exchange omitted for brevity) // Once you have the user's email and profile from the provider: // 1. Find or create user in your database // 2. Generate JWT tokens (same as login flow) // 3. Set httpOnly cookies // 4. Redirect to frontend res.redirect('/dashboard'); });
OAuth tells you that a user successfully authenticated with Google or GitHub. It does not tell you what they are authorized to do in your application. Always maintain your own user records and role assignments. The OAuth profile data (email, name) is just a verified identity claim.
7. Adding Biometric Authentication with the H33 API
Passwords can be stolen, phished, and brute-forced. OAuth delegates trust to third parties. Biometric authentication binds identity to biology — something the user is, not something they know or something a third party vouches for. H33's API makes this accessible in a single API call, with the biometric template processed under fully homomorphic encryption (FHE) so that raw biometric data never leaves the user's device in cleartext.
How It Works
- Enrollment — The user captures a biometric (face, fingerprint, voice). The H33 SDK on the client encrypts the biometric template using FHE before transmitting it.
- Storage — H33 stores only the encrypted template. Raw biometric data never touches your server or H33's servers in decryptable form.
- Verification — On subsequent logins, a fresh biometric capture is encrypted and sent. H33 computes a similarity match on the encrypted data using FHE, returning only a boolean match result and a confidence score.
Why FHE Matters for Biometrics
Unlike passwords, biometrics cannot be changed if compromised. If your database is breached and contains raw fingerprint or face templates, those users are permanently exposed. FHE ensures that even a complete database breach reveals nothing — the encrypted templates are computationally indistinguishable from random data without the decryption key, which never leaves the client device.
Install and Initialize the H33 SDK
const { H33Client } = require('@h33/sdk'); /** * Initialize the H33 client. * API key from h33.ai/get-api-key — free tier includes 10,000 calls/month. */ const h33 = new H33Client({ apiKey: process.env.H33_API_KEY, baseUrl: process.env.H33_API_URL || 'https://api.h33.ai/v1', timeout: 5000 // 5s timeout — auth should be fast }); module.exports = { h33 };
Biometric Enrollment Endpoint
const express = require('express'); const router = express.Router(); const { requireAuth } = require('../middleware/auth'); const { h33 } = require('../services/h33'); /** * POST /api/auth/biometric/enroll * * Enroll a biometric template for the authenticated user. * The template arrives already encrypted (FHE) from the client SDK. */ router.post('/enroll', requireAuth, async (req, res) => { try { const { encryptedTemplate, modality } = req.body; if (!encryptedTemplate || !modality) { return res.status(400).json({ error: 'encryptedTemplate and modality are required' }); } // Call H33 API to enroll the biometric template const enrollment = await h33.biometric.enroll({ userId: req.user.sub, encryptedTemplate: encryptedTemplate, modality: modality, // 'face', 'fingerprint', or 'voice' metadata: { enrolledAt: new Date().toISOString(), deviceId: req.headers['x-device-id'] || 'unknown' } }); res.status(201).json({ message: 'Biometric enrolled successfully', enrollmentId: enrollment.id, modality: enrollment.modality, // H33 returns the security level of the enrollment securityLevel: enrollment.securityLevel }); } catch (err) { console.error('Biometric enrollment error:', err); res.status(500).json({ error: 'Enrollment failed' }); } }); /** * POST /api/auth/biometric/verify * * Verify a biometric sample against the enrolled template. * All computation happens on encrypted data via FHE. */ router.post('/verify', async (req, res) => { try { const { userId, encryptedSample, modality } = req.body; if (!userId || !encryptedSample || !modality) { return res.status(400).json({ error: 'userId, encryptedSample, and modality are required' }); } // Call H33 API to verify biometric match (FHE computation) const result = await h33.biometric.verify({ userId: userId, encryptedSample: encryptedSample, modality: modality }); if (result.match) { // Biometric verified — issue tokens const { generateAccessToken, generateRefreshToken } = require('../services/jwt'); // Look up the user (replace with your database query) const user = { id: userId, email: result.email, role: 'user' }; const accessToken = generateAccessToken(user); const refreshToken = generateRefreshToken(user); res.cookie('access_token', accessToken, { httpOnly: true, secure: true, sameSite: 'strict', maxAge: 15 * 60 * 1000 }); res.json({ authenticated: true, confidence: result.confidence, latencyUs: result.latencyUs, // Typically ~50µs per auth postQuantumSecure: true }); } else { res.status(401).json({ authenticated: false, confidence: result.confidence, error: 'Biometric verification failed' }); } } catch (err) { console.error('Biometric verify error:', err); res.status(500).json({ error: 'Verification failed' }); } }); module.exports = router;
Wire the biometric routes into your app:
const biometricRoutes = require('./routes/biometric'); app.use('/api/auth/biometric', biometricRoutes);
Client-Side Integration
On the frontend, the H33 client SDK handles biometric capture and FHE encryption. Here is a minimal example:
import { H33Client } from '@h33/client-sdk'; const h33 = new H33Client({ publicKey: 'your_public_key' }); // Capture and encrypt biometric on the user's device async function enrollBiometric() { // Captures face/fingerprint, encrypts via FHE before sending const capture = await h33.biometric.capture({ modality: 'face' }); // Send encrypted template to your backend const response = await fetch('/api/auth/biometric/enroll', { method: 'POST', headers: { 'Content-Type': 'application/json' }, credentials: 'include', body: JSON.stringify({ encryptedTemplate: capture.encryptedTemplate, modality: 'face' }) }); return response.json(); } // Biometric login — no password required async function biometricLogin(userId) { const capture = await h33.biometric.capture({ modality: 'face' }); const response = await fetch('/api/auth/biometric/verify', { method: 'POST', headers: { 'Content-Type': 'application/json' }, credentials: 'include', body: JSON.stringify({ userId, encryptedSample: capture.encryptedTemplate, modality: 'face' }) }); const result = await response.json(); if (result.authenticated) { console.log(`Authenticated in ${result.latencyUs}µs (post-quantum secure)`); } return result; }
H33's biometric verification pipeline runs at approximately 50 microseconds per authentication, sustaining over 1.2 million authentications per second on production hardware. The entire pipeline — FHE biometric matching, zero-knowledge proof verification, and post-quantum attestation — executes in a single API call. See the live benchmarks for current numbers.
8. Post-Quantum Security Considerations
Every authentication system built with classical cryptography has an expiration date. RSA, ECDSA, and the Diffie-Hellman key exchanges that underpin TLS, JWTs, and OAuth are all vulnerable to Shor's algorithm running on a sufficiently powerful quantum computer. The question is not whether this will happen, but when.
What Breaks and When
| Component | Classical Algorithm | Quantum Threat | Impact |
|---|---|---|---|
| JWT signing | RS256 (RSA), ES256 (ECDSA) | Shor's algorithm | Tokens forgeable |
| TLS handshake | ECDH key exchange | Shor's algorithm | Traffic decryptable |
| OAuth tokens | Provider-issued, RSA/ECDSA signed | Shor's algorithm | Identity forgeable |
| Password hashes | bcrypt, Argon2 | Grover's algorithm (2x speedup) | Minimal — increase cost factor |
| AES-256 | Symmetric encryption | Grover's algorithm (2x speedup) | Still 128-bit security |
Password hashing is relatively safe — Grover's algorithm only provides a quadratic speedup, and you can compensate by doubling the cost factor. But everything that relies on public-key cryptography is at risk: your JWT signatures, your TLS connections, your OAuth provider's token signing keys.
The Harvest-Now, Decrypt-Later Threat
The most insidious threat is already happening. Nation-state adversaries are intercepting and storing encrypted network traffic today, planning to decrypt it once quantum computers become available. If your authentication tokens, session data, or API traffic is captured in transit, it can be decrypted retroactively. This is why NIST has been standardizing post-quantum algorithms since 2016 and why migration is urgent now, not when quantum computers are publicly demonstrated.
If an attacker records your JWTs today and later cracks the signing key with a quantum computer, they can forge tokens that appear valid for any user at any point in time. Your audit logs, your session histories, your signed documents — all retroactively compromised. Post-quantum authentication is not a future concern. The data being harvested right now is the attack surface.
NIST Post-Quantum Standards
NIST finalized three post-quantum standards in 2024:
- FIPS 203 (ML-KEM / Kyber) — Key encapsulation for secure key exchange. Replaces ECDH in TLS.
- FIPS 204 (ML-DSA / Dilithium) — Digital signatures. Replaces RSA and ECDSA for signing.
- FIPS 205 (SLH-DSA / SPHINCS+) — Hash-based signatures. Conservative alternative, very large signatures but minimal hardness assumptions.
H33's API abstracts these standards into simple API calls. You do not need to understand the lattice mathematics or manage key generation parameters. You call the API, and you get post-quantum security.
9. Rate Limiting and Brute-Force Protection
No authentication system is complete without rate limiting. Without it, an attacker can attempt millions of login guesses, enumerate valid usernames, or exhaust your API quota. Here is a layered approach that combines IP-based rate limiting with account-based lockout.
IP-Based Rate Limiting
const rateLimit = require('express-rate-limit'); /** * General API rate limit: 100 requests per 15 minutes per IP. */ const generalLimiter = rateLimit({ windowMs: 15 * 60 * 1000, // 15 minutes max: 100, message: { error: 'Too many requests, please try again later' }, standardHeaders: true, // Return rate limit info in headers legacyHeaders: false }); /** * Strict auth rate limit: 5 login attempts per 15 minutes per IP. * This is your primary brute-force defense. */ const authLimiter = rateLimit({ windowMs: 15 * 60 * 1000, max: 5, message: { error: 'Too many login attempts. Try again in 15 minutes.', retryAfter: 900 }, standardHeaders: true, legacyHeaders: false, // Skip successful requests — only count failures skipSuccessfulRequests: true }); /** * Account registration limiter: 3 accounts per hour per IP. */ const registrationLimiter = rateLimit({ windowMs: 60 * 60 * 1000, // 1 hour max: 3, message: { error: 'Too many accounts created. Try again in an hour.' } }); module.exports = { generalLimiter, authLimiter, registrationLimiter };
Apply Rate Limiters to Routes
const { generalLimiter, authLimiter, registrationLimiter } = require('./middleware/rate-limit'); // Apply general limiter to all routes app.use(generalLimiter); // Apply strict limiters to auth endpoints app.use('/api/auth/login', authLimiter); app.use('/api/auth/register', registrationLimiter);
Account-Based Lockout
IP-based rate limiting alone is insufficient because attackers can rotate IP addresses. Complement it with account-based lockout that tracks failed attempts per user:
// In-memory store for demo (use Redis in production) const failedAttempts = new Map(); const MAX_ATTEMPTS = 5; const LOCKOUT_DURATION = 15 * 60 * 1000; // 15 minutes /** * Check if an account is currently locked. */ function isLocked(email) { const record = failedAttempts.get(email); if (!record) return false; // Auto-unlock after lockout duration if (record.lockedUntil && Date.now() > record.lockedUntil) { failedAttempts.delete(email); return false; } return record.count >= MAX_ATTEMPTS; } /** * Record a failed login attempt. */ function recordFailure(email) { const record = failedAttempts.get(email) || { count: 0 }; record.count += 1; record.lastAttempt = Date.now(); if (record.count >= MAX_ATTEMPTS) { record.lockedUntil = Date.now() + LOCKOUT_DURATION; } failedAttempts.set(email, record); return record; } /** * Clear failed attempts after successful login. */ function clearFailures(email) { failedAttempts.delete(email); } module.exports = { isLocked, recordFailure, clearFailures };
Integrate account lockout into your login route by adding these checks before and after password verification:
const { isLocked, recordFailure, clearFailures } = require('../services/account-lockout'); // Inside the login handler, before password check: if (isLocked(email)) { return res.status(429).json({ error: 'Account temporarily locked due to too many failed attempts', retryAfter: 900 }); } // After failed password check: const failure = recordFailure(email); const remaining = MAX_ATTEMPTS - failure.count; return res.status(401).json({ error: 'Invalid credentials', attemptsRemaining: Math.max(0, remaining) }); // After successful password check: clearFailures(email);
The attemptsRemaining field is useful for UX but can help attackers calibrate their approach. In high-security environments, return the same generic "Invalid credentials" message regardless of whether the email exists, the password is wrong, or the account is locked. Balance this against user experience for your specific use case.
10. Best Practices Checklist
Here is a comprehensive checklist drawn from OWASP, NIST SP 800-63B, and our experience running authentication at scale. Treat this as a minimum bar, not a ceiling.
Password and Credential Storage
- Use Argon2id for new deployments; bcrypt (cost 12+) for existing systems.
- Never store passwords in plaintext, MD5, SHA-1, or unsalted SHA-256.
- Enforce minimum 8-character passwords (NIST SP 800-63B). No arbitrary complexity rules.
- Check passwords against known-breached lists (HaveIBeenPwned API).
- Never log passwords, tokens, or session identifiers.
Token and Session Management
- Store JWTs in httpOnly, Secure, SameSite=Strict cookies. Never in localStorage.
- Use short-lived access tokens (15 minutes max) with refresh token rotation.
- Validate token
typefield to prevent refresh tokens from being used as access tokens. - Implement token version counters for forced revocation.
- Always verify the
iss(issuer) andaud(audience) claims.
Transport and Infrastructure
- Enforce HTTPS everywhere. Use HSTS headers with a minimum 1-year max-age.
- Set
Content-Security-Policy,X-Frame-Options, andX-Content-Type-Optionsheaders (Helmet handles this). - Rate-limit authentication endpoints aggressively (5 attempts per 15 minutes).
- Implement account lockout after repeated failures.
- Use zero-trust architecture — verify every request, not just the first one.
Biometric and Multi-Factor
- Never store raw biometric templates. Use FHE-encrypted storage (H33 handles this).
- Support multiple biometric modalities for accessibility.
- Implement liveness detection to prevent replay attacks.
- Provide fallback authentication methods (e.g., recovery codes).
Post-Quantum Readiness
- Plan your migration timeline now. NIST recommends completing PQ transitions by 2030.
- Audit all cryptographic dependencies: JWT signing, TLS certificates, key exchange protocols.
- Use hybrid approaches (classical + PQ) during the transition period.
- Choose an auth provider that supports post-quantum algorithms natively.
The 30-Second Audit
Run this quick check on your current authentication: (1) Are passwords hashed with Argon2id or bcrypt? (2) Are tokens in httpOnly cookies, not localStorage? (3) Is rate limiting active on login endpoints? (4) Do you have a post-quantum migration plan? If you answered "no" to any of these, you have work to do.
11. Migrating to Post-Quantum Auth with H33
The transition from classical to post-quantum authentication does not need to be a rip-and-replace migration. H33's API is designed to be additive — you can layer post-quantum security on top of your existing authentication system in three phases.
Phase 1: Add H33 as a Verification Layer
Keep your existing JWT/password authentication intact. Add H33 verification as an additional check for sensitive operations. This is a non-breaking change that gives you post-quantum security where it matters most.
const { h33 } = require('../services/h33'); /** * Middleware: add post-quantum attestation to sensitive routes. * This wraps existing auth — does not replace it. */ function requirePQAttestation(req, res, next) { const attestation = req.headers['x-h33-attestation']; if (!attestation) { return res.status(403).json({ error: 'Post-quantum attestation required for this operation', docs: 'https://h33.ai/docs/api#attestation' }); } h33.attestation.verify({ token: attestation, userId: req.user.sub }) .then(result => { if (!result.valid) { return res.status(403).json({ error: 'Attestation invalid' }); } req.pqAttestation = result; next(); }) .catch(err => { console.error('PQ attestation error:', err); res.status(500).json({ error: 'Attestation verification failed' }); }); } module.exports = { requirePQAttestation };
Apply it to sensitive routes:
const { requirePQAttestation } = require('../middleware/pq-verify'); // Existing auth is still checked first (requireAuth) // PQ attestation is an additional layer for high-value operations router.post('/transfer-funds', requireAuth, requirePQAttestation, // Post-quantum verification async (req, res) => { // Both classical JWT and PQ attestation verified // Safe to proceed with sensitive operation res.json({ message: 'Transfer authorized (PQ-secure)' }); } );
Phase 2: Add Biometric Enrollment
Offer biometric authentication as an option for users who want passwordless login. The enrollment flow is additive — users who do not enroll continue using password + JWT authentication.
// On the settings page, offer biometric enrollment router.post('/settings/enable-biometric', requireAuth, async (req, res) => { const { encryptedTemplate, modality } = req.body; const enrollment = await h33.biometric.enroll({ userId: req.user.sub, encryptedTemplate, modality }); // Update user record to indicate biometric is available // user.biometricEnabled = true; // user.biometricModality = modality; res.json({ message: 'Biometric authentication enabled', enrollmentId: enrollment.id }); }); // Login route now checks for biometric option router.post('/login', async (req, res) => { const { email, password, biometricSample, modality } = req.body; if (biometricSample) { // Biometric login path (post-quantum secure) const result = await h33.biometric.verify({ userId: email, encryptedSample: biometricSample, modality: modality }); if (result.match) { // Issue tokens and return } } else { // Classical password login path (existing code) } });
Phase 3: Full Post-Quantum Migration
Once biometric enrollment reaches critical mass, you can make post-quantum authentication the default and begin deprecating password-only login. H33's API supports signing authentication tokens with post-quantum algorithms (CRYSTALS-Dilithium) instead of classical RSA or ECDSA:
curl -X POST https://api.h33.ai/v1/auth/token \ -H "Authorization: Bearer $H33_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "userId": "user_12345", "claims": { "email": "user@example.com", "role": "user" }, "signatureAlgorithm": "dilithium3", "expiresIn": "15m" }' // Response: a post-quantum signed authentication token { "token": "h33_pq_eyJ...", "signatureAlgorithm": "dilithium3", "postQuantumSecure": true, "expiresAt": "2026-02-24T15:30:00Z", "latencyUs": 48 }
The Migration Timeline
| Phase | Timeline | Action | User Impact |
|---|---|---|---|
| Phase 1 | Week 1 | Add PQ attestation to sensitive routes | None — transparent to users |
| Phase 2 | Weeks 2-4 | Offer biometric enrollment (optional) | Opt-in — biometric login available |
| Phase 3 | Months 2-3 | PQ-signed tokens as default | Transparent — tokens upgraded |
| Phase 4 | Month 6+ | Deprecate password-only login | Users prompted to add biometric |
Conclusion
Authentication in 2026 demands more than passwords and JWTs. The threat landscape has evolved — credential stuffing is industrialized, phishing is AI-enhanced, and quantum computing is advancing toward breaking the classical cryptography that underpins every authentication token in use today.
In this tutorial, we built a complete Node.js authentication system from the ground up:
- Argon2id password hashing — Memory-hard, GPU-resistant, OWASP-recommended.
- JWT access/refresh tokens in httpOnly cookies — Stateless, XSS-protected, short-lived.
- OAuth 2.0 with PKCE — Modern social login without exposing authorization codes.
- FHE biometric authentication via H33 — Passwordless login where raw biometrics never leave the device in cleartext.
- Post-quantum attestation — Authentication tokens signed with lattice-based algorithms that resist quantum attacks.
- Layered rate limiting and account lockout — Defense in depth against brute-force attacks.
The migration path is incremental. You do not need to rebuild your authentication system overnight. Start by adding H33's post-quantum attestation to your most sensitive routes (Phase 1). Offer biometric enrollment to interested users (Phase 2). Transition to PQ-signed tokens (Phase 3). Each phase is a standalone improvement that makes your system more secure.
The code in this tutorial is available, runnable, and production-oriented. The H33 free tier includes 10,000 API calls per month — enough to build, test, and deploy a fully post-quantum authentication system without spending a dollar.
The quantum clock is ticking. Start building.
Ready to Go Quantum-Secure?
Start protecting your users with post-quantum authentication today. Free tier includes 10,000 API calls/month. No credit card required.
Get Free API Key →