Post-Quantum Cryptography: A Practical Migration Guide for Engineering Teams
Prepare your systems for the quantum threat. A step-by-step guide to post-quantum cryptography migration covering NIST standards, crypto-agility, hybrid deployments, and compliance timelines.
Post-Quantum Cryptography: A Practical Migration Guide for Engineering Teams
Quantum computers capable of breaking RSA-2048 and ECC do not exist today. But adversaries are not waiting for that day to arrive. Intelligence agencies and advanced persistent threat groups are already harvesting encrypted data in transit, storing it for the day when a sufficiently powerful quantum computer can decrypt it. This strategy, known as "harvest now, decrypt later," means that any data encrypted today with classical algorithms and expected to remain confidential for 10 or more years is already at risk. NIST finalized its first set of post-quantum cryptography (PQC) standards in August 2024, and the U.S. government's CNSA 2.0 timeline requires PQC adoption for all new system acquisitions by January 2027. The migration window is not theoretical - it is closing.
This guide provides engineering teams with a concrete, step-by-step path from classical cryptography to post-quantum readiness. Every section includes practical code examples, configuration templates, and tooling recommendations you can start using today.
Why Post-Quantum Matters Now
The Harvest Now, Decrypt Later Threat
The most immediate quantum threat is not a future attack on live systems. It is the retroactive decryption of data being captured right now. Adversaries with sufficient resources - nation-state actors, well-funded criminal organizations - are recording encrypted traffic at scale. When a cryptographically relevant quantum computer (CRQC) becomes available, they will decrypt that stored data.
Consider the practical implications:
- Government classified communications intercepted today could be decrypted in 10 to 15 years
- Healthcare records transmitted over TLS today have regulatory protection requirements extending decades
- Financial transaction data and trade secrets captured now retain their value for years
- Legal communications protected by attorney-client privilege have indefinite confidentiality requirements
If your data needs to stay secret for longer than the time it takes for a CRQC to arrive, you are already late.
Quantum Computing Timeline
The timeline for a cryptographically relevant quantum computer is a subject of active debate, but the consensus among cryptographers and quantum physicists has been converging:
- IBM published a roadmap targeting 100,000+ qubits by 2033
- Google demonstrated quantum error correction milestones in 2024 that significantly shortened projected timelines
- NIST estimates suggest a CRQC capable of breaking RSA-2048 could exist within 10 to 20 years
- Chinese research groups have published papers on quantum factoring advances, though claims vary in credibility
The critical point is not the exact date. It is that migration takes years, and you cannot start after the threat materializes.
NIST Finalized Standards
In August 2024, NIST published three finalized post-quantum cryptographic standards:
| Standard | Algorithm Basis | Purpose | FIPS |
|---|---|---|---|
| ML-KEM | CRYSTALS-Kyber | Key Encapsulation Mechanism | FIPS 203 |
| ML-DSA | CRYSTALS-Dilithium | Digital Signatures | FIPS 204 |
| SLH-DSA | SPHINCS+ | Hash-based Digital Signatures | FIPS 205 |
These are not draft proposals. They are finalized federal standards with assigned FIPS numbers. NIST also selected HQC as a backup KEM standard, expected to be finalized in 2027.
CNSA 2.0 Compliance Requirements
The NSA's Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) sets hard deadlines for National Security Systems (NSS):
- By 2025: Software and firmware signing must use CNSA 2.0 algorithms
- By January 2027: All new system acquisitions must support CNSA 2.0
- By 2030: Web browsers, servers, and cloud environments must exclusively use CNSA 2.0
- By 2033: Full transition for all NSS systems, including legacy
Even if you are not building for the U.S. government, these timelines signal the direction the entire industry is heading. Your customers, partners, and regulators will follow.
Understanding the NIST PQC Standards
ML-KEM (CRYSTALS-Kyber) - Key Encapsulation
ML-KEM is the standard for key encapsulation, which is the process of securely establishing shared secret keys between parties. It replaces the key exchange functionality currently provided by RSA key exchange, ECDH, and Diffie-Hellman.
Parameter Sets:
| Parameter | Security Level | Public Key Size | Ciphertext Size | Shared Secret |
|---|---|---|---|---|
| ML-KEM-512 | NIST Level 1 (AES-128 equivalent) | 800 bytes | 768 bytes | 32 bytes |
| ML-KEM-768 | NIST Level 3 (AES-192 equivalent) | 1,184 bytes | 1,088 bytes | 32 bytes |
| ML-KEM-1024 | NIST Level 5 (AES-256 equivalent) | 1,568 bytes | 1,568 bytes | 32 bytes |
When to use ML-KEM:
- TLS key exchange (replacing ECDHE)
- VPN tunnel establishment
- Secure messaging key agreement
- Any protocol that establishes shared secrets
Basic ML-KEM usage with the oqs Python library:
# pip install liboqs-python
import oqs
# Key generation
kem = oqs.KeyEncapsulation("Kyber768")
public_key = kem.generate_keypair()
# Encapsulation (sender side)
kem_sender = oqs.KeyEncapsulation("Kyber768")
ciphertext, shared_secret_sender = kem_sender.encap_secret(public_key)
# Decapsulation (receiver side)
shared_secret_receiver = kem.decap_secret(ciphertext)
assert shared_secret_sender == shared_secret_receiver
print(f"Public key size: {len(public_key)} bytes")
print(f"Ciphertext size: {len(ciphertext)} bytes")
print(f"Shared secret size: {len(shared_secret_sender)} bytes")ML-DSA (CRYSTALS-Dilithium) - Digital Signatures
ML-DSA is the primary standard for digital signatures. It replaces RSA signatures, ECDSA, and EdDSA for most use cases.
Parameter Sets:
| Parameter | Security Level | Public Key Size | Signature Size |
|---|---|---|---|
| ML-DSA-44 | NIST Level 2 | 1,312 bytes | 2,420 bytes |
| ML-DSA-65 | NIST Level 3 | 1,952 bytes | 3,309 bytes |
| ML-DSA-87 | NIST Level 5 | 2,592 bytes | 4,627 bytes |
When to use ML-DSA:
- Code signing and software distribution
- TLS certificate signatures
- API request authentication
- Document signing
- JWT token signing
import oqs
# Key generation
sig = oqs.Signature("Dilithium3")
public_key = sig.generate_keypair()
# Signing
message = b"Critical firmware update v2.4.1"
signature = sig.sign(message)
# Verification
verifier = oqs.Signature("Dilithium3")
is_valid = verifier.verify(message, signature, public_key)
print(f"Signature valid: {is_valid}")
print(f"Public key size: {len(public_key)} bytes")
print(f"Signature size: {len(signature)} bytes")SLH-DSA (SPHINCS+) - Hash-Based Signatures
SLH-DSA is a hash-based signature scheme. It is slower and produces larger signatures than ML-DSA, but its security relies only on the well-understood properties of hash functions, making it a conservative choice for high-assurance applications.
When to use SLH-DSA:
- Root certificate authority signatures where long-term trust is paramount
- Firmware signing for embedded devices with long lifecycles
- Situations requiring the most conservative security assumptions
- As a fallback if lattice-based cryptography (ML-DSA) faces unexpected attacks
import oqs
# SLH-DSA signing example
sig = oqs.Signature("SPHINCS+-SHA2-256f-simple")
public_key = sig.generate_keypair()
message = b"Root CA certificate payload"
signature = sig.sign(message)
verifier = oqs.Signature("SPHINCS+-SHA2-256f-simple")
is_valid = verifier.verify(message, signature, public_key)
print(f"Signature valid: {is_valid}")
print(f"Public key size: {len(public_key)} bytes")
print(f"Signature size: {len(signature)} bytes")
# Note: SLH-DSA signatures are significantly larger (up to ~49KB)Step 1: Cryptographic Inventory
Before you can migrate, you need to know what you are migrating. Most organizations have no comprehensive inventory of where and how they use cryptography. This is the essential first step.
Scanning Your Codebase for Crypto Usage
This Python script scans your codebase for imports, function calls, and references to cryptographic libraries and algorithms:
#!/usr/bin/env python3
"""
crypto_inventory_scanner.py
Scans a codebase for cryptographic library usage and algorithm references.
Outputs a structured inventory report.
"""
import os
import re
import json
import argparse
from pathlib import Path
from collections import defaultdict
from datetime import datetime
# Patterns indicating cryptographic usage
CRYPTO_PATTERNS = {
"python": {
"libraries": [
r"from\s+cryptography[\.\s]",
r"import\s+cryptography",
r"from\s+Crypto[\.\s]",
r"import\s+Crypto",
r"import\s+hashlib",
r"import\s+hmac",
r"import\s+ssl",
r"from\s+OpenSSL",
r"import\s+OpenSSL",
r"from\s+nacl[\.\s]",
r"import\s+nacl",
r"from\s+jose[\.\s]",
r"import\s+jwt",
r"from\s+paramiko",
],
"algorithms": [
r"RSA",
r"ECDSA",
r"ECDH",
r"Ed25519",
r"X25519",
r"AES",
r"DES|3DES|TripleDES",
r"SHA-?1(?!\d)",
r"SHA-?256",
r"SHA-?384",
r"SHA-?512",
r"MD5",
r"Diffie.?Hellman|DH_",
r"HMAC",
r"PKCS",
],
},
"java": {
"libraries": [
r"javax\.crypto",
r"java\.security",
r"org\.bouncycastle",
r"KeyPairGenerator",
r"Cipher\.getInstance",
r"Signature\.getInstance",
r"KeyAgreement",
],
"algorithms": [
r"RSA",
r"ECDSA",
r"AES",
r"DES",
r"SHA",
r"MD5",
],
},
"go": {
"libraries": [
r"crypto/rsa",
r"crypto/ecdsa",
r"crypto/ed25519",
r"crypto/tls",
r"crypto/x509",
r"crypto/aes",
r"crypto/sha256",
r"golang\.org/x/crypto",
],
"algorithms": [
r"rsa\.GenerateKey",
r"ecdsa\.Sign",
r"ed25519\.Sign",
r"tls\.Config",
],
},
}
FILE_EXTENSIONS = {
".py": "python",
".java": "java",
".go": "go",
".js": "python", # Use general patterns
".ts": "python",
".rb": "python",
".rs": "python",
}
def scan_file(filepath, language):
"""Scan a single file for cryptographic patterns."""
findings = []
try:
with open(filepath, "r", encoding="utf-8", errors="ignore") as f:
lines = f.readlines()
except (PermissionError, OSError):
return findings
patterns = CRYPTO_PATTERNS.get(language, CRYPTO_PATTERNS["python"])
for line_num, line in enumerate(lines, 1):
for category in ["libraries", "algorithms"]:
for pattern in patterns[category]:
if re.search(pattern, line, re.IGNORECASE):
findings.append({
"file": str(filepath),
"line": line_num,
"category": category,
"pattern": pattern,
"content": line.strip(),
})
return findings
def scan_directory(root_dir, exclude_dirs=None):
"""Recursively scan a directory for cryptographic usage."""
if exclude_dirs is None:
exclude_dirs = {
"node_modules", ".git", "venv", "__pycache__",
".tox", "vendor", "dist", "build",
}
all_findings = []
files_scanned = 0
for dirpath, dirnames, filenames in os.walk(root_dir):
dirnames[:] = [d for d in dirnames if d not in exclude_dirs]
for filename in filenames:
ext = Path(filename).suffix
if ext in FILE_EXTENSIONS:
filepath = Path(dirpath) / filename
language = FILE_EXTENSIONS[ext]
findings = scan_file(filepath, language)
all_findings.extend(findings)
files_scanned += 1
return all_findings, files_scanned
def generate_report(findings, files_scanned, output_path=None):
"""Generate a structured inventory report."""
report = {
"scan_date": datetime.now().isoformat(),
"files_scanned": files_scanned,
"total_findings": len(findings),
"summary": defaultdict(int),
"vulnerable_algorithms": [],
"findings": findings,
}
quantum_vulnerable = {"RSA", "ECDSA", "ECDH", "Ed25519",
"X25519", "Diffie", "DH_"}
for finding in findings:
report["summary"][finding["category"]] += 1
for algo in quantum_vulnerable:
if algo.lower() in finding["content"].lower():
report["vulnerable_algorithms"].append({
"algorithm": algo,
"file": finding["file"],
"line": finding["line"],
})
report["summary"] = dict(report["summary"])
if output_path:
with open(output_path, "w") as f:
json.dump(report, f, indent=2)
print(f"Report written to {output_path}")
return report
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description="Scan codebase for cryptographic usage"
)
parser.add_argument("directory", help="Root directory to scan")
parser.add_argument(
"-o", "--output",
default="crypto_inventory.json",
help="Output report file path",
)
args = parser.parse_args()
findings, files_scanned = scan_directory(args.directory)
report = generate_report(findings, files_scanned, args.output)
print(f"\nCryptographic Inventory Summary")
print(f"{'=' * 40}")
print(f"Files scanned: {files_scanned}")
print(f"Total findings: {len(findings)}")
print(f"Quantum-vulnerable references: "
f"{len(report['vulnerable_algorithms'])}")Scanning TLS Configurations
Use this bash script to scan your infrastructure for TLS configurations and certificate details:
#!/bin/bash
# tls_inventory.sh - Scan endpoints for TLS configuration details
ENDPOINTS_FILE="${1:-endpoints.txt}"
OUTPUT_DIR="tls_inventory_$(date +%Y%m%d)"
mkdir -p "$OUTPUT_DIR"
echo "TLS Cryptographic Inventory Scanner"
echo "===================================="
# Function to scan a single endpoint
scan_endpoint() {
local host="$1"
local port="${2:-443}"
local output_file="$OUTPUT_DIR/${host}_${port}.json"
echo "Scanning $host:$port..."
# Get certificate details
cert_info=$(echo | openssl s_client -connect "$host:$port" \
-servername "$host" 2>/dev/null)
# Extract key exchange and signature algorithms
protocol=$(echo "$cert_info" | grep "Protocol" | awk '{print $3}')
cipher=$(echo "$cert_info" | grep "Cipher" | head -1 | awk '{print $3}')
# Get certificate public key algorithm and size
cert_details=$(echo "$cert_info" | openssl x509 -noout -text 2>/dev/null)
pub_key_algo=$(echo "$cert_details" | grep "Public Key Algorithm" \
| awk -F: '{print $2}' | xargs)
pub_key_size=$(echo "$cert_details" | grep "Public-Key" \
| grep -o '[0-9]*')
sig_algo=$(echo "$cert_details" | grep "Signature Algorithm" \
| head -1 | awk -F: '{print $2}' | xargs)
expiry=$(echo "$cert_details" | grep "Not After" \
| awk -F: '{print $2":"$3":"$4}' | xargs)
# Determine quantum vulnerability
quantum_vulnerable="false"
case "$pub_key_algo" in
*rsa*|*RSA*|*ec*|*EC*|*ecdsa*|*ECDSA*)
quantum_vulnerable="true"
;;
esac
# Output JSON record
cat > "$output_file" <<EOF
{
"host": "$host",
"port": $port,
"protocol": "$protocol",
"cipher_suite": "$cipher",
"public_key_algorithm": "$pub_key_algo",
"public_key_size_bits": $pub_key_size,
"signature_algorithm": "$sig_algo",
"certificate_expiry": "$expiry",
"quantum_vulnerable": $quantum_vulnerable
}
EOF
if [ "$quantum_vulnerable" = "true" ]; then
echo " WARNING: $host uses quantum-vulnerable algorithm: $pub_key_algo"
fi
}
# Read endpoints from file (format: host:port or just host)
while IFS= read -r line; do
[[ "$line" =~ ^#.*$ || -z "$line" ]] && continue
host=$(echo "$line" | cut -d: -f1)
port=$(echo "$line" | cut -d: -f2 -s)
port="${port:-443}"
scan_endpoint "$host" "$port"
done < "$ENDPOINTS_FILE"
# Generate summary report
echo ""
echo "Generating summary report..."
python3 -c "
import json, glob, os
files = glob.glob('$OUTPUT_DIR/*.json')
results = []
for f in files:
with open(f) as fh:
results.append(json.load(fh))
vulnerable = [r for r in results if r.get('quantum_vulnerable')]
print(f'Total endpoints scanned: {len(results)}')
print(f'Quantum-vulnerable endpoints: {len(vulnerable)}')
for v in vulnerable:
print(f' - {v[\"host\"]}:{v[\"port\"]} '
f'({v[\"public_key_algorithm\"]}, {v[\"public_key_size_bits\"]}b)')
"
echo ""
echo "Detailed results saved to $OUTPUT_DIR/"Scanning Key Stores and Certificates
#!/bin/bash
# scan_keystores.sh - Inventory all certificates and keys
echo "Certificate and Key Store Inventory"
echo "===================================="
# Scan Java keystores
echo ""
echo "--- Java Keystores ---"
find / -name "*.jks" -o -name "*.p12" -o -name "*.pfx" 2>/dev/null | \
while read -r keystore; do
echo "Found keystore: $keystore"
keytool -list -keystore "$keystore" -storepass changeit 2>/dev/null | \
grep -E "(Entry type|Certificate fingerprint|Owner)"
done
# Scan PEM certificates
echo ""
echo "--- PEM Certificates ---"
find /etc/ssl /etc/pki /opt -name "*.pem" -o -name "*.crt" \
-o -name "*.cert" 2>/dev/null | \
while read -r cert; do
echo "Certificate: $cert"
openssl x509 -in "$cert" -noout \
-subject -issuer -dates -pubkey 2>/dev/null | head -5
key_type=$(openssl x509 -in "$cert" -noout -text 2>/dev/null \
| grep "Public Key Algorithm" | awk -F: '{print $2}' | xargs)
echo " Key Algorithm: $key_type"
echo ""
done
# Scan SSH keys
echo ""
echo "--- SSH Keys ---"
find /home /root -name "id_rsa*" -o -name "id_ecdsa*" \
-o -name "id_ed25519*" 2>/dev/null | \
while read -r key; do
echo "SSH Key: $key"
ssh-keygen -l -f "$key" 2>/dev/null
done
# Check for AWS KMS keys (if AWS CLI is available)
if command -v aws &> /dev/null; then
echo ""
echo "--- AWS KMS Keys ---"
aws kms list-keys --query 'Keys[].KeyId' --output text 2>/dev/null | \
tr '\t' '\n' | while read -r key_id; do
key_info=$(aws kms describe-key --key-id "$key_id" \
--query 'KeyMetadata.{Algo:KeySpec,Usage:KeyUsage,State:KeyState}' \
--output json 2>/dev/null)
echo "KMS Key: $key_id"
echo " $key_info"
done
fiStep 2: Risk Assessment and Prioritization
Not all cryptographic usage carries the same quantum risk. Prioritization prevents your migration from becoming an unfocused effort that tries to change everything at once.
Data Sensitivity and Lifespan Classification
The core question is: How long does this data need to remain confidential, and how long until a CRQC could break its encryption?
If data_confidentiality_requirement > time_to_CRQC, that data is at risk today.
#!/usr/bin/env python3
"""
pqc_risk_assessment.py
Framework for assessing post-quantum cryptographic risk
across systems and data classifications.
"""
from dataclasses import dataclass, field
from enum import Enum
from typing import Optional
class Sensitivity(Enum):
PUBLIC = 1
INTERNAL = 2
CONFIDENTIAL = 3
RESTRICTED = 4
TOP_SECRET = 5
class MigrationPriority(Enum):
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
CRITICAL = "critical"
@dataclass
class CryptoAsset:
name: str
system: str
algorithm: str
key_size: int
purpose: str # "encryption", "signing", "key_exchange", "hashing"
data_sensitivity: Sensitivity
data_lifespan_years: int
protocol: Optional[str] = None # "TLS", "SSH", "IPsec", etc.
notes: str = ""
dependencies: list = field(default_factory=list)
def assess_quantum_risk(asset: CryptoAsset,
crqc_timeline_years: int = 15) -> dict:
"""
Assess the quantum risk for a cryptographic asset.
Args:
asset: The cryptographic asset to assess
crqc_timeline_years: Estimated years until a CRQC is available
Returns:
Risk assessment dictionary with priority and reasoning.
"""
# Algorithms vulnerable to quantum attacks
quantum_vulnerable_algorithms = {
"RSA", "ECDSA", "ECDH", "Ed25519", "X25519",
"DSA", "DH", "Diffie-Hellman", "ElGamal",
"secp256r1", "secp384r1", "P-256", "P-384", "P-521",
}
# Check if the algorithm is quantum-vulnerable
is_vulnerable = any(
vuln.lower() in asset.algorithm.lower()
for vuln in quantum_vulnerable_algorithms
)
if not is_vulnerable:
return {
"asset": asset.name,
"priority": MigrationPriority.LOW,
"reason": f"{asset.algorithm} is not vulnerable to "
f"known quantum attacks",
"action": "Monitor for future developments",
}
# Calculate the risk window
# Data that must remain secret longer than the CRQC timeline
# is at risk from harvest-now-decrypt-later
risk_window = asset.data_lifespan_years - crqc_timeline_years
# Determine priority based on sensitivity and risk window
if risk_window > 5 and asset.data_sensitivity.value >= 4:
priority = MigrationPriority.CRITICAL
reason = (
f"Data lifespan ({asset.data_lifespan_years}y) far exceeds "
f"CRQC timeline ({crqc_timeline_years}y). "
f"Sensitivity: {asset.data_sensitivity.name}. "
f"Harvest-now-decrypt-later risk is immediate."
)
elif risk_window > 0 and asset.data_sensitivity.value >= 3:
priority = MigrationPriority.HIGH
reason = (
f"Data lifespan ({asset.data_lifespan_years}y) exceeds "
f"CRQC timeline ({crqc_timeline_years}y). "
f"Migration should begin within 12 months."
)
elif asset.data_sensitivity.value >= 3:
priority = MigrationPriority.MEDIUM
reason = (
f"Sensitive data with moderate lifespan. "
f"Plan migration within 24 months."
)
else:
priority = MigrationPriority.LOW
reason = (
f"Lower sensitivity or short data lifespan. "
f"Include in general migration timeline."
)
return {
"asset": asset.name,
"system": asset.system,
"current_algorithm": asset.algorithm,
"priority": priority,
"risk_window_years": risk_window,
"reason": reason,
"recommended_replacement": get_pqc_replacement(asset),
}
def get_pqc_replacement(asset: CryptoAsset) -> str:
"""Recommend a PQC replacement based on the asset's purpose."""
replacements = {
"key_exchange": "ML-KEM-768 (hybrid with X25519 initially)",
"encryption": "ML-KEM-768 for key wrapping + AES-256-GCM",
"signing": "ML-DSA-65 (hybrid with ECDSA initially)",
"hashing": "No change needed - SHA-256/SHA-3 are quantum-resistant "
"(use SHA-384+ for Grover resistance margin)",
}
return replacements.get(asset.purpose, "Consult PQC migration guide")
# Example usage: Assess a real inventory
if __name__ == "__main__":
inventory = [
CryptoAsset(
name="API Gateway TLS",
system="api-gateway-prod",
algorithm="ECDHE-ECDSA-AES256-GCM-SHA384",
key_size=256,
purpose="key_exchange",
data_sensitivity=Sensitivity.CONFIDENTIAL,
data_lifespan_years=7,
protocol="TLS 1.3",
),
CryptoAsset(
name="Patient Records Encryption",
system="healthcare-ehr",
algorithm="RSA-2048",
key_size=2048,
purpose="encryption",
data_sensitivity=Sensitivity.RESTRICTED,
data_lifespan_years=50,
protocol="TLS 1.2",
notes="HIPAA requirement: records retained indefinitely",
),
CryptoAsset(
name="Code Signing Certificate",
system="ci-cd-pipeline",
algorithm="ECDSA-P256",
key_size=256,
purpose="signing",
data_sensitivity=Sensitivity.CONFIDENTIAL,
data_lifespan_years=10,
),
CryptoAsset(
name="Internal Service Mesh mTLS",
system="kubernetes-cluster",
algorithm="ECDSA-P256",
key_size=256,
purpose="key_exchange",
data_sensitivity=Sensitivity.INTERNAL,
data_lifespan_years=1,
protocol="mTLS",
),
]
print("Post-Quantum Risk Assessment Report")
print("=" * 50)
for asset in inventory:
result = assess_quantum_risk(asset)
print(f"\nAsset: {result['asset']}")
print(f" System: {result.get('system', 'N/A')}")
print(f" Current: {result.get('current_algorithm', 'N/A')}")
print(f" Priority: {result['priority'].value}")
print(f" Risk Window: {result.get('risk_window_years', 'N/A')} years")
print(f" Reason: {result['reason']}")
if 'recommended_replacement' in result:
print(f" Replacement: {result['recommended_replacement']}")Risk Matrix Quick Reference
| Data Type | Sensitivity | Lifespan | Quantum Risk | Migration Priority |
|---|---|---|---|---|
| Healthcare records (HIPAA) | Restricted | 50+ years | Critical | Immediate |
| Financial PII | Restricted | 20+ years | Critical | Immediate |
| Government classified | Top Secret | Indefinite | Critical | Immediate |
| Legal communications | Confidential | 30+ years | High | Within 12 months |
| Trade secrets | Restricted | 10-20 years | High | Within 12 months |
| Customer PII | Confidential | 7-10 years | Medium | Within 24 months |
| Session tokens | Internal | Hours | Low | Standard timeline |
| Public API responses | Public | None | Low | Standard timeline |
Step 3: Building Crypto-Agility
Crypto-agility is the ability to swap cryptographic algorithms without rewriting application code. This is not optional - it is the single most important architectural decision for PQC readiness. Even after you deploy ML-KEM and ML-DSA, algorithms will continue to evolve, and your systems must be able to adapt.
Configuration-Driven Crypto Selection in Python
"""
crypto_agile.py
A crypto-agility layer that allows algorithm selection via configuration.
"""
import json
from abc import ABC, abstractmethod
from pathlib import Path
class KeyEncapsulation(ABC):
"""Abstract interface for key encapsulation mechanisms."""
@abstractmethod
def generate_keypair(self) -> tuple:
pass
@abstractmethod
def encapsulate(self, public_key: bytes) -> tuple:
pass
@abstractmethod
def decapsulate(self, ciphertext: bytes,
private_key: bytes) -> bytes:
pass
class ClassicalECDH(KeyEncapsulation):
"""Classical ECDH key exchange using the cryptography library."""
def generate_keypair(self) -> tuple:
from cryptography.hazmat.primitives.asymmetric.x25519 import (
X25519PrivateKey,
)
private_key = X25519PrivateKey.generate()
public_key = private_key.public_key()
return private_key, public_key
def encapsulate(self, public_key: bytes) -> tuple:
from cryptography.hazmat.primitives.asymmetric.x25519 import (
X25519PrivateKey, X25519PublicKey,
)
ephemeral_private = X25519PrivateKey.generate()
shared_secret = ephemeral_private.exchange(
X25519PublicKey.from_public_bytes(public_key)
)
ephemeral_public = (
ephemeral_private.public_key().public_bytes_raw()
)
return ephemeral_public, shared_secret
def decapsulate(self, ciphertext: bytes,
private_key: bytes) -> bytes:
from cryptography.hazmat.primitives.asymmetric.x25519 import (
X25519PrivateKey, X25519PublicKey,
)
return private_key.exchange(
X25519PublicKey.from_public_bytes(ciphertext)
)
class PostQuantumKEM(KeyEncapsulation):
"""Post-quantum KEM using liboqs."""
def __init__(self, algorithm: str = "Kyber768"):
self.algorithm = algorithm
def generate_keypair(self) -> tuple:
import oqs
kem = oqs.KeyEncapsulation(self.algorithm)
public_key = kem.generate_keypair()
return kem, public_key
def encapsulate(self, public_key: bytes) -> tuple:
import oqs
kem = oqs.KeyEncapsulation(self.algorithm)
ciphertext, shared_secret = kem.encap_secret(public_key)
return ciphertext, shared_secret
def decapsulate(self, ciphertext: bytes,
private_key) -> bytes:
return private_key.decap_secret(ciphertext)
class HybridKEM(KeyEncapsulation):
"""
Hybrid KEM combining classical and post-quantum algorithms.
The shared secret is derived from both, so security holds
as long as at least one algorithm remains unbroken.
"""
def __init__(self, classical: KeyEncapsulation,
post_quantum: KeyEncapsulation):
self.classical = classical
self.post_quantum = post_quantum
def generate_keypair(self) -> tuple:
classical_keys = self.classical.generate_keypair()
pq_keys = self.post_quantum.generate_keypair()
return (classical_keys, pq_keys)
def encapsulate(self, public_keys: tuple) -> tuple:
import hashlib
classical_pub, pq_pub = public_keys
c_ct, c_ss = self.classical.encapsulate(classical_pub)
pq_ct, pq_ss = self.post_quantum.encapsulate(pq_pub)
# Combine shared secrets using a KDF
combined_secret = hashlib.sha384(c_ss + pq_ss).digest()
combined_ciphertext = (c_ct, pq_ct)
return combined_ciphertext, combined_secret
def decapsulate(self, ciphertext: tuple,
private_keys: tuple) -> bytes:
import hashlib
classical_priv, pq_priv = private_keys
c_ct, pq_ct = ciphertext
c_ss = self.classical.decapsulate(c_ct, classical_priv)
pq_ss = self.post_quantum.decapsulate(pq_ct, pq_priv)
return hashlib.sha384(c_ss + pq_ss).digest()
# Configuration-driven factory
ALGORITHM_REGISTRY = {
"x25519": ClassicalECDH,
"ml-kem-768": lambda: PostQuantumKEM("Kyber768"),
"ml-kem-1024": lambda: PostQuantumKEM("Kyber1024"),
"hybrid-x25519-ml-kem-768": lambda: HybridKEM(
ClassicalECDH(), PostQuantumKEM("Kyber768")
),
}
def load_crypto_config(config_path: str = "crypto_config.json"):
"""Load cryptographic algorithm configuration from file."""
with open(config_path) as f:
return json.load(f)
def get_kem(config: dict) -> KeyEncapsulation:
"""
Get a KEM instance based on configuration.
This is the key to crypto-agility: algorithm selection
happens at configuration time, not in application code.
"""
algo_name = config.get("kem_algorithm", "x25519")
factory = ALGORITHM_REGISTRY.get(algo_name)
if factory is None:
raise ValueError(f"Unknown KEM algorithm: {algo_name}")
if callable(factory) and not isinstance(factory, type):
return factory()
return factory()The corresponding configuration file:
{
"kem_algorithm": "hybrid-x25519-ml-kem-768",
"signature_algorithm": "hybrid-ecdsa-ml-dsa-65",
"min_security_level": 3,
"allowed_algorithms": [
"x25519",
"ml-kem-768",
"ml-kem-1024",
"hybrid-x25519-ml-kem-768"
],
"migration_phase": "hybrid",
"fallback_algorithm": "x25519"
}Crypto-Agility in Go
package crypto
import (
"crypto/ecdh"
"crypto/rand"
"crypto/sha512"
"encoding/json"
"fmt"
"os"
)
// KEMAlgorithm defines the interface for key encapsulation mechanisms.
type KEMAlgorithm interface {
GenerateKeyPair() (publicKey, privateKey []byte, err error)
Encapsulate(publicKey []byte) (ciphertext, sharedSecret []byte, err error)
Decapsulate(ciphertext, privateKey []byte) (sharedSecret []byte, err error)
Name() string
}
// CryptoConfig holds the runtime cryptographic configuration.
type CryptoConfig struct {
KEMAlgorithm string `json:"kem_algorithm"`
SignatureAlgorithm string `json:"signature_algorithm"`
MinSecurityLevel int `json:"min_security_level"`
AllowedAlgorithms []string `json:"allowed_algorithms"`
MigrationPhase string `json:"migration_phase"`
}
// X25519KEM implements classical ECDH key exchange.
type X25519KEM struct{}
func (k *X25519KEM) Name() string { return "x25519" }
func (k *X25519KEM) GenerateKeyPair() ([]byte, []byte, error) {
curve := ecdh.X25519()
privateKey, err := curve.GenerateKey(rand.Reader)
if err != nil {
return nil, nil, fmt.Errorf("key generation failed: %w", err)
}
return privateKey.PublicKey().Bytes(), privateKey.Bytes(), nil
}
func (k *X25519KEM) Encapsulate(publicKey []byte) ([]byte, []byte, error) {
curve := ecdh.X25519()
// Generate ephemeral keypair
ephemeralPrivate, err := curve.GenerateKey(rand.Reader)
if err != nil {
return nil, nil, err
}
// Parse peer public key
peerPublic, err := curve.NewPublicKey(publicKey)
if err != nil {
return nil, nil, err
}
// Perform ECDH
sharedSecret, err := ephemeralPrivate.ECDH(peerPublic)
if err != nil {
return nil, nil, err
}
return ephemeralPrivate.PublicKey().Bytes(), sharedSecret, nil
}
func (k *X25519KEM) Decapsulate(ciphertext, privateKeyBytes []byte) ([]byte, error) {
curve := ecdh.X25519()
privateKey, err := curve.NewPrivateKey(privateKeyBytes)
if err != nil {
return nil, err
}
peerPublic, err := curve.NewPublicKey(ciphertext)
if err != nil {
return nil, err
}
return privateKey.ECDH(peerPublic)
}
// HybridKEM combines two KEM algorithms for defense in depth.
type HybridKEM struct {
Classical KEMAlgorithm
PostQuantum KEMAlgorithm
}
func (h *HybridKEM) Name() string {
return fmt.Sprintf("hybrid-%s-%s",
h.Classical.Name(), h.PostQuantum.Name())
}
func (h *HybridKEM) GenerateKeyPair() ([]byte, []byte, error) {
// In production, serialize both keypairs into a structured format
cPub, cPriv, err := h.Classical.GenerateKeyPair()
if err != nil {
return nil, nil, err
}
pqPub, pqPriv, err := h.PostQuantum.GenerateKeyPair()
if err != nil {
return nil, nil, err
}
// Concatenate with length prefixes in production
_ = cPriv
_ = pqPriv
return append(cPub, pqPub...), append(cPriv, pqPriv...), nil
}
func (h *HybridKEM) Encapsulate(publicKey []byte) ([]byte, []byte, error) {
// Split public key and encapsulate with both algorithms
// Combine shared secrets with a KDF
// This is simplified - production code needs proper serialization
return nil, nil, fmt.Errorf("implement with proper key serialization")
}
func (h *HybridKEM) Decapsulate(ciphertext, privateKey []byte) ([]byte, error) {
return nil, fmt.Errorf("implement with proper key serialization")
}
// LoadConfig reads crypto configuration from a JSON file.
func LoadConfig(path string) (*CryptoConfig, error) {
data, err := os.ReadFile(path)
if err != nil {
return nil, fmt.Errorf("failed to read config: %w", err)
}
var config CryptoConfig
if err := json.Unmarshal(data, &config); err != nil {
return nil, fmt.Errorf("failed to parse config: %w", err)
}
return &config, nil
}
// NewKEM creates a KEM instance based on configuration.
// This factory function is the key to crypto-agility.
func NewKEM(config *CryptoConfig) (KEMAlgorithm, error) {
switch config.KEMAlgorithm {
case "x25519":
return &X25519KEM{}, nil
// Add PQC implementations as they become available in Go
// case "ml-kem-768":
// return &MLKEM768{}, nil
default:
return nil, fmt.Errorf("unsupported KEM algorithm: %s",
config.KEMAlgorithm)
}
}
// CombineSharedSecrets derives a combined secret from
// classical and post-quantum shared secrets.
func CombineSharedSecrets(classical, postQuantum []byte) []byte {
combined := append(classical, postQuantum...)
hash := sha512.Sum384(combined)
return hash[:]
}Step 4: Hybrid Deployments
Hybrid mode runs classical and post-quantum algorithms simultaneously. This is the recommended transition strategy because it maintains backward compatibility while adding quantum resistance. Security is guaranteed as long as at least one of the two algorithms remains secure.
Configuring Hybrid TLS with OpenSSL 3.x and OQS Provider
The Open Quantum Safe project provides an OpenSSL provider that adds PQC algorithm support:
# Install the OQS provider for OpenSSL 3.x
# Build from source for production use
# Clone and build liboqs
git clone https://github.com/open-quantum-safe/liboqs.git
cd liboqs
mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=/opt/oqs \
-DOQS_USE_OPENSSL=ON \
-DBUILD_SHARED_LIBS=ON ..
make -j$(nproc)
sudo make install
# Clone and build the OQS provider for OpenSSL 3.x
cd ../..
git clone https://github.com/open-quantum-safe/oqs-provider.git
cd oqs-provider
mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=/opt/oqs \
-Dliboqs_DIR=/opt/oqs/lib/cmake/liboqs ..
make -j$(nproc)
sudo make install
# Copy the provider to OpenSSL's provider directory
sudo cp /opt/oqs/lib/ossl-modules/oqsprovider.so \
$(openssl version -d | cut -d'"' -f2)/ossl-modules/OpenSSL configuration for hybrid TLS:
# /etc/ssl/openssl_pqc.cnf
# OpenSSL configuration with PQC hybrid support
openssl_conf = openssl_init
[openssl_init]
providers = provider_sect
ssl_conf = ssl_sect
[provider_sect]
default = default_sect
oqsprovider = oqs_sect
[default_sect]
activate = 1
[oqs_sect]
activate = 1
module = /opt/oqs/lib/ossl-modules/oqsprovider.so
[ssl_sect]
system_default = system_default_sect
[system_default_sect]
# Hybrid key exchange: combines X25519 with ML-KEM-768
# The connection is secure if either algorithm holds
Groups = x25519_mlkem768:x25519_mlkem1024:x25519:secp384r1
# Signature algorithms for certificate verification
SignatureAlgorithms = mldsa65:ecdsa_secp384r1_sha384:rsa_pss_rsae_sha384
# Minimum TLS version
MinProtocol = TLSv1.3
# Disable older cipher suites
CipherSuites = TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256Nginx configuration for hybrid TLS:
# /etc/nginx/conf.d/pqc_tls.conf
# Nginx configuration with hybrid post-quantum TLS
server {
listen 443 ssl;
server_name api.example.com;
# Use the PQC-enabled OpenSSL configuration
ssl_conf_command Options ServerPreference;
# Certificate signed with hybrid or classical algorithm
ssl_certificate /etc/ssl/certs/server_hybrid.pem;
ssl_certificate_key /etc/ssl/private/server_hybrid.key;
# TLS 1.3 only for PQC support
ssl_protocols TLSv1.3;
# Hybrid key exchange groups (requires OQS provider)
# x25519_mlkem768 = X25519 + ML-KEM-768 hybrid
ssl_conf_command Groups x25519_mlkem768:x25519_mlkem1024:x25519;
# Signature algorithms
ssl_conf_command SignatureAlgorithms mldsa65:ecdsa_secp384r1_sha384;
# HSTS header
add_header Strict-Transport-Security "max-age=63072000" always;
location / {
proxy_pass http://backend;
}
}Generating Hybrid Certificates
#!/bin/bash
# generate_hybrid_certs.sh
# Generate a certificate chain using hybrid PQC algorithms
export OPENSSL_CONF=/etc/ssl/openssl_pqc.cnf
CERT_DIR="/etc/ssl/pqc_certs"
mkdir -p "$CERT_DIR"
echo "Generating PQC Hybrid Certificate Chain"
echo "========================================"
# Step 1: Generate Root CA with ML-DSA-65
echo "Creating Root CA with ML-DSA-65..."
openssl genpkey -algorithm mldsa65 \
-out "$CERT_DIR/root_ca.key"
openssl req -new -x509 \
-key "$CERT_DIR/root_ca.key" \
-out "$CERT_DIR/root_ca.crt" \
-days 3650 \
-subj "/C=US/O=Example Corp/CN=PQC Root CA" \
-addext "basicConstraints=critical,CA:TRUE" \
-addext "keyUsage=critical,keyCertSign,cRLSign"
echo "Root CA created."
# Step 2: Generate Intermediate CA
echo "Creating Intermediate CA with ML-DSA-65..."
openssl genpkey -algorithm mldsa65 \
-out "$CERT_DIR/intermediate_ca.key"
openssl req -new \
-key "$CERT_DIR/intermediate_ca.key" \
-out "$CERT_DIR/intermediate_ca.csr" \
-subj "/C=US/O=Example Corp/CN=PQC Intermediate CA"
openssl x509 -req \
-in "$CERT_DIR/intermediate_ca.csr" \
-CA "$CERT_DIR/root_ca.crt" \
-CAkey "$CERT_DIR/root_ca.key" \
-CAcreateserial \
-out "$CERT_DIR/intermediate_ca.crt" \
-days 1825 \
-extfile <(printf "basicConstraints=critical,CA:TRUE,pathlen:0\n\
keyUsage=critical,keyCertSign,cRLSign")
echo "Intermediate CA created."
# Step 3: Generate Server Certificate
echo "Creating server certificate with ML-DSA-65..."
openssl genpkey -algorithm mldsa65 \
-out "$CERT_DIR/server.key"
openssl req -new \
-key "$CERT_DIR/server.key" \
-out "$CERT_DIR/server.csr" \
-subj "/C=US/O=Example Corp/CN=api.example.com"
openssl x509 -req \
-in "$CERT_DIR/server.csr" \
-CA "$CERT_DIR/intermediate_ca.crt" \
-CAkey "$CERT_DIR/intermediate_ca.key" \
-CAcreateserial \
-out "$CERT_DIR/server.crt" \
-days 365 \
-extfile <(printf "subjectAltName=DNS:api.example.com,\
DNS:*.api.example.com\nkeyUsage=critical,digitalSignature\n\
extendedKeyUsage=serverAuth")
echo "Server certificate created."
# Step 4: Create the certificate chain file
cat "$CERT_DIR/server.crt" \
"$CERT_DIR/intermediate_ca.crt" \
> "$CERT_DIR/server_chain.pem"
echo ""
echo "Certificate chain generated in $CERT_DIR/"
echo "Verifying chain..."
openssl verify -CAfile "$CERT_DIR/root_ca.crt" \
-untrusted "$CERT_DIR/intermediate_ca.crt" \
"$CERT_DIR/server.crt"Hybrid Key Exchange in Application Code
"""
hybrid_key_exchange.py
Demonstrates a hybrid key exchange combining X25519 and ML-KEM-768.
"""
import hashlib
import os
from cryptography.hazmat.primitives.asymmetric.x25519 import (
X25519PrivateKey,
)
from cryptography.hazmat.primitives.kdf.hkdf import HKDF
from cryptography.hazmat.primitives import hashes
try:
import oqs
PQC_AVAILABLE = True
except ImportError:
PQC_AVAILABLE = False
print("Warning: liboqs not available. PQC disabled.")
def hybrid_key_exchange_initiator(peer_classical_pub, peer_pqc_pub):
"""
Initiator side of a hybrid key exchange.
Combines X25519 ECDH with ML-KEM-768 encapsulation.
Returns:
classical_ephemeral_pub: Ephemeral X25519 public key
pqc_ciphertext: ML-KEM ciphertext
shared_secret: Combined 48-byte shared secret
"""
# Classical X25519 key exchange
ephemeral_private = X25519PrivateKey.generate()
classical_shared = ephemeral_private.exchange(peer_classical_pub)
classical_ephemeral_pub = ephemeral_private.public_key()
if PQC_AVAILABLE:
# Post-quantum ML-KEM-768 encapsulation
kem = oqs.KeyEncapsulation("Kyber768")
pqc_ciphertext, pqc_shared = kem.encap_secret(peer_pqc_pub)
else:
# Fallback: use additional classical randomness
pqc_ciphertext = b""
pqc_shared = os.urandom(32)
# Combine shared secrets using HKDF
combined_input = classical_shared + pqc_shared
shared_secret = HKDF(
algorithm=hashes.SHA384(),
length=48,
salt=None,
info=b"hybrid-x25519-mlkem768-key-exchange",
).derive(combined_input)
return classical_ephemeral_pub, pqc_ciphertext, shared_secret
def hybrid_key_exchange_responder(
ephemeral_classical_pub,
pqc_ciphertext,
own_classical_private,
own_pqc_kem,
):
"""
Responder side of a hybrid key exchange.
Returns:
shared_secret: Combined 48-byte shared secret
"""
# Classical X25519 key exchange
classical_shared = own_classical_private.exchange(
ephemeral_classical_pub
)
if PQC_AVAILABLE and own_pqc_kem is not None:
# Post-quantum ML-KEM-768 decapsulation
pqc_shared = own_pqc_kem.decap_secret(pqc_ciphertext)
else:
pqc_shared = os.urandom(32)
# Combine with the same KDF parameters
combined_input = classical_shared + pqc_shared
shared_secret = HKDF(
algorithm=hashes.SHA384(),
length=48,
salt=None,
info=b"hybrid-x25519-mlkem768-key-exchange",
).derive(combined_input)
return shared_secretStep 5: Testing and Validation
PQC algorithms have fundamentally different performance characteristics compared to classical cryptography. Key sizes and signatures are significantly larger, and computational costs differ. Thorough testing is essential before production deployment.
Performance Benchmarking
#!/usr/bin/env python3
"""
pqc_benchmark.py
Benchmark PQC algorithms against classical counterparts.
Measures key generation, encapsulation/signing, and
decapsulation/verification times.
"""
import time
import statistics
import json
from dataclasses import dataclass, asdict
@dataclass
class BenchmarkResult:
algorithm: str
operation: str
iterations: int
mean_ms: float
median_ms: float
p95_ms: float
p99_ms: float
key_size_bytes: int
output_size_bytes: int
def benchmark_operation(func, iterations=1000):
"""Run a function multiple times and collect timing data."""
times = []
for _ in range(iterations):
start = time.perf_counter()
result = func()
elapsed = (time.perf_counter() - start) * 1000 # ms
times.append(elapsed)
return {
"mean": statistics.mean(times),
"median": statistics.median(times),
"p95": sorted(times)[int(0.95 * len(times))],
"p99": sorted(times)[int(0.99 * len(times))],
}
def benchmark_classical_ecdh(iterations=1000):
"""Benchmark X25519 key exchange."""
from cryptography.hazmat.primitives.asymmetric.x25519 import (
X25519PrivateKey,
)
# Key generation
keygen_stats = benchmark_operation(
X25519PrivateKey.generate, iterations
)
# Key exchange
static_private = X25519PrivateKey.generate()
static_public = static_private.public_key()
def do_exchange():
ephemeral = X25519PrivateKey.generate()
return ephemeral.exchange(static_public)
exchange_stats = benchmark_operation(do_exchange, iterations)
pub_bytes = static_public.public_bytes_raw()
return [
BenchmarkResult(
algorithm="X25519",
operation="keygen",
iterations=iterations,
mean_ms=keygen_stats["mean"],
median_ms=keygen_stats["median"],
p95_ms=keygen_stats["p95"],
p99_ms=keygen_stats["p99"],
key_size_bytes=len(pub_bytes),
output_size_bytes=len(pub_bytes),
),
BenchmarkResult(
algorithm="X25519",
operation="key_exchange",
iterations=iterations,
mean_ms=exchange_stats["mean"],
median_ms=exchange_stats["median"],
p95_ms=exchange_stats["p95"],
p99_ms=exchange_stats["p99"],
key_size_bytes=len(pub_bytes),
output_size_bytes=32,
),
]
def benchmark_pqc_kem(algorithm="Kyber768", iterations=1000):
"""Benchmark a PQC KEM algorithm."""
import oqs
# Key generation
def do_keygen():
kem = oqs.KeyEncapsulation(algorithm)
return kem.generate_keypair()
keygen_stats = benchmark_operation(do_keygen, iterations)
# Encapsulation
kem = oqs.KeyEncapsulation(algorithm)
public_key = kem.generate_keypair()
def do_encap():
enc = oqs.KeyEncapsulation(algorithm)
return enc.encap_secret(public_key)
encap_stats = benchmark_operation(do_encap, iterations)
# Decapsulation
ciphertext, _ = oqs.KeyEncapsulation(algorithm).encap_secret(
public_key
)
def do_decap():
return kem.decap_secret(ciphertext)
decap_stats = benchmark_operation(do_decap, iterations)
return [
BenchmarkResult(
algorithm=algorithm,
operation="keygen",
iterations=iterations,
mean_ms=keygen_stats["mean"],
median_ms=keygen_stats["median"],
p95_ms=keygen_stats["p95"],
p99_ms=keygen_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=len(public_key),
),
BenchmarkResult(
algorithm=algorithm,
operation="encapsulation",
iterations=iterations,
mean_ms=encap_stats["mean"],
median_ms=encap_stats["median"],
p95_ms=encap_stats["p95"],
p99_ms=encap_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=len(ciphertext),
),
BenchmarkResult(
algorithm=algorithm,
operation="decapsulation",
iterations=iterations,
mean_ms=decap_stats["mean"],
median_ms=decap_stats["median"],
p95_ms=decap_stats["p95"],
p99_ms=decap_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=32,
),
]
def benchmark_signature(algorithm="Dilithium3", iterations=1000):
"""Benchmark a PQC signature algorithm."""
import oqs
# Key generation
def do_keygen():
sig = oqs.Signature(algorithm)
return sig.generate_keypair()
keygen_stats = benchmark_operation(do_keygen, iterations)
# Signing
sig = oqs.Signature(algorithm)
public_key = sig.generate_keypair()
message = b"Benchmark test message for PQC signature verification"
def do_sign():
return sig.sign(message)
sign_stats = benchmark_operation(do_sign, iterations)
signature = sig.sign(message)
# Verification
def do_verify():
v = oqs.Signature(algorithm)
return v.verify(message, signature, public_key)
verify_stats = benchmark_operation(do_verify, iterations)
return [
BenchmarkResult(
algorithm=algorithm,
operation="keygen",
iterations=iterations,
mean_ms=keygen_stats["mean"],
median_ms=keygen_stats["median"],
p95_ms=keygen_stats["p95"],
p99_ms=keygen_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=len(public_key),
),
BenchmarkResult(
algorithm=algorithm,
operation="sign",
iterations=iterations,
mean_ms=sign_stats["mean"],
median_ms=sign_stats["median"],
p95_ms=sign_stats["p95"],
p99_ms=sign_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=len(signature),
),
BenchmarkResult(
algorithm=algorithm,
operation="verify",
iterations=iterations,
mean_ms=verify_stats["mean"],
median_ms=verify_stats["median"],
p95_ms=verify_stats["p95"],
p99_ms=verify_stats["p99"],
key_size_bytes=len(public_key),
output_size_bytes=0,
),
]
if __name__ == "__main__":
all_results = []
print("PQC Performance Benchmark")
print("=" * 60)
# Classical benchmarks
print("\nBenchmarking X25519...")
all_results.extend(benchmark_classical_ecdh())
# PQC KEM benchmarks
for algo in ["Kyber512", "Kyber768", "Kyber1024"]:
print(f"Benchmarking {algo}...")
all_results.extend(benchmark_pqc_kem(algo))
# PQC signature benchmarks
for algo in ["Dilithium2", "Dilithium3", "Dilithium5"]:
print(f"Benchmarking {algo}...")
all_results.extend(benchmark_signature(algo))
# Print results table
print(f"\n{'Algorithm':<20} {'Operation':<16} {'Mean(ms)':<10} "
f"{'P95(ms)':<10} {'Key(B)':<10} {'Output(B)':<10}")
print("-" * 76)
for r in all_results:
print(f"{r.algorithm:<20} {r.operation:<16} {r.mean_ms:<10.3f} "
f"{r.p95_ms:<10.3f} {r.key_size_bytes:<10} "
f"{r.output_size_bytes:<10}")
# Save full results to JSON
with open("pqc_benchmark_results.json", "w") as f:
json.dump([asdict(r) for r in all_results], f, indent=2)
print("\nFull results saved to pqc_benchmark_results.json")TLS Handshake Testing
#!/bin/bash
# test_pqc_tls.sh
# Test PQC TLS handshake and measure performance
SERVER="api.example.com"
PORT=443
ITERATIONS=100
export OPENSSL_CONF=/etc/ssl/openssl_pqc.cnf
echo "PQC TLS Handshake Test"
echo "======================"
echo "Server: $SERVER:$PORT"
echo "Iterations: $ITERATIONS"
echo ""
# Test each key exchange group
for GROUP in "x25519" "x25519_mlkem768" "x25519_mlkem1024"; do
echo "Testing group: $GROUP"
success=0
fail=0
total_time=0
for i in $(seq 1 $ITERATIONS); do
start_time=$(python3 -c "import time; print(time.time())")
result=$(echo | openssl s_client \
-connect "$SERVER:$PORT" \
-servername "$SERVER" \
-groups "$GROUP" \
-brief 2>&1)
end_time=$(python3 -c "import time; print(time.time())")
if echo "$result" | grep -q "CONNECTION ESTABLISHED"; then
success=$((success + 1))
elapsed=$(python3 -c \
"print(($end_time - $start_time) * 1000)")
total_time=$(python3 -c \
"print($total_time + $elapsed)")
else
fail=$((fail + 1))
fi
done
if [ "$success" -gt 0 ]; then
avg_time=$(python3 -c "print($total_time / $success)")
echo " Success: $success/$ITERATIONS"
echo " Average handshake time: ${avg_time}ms"
else
echo " All handshakes failed for $GROUP"
fi
echo ""
done
# Verify the negotiated parameters
echo "Negotiated Parameters (hybrid):"
echo | openssl s_client \
-connect "$SERVER:$PORT" \
-servername "$SERVER" \
-groups "x25519_mlkem768" 2>/dev/null | \
grep -E "(Protocol|Cipher|Server Temp Key|Peer signing)"Compatibility Test Matrix
# pqc_compatibility_tests.yaml
# Test matrix for PQC compatibility across clients and servers
test_matrix:
clients:
- name: "Chrome 124+"
pqc_support: true
hybrid_groups: ["x25519_mlkem768"]
notes: "Enabled by default since Chrome 124"
- name: "Firefox 128+"
pqc_support: true
hybrid_groups: ["x25519_mlkem768"]
notes: "Enabled by default"
- name: "curl with OQS"
pqc_support: true
hybrid_groups: ["x25519_mlkem768", "x25519_mlkem1024"]
notes: "Requires curl built with oqs-provider"
- name: "Python requests (urllib3)"
pqc_support: false
hybrid_groups: []
notes: "Depends on system OpenSSL version and provider"
- name: "Java 21+"
pqc_support: partial
hybrid_groups: []
notes: "Awaiting JEP for PQC KEM integration"
servers:
- name: "Nginx + OQS provider"
config_example: "See nginx pqc_tls.conf"
supported_groups: ["x25519_mlkem768", "x25519_mlkem1024"]
- name: "Cloudflare"
pqc_support: true
supported_groups: ["x25519_mlkem768"]
notes: "Enabled by default for all zones"
- name: "AWS ALB"
pqc_support: true
supported_groups: ["x25519_mlkem768"]
notes: "Available in select regions"
validation_tests:
- name: "Hybrid handshake succeeds"
command: |
openssl s_client -connect $HOST:443 \
-groups x25519_mlkem768 -brief
expected: "CONNECTION ESTABLISHED"
- name: "Fallback to classical works"
command: |
openssl s_client -connect $HOST:443 \
-groups x25519 -brief
expected: "CONNECTION ESTABLISHED"
- name: "Certificate chain validates"
command: |
openssl s_client -connect $HOST:443 \
-verify_return_error -brief
expected: "Verification: OK"
- name: "Key sizes within budget"
description: >
Verify that TLS handshake completes within
acceptable size and latency budgets.
ML-KEM-768 adds approximately 2KB to the
handshake compared to X25519 alone.
max_handshake_bytes: 8192
max_latency_ms: 100Step 6: Migration Execution
With your inventory complete, risks prioritized, crypto-agility built, and testing done, you can begin the actual migration. Execute in phases, starting with the highest-risk systems.
Phase 1: TLS Termination Points
TLS is the highest priority because it protects data in transit, which is the primary target for harvest-now-decrypt-later attacks.
# ansible/playbooks/pqc_tls_migration.yaml
# Ansible playbook for rolling out PQC TLS configuration
---
- name: Deploy PQC TLS Configuration
hosts: tls_termination_points
become: yes
vars:
oqs_provider_version: "0.6.0"
openssl_modules_dir: "/usr/lib/x86_64-linux-gnu/ossl-modules"
pqc_config_dir: "/etc/ssl/pqc"
tasks:
- name: Install OQS provider
copy:
src: "files/oqsprovider.so"
dest: "{{ openssl_modules_dir }}/oqsprovider.so"
mode: "0644"
notify: restart_nginx
- name: Deploy PQC OpenSSL configuration
template:
src: "templates/openssl_pqc.cnf.j2"
dest: "/etc/ssl/openssl_pqc.cnf"
mode: "0644"
notify: restart_nginx
- name: Deploy PQC certificates
copy:
src: "files/certs/{{ item }}"
dest: "{{ pqc_config_dir }}/{{ item }}"
mode: "0600"
loop:
- server_chain.pem
- server.key
notify: restart_nginx
- name: Deploy Nginx PQC TLS configuration
template:
src: "templates/pqc_tls.conf.j2"
dest: "/etc/nginx/conf.d/pqc_tls.conf"
mode: "0644"
notify: restart_nginx
- name: Test Nginx configuration
command: nginx -t
register: nginx_test
changed_when: false
- name: Verify PQC handshake works
command: >
openssl s_client -connect localhost:443
-groups x25519_mlkem768 -brief
register: pqc_test
changed_when: false
failed_when: "'CONNECTION ESTABLISHED' not in pqc_test.stdout"
handlers:
- name: restart_nginx
service:
name: nginx
state: restartedPhase 2: API Authentication
Migrate API authentication tokens and signatures to use PQC algorithms:
"""
pqc_jwt.py
JWT-like token signing and verification using ML-DSA
for post-quantum secure API authentication.
"""
import json
import base64
import time
import oqs
class PQCTokenSigner:
"""
Signs and verifies authentication tokens using ML-DSA-65.
Drop-in replacement for ECDSA/RSA JWT signing.
"""
def __init__(self, algorithm: str = "Dilithium3"):
self.algorithm = algorithm
self._signer = None
self._public_key = None
def generate_keys(self) -> tuple:
"""Generate a new signing keypair."""
self._signer = oqs.Signature(self.algorithm)
self._public_key = self._signer.generate_keypair()
return self._public_key, self._signer.export_secret_key()
def load_keys(self, public_key: bytes, secret_key: bytes):
"""Load existing signing keys."""
self._public_key = public_key
self._signer = oqs.Signature(self.algorithm, secret_key)
def create_token(self, payload: dict,
expiry_seconds: int = 3600) -> str:
"""
Create a signed token with the given payload.
Returns:
A base64url-encoded token string in the format:
header.payload.signature
"""
header = {
"alg": f"PQ-{self.algorithm}",
"typ": "PQT", # Post-Quantum Token
}
payload["iat"] = int(time.time())
payload["exp"] = int(time.time()) + expiry_seconds
header_b64 = base64.urlsafe_b64encode(
json.dumps(header).encode()
).decode().rstrip("=")
payload_b64 = base64.urlsafe_b64encode(
json.dumps(payload).encode()
).decode().rstrip("=")
signing_input = f"{header_b64}.{payload_b64}".encode()
signature = self._signer.sign(signing_input)
signature_b64 = base64.urlsafe_b64encode(
signature
).decode().rstrip("=")
return f"{header_b64}.{payload_b64}.{signature_b64}"
def verify_token(self, token: str,
public_key: bytes = None) -> dict:
"""
Verify a token's signature and return the payload.
Raises:
ValueError: If verification fails or token is expired.
"""
pub_key = public_key or self._public_key
parts = token.split(".")
if len(parts) != 3:
raise ValueError("Invalid token format")
header_b64, payload_b64, signature_b64 = parts
# Restore base64 padding
def pad_b64(s):
return s + "=" * (4 - len(s) % 4)
signing_input = f"{header_b64}.{payload_b64}".encode()
signature = base64.urlsafe_b64decode(pad_b64(signature_b64))
verifier = oqs.Signature(self.algorithm)
is_valid = verifier.verify(signing_input, signature, pub_key)
if not is_valid:
raise ValueError("Token signature verification failed")
payload = json.loads(
base64.urlsafe_b64decode(pad_b64(payload_b64))
)
if payload.get("exp", 0) < time.time():
raise ValueError("Token has expired")
return payload
# Usage example
if __name__ == "__main__":
signer = PQCTokenSigner("Dilithium3")
public_key, secret_key = signer.generate_keys()
# Create a token
token = signer.create_token({
"sub": "user-12345",
"role": "admin",
"service": "api-gateway",
})
print(f"Token length: {len(token)} characters")
print(f"Token (first 100 chars): {token[:100]}...")
# Verify the token
payload = signer.verify_token(token)
print(f"\nVerified payload: {json.dumps(payload, indent=2)}")Phase 3: Data-at-Rest Encryption
"""
pqc_data_encryption.py
Hybrid encryption for data at rest using ML-KEM + AES-256-GCM.
"""
import os
import json
import struct
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
import oqs
class HybridFileEncryptor:
"""
Encrypts files using a hybrid scheme:
1. ML-KEM-768 encapsulates a shared secret
2. The shared secret derives an AES-256-GCM key
3. AES-256-GCM encrypts the file data
File format:
[4 bytes: ciphertext length][ML-KEM ciphertext]
[12 bytes: nonce][encrypted data][16 bytes: GCM tag]
"""
KEM_ALGORITHM = "Kyber768"
def __init__(self):
self._kem = None
self._public_key = None
def generate_keys(self) -> tuple:
"""Generate a new ML-KEM keypair for file encryption."""
self._kem = oqs.KeyEncapsulation(self.KEM_ALGORITHM)
self._public_key = self._kem.generate_keypair()
return self._public_key, self._kem.export_secret_key()
def encrypt_file(self, input_path: str, output_path: str,
public_key: bytes):
"""Encrypt a file using hybrid ML-KEM + AES-256-GCM."""
# Step 1: Encapsulate a shared secret with ML-KEM
kem = oqs.KeyEncapsulation(self.KEM_ALGORITHM)
ciphertext, shared_secret = kem.encap_secret(public_key)
# Step 2: Derive an AES-256 key from the shared secret
# Using first 32 bytes of the shared secret as AES key
aes_key = shared_secret[:32]
nonce = os.urandom(12)
# Step 3: Read and encrypt the file data
with open(input_path, "rb") as f:
plaintext = f.read()
aesgcm = AESGCM(aes_key)
encrypted_data = aesgcm.encrypt(nonce, plaintext, None)
# Step 4: Write the encrypted file
with open(output_path, "wb") as f:
# Write KEM ciphertext length and ciphertext
f.write(struct.pack(">I", len(ciphertext)))
f.write(ciphertext)
# Write nonce and encrypted data
f.write(nonce)
f.write(encrypted_data)
return {
"input_size": len(plaintext),
"output_size": (4 + len(ciphertext) + 12
+ len(encrypted_data)),
"overhead_bytes": (4 + len(ciphertext) + 12 + 16),
}
def decrypt_file(self, input_path: str, output_path: str,
secret_key: bytes):
"""Decrypt a file encrypted with encrypt_file."""
kem = oqs.KeyEncapsulation(self.KEM_ALGORITHM, secret_key)
with open(input_path, "rb") as f:
# Read KEM ciphertext
ct_len = struct.unpack(">I", f.read(4))[0]
ciphertext = f.read(ct_len)
# Read nonce and encrypted data
nonce = f.read(12)
encrypted_data = f.read()
# Decapsulate to recover the shared secret
shared_secret = kem.decap_secret(ciphertext)
aes_key = shared_secret[:32]
# Decrypt the data
aesgcm = AESGCM(aes_key)
plaintext = aesgcm.decrypt(nonce, encrypted_data, None)
with open(output_path, "wb") as f:
f.write(plaintext)
return {"output_size": len(plaintext)}
# Usage
if __name__ == "__main__":
encryptor = HybridFileEncryptor()
public_key, secret_key = encryptor.generate_keys()
# Encrypt
stats = encryptor.encrypt_file(
"sensitive_data.db",
"sensitive_data.db.enc",
public_key,
)
print(f"Encrypted: {stats}")
# Decrypt
result = encryptor.decrypt_file(
"sensitive_data.db.enc",
"sensitive_data_decrypted.db",
secret_key,
)
print(f"Decrypted: {result}")Cloud Provider PQC Support
AWS KMS Post-Quantum Support
AWS KMS supports hybrid post-quantum TLS for API calls, protecting the key material exchanged with KMS.
"""
aws_pqc_kms.py
Using AWS KMS with post-quantum TLS enabled.
"""
import boto3
from botocore.config import Config
# Enable hybrid post-quantum TLS for KMS API calls
# This protects the key exchange between your client and KMS
pqc_config = Config(
region_name="us-east-1",
# Enable PQ hybrid key exchange for TLS connections to KMS
request_checksum_calculation="when_required",
response_checksum_validation="when_required",
)
# Create KMS client with PQC TLS
# AWS SDK automatically uses hybrid PQ key exchange when available
kms_client = boto3.client("kms", config=pqc_config)
def create_pqc_ready_key():
"""
Create a KMS key configured for eventual PQC migration.
Uses AES-256 for symmetric operations (quantum-resistant)
with PQ-TLS protecting the API communication channel.
"""
response = kms_client.create_key(
Description="PQC-ready encryption key",
KeyUsage="ENCRYPT_DECRYPT",
KeySpec="SYMMETRIC_DEFAULT", # AES-256-GCM
Tags=[
{"TagKey": "pqc-migration", "TagValue": "phase-1"},
{"TagKey": "crypto-agility", "TagValue": "enabled"},
],
Policy=json.dumps({
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowKeyManagement",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::123456789012:root"
},
"Action": "kms:*",
"Resource": "*",
},
],
}),
)
return response["KeyMetadata"]["KeyId"]
def encrypt_with_pqc_tls(key_id: str, plaintext: bytes) -> bytes:
"""
Encrypt data using KMS over a PQ-TLS channel.
The data encryption uses AES-256-GCM (quantum-safe).
The TLS channel to KMS uses hybrid PQ key exchange.
"""
response = kms_client.encrypt(
KeyId=key_id,
Plaintext=plaintext,
EncryptionAlgorithm="SYMMETRIC_DEFAULT",
)
return response["CiphertextBlob"]# Verify AWS KMS is using PQ-TLS
# Set environment variable to force hybrid PQ TLS
export AWS_USE_FIPS_ENDPOINT=true
# Test with AWS CLI
aws kms list-keys --region us-east-1 --debug 2>&1 | \
grep -i "post-quantum\|hybrid\|kyber\|kem"Azure Key Vault
"""
azure_pqc_keyvault.py
Azure Key Vault with quantum-safe configuration.
"""
from azure.identity import DefaultAzureCredential
from azure.keyvault.keys import KeyClient
from azure.keyvault.keys.crypto import (
CryptographyClient,
EncryptionAlgorithm,
)
# Azure Key Vault with managed HSM supports
# quantum-safe key protection at the infrastructure level
credential = DefaultAzureCredential()
key_client = KeyClient(
vault_url="https://your-vault.vault.azure.net/",
credential=credential,
)
# Create an RSA key (plan for migration to PQC when supported)
# Azure is tracking NIST PQC standards for future integration
key = key_client.create_rsa_key(
name="pqc-migration-ready",
size=4096, # Use maximum classical key size during transition
tags={
"pqc-migration-phase": "inventory",
"target-algorithm": "ML-KEM-768",
"migration-deadline": "2027-01-01",
},
)
print(f"Key created: {key.name}")
print(f"Key type: {key.key_type}")
print(f"Key ID: {key.id}")GCP Cloud KMS
"""
gcp_pqc_kms.py
Google Cloud KMS quantum-safe configuration.
"""
from google.cloud import kms_v1
def create_quantum_resistant_key(
project_id: str,
location_id: str,
key_ring_id: str,
key_id: str,
):
"""
Create a Cloud KMS key optimized for PQC readiness.
Uses AES-256 symmetric encryption (quantum-resistant for
symmetric operations) while tracking PQC migration status.
"""
client = kms_v1.KeyManagementServiceClient()
key_ring_name = client.key_ring_path(
project_id, location_id, key_ring_id
)
purpose = kms_v1.CryptoKey.CryptoKeyPurpose.ENCRYPT_DECRYPT
crypto_key = {
"purpose": purpose,
"version_template": {
# HSM-backed keys for maximum security
"protection_level": kms_v1.ProtectionLevel.HSM,
"algorithm": (
kms_v1.CryptoKeyVersion.CryptoKeyVersionAlgorithm
.GOOGLE_SYMMETRIC_ENCRYPTION
),
},
"labels": {
"pqc-migration": "phase-1",
"crypto-agility": "enabled",
},
}
created_key = client.create_crypto_key(
request={
"parent": key_ring_name,
"crypto_key_id": key_id,
"crypto_key": crypto_key,
}
)
print(f"Created key: {created_key.name}")
return created_key
# GCP also supports TLS 1.3 with hybrid PQ key exchange
# for connections to Cloud KMS endpoints in supported regions.
# Enable via the GOOGLE_API_USE_CLIENT_CERTIFICATE env var
# and ensure your client library is built with PQ support.Compliance Timeline
CNSA 2.0 Milestones
| Deadline | Requirement | Impact |
|---|---|---|
| 2025 | Software/firmware signing must use CNSA 2.0 | Code signing pipelines must support ML-DSA |
| 2026 | New network equipment must support PQC | Network infrastructure procurement changes |
| Jan 2027 | All new system acquisitions must be CNSA 2.0 compliant | Procurement contracts require PQC support |
| 2028 | Web servers must support PQC TLS | Public-facing services need hybrid TLS |
| 2030 | Exclusive use of CNSA 2.0 for web/cloud | No classical-only connections allowed |
| 2033 | Complete transition for all NSS | Legacy systems fully migrated |
Industry-Specific Requirements
Financial Services (PCI DSS, SOX):
- PCI DSS v4.0 requires "strong cryptography" - PQC will become the standard definition
- Financial regulators in the EU, UK, and Singapore have issued PQC readiness advisories
- High-value transaction data has long confidentiality requirements, making it a priority for harvest-now-decrypt-later protection
- Plan to begin hybrid TLS deployment by Q4 2026
Healthcare (HIPAA):
- HIPAA requires encryption of PHI in transit and at rest with no expiration on protection requirements
- Patient records have indefinite confidentiality requirements
- Healthcare data is a prime target for harvest-now-decrypt-later attacks
- Organizations should begin PQC inventory and pilot testing immediately
Government Contractors (FedRAMP, CMMC):
- FedRAMP will incorporate PQC requirements aligned with CNSA 2.0 timelines
- CMMC Level 2+ will require PQC for Controlled Unclassified Information (CUI)
- Contractors must demonstrate PQC readiness in system security plans
- Existing Authority to Operate (ATO) renewals will include PQC requirements starting 2027
Migration Timeline Template
# pqc_migration_timeline.yaml
# Template for planning your organization's PQC migration
phases:
- name: "Phase 0: Discovery and Planning"
duration: "3-6 months"
start: "Immediately"
tasks:
- Complete cryptographic inventory of all systems
- Classify data by sensitivity and lifespan
- Identify quantum-vulnerable algorithms in production
- Assess third-party and vendor PQC readiness
- Establish a PQC migration working group
- Budget for tooling, training, and infrastructure
deliverables:
- Cryptographic inventory report
- Risk assessment and prioritization matrix
- Migration budget proposal
- name: "Phase 1: Foundation"
duration: "6-12 months"
start: "Q2 2026"
tasks:
- Implement crypto-agility in application architecture
- Deploy PQC testing infrastructure (OQS, liboqs)
- Begin hybrid TLS pilot on non-production systems
- Update certificate management for PQC certificates
- Train engineering teams on PQC concepts and tooling
deliverables:
- Crypto-agile application framework
- Hybrid TLS pilot results and performance data
- Team training completion records
- name: "Phase 2: Hybrid Deployment"
duration: "6-12 months"
start: "Q1 2027"
tasks:
- Deploy hybrid TLS to production TLS termination points
- Migrate API authentication to hybrid signatures
- Enable PQ-TLS for cloud KMS API connections
- Update VPN configurations for hybrid key exchange
- Begin data-at-rest encryption migration for highest-risk data
deliverables:
- Production hybrid TLS deployment
- API authentication migration complete
- VPN PQC configuration deployed
- name: "Phase 3: Full PQC"
duration: "12-18 months"
start: "Q1 2028"
tasks:
- Transition from hybrid to PQC-only where possible
- Complete data-at-rest re-encryption with PQC
- Migrate code signing to ML-DSA
- Update all certificate chains to PQC
- Remove classical-only fallbacks from critical paths
deliverables:
- PQC-only operation for critical systems
- Complete certificate chain migration
- Compliance documentation and audit evidence
- name: "Phase 4: Validation and Compliance"
duration: "Ongoing"
start: "Q3 2029"
tasks:
- Third-party security audit of PQC deployment
- Compliance certification updates
- Continuous monitoring for PQC algorithm updates
- Performance optimization and tuning
deliverables:
- Audit reports
- Updated compliance certifications
- Ongoing monitoring dashboardsCommon Mistakes to Avoid
1. Waiting for the "Perfect" Time
The most common and most dangerous mistake is waiting. Organizations often delay PQC migration because quantum computers "are not here yet." But consider:
- Cryptographic migrations historically take 5 to 10 years to complete across large organizations
- The MD5-to-SHA transition took over a decade in many enterprises
- The TLS 1.0/1.1 deprecation took years of effort even with clear deadlines
- Harvest-now-decrypt-later means waiting exposes data that is already in transit
Start your cryptographic inventory today. Even if you do not deploy PQC for another year, knowing what you need to migrate is the prerequisite for every other step.
2. Ignoring Crypto-Agility
Some teams jump straight to deploying ML-KEM and ML-DSA without building the abstraction layer to swap algorithms later. This creates the same problem you have now - hardcoded algorithm choices that require code changes to update.
PQC standards will evolve. New attacks may be discovered. NIST has additional algorithms under evaluation. Your architecture must support algorithm rotation without application rewrites.
3. Forgetting Third-Party Dependencies
Your own code is only part of the equation. Audit every dependency:
# Check Python dependencies for crypto library versions
pip list | grep -iE "cryptography|pyopenssl|pycrypto|pynacl|paramiko"
# Check Node.js dependencies
npm list | grep -iE "crypto|tls|jose|jsonwebtoken|node-forge"
# Check Go dependencies
go list -m all | grep -iE "crypto|tls|x509"
# Check for vendored OpenSSL versions
find /usr/lib /usr/local/lib /opt -name "libssl*" -o \
-name "libcrypto*" 2>/dev/null | sort -uKey questions for each dependency:
- Does this library support PQC algorithms?
- What is the library maintainer's PQC roadmap?
- Can you upgrade the library independently of your application?
- Does the library support algorithm configuration, or are algorithms hardcoded?
4. Underestimating Key and Signature Size Impact
PQC key sizes and signatures are significantly larger than their classical counterparts:
| Algorithm | Public Key | Signature/Ciphertext | Classical Equivalent |
|---|---|---|---|
| ML-KEM-768 | 1,184 B | 1,088 B | X25519: 32 B / 32 B |
| ML-DSA-65 | 1,952 B | 3,309 B | ECDSA-P256: 64 B / 64 B |
| SLH-DSA-256f | 32 B | 49,856 B | Ed25519: 32 B / 64 B |
This matters for:
- TLS handshake size: Hybrid handshakes add 2-3 KB. This can cause issues with networks that fragment or drop large UDP packets (QUIC/DTLS)
- Certificate chain size: PQC certificate chains can exceed typical MTU sizes, requiring TCP fragmentation
- Database storage: Storing PQC public keys and signatures requires schema changes
- Bandwidth: High-throughput APIs signing every response will see measurable bandwidth increases
- IoT and embedded: Constrained devices may not have memory for large PQC keys
5. Skipping Performance Testing in Production-Like Conditions
Lab benchmarks do not capture real-world performance. Test with:
- Realistic network latency (not just localhost)
- Production traffic volumes and patterns
- Client diversity (browsers, mobile apps, API clients, IoT devices)
- Connection reuse patterns (TLS session resumption behavior with PQC)
- Load balancer behavior with larger handshakes
6. Not Planning for Certificate Lifecycle
PQC certificates are larger and have different operational characteristics:
- Certificate Transparency logs must support PQC certificates
- OCSP responses with PQC signatures are larger
- Certificate revocation infrastructure must be updated
- Automated certificate management (ACME/Let's Encrypt) needs PQC support
Plan your certificate lifecycle management before deploying PQC certificates in production.