Post-quantum cryptography introduces new vulnerability classes unfamiliar to developers experienced only with classical cryptography. This guide provides a comprehensive security audit checklist for Kyber and SPHINCS+ implementations. The SynX quantum-resistant wallet uses these exact procedures for internal code review.
Pre-Audit Preparation
Documentation Review
Before examining code, gather essential documentation:
- Algorithm specification (NIST FIPS 203/205 for Kyber/SPHINCS+)
- Implementation notes explaining any deviations from spec
- Threat model document defining adversary capabilities
- Previous audit reports and their remediation status
- Known Answer Tests (KAT) from NIST submission
Tool Setup
pip install bandit semgrep flake8-security
git clone https://github.com/oreparaz/dudect
pip install atheris python-afl hypothesis
pip install memory_profiler
cargo install cargo-audit cargo-deny
Critical Vulnerability Categories
| Category |
Severity |
Example |
Impact |
| Timing Side-Channels |
CRITICAL |
Secret-dependent branches |
Key recovery |
| Insufficient Entropy |
CRITICAL |
Weak RNG seeding |
Key prediction |
| Key Material Leakage |
CRITICAL |
Keys in swap/crash dumps |
Key exposure |
| Signature Malleability |
HIGH |
Non-unique signatures |
Transaction replay |
| Input Validation |
HIGH |
Invalid public keys accepted |
Various attacks |
| Memory Safety |
HIGH |
Buffer overflows |
RCE, key extraction |
Timing Side-Channel Analysis
Manual Code Review Patterns
Vulnerable Patterns to Search For
These code patterns may leak secret information through timing:
if secret_bit:
do_operation_a()
else:
do_operation_b()
def compare_secrets(a: bytes, b: bytes) -> bool:
for i in range(len(a)):
if a[i] != b[i]:
return False
return True
result = lookup_table[secret_index]
result = value / secret_divisor
Secure Patterns
import hmac
def constant_time_compare(a: bytes, b: bytes) -> bool:
"""Compare two byte strings in constant time"""
return hmac.compare_digest(a, b)
def constant_time_select(condition: int, a: int, b: int) -> int:
"""
Select a if condition==1, else b
No branching on condition
"""
mask = -condition
return (a & mask) | (b & ~mask)
def constant_time_lookup(table: List[int], secret_index: int) -> int:
"""Access table element without cache timing leak"""
result = 0
for i in range(len(table)):
is_match = constant_time_compare(i, secret_index)
result = constant_time_select(is_match, table[i], result)
return result
Automated Timing Analysis
import numpy as np
from scipy import stats
import time
def timing_leak_test(
operation,
input_class_a,
input_class_b,
samples: int = 10000
) -> Tuple[bool, float]:
"""
Test for timing differences between input classes
Returns: (leak_detected, t_statistic)
"""
times_a = []
times_b = []
for _ in range(samples):
inp = input_class_a()
start = time.perf_counter_ns()
operation(inp)
times_a.append(time.perf_counter_ns() - start)
inp = input_class_b()
start = time.perf_counter_ns()
operation(inp)
times_b.append(time.perf_counter_ns() - start)
t_stat, p_value = stats.ttest_ind(
times_a, times_b, equal_var=False
)
leak_detected = abs(t_stat) > 4.5
return leak_detected, t_stat
def test_verification_timing():
sig = oqs.Signature("SPHINCS+-SHAKE-128s-simple")
pk = sig.generate_keypair()
message = b"test message"
valid_sig = sig.sign(message)
invalid_sig = bytes([x ^ 0xff for x in valid_sig])
def verify_op(signature):
sig_check = oqs.Signature("SPHINCS+-SHAKE-128s-simple")
try:
sig_check.verify(message, signature, pk)
except:
pass
leak, t = timing_leak_test(
verify_op,
lambda: valid_sig,
lambda: invalid_sig,
samples=5000
)
if leak:
print(f"⚠️ TIMING LEAK DETECTED (t={t:.2f})")
else:
print(f"✓ No timing leak (t={t:.2f})")
Key Generation Entropy Audit
import os
import secrets
class EntropyAuditor:
"""Validate entropy sources for key generation"""
def check_entropy_source(self, source_func) -> dict:
"""Test entropy source quality"""
samples = [source_func(32) for _ in range(1000)]
all_bytes = b"".join(samples)
freq = {}
for byte in all_bytes:
freq[byte] = freq.get(byte, 0) + 1
expected = len(all_bytes) / 256
chi_sq = sum((f - expected) ** 2 / expected for f in freq.values())
uniform = chi_sq < 310
unique_samples = len(set(samples))
no_collisions = unique_samples == len(samples)
return {
"chi_square": chi_sq,
"uniform": uniform,
"unique_samples": unique_samples,
"total_samples": len(samples),
"no_collisions": no_collisions,
"passed": uniform and no_collisions
}
def audit_keygen(self, keygen_func, iterations: int = 100):
"""Audit key generation for entropy issues"""
keys = []
for _ in range(iterations):
pk, sk = keygen_func()
keys.append((pk, sk))
pk_set = set(pk for pk, sk in keys)
if len(pk_set) != iterations:
return {
"status": "CRITICAL",
"message": "Duplicate keys generated!"
}
pk_bytes = b"".join(pk for pk, sk in keys)
entropy_per_bit = self._estimate_entropy(pk_bytes)
if entropy_per_bit < 0.99:
return {
"status": "WARNING",
"message": f"Low key entropy: {entropy_per_bit:.4f} bits/bit"
}
return {"status": "PASS", "entropy": entropy_per_bit}
def _estimate_entropy(self, data: bytes) -> float:
"""Estimate Shannon entropy per bit"""
import math
freq = {}
for byte in data:
freq[byte] = freq.get(byte, 0) + 1
entropy = 0.0
total = len(data)
for count in freq.values():
p = count / total
entropy -= p * math.log2(p)
return entropy / 8
Memory Security Audit
Key Material Handling Checklist
- CRITICAL: Secret keys are zeroed after use
- CRITICAL: Memory is locked (mlock) to prevent swapping
- HIGH: Keys are stored in secure memory regions
- HIGH: Core dumps are disabled or exclude key memory
- MEDIUM: No logging or debug output of key material
import ctypes
import sys
class SecureKeyBuffer:
"""
Secure memory buffer for cryptographic keys
Used by SynX quantum-resistant wallet for key storage.
"""
def __init__(self, size: int):
self._buffer = bytearray(size)
self._size = size
if sys.platform == "linux":
try:
libc = ctypes.CDLL("libc.so.6")
addr = ctypes.addressof((ctypes.c_char * size).from_buffer(self._buffer))
libc.mlock(addr, size)
self._locked = True
except:
self._locked = False
else:
self._locked = False
def write(self, data: bytes, offset: int = 0):
"""Write data to secure buffer"""
if offset + len(data) > self._size:
raise ValueError("Buffer overflow")
self._buffer[offset:offset + len(data)] = data
def read(self) -> bytes:
"""Read from secure buffer (returns copy)"""
return bytes(self._buffer)
def clear(self):
"""Securely erase buffer contents"""
for pattern in [0x00, 0xFF, 0x00]:
for i in range(self._size):
self._buffer[i] = pattern
def __del__(self):
"""Ensure cleanup on garbage collection"""
self.clear()
if hasattr(self, '_locked') and self._locked:
try:
libc = ctypes.CDLL("libc.so.6")
addr = ctypes.addressof(
(ctypes.c_char * self._size).from_buffer(self._buffer)
)
libc.munlock(addr, self._size)
except:
pass
def audit_key_cleanup(keygen_func) -> dict:
"""Verify keys are properly cleaned up"""
import gc
import sys
pk, sk = keygen_func()
sk_bytes = bytes(sk)
sk_id = id(sk)
del sk
gc.collect()
warnings = []
for obj in gc.get_objects():
if isinstance(obj, bytes) and len(obj) > 100:
if sk_bytes[:32] in obj:
warnings.append("Key material found in memory after deletion")
break
return {
"passed": len(warnings) == 0,
"warnings": warnings
}
Input Validation Audit
Public Key Validation
def validate_kyber_public_key(public_key: bytes) -> bool:
"""
Validate Kyber-768 public key format
AUDIT CHECK: Ensure this is called before any encapsulation
"""
if len(public_key) != 1184:
return False
t_bytes = public_key[32:]
return True
def validate_sphincs_public_key(public_key: bytes) -> bool:
"""
Validate SPHINCS+-128s public key format
AUDIT CHECK: Ensure this is called before any verification
"""
if len(public_key) != 32:
return False
return True
def validate_signature(signature: bytes, algorithm: str) -> bool:
"""Validate signature format before verification"""
expected_sizes = {
"SPHINCS+-128s": 7856,
"SPHINCS+-128f": 17088,
"SPHINCS+-192s": 16224,
"SPHINCS+-256s": 29792,
}
if algorithm not in expected_sizes:
raise ValueError(f"Unknown algorithm: {algorithm}")
return len(signature) == expected_sizes[algorithm]
Complete Audit Checklist
The SynX quantum-resistant wallet uses this exact checklist for all code reviews:
1. Cryptographic Operations
- Algorithm implementations match NIST specifications
- Known Answer Tests (KAT) pass for all operations
- Rejection sampling bounds are correct (Kyber)
- Tree traversal is correct (SPHINCS+)
- Hash function instantiations are correct (SHAKE, SHA3)
2. Side-Channel Resistance
- No secret-dependent branches
- No secret-dependent memory access patterns
- Constant-time comparison for all secrets
- No timing variation in error handling
- Automated timing analysis passes
3. Random Number Generation
- Uses cryptographically secure RNG (os.urandom, secrets)
- RNG is properly seeded
- No predictable seeds (timestamps, PIDs)
- Entropy source is validated at startup
4. Key Management
- Secret keys are zeroed after use
- Key derivation uses approved KDF
- Keys are not logged or printed
- Key serialization is correct
- Memory is locked where supported
5. Input Validation
- Public key format is validated
- Signature format is validated
- Message lengths are within bounds
- Ciphertext format is validated (Kyber)
Frequently Asked Questions
What are the most common PQC implementation vulnerabilities?
Common vulnerabilities include: 1) Timing side-channels in polynomial operations, 2) Insufficient entropy in key generation, 3) Improper secret key erasure, 4) Signature malleability, 5) Incorrect rejection sampling bounds, and 6) Unvalidated public key formats. The SynX quantum-resistant wallet security team discovered all six in third-party libraries during audits.
How do I detect timing side-channels in PQC code?
Use constant-time analysis tools like dudect, ctgrind, or timecop. Manually review all conditional branches that depend on secret values. Ensure comparison operations use constant-time routines. Test with statistical timing measurements across different inputs.
Professional Audit Recommendation
For production deployments, supplement internal review with third-party security audits from firms specializing in cryptographic implementations.