Fuzz Testing Strategy for 6 Module Variants
The total number of separate fuzz sessions you should run is 10, 24-48 hours each
Total time fuzz tested: 240 to 480 hours (10 to 20 days) of continuous CPU time.
-------
It is much more realistic to expect the total fuzzing effort to take 5 to 8 days
of dedicated, continuous runtime, even running in parallel.
Recommended Strategy for Parallel Fuzzing:
Prioritize: Identify the 4-6 most critical sessions (e.g., all four KEM/DEM targets and the highest priority sign/verify) and run those first.
Limit Threads: Run 4 to 6 sessions simultaneously to maintain high thread efficiency without constantly overheating your CPU.
Run in Batches: Once those first few critical sessions hit the 48-hour mark or plateau, stop them, review the findings, and start the next batch.
----------------------
| Fuzz Target | Crypto Logic Tested | Runs Needed | Rationale |
| :--- | :--- | :--- |
| 1. fuzz_keypair_gen | Kyber/Dilithium (Common) | 1 Run | Key generation logic is constant regardless of crypto or environment. |
| 2. fuzz_verify | Dilithium DSA (Common) | 1 Run | Verification logic is the same in all variants. |
| 3. fuzz_sign | Dilithium DSA (Common) | 1 Run | Signing logic is the same in all variants. |
| 4. fuzz_serialization | All Cryptographic Primitives (Common) | 1 Run | Serialization format is constant. |
| 5. fuzz_encapsulate | Kyber KEM + DEM/KDF | 3 Runs | Must be run for AES-GCM and SHAKE variants, testing the different std, alloc, and no_std environments. |
| 6. fuzz_decapsulate | Kyber KEM + DEM/KDF | 3 Runs | Must be run for AES-GCM and SHAKE variants, testing the different std, alloc, and no_std environments. |
How to Achieve Full Coverage (10 Sessions)
You don't need to run every target against every environment. Focus the effort on the features that actually change the cryptographic implementation:
Test Group
Target
Feature Flags to Run
Total
Common Code Fuzzing
fuzz_keypair_gen, fuzz_verify, fuzz_sign, fuzz_serialization
Run once, using the standard std environment (default).
4 Sessions
AES-GCM Fuzzing
fuzz_encapsulate, fuzz_decapsulate
Run against one of the AES-GCM profiles (e.g., test_std_aes_gcm).
2 Sessions
SHAKE Fuzzing
fuzz_encapsulate, fuzz_decapsulate
Run against one of the SHAKE profiles (e.g., test_std_shake).
2 Sessions
Environment Boundary Fuzzing
fuzz_encapsulate, fuzz_decapsulate
Run against a no_std profile (e.g., test_no_std_no_alloc_shake). This is essential to check if the memory handling breaks without the std features.
2 Sessions
By running the KEM targets with different feature flags, you ensure that the code for both AES-GCM and SHAKE256 is robustly tested against bad inputs.