Executive Summary
The digital infrastructure of the modern world stands on the precipice of a fundamental transformation. For decades, the security of global communications, financial transactions, national secrets, and personal privacy has rested upon the mathematical difficulty of specific problems—namely, integer factorization and discrete logarithms—that underpin Public Key Infrastructure (PKI). The emergence of quantum computing threatens to shatter this foundation. The year 2025 has become the definitive inflection point where theoretical risk has converted into operational necessity. With the formal publication of Federal Information Processing Standards (FIPS) 203, 204, and 205 by the U.S. National Institute of Standards and Technology (NIST) in August 2024, and the subsequent enforcement mandates rippling through global regulatory frameworks, Post-Quantum Cryptography (PQC) is no longer an optional future upgrade. It is an immediate compliance, legal, and security mandate.
This report provides an exhaustive analysis of the PQC landscape as of late 2025. It argues that the urgency for migration is not dictated by the arrival of a fault-tolerant quantum computer, but by the “Harvest Now, Decrypt Later” (HNDL) threat vector which effectively weaponizes current encrypted data against future capabilities. We explore the technical nuances of the newly standardized lattice-based algorithms, the divergence in global regulatory timelines between Western allies and nations like China, and the profound engineering challenges organizations face in migrating legacy systems to quantum-safe architectures. The analysis demonstrates that failing to act now exposes organizations not only to catastrophic future breaches but to immediate regulatory penalties and negligence liabilities.
1. The Anatomy of the Quantum Threat
To understand why PQC is mandatory today, one must first deconstruct the nature of the threat. It is a common misconception that quantum risk is solely a future problem. The reality is that the threat timeline has already intersected with the data retention timeline for most critical sectors.
1.1 The Mathematical Collapse: Shor’s Algorithm and Cryptographic vulnerability
Current asymmetric cryptography—specifically RSA (Rivest–Shamir–Adleman), ECDH (Elliptic Curve Diffie-Hellman), and ECDSA (Elliptic Curve Digital Signature Algorithm)—relies on the computational infeasibility of factoring large composite integers or solving the discrete logarithm problem on elliptic curves. Classical computers, operating on binary bits, would require astronomical timescales to reverse these functions for sufficiently large keys.
However, the quantum computing paradigm shifts this calculation entirely. As outlined in the foundational research, Shor’s algorithm utilizes the principles of quantum superposition and entanglement to solve these specific problems exponentially faster than any known classical algorithm.1 The algorithm operates by reducing the factorization problem to a period-finding problem for a modular function $f(x) = a^x \mod N$.
- The Classical Reduction: The algorithm first performs a classical reduction, transforming the problem of finding factors of a number $N$ into the problem of finding the period $r$ of the function.
- The Quantum Fourier Transform (QFT): The core quantum step involves using the QFT to determine the period $r$ with high probability.
- Factor Extraction: Once $r$ is known, classical arithmetic can easily extract the factors of $N$.3
While a classical supercomputer might take trillions of years to break RSA-2048, a Cryptographically Relevant Quantum Computer (CRQC) running Shor’s algorithm could theoretically accomplish this task in a matter of hours or days.4 This capability renders the vast majority of the world’s encrypted traffic transparent to the attacker. It is critical to note that symmetric encryption (like AES) is less severely impacted; Grover’s algorithm only provides a quadratic speedup against symmetric ciphers, meaning that doubling the key size (e.g., from AES-128 to AES-256) effectively mitigates the threat.1 The crisis is specifically existential for public-key cryptography, which is the mechanism used to establish secure connections (key exchange) and verify identities (digital signatures).
1.2 The “Harvest Now, Decrypt Later” (HNDL) Vector
The primary driver for immediate PQC adoption is the strategic behavior of adversaries known as “Harvest Now, Decrypt Later” (HNDL) or “Store Now, Decrypt Later” (SNDL). Intelligence agencies and cybercriminal syndicates are currently intercepting and archiving encrypted network traffic.6
- The Mechanism: An attacker sits on a network tap or compromise point and records the encrypted ciphertext of sensitive sessions. They cannot read this data today. They simply store it in vast data centers.
- The Weaponization: The attacker waits. Once a CRQC becomes available—whether in 2030, 2035, or beyond—they will feed the harvested ciphertext into the quantum computer to recover the session keys and decrypt the historical data.8
This threat profile fundamentally alters the concept of “security.” The vulnerability of data is not determined by the date of the attack, but by the date of data creation relative to the shelf-life of its confidentiality.
- Long-Lived Assets: Trade secrets, pharmaceutical formulas, genetic data, intelligence sources, and weapons system designs often retain their sensitivity for 25 to 50 years.
- The Compliance Trap: Regulations such as HIPAA, GDPR, and financial retention laws mandate that data be stored for years or decades. Ironically, this compliance-driven retention expands the attack surface for HNDL, creating a “hidden risk” where adhering to current storage laws ensures a treasure trove for future quantum attackers.7
Distributed ledger technologies and cryptocurrencies face a unique variation of this threat. While a blockchain could theoretically upgrade its signing algorithms to PQC for future transactions, the history of the ledger is immutable. A quantum attacker could analyze the entire history of the Bitcoin blockchain, for example, and use Shor’s algorithm to derive the private keys associated with public keys exposed in early transactions. This would not only allow the theft of “zombie” coins but could retroactively de-anonymize the entire transaction graph, shattering the privacy of the network’s history.10
1.3 The Risk Calculus: Mosca’s Theorem
To quantify this urgency, the cryptographic community relies on Mosca’s Theorem (or Mosca’s Inequality), a risk management framework proposed by Dr. Michele Mosca. It provides a formal logic for determining when an organization must begin its migration.
The theorem posits that an organization is in danger if:
$$X + Y > Z$$
Table 1: Mosca’s Theorem Parameters
| Parameter | Definition | Implications |
| X (Shelf-Life) | The number of years data must remain confidential. | For national security, X=50+. For mortgages, X=30. For health data, X=Lifetime. |
| Y (Migration Time) | The years required to re-tool infrastructure to be quantum-safe. | Historical migrations (e.g., SHA-1 to SHA-2) took nearly a decade. PQC is far more complex. |
| Z (Collapse Time) | The years until a CRQC is available. | Estimates vary (2030–2040), but the exact date is irrelevant if $X+Y$ is large enough. |
11
If the time required to keep data safe plus the time required to update systems exceeds the time until the threat arrives, the organization has already failed. For example, if a bank needs to protect mortgage contracts for 30 years ($X=30$) and needs 5 years to migrate its mainframes ($Y=5$), but a quantum computer arrives in 15 years ($Z=15$), the inequality holds ($35 > 15$). The bank is already 20 years late. This mathematical reality creates the “retroactive” operational requirement that drives today’s policy mandates.14
2. The New Global Standards: NIST FIPS 203, 204, and 205
The theoretical phase of PQC ended in August 2024. After a rigorous eight-year standardization process that began with 69 submissions in 2016, NIST published the finalized standards. These documents, FIPS 203, 204, and 205, provide the concrete technical specifications that global industry must now implement.15
2.1 The Selection Philosophy: Lattice-Based Dominance
NIST’s selection process heavily favored lattice-based cryptography. Lattices are geometric structures that extend into multidimensional space. The security of these algorithms relies on the hardness of problems like “Learning With Errors” (LWE) over module lattices. These problems involve finding the closest vector to a target point in a high-dimensional lattice, a task that is believed to be exponentially hard even for quantum computers.18
However, ensuring diversity was a key goal. To prevent a single mathematical breakthrough from compromising all standards, NIST also selected a hash-based signature scheme (SPHINCS+) as a backup. This strategy of “cryptographic diversity” ensures resilience.17
2.2 FIPS 203: ML-KEM (Module-Lattice-Based Key-Encapsulation Mechanism)
ML-KEM (formerly CRYSTALS-Kyber) is the primary standard for general encryption and key establishment. It is designed to replace Diffie-Hellman (DH) and RSA key exchange protocols in applications ranging from TLS handshakes to secure messaging.18
- Mechanism: It uses a Key Encapsulation Mechanism (KEM) rather than traditional key exchange. In this model, the sender generates a random symmetric key, encrypts (encapsulates) it using the receiver’s public key, and sends the ciphertext. The receiver then decrypts (decapsulates) it to recover the symmetric key.
- Parameters: FIPS 203 defines three parameter sets: ML-KEM-512, ML-KEM-768, and ML-KEM-1024. These correspond roughly to the security levels of AES-128, AES-192, and AES-256.18
- Changes from Draft to Final: The final release in August 2024 included critical technical adjustments. Most notably, the “matrix indexing” order was reverted to match the original CRYSTALS-Kyber submission, a change driven by community feedback to simplify compliance. Additionally, “domain separation” was added to the key generation and encryption routines. Domain separation acts as a cryptographic namespace, ensuring that keys generated for one security level or purpose cannot be confused with or attacked via keys from another level, significantly hardening the algorithm against misuse.19
2.3 FIPS 204: ML-DSA (Module-Lattice-Based Digital Signature Algorithm)
ML-DSA (formerly CRYSTALS-Dilithium) is the primary standard for digital signatures. It is intended to verify the identity of senders and the integrity of data.18
- Performance: ML-DSA offers strong security and balanced performance. It features relatively fast signing and verification times, making it suitable for high-frequency applications like authentication handshakes.
- Technical Refinements: In the final FIPS 204 standard, NIST restored a “malformed input check” in the “hint unpacking” algorithm. This check had been removed in the draft but was deemed essential to prevent attackers from exploiting improperly formatted signatures. Like ML-KEM, domain separation was introduced to distinguish between “pure” message signing and “pre-hashed” message signing. This prevents collision attacks where an attacker might try to substitute a hash for a message or vice-versa.19
2.4 FIPS 205: SLH-DSA (Stateless Hash-Based Digital Signature Algorithm)
SLH-DSA (formerly SPHINCS+) is the standardized alternative for digital signatures. Unlike the lattice-based ML-DSA, SLH-DSA is based on hash functions.
- The “Backup” Role: Because its security proofs rely only on the security of the underlying hash function (e.g., SHA-2 or SHAKE), it is considered extremely conservative and secure. If a breakthrough in mathematics renders lattice problems easy to solve, SLH-DSA would remain secure.17
- Trade-offs: The cost of this security is performance and size. SLH-DSA produces significantly larger signatures and is slower to verify than ML-DSA. This makes it less ideal for real-time communications but excellent for code signing, software updates, or document archival where verification speed is less critical than long-term durability.17
Table 2: Comparative Overview of NIST PQC Standards
| Standard | Algorithm | Mathematical Basis | Primary Use Case | Key Attribute |
| FIPS 203 | ML-KEM | Module Lattices (LWE) | Key Establishment (Encryption) | Balanced speed/size; Primary KEM |
| FIPS 204 | ML-DSA | Module Lattices (LWE) | Digital Signatures | Fast verification; Primary Signature |
| FIPS 205 | SLH-DSA | Hash Functions | Digital Signatures | Conservative backup; Larger signatures |
3. Global Regulatory Fragmentation and Geopolitics
While the mathematical threat is universal, the regulatory response is fragmented. Governments are mandating PQC adoption on diverging timelines, creating a complex compliance matrix for multinational organizations.
3.1 United States: Aggressive National Security Mandates
The U.S. government has taken the most aggressive stance, driven by the National Security Agency (NSA). The Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) directive sets strict deadlines for National Security Systems (NSS).
- The “Exclusive Use” Mandate: The directive does not just call for “support” of PQC; it calls for the exclusive use of PQC, meaning RSA and ECC must be turned off completely by certain dates.
- Timeline:
- Immediately: Software and firmware signing transitions must begin.
- 2025: New software and firmware must “prefer” CNSA 2.0 algorithms. Web browsers and cloud services must support them.
- 2027: New hardware and software acquisitions for NSS must be compliant.
- 2030: Exclusive use of CNSA 2.0 for software/firmware signing and traditional networking equipment.
- 2033: Exclusive use for all browsers, servers, and cloud services.21
This timeline effectively gives defense contractors and federal suppliers a hard deadline of 2026-2027 to have products ready, as procurement cycles will block non-compliant vendors.23
3.2 United Kingdom: A Managed Transition
The UK’s National Cyber Security Centre (NCSC) has adopted a slightly more pragmatic approach, emphasizing “managed transition” to avoid the risks of rushing and implementing buggy cryptographic implementations.
- The Roadmap: The NCSC published a detailed timeline in 2025.
- 2028: Completion of discovery exercises and inventory planning.
- 2031: Migration of high-priority and critical systems.
- 2035: Full migration of all systems.25
- Pilot Schemes: To support this, the NCSC re-opened its PQC pilot scheme in late 2025. This initiative certifies consultancies to provide assured advice on PQC migration, ensuring that the private sector has access to vetted expertise.27
3.3 European Union: Sovereignty and Coordination
The EU faces the challenge of coordinating 27 member states. The European Commission and ENISA have issued a “Coordinated Implementation Roadmap.”
- Harmonization: The focus is on ensuring that member states do not adopt incompatible standards. The roadmap encourages national strategies to be defined by 2026.
- Hybrid Approach: ENISA explicitly recommends hybrid implementations (pre-quantum + post-quantum) as a near-term mitigation, recognizing that PQC standards are new and could conceivably harbor undiscovered vulnerabilities.28
3.4 Asia-Pacific: Divergence and Independence
The geopolitical fracture in cryptography is most visible in Asia.
- Japan: The National Cyber Command Office (NCO) released an interim report in late 2025, aligning its target with the West: full migration for government agencies by 2035. Japan emphasizes international collaboration, particularly with the US and UK.30
- South Korea: South Korea has taken a hybrid path, developing its own algorithms (KpqC) while monitoring NIST. In January 2025, it announced the winners of its KpqC competition, selecting local algorithms like “KpqC-DSA” to standardize alongside international options. They aim for a pilot transition between 2025 and 2028.31
- China: China is pursuing a distinct strategy of technological sovereignty. The Institute of Commercial Cryptography Standards (ICCS) launched a global call for PQC algorithms in 2025, signaling an intent to develop and standardize its own cryptographic suite rather than adopting the US-led NIST standards. This is driven by fears of foreign “backdoors” in NIST algorithms and a desire to control its own security stack. For global companies, this implies a future where they may need to support different encryption standards for their Chinese operations compared to the rest of the world, complicating supply chain logic.32
4. Engineering the Post-Quantum Stack: Technical Challenges
Moving from policy to practice reveals the immense engineering challenges of PQC. These new algorithms are not simple drop-in replacements; they demand more memory, more bandwidth, and different handling.
4.1 The Cost of Security: Key and Signature Sizes
The most immediate impact of PQC is the significant increase in data sizes.
- RSA vs. ML-KEM: An RSA-2048 public key is roughly 256 bytes. An ML-KEM-768 public key is 1,184 bytes—over four times larger.
- ECC vs. ML-DSA: The contrast for signatures is even starker. An ECDSA (P-256) signature is a compact 64 bytes. An ML-DSA-65 signature is 3,309 bytes—a 50x increase.
- SLH-DSA: The hash-based backup, SLH-DSA, has extremely large signatures (approx. 41KB for robust parameters), making it impractical for many network protocols.34
Table 3: Performance and Size Comparison
| Metric | RSA-2048 | ECC (P-256) | ML-KEM-768 | ML-DSA-65 | SLH-DSA-128f |
| Public Key Size | ~256 Bytes | ~32 Bytes | 1,184 Bytes | 1,952 Bytes | ~32 Bytes |
| Signature/Ciphertext | ~256 Bytes | ~64 Bytes | 1,088 Bytes | 3,309 Bytes | ~17,000 Bytes |
| Key Gen Speed | Slow | Fast | Very Fast | Fast | Fast |
| Verification Speed | Fast | Medium | Fast | Medium | Slow |
34
4.2 Network Protocol Impact: Latency and Fragmentation
These larger sizes have tangible effects on network protocols.
- TLS Handshakes: In a TLS 1.3 handshake, the server must send its certificate chain and a signature. With ML-DSA, this payload grows significantly. If the total size of the handshake messages exceeds the initial TCP Congestion Window (usually ~14KB, or 10 packets), the server must wait for an ACK from the client before sending the rest. This introduces an extra Round Trip Time (RTT), increasing latency for establishing connections.37
- UDP Fragmentation: Protocols running over UDP (like DNSSEC or QUIC) face a harder limit: the Maximum Transmission Unit (MTU), typically 1,500 bytes. An ML-DSA signature (3.3KB) is larger than a single Ethernet frame. This forces IP fragmentation, where the packet is split into pieces. Fragmentation is fragile; if one fragment is lost, the whole message fails. Furthermore, fragmentation is often blocked by firewalls and can be exploited for Denial of Service (DoS) attacks.34 This makes ML-DSA challenging for DNSSEC, where compact signatures are crucial.
4.3 The “Hybrid” Bridge: X25519MLKEM768
To navigate the transition period, the industry has standardized “Hybrid Key Exchange.” This approach combines a battle-tested classical algorithm (X25519) with a post-quantum one (ML-KEM-768).
- The Combiner: The two algorithms are run in parallel. The final shared secret is derived by combining the output of both.
- Security Logic: This creates a “defense in depth.” If ML-KEM turns out to have a hidden flaw, the connection remains as secure as X25519 (secure against classical computers). If a quantum computer attacks, X25519 breaks, but ML-KEM protects the connection.
- Adoption Status: This standard (often identified as X25519MLKEM768 or codepoint 0x11EB) is already deployed in Chrome, Firefox, and Edge. It is the default for high-security connections in Google Cloud and AWS.39
4.4 Hardware Roots of Trust
Cryptographic security ultimately rests on hardware. Hardware Security Modules (HSMs) are the vaults that store master keys.
- Thales: Thales has released firmware (v7.9+) for its Luna Network HSMs that natively supports ML-KEM and ML-DSA within the FIPS boundary. They utilize “Functionality Modules” (FMs) to allow agility, enabling firmware updates to add new algorithms without replacing the hardware.42
- Entrust: Entrust’s nShield HSMs (Firmware v13.8.0) also support the full NIST suite and have been submitted for FIPS 140-3 validation. This hardware readiness is a critical prerequisite for the financial and government sectors to begin migration.44
5. Case Studies in Adoption: Leaders vs. Laggards
The adoption of PQC is not uniform. A sharp divide exists between the “hyperscalers” and legacy industries.
5.1 The Tech Giants: Protecting Data in Transit
Technology providers have moved fastest, driven by the HNDL threat to their cloud customers.
- Apple (PQ3): Apple’s iMessage protocol, PQ3, is perhaps the most sophisticated mass deployment of PQC. It employs post-quantum cryptography not just for the initial handshake, but for re-keying. This provides “Post-Compromise Security.” Even if an attacker steals a user’s key today, the protocol automatically heals itself with new quantum-safe keys, preventing the attacker from reading future messages. This sets a new global standard for consumer privacy.45
- Signal (PQXDH): Signal upgraded its X3DH protocol to PQXDH, adding a layer of CRYSTALS-Kyber (ML-KEM) encapsulation. This ensures that the initial exchange of secrets is immune to quantum harvesting.48
- Cloudflare: By late 2025, Cloudflare reported that the majority of human-initiated traffic crossing its network was protected by hybrid PQC. This widespread deployment proves that the performance overhead of PQC is manageable at internet scale.39
5.2 The Financial Sector: A Dangerous Lag
In contrast, the financial sector is dangerously behind. Research indicates that as of 2025, only 3% of banking websites supported PQC key exchange. Even more concerning, a quarter of top websites still lacked support for TLS 1.3, a prerequisite for modern PQC integration.49
- Mainframe Dependency: Banks rely on decades-old mainframe infrastructure (e.g., IBM z/OS). Upgrading cryptography in these environments is not a software patch; it is a major re-platforming exercise.
- Risk Exposure: This lag is critical because financial data has extreme shelf-life requirements. A mortgage or insurance contract signed today must remain confidential for 30+ years. The failure of banks to protect this data against HNDL represents a massive systemic risk.49
6. Liability, Governance, and Business Risk
The conversation around PQC has shifted from the server room to the boardroom. The failure to migrate is now a legal and fiduciary liability.
6.1 GDPR and “State of the Art”
The General Data Protection Regulation (GDPR) mandates that data controllers implement “state of the art” security measures (Article 32). Legal scholars argue that now that NIST standards are final, PQC is the state of the art.
- Retrospective Liability: If an organization continues to use RSA to protect long-lived genetic or health data, and that data is harvested today and decrypted in 2030, regulators could rule that the organization was negligent today. The HNDL threat is “foreseeable,” and failing to mitigate a foreseeable risk is a cornerstone of liability.51
6.2 The “Unreasonable” Practice Doctrine
In the United States, the Federal Trade Commission (FTC) takes action against companies with “unreasonable” security practices. Once the federal government (via CNSA 2.0) mandates PQC for its own systems, it establishes a benchmark for what is “reasonable.” Companies that experience a breach due to quantum decryption in the future may find themselves defenseless in court if they cannot prove they had a PQC migration plan in place when the standards became available.53
7. Strategic Recommendations: A Roadmap for Resilience
For C-suite executives and security leaders, the path forward requires a structured, multi-year program.
7.1 Phase 1: Cryptographic Discovery (Immediate)
Organizations cannot replace what they cannot find. The first step is deploying “Cryptographic Discovery” tools to scan the network, code repositories, and third-party libraries.
- Goal: Create a dynamic inventory of where RSA/ECC is used.
- Standard: Use NIST SP 800-227 as a guide for identifying key encapsulation mechanisms and assessing their usage contexts.54
7.2 Phase 2: Crypto-Agility and Hybrid Pilots (2025-2026)
- Hybrid by Default: Configure all web servers, load balancers, and gateways to prefer Hybrid TLS (X25519+ML-KEM). This provides immediate HNDL protection with near-zero risk of breakage.39
- Vendor Accountability: Audit the PQC roadmaps of all critical vendors. If a SaaS provider or firewall vendor does not have a confirmed date for FIPS 203 support, they are a supply chain vulnerability.
- Internal PKI Pilot: Establish a parallel internal PKI using ML-DSA. Issue quantum-safe certificates for non-critical internal services to test the impact of larger certificate chains on network latency.58
7.3 Phase 3: Strategic Migration (2027-2030)
- Prioritize by Shelf-Life: Use the Mosca inequality to prioritize migration. Systems processing data with a shelf-life > 10 years must be migrated first.
- Hardware Refresh: Budget for the replacement of legacy HSMs, smart cards, and IoT devices that lack the memory or processing power to handle PQC keys.42
- Policy Enforcement: Set a hard deadline (aligned with CNSA 2.0’s 2030 date) to deprecate RSA/ECC for all internal authentication.
Conclusion
The transition to Post-Quantum Cryptography is a singularity in the history of information security. It is rare that the industry knows exactly when a current security model will fail, yet with PQC, the timeline is clear. The convergence of the “Harvest Now, Decrypt Later” threat, the finalization of NIST standards, and the aggressive regulatory mandates from the US, UK, and EU creates an environment where inaction is no longer a calculated risk—it is a compliance failure.
The finalized FIPS 203, 204, and 205 standards provide the tools. The “Hybrid” protocols provide the bridge. The regulatory timelines provide the schedule. The only variable remaining is organizational will. Organizations that begin their migration today will secure their digital trust for the next era of computing. Those that wait will find that by the time the quantum threat is visible, their data has already been lost.
List of Tables
Table 1: Mosca’s Theorem Parameters
| Parameter | Definition | Implications |
| X (Shelf-Life) | The number of years data must remain confidential. | For national security, X=50+. For mortgages, X=30. For health data, X=Lifetime. |
| Y (Migration Time) | The years required to re-tool infrastructure to be quantum-safe. | Historical migrations (e.g., SHA-1 to SHA-2) took nearly a decade. PQC is far more complex. |
| Z (Collapse Time) | The years until a CRQC is available. | Estimates vary (2030–2040), but the exact date is irrelevant if $X+Y$ is large enough. |
Table 2: Comparative Overview of NIST PQC Standards
| Standard | Algorithm | Mathematical Basis | Primary Use Case | Key Attribute |
| FIPS 203 | ML-KEM | Module Lattices (LWE) | Key Establishment (Encryption) | Balanced speed/size; Primary KEM |
| FIPS 204 | ML-DSA | Module Lattices (LWE) | Digital Signatures | Fast verification; Primary Signature |
| FIPS 205 | SLH-DSA | Hash Functions | Digital Signatures | Conservative backup; Larger signatures |
Table 3: Performance and Size Comparison
| Metric | RSA-2048 | ECC (P-256) | ML-KEM-768 | ML-DSA-65 | SLH-DSA-128f |
| Public Key Size | ~256 Bytes | ~32 Bytes | 1,184 Bytes | 1,952 Bytes | ~32 Bytes |
| Signature/Ciphertext | ~256 Bytes | ~64 Bytes | 1,088 Bytes | 3,309 Bytes | ~17,000 Bytes |
| Key Gen Speed | Slow | Fast | Very Fast | Fast | Fast |
| Verification Speed | Fast | Medium | Fast | Medium | Slow |
