What EU’s PQC roadmap means on the ground
In this Help Net Security interview, David Warburton, Director at F5 Labs, discusses how the EU’s Post-Quantum Cryptography (PQC) roadmap aligns with global efforts and addresses both the technical and regulatory challenges of migrating to PQC. Warburton also outlines practical steps organizations must take to ensure cryptographic agility and long-term data protection.
How does the EU’s PQC roadmap align with global efforts, such as those from NIST and ETSI? Are there any key differences or unique priorities?
The EU’s PQC roadmap is broadly aligned with that from NIST; both advise a phased migration to PQC with hybrid-PQC ciphers and hybrid digital certificates. These hybrid solutions provide the security promises of brand new PQC algorithms, whilst allowing legacy devices that do not support them, to continue using what’s now being called ‘classical cryptography’. In the first instance, both the EU and NIST are recommending that non-PQC encryption is removed by 2030 for critical systems, with all others following suit by 2035.
While both acknowledge the ‘harvest now, decrypt later’ threat, neither emphasise the importance of understanding the cover time of data; nor reference the very recent advancements in quantum computing. With many now predicting the arrival of cryptographically relevant quantum computers (CRQC) by 2030, if organizations or governments have information with a cover time of five years or more, it is already too late for many to move to PQC in time.
Perhaps the most significant difference that EU organizations will face compared to their American counterparts, is that the European roadmap is more than just advice; in time it will be enforced through various directives and regulations. PQC is not explicitly stated in EU regulations, although that is not surprising. GDPR and many other technology-focussed EU laws, like NIS2 (2022) and the Cyber Resilience Act (2024), take a risk-based approach to security. Rather than defining specific standards and protocols, they require organizations to maintain up to date policies and adhere to industry best practises.
NIST guidance is more explicit about specific protocols, but it doesn’t enforce action. While it has codified new PQC cryptographic algorithms into new standards, it will be up to industry regulators to adopt them and insist upon their implementation.
What role do EU member states and institutions play in harmonizing the rollout of PQC across sectors and borders? How challenging is that coordination?
When we take a moment to appreciate how much we depend upon cryptography, that the challenge of migrating to PQC is enormous. Encryption is used in cellular communications, email and document signing, messaging apps, biometric systems, mobile banking, smart meters, bluetooth and WiFi, remote access VPNs, ePassports, some electronic voting systems, healthcare platforms, and web and API security. So, the technical aspect of PQC rollout is immense; yet it pales in comparison to the policy and logistical aspects of harmonising PQC deployments, particularly across so many member states in the EU.
Consider all of the use cases of cryptography given above. One of the biggest challenges lies in ensuring that all 27 Member States, and the dozens of impacted sectors, implement them consistently, on similar timelines, and with compatible infrastructure. The EU roadmap provides shared milestones, but execution varies widely depending on each state’s cybersecurity maturity, regulatory interpretation, and access to technical resources.
The web has seen the most rapid progress towards PQC adoption, with hybrid-PQC ciphers already adopted in 8.6% of the world’s top 1 million websites, according to our latest research. But hybrid-PQC certificates, which are essential for authenticating TLS connections and smartcards, are still in the early stages of integration across tools and platforms.
Ultimately, harmonisation is more than making cryptography work, it’s about aligning policy, procurement, testing and trust infrastructure across borders. That’s a far more complex task than choosing the right algorithm. The EU is attempting to address this through coordinated roadmaps, funding for pilot projects and ENISA-led collaboration, but ensuring synchronised, secure deployment across such a diverse landscape remains one of the biggest challenges in the post-quantum transition.
What does the migration path from classical to post-quantum algorithms look like in practice for enterprise environments or government systems?
The migration to post-quantum cryptography will be complex, but much of that complexity lies in planning rather than in the technical implementation itself. One of the most critical challenges is ensuring that both ends of a communication, such as a web browser and server, an IoT device and its cloud management platform, or a mobile app and the APIs it connects to, can support the same cryptographic algorithms.
This requires a comprehensive inventory of all systems performing cryptographic operations. Some PQC algorithms will introduce performance overhead, particularly on constrained devices, and many legacy or low-cost IoT devices may not support PQC at all, necessitating hardware replacements in some cases.
While the public internet is making slow but steady progress toward a PQC-ready web, internal systems and non-production environments are often overlooked. It’s common to see encryption standards deployed quickly on customer-facing websites, while internal APIs, test, or development platforms remain stuck on outdated protocols. These neglected systems are frequently targeted by threat actors precisely because they lag behind in security.
This is why maintaining a detailed cryptographic asset register is essential. It enables organizations to assess the risk at each endpoint and ensures that all externally accessible services, whether production or not, receive the same level of cryptographic protection.
Our recent analysis of the top one million websites on the internet revealed that a significant portion still do not support TLS 1.3. From a migration standpoint, adopting TLS 1.3 should be the first step, as it provides the necessary foundation for future PQC support.
Interoperability is a big concern in crypto migration. How is the EU addressing compatibility between legacy systems and PQC solutions during the transition phase?
The EU places an emphasis on first mapping and creating an inventory of cryptographic systems within an organization. I would extend this advice and recommend that organizations consider where cryptographic operations can be consolidated. The more systems which encrypt and decrypt communications, the more complex it becomes to safeguard cryptographic key management, and to ensure that each system supports the same ciphers and algorithms.
Support for cryptographic agility and a quantum-safe upgrade path is also called out by the EU. Simply put, this means ensuring that the platforms you choose to perform cryptographic operations can easily swap certificates, protocols and ciphers, all without disrupting the underlying application. Capable cryptographic solutions will even allow for the use of different algorithms based on the device making a request. Sensitive and mission critical services can be forced to make use of hybrid-PQC cryptography, whilst legacy and less sensitive communications can be allowed to maintain their use of classical cryptography. This dynamic crypto agility is made possible by decoupling encryption from the underlying application.
Cryptographic solutions are already implementing support for PQC ciphers. However, from our analysis of global web traffic, we see a huge gap in device and web browser support. These hybrid dual-algorithm handshakes ensure that even if one party doesn’t support PQC, the session can still be secured using a traditional cipher, while gaining protection from the post-quantum side if both parties are capable. This means organizations can maintain interoperability with legacy systems while gradually introducing PQC.
SaaS providers and platforms already support these hybrid schemes, often requiring only configuration changes rather than full re-architecting. That’s the pathway we see gaining traction, not just in the EU, but globally.
What’s the EU’s guidance for handling data that must remain confidential for decades, the so-called “harvest now, decrypt later” scenario?
The ‘harvest now, decrypt later’ threat model is perhaps the most urgent and underappreciated driver for the adoption of PQC. It represents a critical risk to organizations handling data with long ‘cover time’, meaning information that must remain confidential for years or even decades. EU guidance, including ENISA and the Post-Quantum Roadmap, emphasises early identification of such data and the need for quantum-safe protections, particularly in critical sectors like healthcare, government and finance.
For example, medical records or foreign intelligence retain value over time and may be harvested today in the knowledge that quantum computers will eventually decrypt them. If your systems still rely on non-PQC algorithms (RSA, ECHDE, etc), data compromised in transit or storage could be exposed retroactively.
Our guidance is straightforward: apply the formula:
PQC Deployment Date = Q-Day – Cover Time
If we assume Q-Day to be 2030 and a cover time of five years, then organizations should be deploying PQC now, in 2025. Waiting even a year longer puts today’s sensitive data at risk of future compromise. This isn’t a problem that can be deferred until quantum computers are a reality, it demands immediate action.