The paradox of post-quantum crypto preparedness

Preparing for post-quantum cryptography (PQC) is a paradox: on the one hand, we don’t know for sure when, or perhaps even if, a large quantum computer will become available that can break all current public-key cryptography. On the other hand, the consequences would be terrible – hijacked code updates, massive sensitive data exposure – and the migration process so complicated that we have no choice but to start preparing now. But what can we do, without wasting resources, to be ready and to reassure our customers that we’re ready?

post-quantum crypto

Fortunately, there is a way to prepare for PQC that not only mitigates risk, but also gives us a number of immediate security and resilience benefits. Two recent reports, one from NIST and another from ENISA hold the key. In this article, we’ll show how you can use the takeaways from these reports to build a PQC plan for your organization with an instant return on investment, even if a large quantum computer turns out to be decades away.

The right kind of inventory

The NIST report, Getting Ready For Post Quantum Cryptography, covers the development of an inventory and a migration playbook. It is common sense to start your post-quantum planning with an inventory of the cryptography you use, but as the report makes clear, just listing the algorithms each application employs is a waste of time. For the inventory to have value, it must detail how the cryptography is used, and for what. NIST recommends that you identify automated tools to assist with this task, since retrieving this level of detail by hand would be far too time-consuming.

The reason the inventory needs this rich information becomes clear when you come to build the migration playbook. PQC won’t be a simple drop-in replacement for existing cryptography. There will most likely be a number of candidate algorithms with different trade-offs in terms of execution time, key size, output size, and other characteristics.

For each current use of public-key cryptography, you will have to choose the best replacement that suits the constraints of that usage. NIST gives a 15-point bulleted list of considerations on the usage that you will need to take into account. This includes technical considerations like performance, but also business considerations, like product lifetime and compliance requirements.

The value of NIST’s 15-point list is that it can guide the cryptography inventory work. By making sure the inventory contains enough information to respond to these 15 questions, we can make sure that we’re preparing in the most efficient way possible to build the migration playbook when the time comes.

The good news about this kind of rich cryptography inventory is that it has immediate business benefits, long before the arrival of a large quantum computer, as part of good cryptography management practice. First, it will allow you to eliminate weak or non-compliant cryptography that may be in use. Mistakes with cryptography are one of the most common application security flaws, and errors such as hard-coded keys or weak block cipher modes are easily exploited. Second, it can be used to demonstrate compliance to auditors, who are becoming increasingly studious of the use of cryptographic controls. Finally, it has benefits for resilience: an inventory that includes cryptographic keys and certificates allows for rapid response to compromises. For these business benefits to be realized, the inventory needs to be always up to date – another reason why automation is key.

Mitigating today’s quantum risk

Setting up the inventory and the migration playbook is all we need if we have time to wait for the NIST standardization process to conclude. But what if we need PQC now? This might be the case if we have data that needs to be kept confidential for the long-term, or if we’re shipping an embedded product now that needs to be able to verify code updates in the next 20 years. Or we might just need to reassure our customers that our critical applications will continue to function securely if a large quantum computer becomes available sooner than we were expecting.

The ENISA report Post Quantum Cryptography: Current State and Quantum Mitigation gives a roadmap for this situation. There are essentially two options if you need to mitigate quantum risk now: use a hybrid post-quantum and pre-quantum scheme, or use pre-shared keys.

Under a hybrid scheme, you’ll need to select one of the most promising candidates from the NIST standardization process and use it in combination with a classical scheme like RSA. You need to combine them in such a way that security of one of the algorithms is enough to guarantee security of the result, whether that result is a signature on a document, or a key to be used for encrypting data. Researchers have been experimenting with these ideas for some time, but the ENISA report gives a succinct description of how to do this.

In practice, it is likely that PQC will be phased-in this way even after the standardization process concludes, while confidence builds in the security of the selected candidates, so knowledge gained from these experiments will be invaluable. Open-source implementations of all the remaining candidates in the process are already available.

Using pre-shared keys involves injecting extra key material into an exchange that will keep the final result secure even if the public-key scheme is later broken. This is only feasible if you can maintain state for these exchanges. Again, this is a practice that has been known about in the research world for some time, but the ENISA report explains how you can experiment with it yourself.

Equipped with the right kind of cryptographic inventory, a migration playbook, and knowledge gained from hybrid-scheme experiments, we can take a rational approach to the post-quantum paradox, with no need to panic or play down the threat. Adopting good cryptography management practices now gives us immediate benefits – and peace of mind for us and our customers for the post-quantum future.

Don't miss