[cabfpub] Contingency planning for Quantum Cryptanalysis
Phillip Hallam-Baker
philliph at comodo.com
Tue Apr 19 17:41:54 UTC 2016
One of the statements made at IETF ’95 was that various parties believe that they can build a Quantum Computer with ‘significant’ capabilities as early as 2020.
It is thus important that we start looking at contingency planning as an industry. Even if the claims are wildly optimistic and it will be decades before such a machine could break current public key algorithms, we will have to deal with the press and increasingly government concerns. If we don’t have a contingency plan, we risk having to create one in a hurry.
As a general rule, I have always wanted to keep the WebPKI at least ten years ahead of the state of the art in cryptanalysis. We have a long tail of legacy browsers that can be attacked. As it says in Bill 14:7, “Windows XP will always be with you”.
We do not know at this stage what restrictions there will be on the purported QC machines or whether it will prove possible to maintain coherence for enough Qbits to break public key algorithms. What we do know is that RSA, DH, and ECDH are all affected. We also know that our symmetric algorithms are not. AES and SHA-2/SHA-3 remain strong.
Another bit of gut instinct I have is that any new system is likely to be a variant of Diffie Hellman. RSA is vulnerable because it is a trapdoor system. The encryption and decryption systems are symmetrical. We can imagine QC approaches to be the wave mechanics equivalent of filling up a swimming pool and letting it find its own level. Diffie Hellman does not have that symmetry. The scheme works by taking two different paths through a forrest to reach the same destination. Alice and Bob take different paths to arrive at the same key but neither path needs to be reversible for the system to work.
So what is to be done?
One option we do not have is to deploy a new Quantum resistant algorithm. There are proposals out there. But it will take us a considerable time to get comfortable with any of them. And that is not the only criteria either. We would have to have clear intellectual property rights before we could use them. And even if such algorithms exist, they may not be anywhere near as fast as RSA, let alone match the Elliptic Curve systems we are currently getting excited about.
Breaking down the problem into stages, I see two separate problems. One is developing a new QCR PKI (Quantum Cryptanalysis Resistant) and the second is how to deploy it using the assets we already have in place.
So right now, for purposes of contingency planning, I would like to focus on the second. In particular I would like to ask what we can do now to deploy QCR roots of trust that we can use as anchors for the eventual QCR-PKI.
It seems to me that the obvious approach is to make use of Lamport's hash signatures in combination with various efficiency improvements proposed by Diffie, Merkle and Winternitz. CFRG is already working on a draft.
https://datatracker.ietf.org/doc/draft-mcgrew-hash-sigs/ <https://datatracker.ietf.org/doc/draft-mcgrew-hash-sigs/>
The principle of hash signatures is fairly straightforward. If I start with two random numbers X and NX, I can take a hash of them to produce H(X) and H(NX) and then calculate V = H(H(X) +H(NX)). The value V is my ‘public key’. To assert the value 1, I release the value X, to release the value 0, I release NX.
This approach can be quickly shown to be secure provided that (1) the hash function is secure and (2) only one of the values is ever released.
The efficiency improvements allow the work factor of the signature to be increased to reasonable strength (i.e. 128, 256 bits) without creating unreasonably long signatures. Even so, using hash signatures has two important disadvantages over RSA or ECDH:
1) The signatures are relatively large (1Kb - 8Kb depending on strength and precalculation)
2) Each public key can only be used to sign a single document.
There are in fact ways that it is possible to construct a WebPKI type infrastructure using hash signatures and we may even end up having to resort to using some of them, particularly for low power devices. In particular:
* Distribute Merkle trees of public key values.
* Adopt a ‘use one, make one’ approach to distribution.
* Engage hash chain logs to provide reference truth.
* Use GPU farms and/or bitcoin mining equipment to construct large Merkle trees, the hardware using the trees can be more modest.
Note that even if some smart person comes up with a perfect QCR algorithm tomorrow that is obviously secure and a drop in replacement for RSA, I still want a hash signature alternative so that they don’t have unlimited bargaining power.
So what I think should be done now is for CAs, assisted by the technical community to come to a mechanism for deploying QCR contingency trust roots that can be validated under the current PKI. These would consist of the head of a Merkle tree containing 'some number’ (hundreds? thousands? millions?) of public key commitments for ‘just in case’ use.
As far as distribution goes, the simplest approach would probably be to add them to traditional certs as an extension. This might be in the root itself or an intermediary that has been checked into CT.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cabforum.org/pipermail/public/attachments/20160419/8861874f/attachment-0002.html>
More information about the Public
mailing list