top | item 16014628

NIST Post-Quantum Cryptography Round 1 Submissions

130 points| sohkamyung | 8 years ago |csrc.nist.gov

48 comments

order
[+] Nokinside|8 years ago|reply
If you can tolerate ~1MB private keys, Classic McEliece is probably the best conservative and secure post-quantum option just now before others are analyzed more. It's old and proven.

>9 Advantages and limitations (2.B.6)

>The central advantage of this submission is security. See the design rationale.

>Regarding efficiency, the use of random-looking linear codes with no visible structure forces public-key sizes to be on the scale of a megabyte for quantitatively high security: the public key is a full (generator/parity-check) matrix. Key-generation software is also not very fast. Applications must continue using each public key for long enough to handle the costs of generating and distributing the key.

>There are, however, some compensating efficiency advantages. Encapsulation and decapsu- lation are reasonably fast in software, and impressively fast in hardware, due to the simple nature of the objects (binary vectors) and operations (such as binary matrix-vector multiplications). Key generation is also quite fast in hardware. The hardware speeds of key generation and decoding are already demonstrated by our FPGA implementation. Encapsulation takes only a single pass over a public key, allowing large public keys to be streamed through small coprocessors and small devices.

>Furthermore, the ciphertexts are unusually small for post-quantum cryptography: under 256 bytes for our suggested high-security parameter sets. This allows ciphertexts to fit comfortably inside single network packets. The small ciphertext size can be much more important for total traffic than the large key size, depending on the ratio between how often keys are sent and how often ciphertexts are sent. System parameters can be adjusted for even smaller ciphertexts.

[+] hackcasual|8 years ago|reply
Can't really see standardization settling on something that requires ~1,000-10,000x overhead from its competitors. 1mb public keys would mean things like IoT devices and block chain type usages, not to mention local caching taking possibly gigabytes.

Hybrid secret-sharing schemes do a lot to mitigate both future quantum risk and present broken cipher system risk.

[+] bascule|8 years ago|reply
One of them has already fallen victim to a key recovery attack:

https://twitter.com/yx7__/status/945283780851400704

Round2 (LWE-based) and SIKE (isogeny-based) are the particularly interesting ones to me. Both support comparatively small keys (~1kB), with Round2 seemingly winning on performance, but also patented.

[+] mort96|8 years ago|reply
I never understood how or why patenting math is a thing.
[+] exabrial|8 years ago|reply
I like to ask dumb questions... So here is one: Are forward secrecy guarantees made by TLS valid in a post-quantum world?

Specifically, say I connect via TLS to a server that uses FS and exchange some information that will be valuable in 150 years. If the entire conversation is recorded at the wire, does quantum computing break this promise? (or have I misunderstood FS completely?)

[+] CiPHPerCoder|8 years ago|reply
https://www.fredericjacobs.com/blog/2016/04/07/qc-axolotl/ might be a good place to start answering this question.

If you can break the ECDH, you can figure out the key that was calculated. However, if other unknown data (e.g. past ECDH outputs) is hashed together to generate the key (like Signal does, as a very rough description), it's only post-quantum secure if the adversary ever misses a single message.

Other than weird saved-by-the-bell corner cases, the answer is "No".

[+] nokcha|8 years ago|reply
>Are forward secrecy guarantees made by TLS valid in a post-quantum world?

No, forward secrecy doesn't apply if the underlying public-key cryptography gets broken (such as by a quantum computer).

[+] late2part|8 years ago|reply
I'm seriously considering moving to pre-shared keys. Vernor Vinge's books (_A Deepness In The Sky_, others) talk about how much of interstellar shipping is the secure transportation of random pre-shared keys.

Although I'm unfamiliar w/ anything more sophisticated than a password manager like 1Password and S/Key - I'd like to see if I could have a go with this for my personal kit.

Can anyone recommend keywords/papers that talk about pre-shared key authentication at automated scale?

[+] CiPHPerCoder|8 years ago|reply
PSK isn't really a good solution, mostly because of scaling problems.

Today, all you need is a 32-byte ECDH private key, and a corresponding 32-byte public key. To talk to any of the billions of devices on the Internet securely, all you need is their 32-byte public key, and some means to verify that it is authentic (i.e. certificate authorities + certificate transparency). With this information, you can calculate a shared secret (or use a sealing API given only their public key, if you really want).

Total local storage: PKI overhead plus your own keys, under a few megabytes.

Doing PSK at scale means storing billions of 32-byte binary strings. One for each participant you intend to communicate with. They must also do this for every possible device they wish to communicate with, otherwise others can just eavesdrop on their conversations if keys are reused.

You've turned an O(n) space problem into an O(n^2) space problem.

If your n is small, you might still be tempted to make the move, but it will quickly become unmanageable as n grows.

Furthermore, if a devices loses their key, you must:

1. Generate/share a new keypair with n devices (which at Internet scale is billions), and

2. Do so in a way that doesn't open the door for backdoors, which probably requires digital signatures and therefore you're not actually getting rid of public-key cryptography

I think at Internet scale, we're still years away from having reliable post-quantum cryptography to even recommend let alone adopt, but that's going to be the way to go. Something something baby and bathwater.

[+] dsr_|8 years ago|reply
Actually, he ships around giant blocks of random numbers in a k of n format to be reassembled as a one-time pad.
[+] Klathmon|8 years ago|reply
Does NIST require submissions to be free (as in open software)?

Or is this going to be a bunch of advertising for proprietary patented algorithms?

[+] IncRnd|8 years ago|reply
This is to form PQC (post quantum crypto) algorithms driven by the industry. In practice, as long as quantum computers don't both work and become scalable in the interim, you have another 4 years to worry about recommendations.
[+] yalogin|8 years ago|reply
Are there any blogs that talk about what’s required in a crypto algorithm to withstand quantum computing? Is a straightforward increasing the key size enough? Or are there new paradigms explored? Are these algorithms symmetric or asymmetric?
[+] betterunix2|8 years ago|reply
PQ crypto has to be secure against a scalable quantum computer i.e. one that can be increased to attack arbitrarily large keysizes. Building a quantum computer with enough qbits to attack RSA2048 is not a problem if there is no known way to increase the number of qbits to attack RSA3072 -- but if you can keep adding more qbits, then we need a post-quantum algorithm.

The interest in post-quantum crypto is mainly in public-key systems, because for symmetric key systems you can essentially double the key size to achieve the same security level. In other words, AES256 is as secure against a quantum computer as AES128 is against a classical computer. On the other hand, quantum algorithms break RSA and DLOG (include ECDLOG), so the research has been on new approaches to public key crypto that seem hard for quantum computers. For example, the Learning With Errors (LWE) problem:

https://en.wikipedia.org/wiki/Learning_with_errors

[+] westurner|8 years ago|reply
This paper lists a few of the practical concerns for quantum-resistant algos (and proposes an algo that wasn't submitted to NIST Post-Quantum Cryptography Round 1):

"Quantum attacks on Bitcoin, and how to protect against them" https://arxiv.org/abs/1710.10377 (~2027?)

A few Quantum Computing and Quantum Algorithm resources: https://news.ycombinator.com/item?id=16052193

Responsive HTML (arxiv-vanity/engrafo, PLoS,) or Markdown in a Jupyter notebook (stored in a Git repo with a tag and maybe a DOI from figshare or Zenodo) really would be far more useful than comparing LaTeX equations rendered into PDFs.

[+] algorithmsRcool|8 years ago|reply
This seems to have generated a lot of collaboration.

DJB is listed on 5 separate submissions.

Phillippe Gaborit is on 8!

[+] maxharris|8 years ago|reply
If I were you, I wouldn't be so quick to trust any cryptography standard set by a government agency. They have an obvious incentive to ensure that any standard that is eventually set be open to attack by whatever means the NSA has (or will have in the near future.)

So pay careful attention to things they reject, or things they modify!

I have no insider knowledge here. I'm just applying common sense...

[+] AgentME|8 years ago|reply
I sure hope we can get a post-quantum version of Dual_EC_DRBG from the trustworthy people at NIST. /s

Not that I especially think any of these algorithms are going to be backdoored (though I certainly hope they get extra scrutiny just in case). I'm mostly just disappointed that there aren't any real consequences for a group like NIST for rubber-stamping backdoored things.