I work on homomorphic encryption, and there are some rumors circulating that, if this checks out, it will break some of the leading FHE schemes like BFV, where the moduli used are quite large (in the hundreds of bits or even over a thousand bits).
Headline should be "polynomial time quantum algorithms for solving lattices" or somesuch. The polynomial time aspect is the main contribution here - and also why this is attracting attention.
How does this affect these statements on Wikipedia [1]
> some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
and [2] ?
> One class of quantum resistant cryptographic algorithms is based on a concept called "learning with errors" introduced by Oded Regev in 2005.
The idea of "appear to be resistant to attack" is an empirical one. When someone says that, they are saying that we simply have not found a good attack against this problem. That can change any day, in principle. Unfortunately, "we don't know of an attack" is about as strong a statement you can make in cryptography, when talking about a fundamental hardness assumption. More verbosely, you'd say "the best known attacks take 2^whatever operations on a computer (classical or quantum), and that's expensive, so we're probably fine unless someone makes a significant leap tomorrow"
The NTRU article mentions PQ resistance to Shor's only, other evaluations, and that IEEE Std 1363.1 (2008) and the X9 financial industry spec already specify NTRU, which is a Round 3 Finalist lattice-based method.
In [1] Under "Selected Algorithms 2022", the article lists "Lattice:
CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON; Hash-based: SPHINCS+".
Round 4 includes Code-based and
Supersingular elliptic curve isogeny algos.
FWIU
There's not yet a TLS 1.4/2.0 that specifies which [lattice-based] PQ algos webservers would need to implement to support a new PQ TLS spec.
People seemed to be focusing on the fact that this wouldn’t break the NIST leading PQC public key cryptosystem, but I think that misses the point. This takes a problem at the core of this security, which previously only had an exponential approximation, and finds a polynomial approximation. Sure that polynomial is too high O(n^4.5) to break the leading proposed systems, but I mean are you really feeling safe when an exponential just changed to a polynomial?
An analogy would be something like this. Factoring is hard. We base RSA on the hardness of this problem and there we use numbers that are the product of two primes. Someone just found an algorithm that doesn’t work to find the product of two primes, but can take a product of four primes and return two products of two primes. Do you feel safe with RSA?
Anyway the paper could be wrong or it could be right, it will take a while for those in the field to dig through this. As a cautionary tale, there have been a few extra good quantum people who have proposed quantum attacks on lattice problems that have later been shown to have bugs.
The running time of attacks hasn't suddenly become O(n^4.5). The latter figure describe the noise ratio for which the LWE assumption becomes broken in quantum polynomial time.
The proposed post-quantum encryption schemes use a much smaller noise ratio which (at the moment) is not affected by these attacks.
Not a lattice expert, so add salt to taste, but it looks like LWE in general (incluring RLWE)
But the current attack essentially wants q > n^2, so even if it is confirmed, not all LWE schemes are dead. There will certainly be people who tweak the params in response and carry on.
However, attacks only get better. And for people in FHE who are squeezed between performance problems and dangerously thin security parameters, it is a bad day if confirmed. There's no credible practical alternative to LWE for FHE...
Hello everyone. I am a college student and currently new to this field. If possible can somone explain in simple terms that what real future impacts would this paper can create?
Some post-quantum signatures like CRYSTALS-Dilithium are based on lattices. Makes me think that quantum key distribution (what I've been working on for the past 6 months) has a chance to actually become useful instead of being only of interest to academics and to a few companies that sell overpriced solutions to paranoids.
QKD does not solve the problem that quantum computers create, and cannot replace public key cryptography. That's a common misconception that the marketing departments of QKD research tries to keep alive.
Even under ideal conditions (whether these can exist is debatable), the best QKD gives you is a securely encrypted channel only when you already have a securely authenticated channel. The latter is extremely important, makes the whole thing mostly useless, and is often omitted by QKD advocates.
Code based systems are still in, and classic McEliece could be extended to ~50 MiB for a keypair and still be way more practical than QKD. Just run the max current classic McEliece spec hybrid post quantum with X448.
> Let us remark that the modulus-noise ratio achieved by our quantum algorithm is still too large to break the public-key encryption schemes based on (Ring)LWE used in practice. In particular, we have not broken the NIST PQC standardization candidates. For example, for CRYSTALS-Kyber [BDK+18], the error term is chosen from a small constant range, the modulus is q = 3329, the dimension is n = 256 · k where k ∈ {3, 4, 5}, so we can think of q as being almost linear in n. For our
algorithm, if we set αq ∈ O(1), then our algorithm applies when q ∈ Ω^~(n^2), so we are not able to break CRYSTALS-Kyber yet. We leave the task of improving the approximation factor of our quantum algorithm to future work.
Factorization and discrete log are also polynomial on a quantum computer, and we are very good at just increasing bit widths. If CRYSTALS is also polynomial in BQP, there is very little reason to invest so much into it.
I am still of the (very controversial) opinion that the only PQC algorithm worth investing in at the expense of classical algorithms is Classic McEliece. This is a code that has stood up to classical and quantum cracking attempts for a very long time - cracking these codes is equivalent to creating a very valuable algorithm in error correcting codes.
The NIST also is dead set on people using only PQC or classical crypto, not a wrapper with both. That is stupid IMO.
"Note: Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix. See Section 3.5.9 (Page 37) for details. I sincerely thank Hongxun Wu and (independently) Thomas Vidick for finding the bug today. Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold. I leave the rest of the paper as it is (added a clarification of an operation in Step 8) as a hope that ideas like Complex Gaussian and windowed QFT may find other applications in quantum computation, or tackle LWE in other ways."
If the findings of this paper hold up, I believe it could pretty much undo a decade of NIST's efforts in post-quantum cryptography.
a seismic shift in the world of cryptography.
Not entirely true, there are other PKE and DSA algorithms that were/are a part of the competition that used problems not related to lattices. However, the lattice-based options were often among the fastest and smallest.
No? One of the side effects of running an open competition is that it focused attention on a variety of competing options for this, all of which were formalized, recorded, and publicly evaluated by the world's academic cryptography experts. We're strictly better off as a result, and much of NIST's own work would still be valuable even in a hypothetical scenario in which none of LWE was quantum-safe.
This is the reason why nist did the decade of work - to focus effort on figuring out what options are secure. Finding out an option is not secure is a good thing. Its why we are putting effort into PQC now before quantum computers are a real threat.
j2kun|1 year ago
ilya_m|1 year ago
troq13|1 year ago
j2kun|1 year ago
Beldin|1 year ago
anonymousDan|1 year ago
JohnKemeny|1 year ago
tromp|1 year ago
> some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
and [2] ?
> One class of quantum resistant cryptographic algorithms is based on a concept called "learning with errors" introduced by Oded Regev in 2005.
[1] https://en.wikipedia.org/wiki/Lattice-based_cryptography
[2] https://en.wikipedia.org/wiki/Ring_learning_with_errors_key_...
doomrobo|1 year ago
westurner|1 year ago
[1] NIST Post-Quantum Cryptography Standardization: https://en.wikipedia.org/wiki/NIST_Post-Quantum_Cryptography...
The NTRU article mentions PQ resistance to Shor's only, other evaluations, and that IEEE Std 1363.1 (2008) and the X9 financial industry spec already specify NTRU, which is a Round 3 Finalist lattice-based method.
In [1] Under "Selected Algorithms 2022", the article lists "Lattice: CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON; Hash-based: SPHINCS+".
Round 4 includes Code-based and Supersingular elliptic curve isogeny algos.
FWIU There's not yet a TLS 1.4/2.0 that specifies which [lattice-based] PQ algos webservers would need to implement to support a new PQ TLS spec.
wbl|1 year ago
da-bacon|1 year ago
An analogy would be something like this. Factoring is hard. We base RSA on the hardness of this problem and there we use numbers that are the product of two primes. Someone just found an algorithm that doesn’t work to find the product of two primes, but can take a product of four primes and return two products of two primes. Do you feel safe with RSA?
Anyway the paper could be wrong or it could be right, it will take a while for those in the field to dig through this. As a cautionary tale, there have been a few extra good quantum people who have proposed quantum attacks on lattice problems that have later been shown to have bugs.
Ar-Curunir|1 year ago
The proposed post-quantum encryption schemes use a much smaller noise ratio which (at the moment) is not affected by these attacks.
unknown|1 year ago
[deleted]
deknos|1 year ago
axblount|1 year ago
If so, it's a big blow to systems like FrodoKEM that banked on unstructured lattices providing higher security.
tux3|1 year ago
But the current attack essentially wants q > n^2, so even if it is confirmed, not all LWE schemes are dead. There will certainly be people who tweak the params in response and carry on.
However, attacks only get better. And for people in FHE who are squeezed between performance problems and dangerously thin security parameters, it is a bad day if confirmed. There's no credible practical alternative to LWE for FHE...
j2kun|1 year ago
hellobye|1 year ago
swells34|1 year ago
Since this is about quantum computing, real world effects are very likely to be none except an exorbitant amount of grant money.
tschumacher|1 year ago
hannob|1 year ago
Even under ideal conditions (whether these can exist is debatable), the best QKD gives you is a securely encrypted channel only when you already have a securely authenticated channel. The latter is extremely important, makes the whole thing mostly useless, and is often omitted by QKD advocates.
Vecr|1 year ago
ColinWright|1 year ago
https://news.ycombinator.com/item?id=40086515
"Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix."
...
"Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold."
ghostway|1 year ago
> Let us remark that the modulus-noise ratio achieved by our quantum algorithm is still too large to break the public-key encryption schemes based on (Ring)LWE used in practice. In particular, we have not broken the NIST PQC standardization candidates. For example, for CRYSTALS-Kyber [BDK+18], the error term is chosen from a small constant range, the modulus is q = 3329, the dimension is n = 256 · k where k ∈ {3, 4, 5}, so we can think of q as being almost linear in n. For our algorithm, if we set αq ∈ O(1), then our algorithm applies when q ∈ Ω^~(n^2), so we are not able to break CRYSTALS-Kyber yet. We leave the task of improving the approximation factor of our quantum algorithm to future work.
ghostway|1 year ago
pclmulqdq|1 year ago
I am still of the (very controversial) opinion that the only PQC algorithm worth investing in at the expense of classical algorithms is Classic McEliece. This is a code that has stood up to classical and quantum cracking attempts for a very long time - cracking these codes is equivalent to creating a very valuable algorithm in error correcting codes.
The NIST also is dead set on people using only PQC or classical crypto, not a wrapper with both. That is stupid IMO.
unknown|1 year ago
[deleted]
JoachimS|1 year ago
"Note: Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix. See Section 3.5.9 (Page 37) for details. I sincerely thank Hongxun Wu and (independently) Thomas Vidick for finding the bug today. Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold. I leave the rest of the paper as it is (added a clarification of an operation in Step 8) as a hope that ideas like Complex Gaussian and windowed QFT may find other applications in quantum computation, or tackle LWE in other ways."
ColinWright|1 year ago
dogeprotocol|1 year ago
deyiao|1 year ago
kyoji|1 year ago
tptacek|1 year ago
bawolff|1 year ago