(no title)
fitblipper | 2 years ago
Why do you say this? The NSA has done this exact thing in the past[1], so why give them the benefit of the doubt this time?
fitblipper | 2 years ago
Why do you say this? The NSA has done this exact thing in the past[1], so why give them the benefit of the doubt this time?
BoppreH|2 years ago
Meanwhile, both NTRU and Kyber are lattice-based, and their designs came from honest attempts. To be an NSA effort, there would need to exist an exploitable flaw in Kyber, but not NTRU, known only to the NSA. And it's not like NTRU as a whole got disqualified; only the fastest variant did.
That's the problem with spy agencies, you never know what they are capable of. But if it was an NSA effort, it would be, by far, the most subtle one uncovered so far.
akdor1154|2 years ago
vlovich123|2 years ago
Changing rules on the fly and improperly applying said rules could be a way to select a weak option you can break while having stronger plausible deniability than what happened with Dual_EC_DRBG (which btw wasn’t actually confirmed until the Snowden leak). So here’s someone claiming NIST is being suspicious in how the algorithm selection happened. The rules really need to be set in stone at the beginning of the competition or before the phases at least. And you can’t pick diametrically opposed rule sets between phases (as happened if you read Bernstein’s letter), only tweaks.
TMWNN|2 years ago
staunton|2 years ago
red_admiral|2 years ago