top | item 46033151

NSA and IETF, part 3: Dodging the issues at hand

316 points| upofadown | 3 months ago |blog.cr.yp.to

229 comments

order
[+] seethishat|3 months ago|reply
For context, djb has been doing and saying these things since he was a college student:

    While a graduate student at the University of California at Berkeley, Bernstein completed the development of an encryption equation (an "algorithm") he calls "Snuffle." Bernstein wishes to publish a) the algorithm (b) a mathematical paper describing and explaining the algorithm and (c) the "source code" for a computer program that incorporates the algorithm. Bernstein also wishes to discuss these items at mathematical conferences, college classrooms and other open public meetings. The Arms Export Control Act and the International Traffic in Arms Regulations (the ITAR regulatory scheme) required Bernstein to submit his ideas about cryptography to the government for review, to register as an arms dealer, and to apply for and obtain from the government a license to publish his ideas. Failure to do so would result in severe civil and criminal penalties. Bernstein believes this is a violation of his First Amendment rights and has sued the government. 

    After four years and one regulatory change, the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional. 
Source https://www.eff.org/cases/bernstein-v-us-dept-justice
[+] basilgohar|3 months ago|reply
djb has earned my massive respect for how consistent he's been in this regard. I love his belligerence towards authoritarian overreach in this regard. Him, Phil Zimmermann, Richard Stallman, and all are owed great respect for their insistence on their principles which have paid massive dividends to all of us through the freedom and software that has been preserved and become possible through them. I appreciate them immensely and I think we all owe them a debt of gratitude for their sacrifices, because they all paid a heavy price for their advocacy over time.
[+] ants_everywhere|3 months ago|reply
That was when he had the legal expertise of the EFF to help him make his case. Later he decided to represent himself in court and failed

> This time, he chose to represent himself, although he had no formal legal training. On October 15, 2003, almost nine years after Bernstein first brought the case, the judge dismissed it....

https://en.wikipedia.org/wiki/Bernstein_v._United_States

[+] dhx|3 months ago|reply
Amongst the numerous reasons why you _don't_ want to rush into implementing new algorithms is even the _reference implementation_ (and most other early implementations) for Kyber/ML-KEM included multiple timing side channel vulnerabilities that allowed for key recovery.[1][2]

djb has been consistent in view for decades that cryptography standards need to consider the foolproofness of implementation so that a minor implementation mistake specific to timing of specific instructions on specific CPU architectures, or specific compiler optimisations, etc doesn't break the implementation. See for example the many problems of NIST P-224/P-256/P-384 ECC curves which djb has been instrumental in fixing through widespread deployment of X25519.[3][4][5]

[1] https://cryspen.com/post/ml-kem-implementation/

[2] https://kyberslash.cr.yp.to/faq.html / https://kyberslash.cr.yp.to/libraries.html

[3] https://en.wikipedia.org/wiki/Elliptic_curve_point_multiplic...

[4] https://safecurves.cr.yp.to/ladder.html

[5] https://cr.yp.to/newelliptic/nistecc-20160106.pdf

[+] mpyne|3 months ago|reply
Given the emphasis on reliability of implementations of an algorith, it's ironic that the Curve 25519-based Ed25519 digital signature standard was itself specified and originally implemented in such a way as to lead to implementation divergence on what a valid and invalid signature actually was. See https://hdevalence.ca/blog/2020-10-04-its-25519am/

Not a criticism, if anything it reinforces DJB's point. But it makes clear that ease of (proper) implementation also needs to cover things like proper canonicalization of relevant security variables and that supporting multiple modes of operation doesn't actually lead to different answers of security questions meant to give the same answer.

[+] glitchc|3 months ago|reply
This logic does not follow. Your argument seems to be "the implementation has security bugs, so let's not ratify the standard." That's not how standards work though. Ensuring an implementation is secure is part of the certification process. As long as the scheme itself is shown to be provably secure, that is sufficient to ratify a standard.

If anything, standardization encourages more investment, which means more eyeballs to identify and plug those holes.

[+] Foxboron|3 months ago|reply
> See for example the many problems of NIST P-224/P-256/P-384 ECC curves

What are those problems exactly? The whitepaper from djb only makes vague claims about NSA being a malicious actor, but after ~20 years no known backdoors nor intentional weaknesses has been reliably proven?

[+] zahllos|3 months ago|reply
In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange.

Here's the thing. The existence of a standard does not mean we need to use it for most of the internet. There will also be hybrid standards, and most of the rest of us can simply ignore the existence of ML-KEM -only. However, NSA's CNSA 2.0 (commercial cryptography you can sell to the US Federal Government) does not envisage using hybrid schemes. So there's some sense in having a standard for that purpose. Better developed through the IETF than forced on browser vendors directly by the US, I think. There was rough consensus to do this. Should we have a single-cipher kex standard for HQC too? I'd argue yes, and no the NSA don't propose to use it (unless they updated CNSA).

The requirement of the NIST competition is that all standardized algorithms are both classical and PQ-resistant. Some have said in this thread that lattice crypto is relatively new, but it actually has quite some history, going back to Atjai in '97. If you want paranoia, there's always code theory based schemes going back to around '75. We don't know what we don't know, which is why there's HQC (code based) waiting on standardisation and an additional on-ramp for signatures, plus the expensive (size and sometimes statefulness) of hash-based options. So there's some argument that single-cipher is fine, and we have a whole set of alternative options.

This particular overreaction appears to be yet another in a long running series of... disagreements with the entire NIST process, including "claims" around the security level of what we then called Kyber, insults to the NIST team's security level estimation in the form of suggesting they can't do basic arithmetic (given we can't factor anything bigger than 15 on a real quantum computer and we simply don't have hardware anywhere near breaking RSA, estimate is exactly what these are) and so on.

[+] HelloNurse|3 months ago|reply
The metaphor near the beginning of the article is a good summary: standardizing cars with seatbelts, but also cars without seatbelts.

Since ML-KEM is supported by the NSA, it should be assumed to have a NSA-known backdoor that they want to be used as much as possible: IETF standardization is a great opportunity for a long term social engineering operation, much like DES, Clipper, the more recent funny elliptic curve, etc.

[+] adgjlsfhk1|3 months ago|reply
The problem with standardizing bad crypto options is that you are then exposed to all sorts of downgrade attack possibilities. There's a reason TLS1.3 removed all of the bad crypto algorithms that it had supported.
[+] vessenes|3 months ago|reply
My professors at Brown were walking on QR lattice cryptography well before 1997, although they may not have been publishing much - NTRU was in active development throughout the mid 1990s when I was there. Heating up by 1997 though, for sure.
[+] crote|3 months ago|reply
> In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange.

No, that's background dressing by now. The bigger issue is how IETF is trying to railroad a standard by violating its own procedures, ignoring all objections, and banning people who oppose it.

They are literally doing the kind of thing we always accuse China of doing. ML-KEM-only is obviously being pushed for political reasons. If you're not willing to let a standard be discussed on its technical merits, why even pretend to have a technology-first industry working group?

Seeing standards being corrupted like this is sickening. At least have the gall openly claim it should be standardized because it makes things easier for the NSA - and by extension (arguably) increasing national security!

[+] cryptonector|3 months ago|reply
You're not accurately representing DJB's concern.

His concern is that NSA will get vendors to ship code that will prefer ML-KEM, which, not being a hybrid of ECC and PQC, will be highly vulnerable should ML-KEM turn out to be weak, and then there's the concern that it might be backdoored -- that this is a Dual_EC redux.

[+] vorpalhex|3 months ago|reply
The standard will be used, as it was the previous time the IETF allowed the NSA to standardize a known weak algorithm.

Sorry that someone calling out a math error makes the NIST team feel stupid. Instead of dogpiling the person for not stroking their ego, maybe they should correct the error. Last I checked, a quantum computer wasn't needed to handle exponents, a whiteboard will do.

[+] aaomidi|3 months ago|reply
Except when the government starts then mandating a specific algorithm.

And yes. This has happened. There’s a reason there’s only the NIST P Curves in the WebPKI world.

[+] ants_everywhere|3 months ago|reply
D. J. Bernstein is very well respected and for very good reason. And I don't have firsthand knowledge of the background here, but the blog posts about the incident have been written in a kind of weird voice that make me feel like I'm reading about the US Government suppressing evidence of Bigfoot or something.

Stuff like this

> Wow, look at that: "due process".... Could it possibly be that the people writing the law were thinking through how standardization processes could be abused?"

is both accusing the other party of bad faith and also heavily using sarcasm, which is a sort of performative bad faith.

Sarcasm can be really effective when used well. But when a post is dripping with sarcasm and accusing others of bad faith it comes off as hiding a weak position behind contempt. I don't know if this is just how DJB writes, or if he's adopting this voice because he thinks it's what the internet wants to see right now.

Personally, I would prefer a style where he says only what he means without irony and expresses his feelings directly. If showing contempt is essential to the piece, then the Linus Torvalds style of explicit theatrical contempt is probably preferable, at least to me.

I understand others may feel differently. The style just gives me crackpot vibes and that may color reception of the blog posts to people who don't know DJT's reputation.

[+] pverheggen|3 months ago|reply
While it's true that six others unequivocally opposed adoption, we don't know how many of those oppose the chairs claiming they have consensus. This may be a normal ratio to move forward with adoption, you'd have to look at past IETF proceeding to get a sense for that.

One other factor which comes in to play, some people can't stand his communication style. When disagreed with, he tends to dig in his heels and write lengthly responses that question people's motives, like in this blog post and others. Accusing the chairs of corruption may have influenced how seriously his complaint was taken.

[+] dataflow|3 months ago|reply
> One other factor which comes in to play, some people can't stand his communication style. When disagreed with, he tends to dig in his heels and write lengthly responses that question people's motives, like in this blog post and others.

I don't have context on this other than the linked page, but if what he's saying is accurate, it does seem pretty damning and corrupt, no? Why all the lies and distortions otherwise - how does one assume a generous explanation for lies and distortions?

[+] ImPostingOnHN|3 months ago|reply
> Accusing the chairs of corruption may have influenced how seriously his complaint was taken.

If you alter your official treatment of somebody because they suggested you might be corrupt (in other words, because of personal animus), then you have just confirmed their suggestion.

[+] cryptonector|3 months ago|reply
> One other factor which comes in to play, some people can't stand his communication style. When disagreed with, he tends to dig in his heels and write lengthly responses that question people's motives, like in this blog post and others. Accusing the chairs of corruption may have influenced how seriously his complaint was taken.

The IESG though is completely mishandling it. They could discipline him if need be (posting bans for some amount of time) and still hear the appeal. Instead they're sticking their fingers in their ears. DJB might be childish and annoying, but how are they that much better?

[+] abhv|3 months ago|reply
20+2 (conditional support) versus 7.

22/29 = 76% in some form of "yea"

That feels like "rough consensus"

[+] stavros|3 months ago|reply
> That OMB rule, in turn, defines "consensus" as follows: "general agreement, but not necessarily unanimity, and includes a process for attempting to resolve objections by interested parties, as long as all comments have been fairly considered, each objector is advised of the disposition of his or her objection(s) and the reasons why, and the consensus body members are given an opportunity to change their votes after reviewing the comments".

From https://blog.cr.yp.to/20251004-weakened.html#standards, linked in TFA.

[+] jcranmer|3 months ago|reply
The standard used in the C and C++ committees is essentially a 2-to-1 majority in favor. I'm not aware of any committee where a 3-to-1 majority is insufficient to get an item to pass.

DJB's argument that this isn't good enough would, by itself, be enough for me to route his objections to /dev/null; it's so tedious and snipey that it sours the quality of his other arguments by mere association. And overall, it gives the impression of someone who is more interested in derailing the entire process than in actually trying to craft a good standard.

[+] f33d5173|3 months ago|reply
A consensus is 100%. A rough consensus should be near 100%. 2/3 is a super majority. That's a very different standard.
[+] ImPostingOnHN|3 months ago|reply
consensus is not a synonym for majority, supermajority, or for any fraction of the whole, unless the fraction is 100%
[+] o11c|3 months ago|reply
It's always a mistake to look at numbers for consensus, without also considering how strongly the positions are held.
[+] thomasdeleeuw|3 months ago|reply
France and Germany propose hybrid schemes as well:

The german position:

https://www.bsi.bund.de/SharedDocs/Downloads/EN/BSI/Publicat...

"The quantum-safe mechanisms recommended in this Technical Guideline are generally not yet trusted to the same extent as the established classical mechanisms, since they have not been as well studied with regard to side-channel resistance and implementation security. To ensure the long-term security of a key agreement, this Technical Guideline therefore recommends the use of a hybrid key agreement mechanism that combines a quantum-safe and a classical mechanism."

The french position, also quoting the German position:

https://cyber.gouv.fr/sites/default/files/document/follow_up...

"As outlined in the previous position paper [1], ANSSI still strongly emphasizes the necessity of hybridation1 wherever post-quantum mitigation is needed both in the short and medium term. Indeed, even if the post-quantum algorithms have gained a lot of attention, they are still not mature enough to solely ensure the security"

[+] blintz|3 months ago|reply
Standardizing a codepoint for a pure ML-KEM version of TLS is fine. TLS clients always get to choose what ciphersuites they support, and nothing forces you to use it.

He has essentially accused anyone who shares this view of secretly working for the NSA. This is ridiculous.

You can see him do this on the mailing list: https://mailarchive.ietf.org/arch/browse/tls/?q=djb

[+] dataflow|3 months ago|reply
> standardizing a code point (literally a number) for a pure ML-KEM version of TLS is fine. TLS clients always get to choose what ciphersuites they support, and nothing forces you to use it.

I think the whole point is that some people would be forced to use it due to other standards picking previously-standardized ciphers. He explains and cites examples of this in the past.

> He has essentially accused anyone who shares this view of secretly working for the NSA. This is ridiculous.

He comes with historical and procedural evidence of bad faith. Why is this ridiculous? If you see half the submitted ciphers being broken, and lies and distortions being used to shove the others through, and historical evidence of the NSA using standards as a means to weaken ciphers, why wouldn't you equate that to working for the NSA (or something equally bad)?

[+] ImPostingOnHN|3 months ago|reply
Sunlight is the best disinfectant. I see one group of people shining it and another shading the first group.

Someone who wants to be seen as acting in good faith (and cryptography standards folks should want this), should be addressing the substance of what he said.

Consensus doesn't mean "majority rule", it requires good-faith resolutions (read: not merely responses like 'nuh-uh') to the voiced concerns.

[+] qi_reaper|3 months ago|reply
I understand you are smart and are talking about things above my paygrade, but dang can you format the text on your site so it is easier to read please
[+] pjz|3 months ago|reply
uhhh... that's mostly on your browser. The css is at the top and pretty skimpy. If it really bothers you, find a styler extension that will override the CSS to render it more pleasingly.
[+] GauntletWizard|3 months ago|reply
The NSA has railroaded bad crypto before [1]. The correct answer is to just ignore it, to say "okay, this is the NSA's preferred backdoored crypto standard, and none of our actual implementations will support it."

It is not acceptable for the government to be forcing bad crypto down our throats, it is not acceptable for the NSA to be poisoning the well this way, but for all I respect DJB, they are "playing the game" and 20 to 7 is consensus.

[1] https://en.wikipedia.org/wiki/Dual_EC_DRBG

[+] g-mork|3 months ago|reply
Handforth Parish council Internet edition. You have no authority here, djb! No authority at all
[+] jancsika|3 months ago|reply
Dear some seasoned cryptographer,

Please ELI5: what is the argument for including the option for the non-hybrid option in this standard? Is it a good argument in your expert opinion?

My pea brain: implementers plus options equals bad, newfangled minus entrenched equals bad, alice only trust option 1 but bob only have option 2 = my pea brain hurt!

[+] dwaite|3 months ago|reply
More of a person with IETF participation experience than as a cryptographer (I enjoy watching numbers dance but am not a choreographer):

This ( https://datatracker.ietf.org/doc/draft-ietf-tls-mlkem/ ) is a document describing how to use the ML-KEM algorithm with TLS 1.3 in an interoperable manner.

It does not preclude other post-quantum algorithms from being described for use with TLS 1.3. It also does not preclude hybrid approaches from being used with TLS 1.3.

It is however a document scoped so it cannot be expanded to include either of those things. Work to define interoperable use of other algorithms, including hybrid algorithms, would be in other documents.

There is no MTI (mandatory-to-implement) once these are documented from the IETF directly, but there could be market and regulatory pressures.

My suspicion is that this is bleed-out from a larger (and uglier) fight in the sister organization, the IRTF. There, the crypto forum research group (CFRG) has been having discussions on KEMs which have gotten significantly more heated.

A person with concern that there may be weaknesses in a post quantum technique may want a hybrid option to provide additional security. They may then be concerned that standardization of non-hybrid options would discourage hybrid usage, where hybrid is not yet standardized and would likely be standardized later (or not at all).

The pressure now with post quantum is to create key negotiation algorithms are not vulnerable to theoretical post quantum computer attack. This is because of the risk of potentially valuable encrypted traffic being logged now in the hopes that it could later be targeted by a post-quantum computer.

Non-negotiated encrypted (e.g. just using a static AES key) is already safe, and signature algorithms can be updated much closer to viable attacks to protect transactional data.

[+] vayup|3 months ago|reply
The strongest arugument made is that hybrid is more complex, more work and therefore more risky.

As someone who has been implementing such systems for 20 years, I don't buy this. In my mind, it's equivalent to saying "Seatbelts add complexity to the safety system, and it's more work. So let's get rid of it."

In this argument, the benefits of hybrid/seatbelts are not factored in adequately.

[+] 0xbadcafebee|3 months ago|reply
tl;dr DJB is trying to stop the NSA railroading bad crypto into TLS standards, the objections deadline is in two days, and they're stonewalling him

This /. story fills in the backstory: https://it.slashdot.org/story/25/11/23/226258/cryptologist-d...

  Normal practice in deploying post-quantum cryptography is to deploy ECC+PQ. IETF's TLS working group is standardizing ECC+PQ. But IETF management is also non-consensually ramming a particular NSA-driven document through the IETF process, a "non-hybrid" document that adds just PQ as another TLS option.
[+] philipwhiuk|3 months ago|reply

[deleted]

[+] amszmidt|3 months ago|reply
”No association” and “I am not a representative” are quite different things to say.
[+] 6581|3 months ago|reply
That's not what the message you linked claims at all. Maybe you missed the "in this message" at the end of the sentence?
[+] kiray|3 months ago|reply

[deleted]

[+] Foxboron|3 months ago|reply
> This is why djb is in the Cypherpunks Hall of Fame! [1]

This is a list made by you 2 weeks ago?

EDIT: Okay lol. I actually browsed the list and found multiple dubious entries, along with Trump!

Hilarious list. 10/10.

[+] anonym29|3 months ago|reply
Name calling, bullying (forms of systematic harassment) and attempting to instill feelings of social isolation in a target are documented techniques employed by intelligence agencies in both online and offline discourse manipulation / information warfare.

You can read up more here if you are curious: https://www.statewatch.org/media/documents/news/2015/jun/beh...

Many of the attacks against djb line up quite nicely with "discredit" operational objectives.

[+] bigyabai|3 months ago|reply
Can you please stop spam-submitting this AI-generated Hall of Fame website? It's against HN guidelines to use the website primarily for promotion and it's clearly what you're doing here.