top | item 26984701

CSRF, CORS, and HTTP Security Headers Demystified

358 points| tonyjstark | 4 years ago |blog.vnaik.com

43 comments

order
[+] shartte|4 years ago|reply
> It is good practice to always use the SameSite directive with cookies as this provides protection against CSRF attacks.

Be careful with assuming SameSite fully protects from CSRF attacks. I thought it does, but then I read what "site" actually refers to in the context of same site (eTLD+1).

If the eTLD+1 (i.e. company.com) is not listed on the Public Suffix List, even SameSite=strict cookies for a.company.com will still be sent for requests initiated from b.company.com

[+] lol768|4 years ago|reply
I believe originally (back in the early drafts of the spec) the concept of a "site" was significantly stricter (based on the origins matching), but it got watered down which was a real shame. I'm not sure why.

c.f. https://tools.ietf.org/html/draft-west-first-party-cookies-0... and https://tools.ietf.org/html/draft-west-first-party-cookies-0...

Excerpts (draft 2):

> If "document" is a first-party context, and "request"'s URI's origin is the same as the origin of the URI of the active document in the top-level browsing context of "document", then return "First-Party".

vs. (draft 3)

> A document is considered a "first-party context" if and only if the registerable domain of the origin of its URI is the same as the registerable domain of the first-party origin, and if each of the active documents in its ancestors' browsing contexts' is a first-party context.

[+] capableweb|4 years ago|reply
What reason is there even to use Cookies anymore? Use LocalStorage instead and get better protection as it's not by default being sent around.
[+] lemarchr|4 years ago|reply
Great overview with a minor nit-pick; it's Cross-Origin Resource Sharing rather than Request Sharing. It describes a server's willingness to share its resources across origins. The client's request isn't the thing being shared.
[+] 1vuio0pswjnm7|4 years ago|reply
"It is good practice to always use the SameSite directive with cookies as this provides protection against CSRF attacks."

"As an added bonus, many of the mitigations on this page can be applied at the proxy server (CSP, HSTS, HPKP) or network level (better server proxying to remove the need for CORS), and only the CSRF and XSS protections really need to be added to the application."

If I add a line to the localhost-bound forward proxy that the aplication uses so that "SameSite" is added to every cookie, then it appears the second statement is misleading.

As a user, I rely on a (forward) proxy. Much easier for me to focus on the proxy than trying to make sure every application^1 is doing the right things.

Both parties to an HTTP transaction can use proxies to execute mitigations. And as the author states, the ones he is mentioning are only some of the possibilities.

1. Especially ones that we do not compile from source and are distributed by "tech" companies that rely on online advertising as their main source of revenue. We users are not their customers, we are the guinea pigs.

[+] evanspa|4 years ago|reply
> Note that CORS preflight requests are not made for GET HEAD POST requests with default headers.

I really wish the author included an explanation for this. What are "default headers"? What special header(s) needs to be on the request in order for a preflight request to be made?

[+] horsawlarway|4 years ago|reply
If you're genuinely interested, MDN has some pretty great documentation on the subject.

https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS

For your specific question, this is the relevant section of the above link

----

Apart from the headers automatically set by the user agent (for example, Connection, User-Agent, or the other headers defined in the Fetch spec as a “forbidden header name”), the only headers which are allowed to be manually set are those which the Fetch spec defines as a “CORS-safelisted request-header”, which are:

Accept

Accept-Language

Content-Language

Content-Type (but note the additional requirements below)

[+] LegionMammal978|4 years ago|reply
As I understand it, the main purpose of CORS is to prevent information from being leaked by domains other than the current one using JavaScript, since the browser will always send those domains' cookies in all requests. In that case, why doesn't JavaScript have a method of sending a request without any cookies? Would it still somehow vulnerable to CSRF attacks? Is there simply no demand for the feature? Are there other issues with the concept that I don't know about? (The main context of this is from an attempt to create a client-side JavaScript application which calls a certain public API, which turned out to be impossible since it did not implement CORS headers.)
[+] saurik|4 years ago|reply
I totally agree with you. That said, there are people who disagree with both of us and believe that it is reasonable for people to use IP-address-based authorization schemes--which, for avoidance of doubt, might simply be "I am behind a firewall (but all the IP addresses behind my firewall are public addresses, and so cannot be disallowed for this purpose by IETF CIDR)"--and so keep insisting that you should not be able to use a script on a website to "port scan" behind someone's firewall and attack their other half-protected file servers, computers, and printers. This is then why a mechanism actually does exist to say "send a request without cookies"... but it is only for a GET and, this being the key limitation, the script isn't allowed to see the value of what was returned or even if it succeeded or failed (although I can't for the life of my find any documentation on this right now despite swearing I was just trying to use this last week before realizing the response body limitation). Otherwise, this whole thing always feels like some half-assed attempt at DRM :/.
[+] izolate|4 years ago|reply
> The reason access-control-allow-origin cannot be '' when access-control-allow-credentials is set is to prevent developers taking the shortcut of adding a and then forgetting about it altogether - this behaviour forces developers to think about how their API is going to be consumed.

Instead developers take the shortcut of creating middleware that captures the Origin header in the request and mirrors it into the response, effectively creating the same insecure ruleset.

[+] technojunkie|4 years ago|reply
This is good information, and I'd love to see a write up how Firefox, Chrome, Brave and other browsers can be set up to prevent some of this.

For example, Firefox has both first-party isolation mode and now Total Cookie Protection, which isolates cookies and would thus likely prevent CSRF. However, I think first-party isolation causes CORS issues like when trying to pay with Paypal on another retail site.

[+] cmeacham98|4 years ago|reply
> which isolates cookies and would thus likely prevent CSRF

CSRF is often done via redirecting you or submitting a form, both of which obviously completely bypass FPI and dFPI (i.e. the cookie part of Total Cookie Protection).

> I'd love to see a write up how Firefox, Chrome, Brave and other browsers can be set up to prevent some of this.

I only use firefox, you'll have to find information elsewhere for other browsers.

CSRF, XSS, Set-Cookie

Need to be fixed server-side, there is little to nothing you can do as the client. CSRF and XSS represent straight-up vulnerabilities in the website. Report to the developer and/or stop using the vulnerable website.

CORS

No additional work needed for security benefits, to reduce its ability to track you: https://addons.mozilla.org/en-US/firefox/addon/privacy-orien...

CSP, X-Frame-Options

You can achieve the same effect of whitelisting 3rd parties by using an extension such as uBlock Origin or uMatrix (warning: no longer in development) in default-deny mode.

HSTS

https://support.mozilla.org/en-US/kb/https-only-prefs

HPKP

Nobody uses this nowadays. Only semi-related, but you can turn on mandatory revocation checking (security.OCSP.require).

Referrer-Policy

    network.http.referer.XOriginPolicy
0=always (default), 1=only if base domains match, 2=only if hosts match

    network.http.referer.XOriginTrimmingPolicy
0=send full URI (default), 1=scheme+host+port+path, 2=scheme+host+port

These apply only to cross-origin requests but that's probably where you care about the referer. Note that the website's Referrer-Policy might override these, I haven't tested that.

[+] kevin_thibedeau|4 years ago|reply
> However, I think first-party isolation causes CORS issues like when trying to pay with Paypal on another retail site.

Generate URLs with a time limited token as query param. No cookies needed.

[+] IncludeSecurity|4 years ago|reply
Trying to demystify CORS in a couple of paragraphs....good luck with that! I think 200 page book would still be too short to demystify it. It's a crazy topic
[+] arcbyte|4 years ago|reply
I never understood the difficulty with CORS. It's dirt simple: don't send requests across domain names. And if you do, make sure you return header(s) from the target resource to specifically allow the origin to request it.

All the difficulty seems to be people trying to do crazy, esoteric things there's no good reason to be doing in the first place.

[+] myfonj|4 years ago|reply
> access-control-allow-origin: The list of origins allowed to make requests.

Is it really a list? AFAIK, and according to MDN: "Only a single origin can be specified."

https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Ac...

[+] rndgermandude|4 years ago|reply
You're correct, it's not a list. A browser will sent an Origin header during the OPTIONS preflight that the server can check and then return back that value in an Access-Control-Allow-Origin response header, or it can return * without any checks if e.g. it's a public API endpoint anyway and expected to be hit by fetch/XHR traffic from all kinds of places.

Non-browser clients (and browser for non-CORS and/or "simple" requests) will not usually send any CORS headers and preflight requests, so you should account for that when building a web API. Non-browser clients can of course just fake any browser header and request they want, so the Access-Control headers are NOT a substitute for real access control/authentication.

[+] stephenmcirl77|4 years ago|reply
Yep, that's correct. Only a single origin is supported. The implementation on the backend server/proxy may use a lookup list, and return the specified origin if that exists in the list. As called out in the post too, * is a valid one (as is null) but is not recommended.
[+] 1vuio0pswjnm7|4 years ago|reply
"Note that CORS preflight requests are not made for GET HEAD POST requests with default headers."

What are these "default" headers. I have seen access-control-allow- response headers when making HTTP requests. I do not send unnecessary headers. Perhaps some of the ones I do not send are considered "default".

"Thus CORS is a way of selectively loosening security not of tightening it."

Proxy config I use scrubs all CORS headers. As the author states, CORS is irrelevant outside the browser. I make most HTTP requests outside the ("modern") browser anyway.

"Overall, as the web grows in terms of features and complexity, the attack surface also grows correspondingly large."

Job security for some people, I guess.

Apparently there is no sufficient incentive to simplify things (by subtraction not addition).