top | item 18134114

Do You Really Know CORS?

603 points| grzegorz_mirek | 7 years ago |performantcode.com

124 comments

order
[+] Klathmon|7 years ago|reply
This is an awesome overview! But don't take it as all encompassing, it doesn't go into some of the more esoteric edge cases with CORS, like:

* either an unreleased safari version, or the most recent version will send preflight requests even if the request meets the spec (like if the Accept-Language is set to something they don't like).

* If you use the ReadableStream API with fetch in the browser, a preflight will be sent.

* If there are any events attached on the XMLHttpRequestUpload.upload listener, it will cause a preflight

* cross-domain @font-face urls, images drawn to a canvas using the drawImage stuff, and some webGL things will also obey CORS

* the crossorigin attribute will be required for cross-origin linked images or css, or the response will be opaque and js won't have access to anything about it.

* if you mess up CORS stuff, you get opaque responses, and opaque responses are "viral", so they can cause entire canvas elements to become "blacklisted" and extremely restricted.

I feel like I've cut my teeth on this API more than most, and I still feel like I'm only scratching the surface!

[+] laumars|7 years ago|reply
The entire webstack is such a broken mess of inconsistencies and thousands of hidden traps that can render the entire thing insecure.

People moan about C yet I find the web stack greatly more painful to write because you didn't even have control over the compiler following standards strictly (where stuff has even been standardised).

I really do wish we worked together to create a new standard for building and deploying documents and applications over the internet because this HTML (and all its supporting technologies) is an experiment that has gone bad. Id preferably want something that doesn't allow each browser to interpret the specifications differently and absolutely something that isn't controlled by Google (they would obviously need input but the last thing we need is another AMP).

Of course it will never happen, but one can dream / rant nonetheless.

[+] saganus|7 years ago|reply
Genuine question: why do you feel you've cut your teeth more than most?

I.e. what kind of dev work do you do that makes you have to deal with this more than the average developer?

[+] rbirkby|7 years ago|reply
Good points. Also, the strange case of access-control-allow-origin containing a list of origins. Or rather the lack of support for the whole standard.
[+] 3pt14159|7 years ago|reply
I'm ideologically against third party on the web because it is a privacy nightmare. But I'm in the system that I'm in, and I don't take on fights that aren't possible to win, so barring my becoming a billionaire I've kinda just accepted that third party is here for at least a little while and I'm not going to refuse to use ads and analytics. Except on my personal website, that gets to stay cool.

That said, CORS is the only thing about third party that I actually like. It's secure by default. That the ensure header and accept header are different things is amazing. I know all about the performance issues[0], but I'm ok with it in the right context.

[0] It's kinda funny how against the herd I am here. I think it is shitty that the preflight sometimes doesn't happen. It's so weird to me that we carved out exceptions for this and seems like an otherwise secure-by-default system should have come with the guaranteed preflight.

[+] paulddraper|7 years ago|reply
CORS is not necessarily about third parties.

It's common to have app.example.org point to a CDN and api.example.org point to an API.

And CORS implementation is terrible. The server has to transmit validation rules for the browser to enforce (with vendor specific caching differences), rather than just enforcing access itself.

The reason it's implemented this way is because of the organic evolution of web security.

[+] matchagaucho|7 years ago|reply
Our microservices stack is pretty dependent upon clients making cross-origin requests. I don't necessarily consider these "3rd party".
[+] Thu27Sep|7 years ago|reply
One idea that the article doesn't convey well, in my opinion, is that the Same-Origin Policy only prevents the browser from reading the response from an HTTP server to third-party host, but it doesn't prevent the request from being issued in a first place. The CORS headers are merely a way for the server to indicate to the browser whether it is allowed to read the response of not, but it doesn't protect the server from anything.

Especially, setting the "Access-Control-Allow-Credentials" header to true means that a client which sent a request with a cookie is allowed to read the result, but whether the request is sent with a cookie or not, and will be treated as such by the server, is entirely up to the client.

So although malicious.com cannot read the details of bank.com using AJAX, it can definitely send a POST request to trigger the transfer from the user's account to a malicious account using the user's cookie (blindly so).

This is the reason proper CSRF protection must be implemented by the server, independently of whether CORS is enabled or not.

[+] sbergot|7 years ago|reply
This is not entirely true. The preflight's role is exactly to prevent a post request to be sent to the server. There is no preflight only in particular cases.
[+] waffle_ss|7 years ago|reply
I recently implemented a feature that depends on CORS and I don't see anything in this article that adds any value over Mozilla's thorough CORS docs.[1]

If you're writing an article on CORS today I also think you should mention recent CORS developments such as Cross-Origin Read Blocking (CORB)[2] and features on the horizon such as Cross-Origin-Resource-Policy, Cross-Origin-Window-Policy, etc. that in light of Spectre, Meltdown etc. are meant to help plug speculative execution holes.[3]

[1]: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS

[2]: https://www.chromium.org/Home/chromium-security/corb-for-dev...

[3]: https://www.arturjanc.com/cross-origin-infoleaks.pdf

[+] emmelaich|7 years ago|reply
> performantcode.com took too long to respond. ERR_CONNECTION_TIMED_OUT

Too ironic not to mention.

[+] lwansbrough|7 years ago|reply
I love CORS. I wouldn't be able to provide the level of security with my API that I do because of it. Without CORS, I don't believe my API design would be practically possible due to the security risks.

If I have access to the Fetch API, I can tell the browser to send the user's cookie cross-origin and I can validate the request based on the origin. This allows for interesting authentication scenarios without the need for explicit client-user consent pages.

[+] acoard|7 years ago|reply
>and I can validate the request based on the origin

Are you talking about the HTTP referer? That's easily spoofable and can't be relied on server-side. The same-origin policy and all the CORS security is implemented in the browser itself, not in HTTP.

If you need to be certain that a request originated from your own page and not another domain you need to use a CSRF token.

[+] JepZ|7 years ago|reply
I never really understood why we have CORS. I mean, the problem with CSRF is that some random page can trick your browser into adding its authentication token to a request which does not originate from the authenticated page. So why the do we need the server to tell the browser that it should not send requests from other origins?

In my opinion, it would have been much better to improve the browsers to not include cookies in 3rd party requests automatically (only when they are explicitly specified via JS for example). It should have solved the issue equally well, without introducing some bulky server-side security feature to remote control browsers.

[+] jtokoph|7 years ago|reply
CORS is really for the opposite problem. Browsers do block requests from other origins by default (mostly). CORS is used to let the server decide which origins are allowed to request data and how it can be requested. If the client was allowed to decide via javascript, then attacker.com could make a request via javascript to facebook.com telling the browser to send cookies and return the user's data. This is actually what the client JS has to do anyway with CORS (using credentials: true), but the server side needs to be able to allow/deny it.
[+] zackmorris|7 years ago|reply
Ya I generally think CORS is a waste of time. It would have been better to provide a hash of the file we're linking to and trust that rather than where it came from. Which is precisely what Subresource Integrity (SRI) does:

https://en.wikipedia.org/wiki/Subresource_Integrity

Sadly even though this is an obvious concept and trivial to implement, it took them over 20 years since the web came out to get it in most browsers. The cost to society of having thousands of copies of the same commonly used files (like jQuery) hosted locally on countless servers rather than having a centrally hosted version already cached from previously visited sites is staggering to contemplate. I'd really like to know who was behind the holdup on deploying SRI.

[+] guscost|7 years ago|reply
CORS is a technical subsidy granted to (sloppy) users of cookie authentication. I’ve never worked on a project where it was anything other than an annoying hoop to jump through.
[+] matt4077|7 years ago|reply
When has "subsidy" become the go-to narrative to argue against any sort of deference to real-world usage?

You could just as easily frame CORS as "antibiotics for the people who dared to leave their house".

(There's also a no-true-scotsman fallacy going on in your argument)

[+] alexnewman|7 years ago|reply
I'd also argue it encourages worst practices
[+] parhamn|7 years ago|reply
Im curious how many newer JS+API applications still use browser cookies as a means of authenticating API requests and how prevalent cookie usage still is for these types of applications?

JWT/tokens + Local/session storage + adding fetch headers seems like the best way as long as you don't run untrusted JS.

[+] anthuswilliams|7 years ago|reply
"as long as you don't run untrusted JS" is a surprisingly tough hurdle to hit, even for very experienced developers.

What is your reason for preferring JWT + localStorage for authentication and session handling? I'm genuinely curious, as httpOnly cookies strike me as better in every meaningful way.

[+] philcockfield|7 years ago|reply
LocalStorage is so much more preferable than cookies, I agree, however as SSR ("server-side-rendering") of heavy client-side JS apps becomes more prevalent, suddenly cookies are back in business.

If the initial SSR needs some initial client-state to complete its work before sending the HTML payload, it can see the cookie, but not localStorage.

[+] stesch|7 years ago|reply
Had to deal with a firewall that filtered all unknown/"new" HTTP headers. This included CORS.

A PITA to find the reason why Firefox wouldn't use the Google fonts.

[+] crooked-v|7 years ago|reply
That kind of crapware is why I'm increasingly glad that the http specs are moving towards being completely illegible to middleware boxes.
[+] lysium|7 years ago|reply
This is really an informative article. We've recently stumbled across this issues and all other pages I could google did not explain it as clearly as this page.
[+] ottomanage|7 years ago|reply
My experience was the same as yours, resources explaining things assumed technical knowledge far beyond my level or were not very clear. This was a very well put-together article.

My only contribution to the discussion is that if you get a CORS error where you wouldn't expect it, the problem might not be a CORS issue. I spent the better part of a weekend trying to debug why a request to a Google API wasn't working and why I was seeing a CORS error (same thing worked fine on another system). Turns out, it wasn't the same thing, my url had a typo...

[+] denormalfloat|7 years ago|reply
Why not just include the Origin on all cross-origin requests? Then the server could deny/allow it without the need for preflight.
[+] jraph|7 years ago|reply
I would be concerned about the privacy implication of it. Imagine if the browser sent the origin to widely used CDNs, or to Google Fonts, and that people didn't actually block Google domains on their browsers.

Also, this would not be secure by default, because you would have to change the default behavior of the server to block cross origin requests.

[+] dvdcxn|7 years ago|reply
How do you prevent people proxying your API via a node service?

This is something I could never get my head around with CORS - what's the point of whitelisting origins if getting around the whitelist is nothing more than an inconvenience?

[+] Liquidor|7 years ago|reply
CORS is mostly used to prevent attacks from a browser script on a non-whitelisted website (CSRF etc.).

To prevent someone abusing your API otherwise, use an authentication method.

[+] sciurus|7 years ago|reply
The user is still protected in that case.

If you create a proxy for foo.com, your javascript can't get the browser to send the user's cookies for foo.com to your proxy.

[+] chii|7 years ago|reply
It's not free to run a proxy like that.
[+] bluepnume|7 years ago|reply
I built out https://github.com/krakenjs/fetch-robot to avoid some of the esoteric issues around CORS endpoints -- and to avoid the performance hit of that preflight request.

It acts as a `fetch` implementation that allows you to declare cross-origin policies in advance, then channel the requests through an iframe which enforces those policies.

[+] afs35mm|7 years ago|reply
Here's one question that's always bugged me - What's stopping a malicious user from sending an HTTP request from any API client like Postman, or even Curl from the CL? Something like a post with: {transferTo: myAccountId, amount: 1000000000}?

Obviously in any nontrivial web app it would fail because of authentication issues, but if a server doesn't do ANY sort of security checking, that should work, no? Does that mean that the onus is on the server developer of mybank.com? And if so, what would stop the malicious request from working on any server developed before the existence of CORS?

[+] joevandyk|7 years ago|reply
Server is supposed to check authentication/authorization through some method.

If HTTP, that’s done via setting some information in the request headers, be it a cookie, or basic auth, or token auth, or similar.

CORS is done by the browser - to not allow certain requests to be made (In case you are accidentally executing malicious javascript code). The server tells the browser via the CORS headers which requests are ok to make.

[+] jedberg|7 years ago|reply
My only experience with CORS has been when trying to access api.foo.com from a web page on foo.com, and then getting denied. Then I messed with the settings on api.foo.com trying to get it to allow access from foo.com, and then I gave up and just configured the load balancer on foo.com to proxy requests to foo.com/api to api.foo.com.

So far it's only gotten in my way as a developer. But it's there to protect users, not me. So at the end of the day, I'm glad it's there as a way to somewhat prevent people from tricking my users into hitting my api with malicious requests.

[+] dasil003|7 years ago|reply
You have it backwards. This type of request was not possible at all before CORS, CORS is what allows you to make it possible.
[+] stockkid|7 years ago|reply
very nice article. I thought I understood CORS but learned some new things:

* not all cross origin requests need to be preflighted * to use credentials, server needs to explicitly allow credentials to be sent from client

[+] exabrial|7 years ago|reply
No... I don't despite my best efforts to be diligent about learning it. :( It's a workaround for a bunch of compromises we made where the cure is almost not worth the pain.
[+] otabdeveloper2|7 years ago|reply
> Do You Really Know CORS?

Of course I don't, nobody does.