"In the case of Firefox’s image cache, a tracker can create a supercookie by “encoding” an identifier for the user in a cached image on one website, and then “retrieving” that identifier on a different website by embedding the same image."
Clever. And so frustrating that optimisations need to be turned off due to bad actors.
Per-site caching negates the principal selling point of centrally-hosted JS and resources, including fonts. The convenience remains, but all speed-related perks (due to the resources being reused from earlier visits to unrelated sites) are no more... which is actually great, because it reduces the value that unscrupulous free CDN providers can derive from their "properties".
It also means that I can remove fonts.google.com from the uBlock blacklist. Yay.
Use uBlock Origin, Multi Account Containers, Privacy Badger, Decentraleyes and CookieAutoDelete with Firefox. Make sure you aggressively clear cache, cookies, etc., periodically (with CookieAutoDelete). You’ll probably load the web servers more and also add more traffic on your network, but it will help protect your privacy since most websites don’t care about that. When websites are user hostile, you have to take protective measures yourself.
Twitter uses these type of cookies. They even use cookies that do not contain any reference to the twitter domain. It is how they track people who have been suspended on the platform
That was the first thing that came to mind when I read this article. It looks very similar, though Firefox seems to be addressing more than just resource caching, like addressing the HSTS tracking scheme. Also, I would not be surprised if Chrome eventually did partitioning for anything but Google resources; they surely won't do anything that hurts their surveillance schemes?
From a purely web browsing experience the first iPad 'should be' powerful enough to browse ANYTHING out there these days. But it can't. The last few models will increasingly have the same issues as the sheer volume of muck and cruft that's included with the advertising gack just continues to explode.
I'm definitely of the opinion that our web browsing devices are marketing tools that we are allowed to use for media consumption.
"a tracker can create a supercookie by “encoding” an identifier for the user in a cached image on one website, and then “retrieving” that identifier on a different website by embedding the same image. To prevent this possibility, Firefox 85 uses a different image cache for every website a user visits. That means we still load cached images when a user revisits the same site, but we don’t share those caches across sites.
In fact, there are many different caches trackers can abuse to build supercookies. Firefox 85 partitions all of the following caches by the top-level site being visited: HTTP cache, image cache, favicon cache, HSTS cache, OCSP cache, style sheet cache, font cache, DNS cache, HTTP Authentication cache, Alt-Svc cache, and TLS certificate cache."
> In fact, there are many different caches trackers can abuse to build supercookies. Firefox 85 partitions all of the following caches by the top-level site being visited: HTTP cache, image cache, favicon cache, __HSTS cache__, OCSP cache, style sheet cache, font cache, DNS cache, HTTP Authentication cache, Alt-Svc cache, and TLS certificate cache.
Imagine a widely used legitimate non-tracking resource, say, a shared JS or CSS library on a CDN. Currently, if that CDN uses HSTS, no matter how many independent websites incorrectly include said resource using an http:// URL, only the first request is MITMable, as every subsequent request will use the HSTS cache.
However, now, every single site will have its own separate HSTS cache, so every single first request from each site will independently be MITMable. Not good. This makes HSTS preloading even more important: https://hstspreload.org/
(Good news, if you have a .app or .dev domain, you're already preloaded.)
That's probably Firefox's own Firefox Multi-Account Containers[0]. Groups caches/cookies into designated categories for each tab (personal, work, shopping, etc.), with smart recognition for assigned sites.
I've had first party isolation turned on for possibly a couple of years now (certainly since before the pandemic) and it does break a small number of sites but nothing I particularly care about. Except that one internal tool that I've taken to loading in Chrome :P.
I don't recall the last time I had to temporarily disable it to allow something to work.
I believe the 'First-party isolation' feature does this, but you need to enable it from about:config, and even then, I'm not sure if it is complete or bug-free.
I'd like to see something like the firefox container extension automatically open a new container for every unique domain name. It could get tricky for eg. federated logins, so I'm not 100% sure what the implementation would look like. But it'd be nice to have the option.
This is definitely a step in the right direction. Problem isn't the browser's ability to run code such as JS which makes it possible to create things like supercookies. Problem is web pages are effectively applications that are semi-local (they can bind to local resources such as cookies, caches, storage, and connect to local peripherals) and the security model is very different between viewing HTML documents and running untrusted apps.
No good comes from downloading random, untrusted native applications from the net, installing them locally, and trusting these not to infect your system with malware. No good comes from loading random, untrusted applications into your browser either, and trusting these not to infect your browser, either.
Basically everything on the web should not only be sandboxed with regard to local system (this is mostly established) but also be sandboxed per site by default. Hidden data, or anything that's not encoded in the URI, should only move between sites at the user's discretion and approval.
I've had personal plans to ditch my complex adblocking, multi-account/temporary container and privacy setup in favour of a clean-state browser that I launch in a file-system sandbox such as firejail for each logical browsing session (like going shopping for X, going online banking, general browsing). Basically an incognito session but with full capabilities as sites can refuse serving incognito browsers. Normal browser setup with everything wiped off when the browser exists.
I have some dedicated browsers for things like Facebook where I whitelist a handful of necessary cookies and forget the rest on exit. This, however, won't clear data other than cookies. I think that per-site sandboxing would mostly solve this problem. I don't particularly care what data each site wants to store on my browser as long as it won't be shared with any other site.
Can anyone explain the fingerprinting issue, unrelated to cookies. Visit any one of these many sites that show you what your browser knows about you, it doesn’t matter if using Firefox with fingerpring blocking enabled, the site reveals a tremendous amount of information in your fingerprint. Firefox doesn’t stop any of that, despite its settings that purport to do so. It's always the same information, not scrambled or randomized, from site to site.
Firefox’s default anti-fingerprinting is just a blacklist of common fingerprinting scripts.
It is incredibly difficult to make a browser fingerprint non-unique. Only the Tor browser has strict enough settings with a large enough user base to overcome fingerprinting.
If you don’t want to use Tor, try these:
- uBlock Origin (which has a larger blacklist of fingerprinting scripts)
- Enable the privacy.resistFingerprinting setting in about:config to make your browser more similar to other users with that setting enabled (but not entirely non-unique)
- The nuclear option: arkenfox user.js [1]. It’s github repo also contains a lot of further information about fingerprinting.
Which actually makes sense. If you have a "zero-fingerprint" browser it will become useless, because you cannot use any advanced features other than displaying HTML.
It's a continual source of amazement for me that a majority of HNers are using a browser made by the largest data gobbler in the world, instead of one that actually tries to prevent spying on users.
It's probably one of the most obscure reasons, but keep Chromium around because it's the only web browser with a JIT-backed javascript engine on ppc64le. Firefox has to run everything interpreted, which is actually fine for most sites, but bogs down on JS heavy web app type things.
On a much less niche side of things, a lot of web apps like Teams, Zoom, and probably many others are only fully functional on Chromium, thanks to WebRTC specifics and some video encoding stuff that's only on Chromium. Don't know the details, but things like video and desktop streaming are limited to Chromium.
That could very well be an artificially enforced restriction, but I don't think it is. I think firefox is moving towards feature parity with Chrome on this one, I hope so anyway.
Eh, I agree in general, but in this case, Chrome implemented network partitioning in Chrome 86, which became stable in October 2020, earlier than Firefox.
Google websites work better on chrome. Not sure if it’s because google is doing something nefarious or if Firefox is just not keeping up with google website technologies.
So, I’ve trained my brain to use chrome as an app only for google websites. When I need to check gmail or YouTube or google calendar, I use chrome. Otherwise I’m on Firefox or safari.
It’s worked pretty well. I found I was only really unhappy with Firefox when using google websites. No longer a problem.
I've been switching to Firefox for private use a year ago, but overall I find it not good. Weird bugs, usability issues, dev tools not that great, etc. And privacy-wise, the defaults don't seem great either. There was something about containers that are supposed to prevent tracking between different domains, but if you actively have to create containers rather than them being automatically applied on each domain, then that's not much use since it makes things cumbersome to use.
How do you know user-agent strings of HNers? My guess would be that FF has above-average usage here, with FF topics getting upvotes regularly.
Hmm, come to think of it, does anybody know an easy Chrome-blocking trick for displaying "this page is best viewed using FF"? Might be an effective deterrent for non-"hackers" and the start of forking the web for good.
I used chrome from 2008 to about 2013. At the time Chrome was fast and their macOS experience was amazing. But you could tell that Google was focusing more and more on integrations and services and less on the browsing experience.
Speed, especially with a large number of tabs opened, and the Dev tools. Chrome's are the most polished by far, and it's trivial to do remote debugging on Android devices.
How much money is a user actually worth per year on average? And why can I not pay that amount of money and be left alone, not seeing any ads, not being tracked, not being sold?
I'm still trying to imagine the way one exploits a lack of partitioning in the DNS cache...
1. It seems like client web pages cannot directly view the DNS information for a given domain name. So I would think embedding identifying information in something like a CNAME or TXT record directly wouldn't work.
2. I suppose a tracker could try to create unique records for a given domain name and then use request/responses to/from that domain to get identifying information. But this seems highly dependent on being able to control the DNS propagation. Short of my ISP trying this trick on me, I'm not really sure who else could manage.
I'm sure I am missing things in this brief analysis. I'd love to hear what others think about this cache.
Unfortunately, some trackers have found ways to abuse these shared resources to follow users around the web. In the case of Firefox’s image cache, a tracker can create a supercookie by “encoding” an identifier for the user in a cached image on one website, and then “retrieving” that identifier on a different website by embedding the same image. To prevent this possibility, Firefox 85 uses a different image cache for every website a user visits. That means we still load cached images when a user revisits the same site, but we don’t share those caches across sites.
Wait, so one form of "supercookie" is basically the same as the transparent gif in an email?
We need to acknowledge also that recognising the user as he moves across pages and domains is sometimes needed to provide valuable services to the user.
Therefore, I believe, browsers have to provide a volunteer "tracking" functionality - when a web page reqests 3rd party cookies, a popup is shown to the user with the cookie values, description (as set by the owning domain), the list of domains already permitted to access the cookies and their privacy policy links, and options Allow Once, Allow, Deny Once, Deny.
So instead of fighting each other, service and the user had a chance to cooperate. Service only needs to describe the need clear enough.
The "problem" with that solution is that users are very willing to click any button necessary to achieve their goal, and in any dialog that prompts to allow tracking in order to achieve something else, most people will click allow.
Personally I don't think this is a problem, and people should be allowed to make that choice. But most of HN seems to disagree with me there, and feels that users need to be protected from making choices that could allow them to be tracked
on safari, it's basically the only way to get access to third party cookies in an iframe since safari 13. I wish other browsers (chrome) would also enable this when third party cookies are disabled. On FF I think the rule is that you have to interact with the site beforehands and you get access automatically, failing that you can use this API. No idea how it works in edge
Which valuable services? I’ve had 3rd party cookies entirely disabled for a while now, and I haven’t noticed any services break, not even cross domain logins.
[+] [-] Jonnax|5 years ago|reply
Clever. And so frustrating that optimisations need to be turned off due to bad actors.
[+] [-] abcd_f|5 years ago|reply
It also means that I can remove fonts.google.com from the uBlock blacklist. Yay.
[+] [-] newscracker|5 years ago|reply
[+] [-] paulpauper|5 years ago|reply
[+] [-] dazbradbury|5 years ago|reply
https://developers.google.com/web/updates/2020/10/http-cache...
Does anyone know if these protections go further or differ significantly?
[+] [-] madeofpalk|5 years ago|reply
[+] [-] 3gg|5 years ago|reply
[+] [-] jypepin|5 years ago|reply
Crazy and sad to see where we've come :\
[+] [-] huron|5 years ago|reply
I'm definitely of the opinion that our web browsing devices are marketing tools that we are allowed to use for media consumption.
[+] [-] Synaesthesia|5 years ago|reply
[+] [-] Humphrey|5 years ago|reply
[+] [-] Ice_cream_suit|5 years ago|reply
In fact, there are many different caches trackers can abuse to build supercookies. Firefox 85 partitions all of the following caches by the top-level site being visited: HTTP cache, image cache, favicon cache, HSTS cache, OCSP cache, style sheet cache, font cache, DNS cache, HTTP Authentication cache, Alt-Svc cache, and TLS certificate cache."
Clever !
[+] [-] CydeWeys|5 years ago|reply
(emphasis mine)
This has negative effects on security, as has been pointed out previously by others: https://nakedsecurity.sophos.com/2015/02/02/anatomy-of-a-bro...
Imagine a widely used legitimate non-tracking resource, say, a shared JS or CSS library on a CDN. Currently, if that CDN uses HSTS, no matter how many independent websites incorrectly include said resource using an http:// URL, only the first request is MITMable, as every subsequent request will use the HSTS cache.
However, now, every single site will have its own separate HSTS cache, so every single first request from each site will independently be MITMable. Not good. This makes HSTS preloading even more important: https://hstspreload.org/
(Good news, if you have a .app or .dev domain, you're already preloaded.)
[+] [-] danielheath|5 years ago|reply
[+] [-] RL_Quine|5 years ago|reply
[+] [-] suvelx|5 years ago|reply
I'd like to see a point where browsing on two different websites are treated as a completely different user. Embeds, cookies, cookies in embeds, etc.
[+] [-] jrmann100|5 years ago|reply
[0] https://addons.mozilla.org/en-US/firefox/addon/multi-account...
[+] [-] andrewaylett|5 years ago|reply
I don't recall the last time I had to temporarily disable it to allow something to work.
[+] [-] cassianoleal|5 years ago|reply
I use it to automatically open every new tab in its own temporary container.
[0] https://addons.mozilla.org/en-US/firefox/addon/temporary-con...
[+] [-] Santosh83|5 years ago|reply
[+] [-] ajvs|5 years ago|reply
[+] [-] jniedrauer|5 years ago|reply
[+] [-] cmeacham98|5 years ago|reply
[+] [-] yason|5 years ago|reply
No good comes from downloading random, untrusted native applications from the net, installing them locally, and trusting these not to infect your system with malware. No good comes from loading random, untrusted applications into your browser either, and trusting these not to infect your browser, either.
Basically everything on the web should not only be sandboxed with regard to local system (this is mostly established) but also be sandboxed per site by default. Hidden data, or anything that's not encoded in the URI, should only move between sites at the user's discretion and approval.
I've had personal plans to ditch my complex adblocking, multi-account/temporary container and privacy setup in favour of a clean-state browser that I launch in a file-system sandbox such as firejail for each logical browsing session (like going shopping for X, going online banking, general browsing). Basically an incognito session but with full capabilities as sites can refuse serving incognito browsers. Normal browser setup with everything wiped off when the browser exists.
I have some dedicated browsers for things like Facebook where I whitelist a handful of necessary cookies and forget the rest on exit. This, however, won't clear data other than cookies. I think that per-site sandboxing would mostly solve this problem. I don't particularly care what data each site wants to store on my browser as long as it won't be shared with any other site.
[+] [-] falsaberN1|5 years ago|reply
Is there a way to disable it? Or should I better think about installing a caching proxy to avoid the redundant traffic?
[+] [-] jb1991|5 years ago|reply
[+] [-] surround|5 years ago|reply
It is incredibly difficult to make a browser fingerprint non-unique. Only the Tor browser has strict enough settings with a large enough user base to overcome fingerprinting.
If you don’t want to use Tor, try these:
- uBlock Origin (which has a larger blacklist of fingerprinting scripts)
- Enable the privacy.resistFingerprinting setting in about:config to make your browser more similar to other users with that setting enabled (but not entirely non-unique)
- The nuclear option: arkenfox user.js [1]. It’s github repo also contains a lot of further information about fingerprinting.
[1] https://github.com/arkenfox/user.js
[+] [-] marvinblum|5 years ago|reply
[+] [-] beervirus|5 years ago|reply
It's a continual source of amazement for me that a majority of HNers are using a browser made by the largest data gobbler in the world, instead of one that actually tries to prevent spying on users.
[+] [-] spijdar|5 years ago|reply
On a much less niche side of things, a lot of web apps like Teams, Zoom, and probably many others are only fully functional on Chromium, thanks to WebRTC specifics and some video encoding stuff that's only on Chromium. Don't know the details, but things like video and desktop streaming are limited to Chromium.
That could very well be an artificially enforced restriction, but I don't think it is. I think firefox is moving towards feature parity with Chrome on this one, I hope so anyway.
[+] [-] sanxiyn|5 years ago|reply
[+] [-] iscrewyou|5 years ago|reply
So, I’ve trained my brain to use chrome as an app only for google websites. When I need to check gmail or YouTube or google calendar, I use chrome. Otherwise I’m on Firefox or safari.
It’s worked pretty well. I found I was only really unhappy with Firefox when using google websites. No longer a problem.
[+] [-] mschuetz|5 years ago|reply
[+] [-] literallyWTF|5 years ago|reply
[+] [-] tannhaeuser|5 years ago|reply
Hmm, come to think of it, does anybody know an easy Chrome-blocking trick for displaying "this page is best viewed using FF"? Might be an effective deterrent for non-"hackers" and the start of forking the web for good.
[+] [-] dang|5 years ago|reply
How do you know what browser the majority of HNers are using?
[+] [-] goalieca|5 years ago|reply
[+] [-] paulpauper|5 years ago|reply
[+] [-] mschuster91|5 years ago|reply
[+] [-] danbruc|5 years ago|reply
[+] [-] nixpulvis|5 years ago|reply
1. It seems like client web pages cannot directly view the DNS information for a given domain name. So I would think embedding identifying information in something like a CNAME or TXT record directly wouldn't work. 2. I suppose a tracker could try to create unique records for a given domain name and then use request/responses to/from that domain to get identifying information. But this seems highly dependent on being able to control the DNS propagation. Short of my ISP trying this trick on me, I'm not really sure who else could manage.
I'm sure I am missing things in this brief analysis. I'd love to hear what others think about this cache.
[+] [-] adolph|5 years ago|reply
Wait, so one form of "supercookie" is basically the same as the transparent gif in an email?
https://help.campaignmonitor.com/email-open-rates#accuracy
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] noctua|5 years ago|reply
[1] https://news.ycombinator.com/item?id=25868742
[+] [-] avodonosov|5 years ago|reply
Therefore, I believe, browsers have to provide a volunteer "tracking" functionality - when a web page reqests 3rd party cookies, a popup is shown to the user with the cookie values, description (as set by the owning domain), the list of domains already permitted to access the cookies and their privacy policy links, and options Allow Once, Allow, Deny Once, Deny.
So instead of fighting each other, service and the user had a chance to cooperate. Service only needs to describe the need clear enough.
[+] [-] notatoad|5 years ago|reply
Personally I don't think this is a problem, and people should be allowed to make that choice. But most of HN seems to disagree with me there, and feels that users need to be protected from making choices that could allow them to be tracked
[+] [-] asddubs|5 years ago|reply
https://developer.mozilla.org/en-US/docs/Web/API/Document/re...
https://developer.mozilla.org/en-US/docs/Web/API/Storage_Acc...
on safari, it's basically the only way to get access to third party cookies in an iframe since safari 13. I wish other browsers (chrome) would also enable this when third party cookies are disabled. On FF I think the rule is that you have to interact with the site beforehands and you get access automatically, failing that you can use this API. No idea how it works in edge
[+] [-] surround|5 years ago|reply
[+] [-] punnerud|5 years ago|reply
[+] [-] appleflaxen|5 years ago|reply