It seems both NoScript and AdBlock Plus have become really permissive as of late regarding their whitelists. While ABP is a bit shady with their 'acceptable ads' deals, I believe in NoScript's case it's probably due to not wanting to break things too badly for less technically minded users.
Regardless, I've replaced both extensions with uBlock Origin. While UB in default deny mode is not as fine grained as NS, it does the job and doesn't compromise on default whitelists at the expense of a little breakage (gorhill is very adamant on this point).
I did the same, but with ublock origin in advanced mode and umatrix, which gives me even more control on what request goes where.
This is probably overkill for the average person, but after so many years of using noscript+requestpolicy I am used to sites being broken by default and having to fix them if needed, to me this is an acceptable tradeoff for the increased security.
The only exception to this rule for me is when I order something on a website, in that case I find it too risky to run with tight blocking (due to redirections to the payment site and so on) and just run a completely default firefox in its own vm that I snapshot before and revert to after.
Why was this downvoted? I'm always interested in current perspectives on best practices in browser security. I don't know whether onosendai's description https://news.ycombinator.com/item?id=9795193 is correct, but, even if it isn't, a rebuttal would be more helpful (to me and to onosendai both) than a silent downvote.
My approach was to start by combining SomeoneWhoCares' and MVPS' hosts files with `uniq`, which rendered such browser addons largely, though not entirely, unnecessary. The down side is you can only do that on machines where you have root access.
Buying a stale entry on the NoScript whitelist for $10 is a cute trick, but the important point this post makes is that you basically can't trust NoScript to protect you from browser vulnerabilities. Many of the zillion scripts it effectively whitelists will themselves have DOM corruption flaws. Compared to the effort it takes to build a reliable drive-by browser exploit, evading NoScript is not a meaningful challenge.
> "you basically can't trust NoScript to protect you from browser vulnerabilities"
You make it sound like NoScript should not be trusted. In reality, nothing is secure but NoScript is one of the better security options (perhaps best?) for helping to prevent a specific set of attacks that use js to enable.
A flaw was found - and promptly fixed. You are (inadvertently I believe) leading people to drop NoScript and possibly go with something else. Another less mature security tool will have its own share of flaws - likely months or years until they will reach relatively similar ground as NoScript.
Actually I just remove all of the default whitelist stuff when I first install it. Problem solved.
The whitelist did surprise me last time, though. I was baffled why gmail was working without me having to permanently allow it. Then I discovered the whitelist. Woah. What a dumb idea.
NoScript doesn't have a public source control repository, which makes it hard to follow what changed between releases. A while ago I made an automatically updated GitHub repository that contains all public releases.
I think NoScript works best as an additional defence line and not as a primary protection mechanism. As such, I would of course still install all regular security updates. It does reduce my attack surface, though, even if it won't stand against dedicated attacks. At the moment it benefits from its rare use (compared to browser users in general) and if it became more widespread and targeted by mainstream attacks, I'm optimistic that the whitelist issue would sort itself out over time (by iteratively removing vulnerable or untrustworthy sites).
Btw., like a few others have noted for theirs, my whitelist does not contain those entries in question. It might be because my installation is relatively old and they weren't pushed retroactively.
If you are on Windows, an additional layer of security might be to install Sandboxie (http://sandboxie.com) and run Firefox from within a dedicated sandbox. Additional brownie points for editing the file/folder access. (making confidential folders inaccessible/write-only)
I'm not sure I agree that trusting subdomains "greatly expands the default trust surface". That's part of the premise of the domain hierarchy: the owner of a domain owns all the subdomains. If you don't trust the policy by which they grant subdomain control to others, then you don't trust the domain. This is the same policy that all browsers use; it might be surprising to folks unfamiliar with URLs, but the article's tone suggests that it's crazy and weird, which definitely isn't the case.
Wow, a domain for sale that's in the whitelist of an addon as popular as NoScript is pretty surprising to me. I immediately assumed it was a CDN by Zend, which seemed like a reasonably trustworthy domain.
Strange considering the general opinion about whitelisting in the original forum thread.
>Giorgio doesn't generally add CDNs to the default whitelist. I'm not sure why he added googleapis.com, except that google.com is already on the default whitelist (so people can use GMail to get support), and googleapis.com is controlled by Google anyway.
My whitelist did not contain this domain, possibly because updates don't change the whitelist retroactively. Which is a good thing if it is part of a policy to never update a users whitelist without them knowing.
Until I see a real audit that looks at bypassing noscript code, I will continue using and promoting noscript as a great tool for safe browsing. No one can deny that a large majority of web exploits use javascript to launch, even when the exploit is in another media or protocol like MS Office or Adobe flash.
Noscript is powerful but it was also never aimed at the general public. In my opinion the general public can benefit from it but only as a shield against unwanted website loading from unknown domains. Because anyone who is not very experienced in the web and able to tie domains to website features will simply use the "allow this page temporarily" feature.
Which in my opinion is fine, it's better protection than not having noscript. But it's not the way noscript was designed to be used.
Could you please elaborate on what is the preferable alternative to temporarily allowing a page? You often do not know which one of the blocked scripts provides the functionality you are looking for. So you unblock them one by one, I guess? But then, when you unblock the offending script the damage is done.
Sounds like a legitimate concern but my install of NoScript on Firefox (Ubuntu) which hasn't been customized in any way, shows only a bunch of local ('about:') pages in the "Whitelist" section and nothing else:
I actually like it better than uMatrix: policeman shows you the full url of the blocked resource that you can inspect before allowing, and cross-domain request are very easy to follow.
The only thing I wished is per-domain control of most modern browser extensions, like, for example: disable CSS animations everywhere except when I allow it to. Likewise for <audio>, <media>, GL, and whatever useless feature I don't need 99.99% of the time.
uMatrix has already per-domain boolean control of "agent spoofing" and related settings, it would be awesome if the above would be included there.
NoScript was a bit more forward thinking in that regard: you can disable media/GL globally except for whitelisted sites, but then again without cross-domain control you end-up whitelisting everything.
To my understanding the white list had to be enabled though? I haven't used NoScript in ages but I thought that was what the white list was for?
As for the claim of any subdomain on any website it depends on your settings. Again I haven't used it in a while, but I do know that it was definitely highly configurable.
You're right, it's right under Settings -> General, the very first tab. If that's not obvious then I don't know what is; obviously a tool like NoScript has to be configured and this is the first place anyone would normally go after the FAQ.
treating noscript (or adblock, or ublock) as security seems misguided. None of them are audited in any real way, or even designed to be security tools. They're nuisance blockers. If they make your web browsing experience slightly less annoying, they're doing their job. Expecting security from them is only going to lower your guard to actual potential threats.
It seems that there should be some kind of automatic sanity check for whitelisted domains. That is, the domain should be registered and should resolve to something. It would also probably make sense to throw up a warning to the NoScript developers if the domain registration ever changes from what was previously approved.
This article was a little startle to me - I never really thought to use NoScript as a security measure for myself. I use it to make the web 'quieter': no more pop-ups, pop-acrosses, popovers, pop-from-wherevers, plus a few other obnoxious behaviours are beaten into submission as well.
Stopping javascript isn't going to stop tracking anyway - the big players still track who's loading their button images, for example.
No. NoScript makes browsing faster, disables lots of annoyances and (to some extent) increases security - not just by blocking JS scripts, but also by providing great tools against stuff like clickjacking.
Maybe it fights some tracking as a side effect, but that's not the reason you install it, because it's not made for that and therefore is not very effective.
I don't think NoScript should have any whitelist entries by default, since the whole idea behind it is to let the user determine what to allow.
Also, CDNs are quite problematic since they are often hosting many different scripts of which you only want to allow some. A finer-grained path/subdomain matching would be ideal here - you could allow * .example.com, example.com/*, or example.com/script.js.
It has to have a default whitelist to expand beyond the userbase of hardcore tech people, so that when Kim Komando endorses it, people don't find that their Facebook, Google, and Gmail are suddenly broken.
Yeah yeah, and not only that I allow scripts from various CDNs all the time. I call bullshit on anyone who claims they don't do the same. It's still worth running for no other reason than reducing the overall resources my browser uses and extending the time required between restarts.
[+] [-] onosendai|10 years ago|reply
Regardless, I've replaced both extensions with uBlock Origin. While UB in default deny mode is not as fine grained as NS, it does the job and doesn't compromise on default whitelists at the expense of a little breakage (gorhill is very adamant on this point).
[+] [-] tetraodonpuffer|10 years ago|reply
This is probably overkill for the average person, but after so many years of using noscript+requestpolicy I am used to sites being broken by default and having to fix them if needed, to me this is an acceptable tradeoff for the increased security.
The only exception to this rule for me is when I order something on a website, in that case I find it too risky to run with tight blocking (due to redirections to the payment site and so on) and just run a completely default firefox in its own vm that I snapshot before and revert to after.
[+] [-] JadeNB|10 years ago|reply
[+] [-] avinassh|10 years ago|reply
What is difference between uBlock and uBlock Origin?
EDIT: From Google searches I have learned that uBlock Origin started as fork of uBlock, but maintained by the guys who started uBlock.
[+] [-] dbbolton|10 years ago|reply
[+] [-] tptacek|10 years ago|reply
[+] [-] jb613|10 years ago|reply
You make it sound like NoScript should not be trusted. In reality, nothing is secure but NoScript is one of the better security options (perhaps best?) for helping to prevent a specific set of attacks that use js to enable.
A flaw was found - and promptly fixed. You are (inadvertently I believe) leading people to drop NoScript and possibly go with something else. Another less mature security tool will have its own share of flaws - likely months or years until they will reach relatively similar ground as NoScript.
[+] [-] coldpie|10 years ago|reply
The whitelist did surprise me last time, though. I was baffled why gmail was working without me having to permanently allow it. Then I discovered the whitelist. Woah. What a dumb idea.
[+] [-] avian|10 years ago|reply
https://github.com/avian2/noscript
For example, this seems to be the change that was pushed as a response to this discovery:
https://github.com/avian2/noscript/commit/398ae6eadd2f40c8b7...
[+] [-] dreyfiz|10 years ago|reply
[+] [-] giancarlostoro|10 years ago|reply
https://forums.informaction.com/viewtopic.php?f=10&t=17066
[+] [-] Perseids|10 years ago|reply
Btw., like a few others have noted for theirs, my whitelist does not contain those entries in question. It might be because my installation is relatively old and they weren't pushed retroactively.
[+] [-] giancarlostoro|10 years ago|reply
[+] [-] _nedR|10 years ago|reply
[+] [-] matchu|10 years ago|reply
[+] [-] vcarl|10 years ago|reply
[+] [-] INTPenis|10 years ago|reply
>Giorgio doesn't generally add CDNs to the default whitelist. I'm not sure why he added googleapis.com, except that google.com is already on the default whitelist (so people can use GMail to get support), and googleapis.com is controlled by Google anyway.
My whitelist did not contain this domain, possibly because updates don't change the whitelist retroactively. Which is a good thing if it is part of a policy to never update a users whitelist without them knowing.
Until I see a real audit that looks at bypassing noscript code, I will continue using and promoting noscript as a great tool for safe browsing. No one can deny that a large majority of web exploits use javascript to launch, even when the exploit is in another media or protocol like MS Office or Adobe flash.
Noscript is powerful but it was also never aimed at the general public. In my opinion the general public can benefit from it but only as a shield against unwanted website loading from unknown domains. Because anyone who is not very experienced in the web and able to tie domains to website features will simply use the "allow this page temporarily" feature.
Which in my opinion is fine, it's better protection than not having noscript. But it's not the way noscript was designed to be used.
[+] [-] Schiphol|10 years ago|reply
[+] [-] nacs|10 years ago|reply
http://i.imgur.com/10bBvEq.png
[+] [-] tenfingers|10 years ago|reply
I personally switched to Policeman (https://addons.mozilla.org/en-US/firefox/addon/policeman/) a while ago, and there it's pretty clear that you can remove the built-in rule set.
I actually like it better than uMatrix: policeman shows you the full url of the blocked resource that you can inspect before allowing, and cross-domain request are very easy to follow.
The only thing I wished is per-domain control of most modern browser extensions, like, for example: disable CSS animations everywhere except when I allow it to. Likewise for <audio>, <media>, GL, and whatever useless feature I don't need 99.99% of the time.
uMatrix has already per-domain boolean control of "agent spoofing" and related settings, it would be awesome if the above would be included there.
NoScript was a bit more forward thinking in that regard: you can disable media/GL globally except for whitelisted sites, but then again without cross-domain control you end-up whitelisting everything.
[+] [-] giancarlostoro|10 years ago|reply
As for the claim of any subdomain on any website it depends on your settings. Again I haven't used it in a while, but I do know that it was definitely highly configurable.
[+] [-] kissickas|10 years ago|reply
[+] [-] notatoad|10 years ago|reply
[+] [-] michaelmior|10 years ago|reply
[+] [-] nly|10 years ago|reply
[+] [-] jay_kyburz|10 years ago|reply
I thought that was the reason you installed NoScript?
[+] [-] vacri|10 years ago|reply
Stopping javascript isn't going to stop tracking anyway - the big players still track who's loading their button images, for example.
[+] [-] seba_dos1|10 years ago|reply
Maybe it fights some tracking as a side effect, but that's not the reason you install it, because it's not made for that and therefore is not very effective.
[+] [-] userbinator|10 years ago|reply
Also, CDNs are quite problematic since they are often hosting many different scripts of which you only want to allow some. A finer-grained path/subdomain matching would be ideal here - you could allow * .example.com, example.com/*, or example.com/script.js.
[+] [-] ams6110|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] Canada|10 years ago|reply
[+] [-] NamPNQ|10 years ago|reply
http://who.is/whois/zendcdn.net
Domain had registered on June 12, 2015
[+] [-] hackuser|10 years ago|reply
[+] [-] wtallis|10 years ago|reply
[+] [-] chappi42|10 years ago|reply