This piece is charged with the personal bias of the author (https://twitter.com/movrcx) who launched an hostile fork of the Tor Browser Bundle because "untrustworthyness".
I recommend you read this instead, which provides a more level-headed and technically correct analysis of the vulnerability (which was there, even if not properly in the terms described by OP):
Oh but not just any kind of “untrusworthyness”, from what I gathered from Twitter at the time, he did it after the Appelbaum affair broke out to provide a “non-SJW” fork of Tor Browser. And If I'm not mistaken, he initially forked the wrong repo, it was a running joke for a couple of days... Nice to see the farce still going strong.
Thanks for the background, I got the sense that the author had an axe to grind.
Still, I seem to recall that when the tor browser auto-update mechanism was deployed, the idea was that HTTPS with pinning was only the first step, and that going forward the updater would also check PGP signatures. It's a bit disappointing to see that hasn't happened yet.
Especially with reproducible builds and several trusted signers independently verifying the built binaries and signing the resulting package, this would add considerable security to the update process.
Interesting note: The author is part of the rotor browser fork that is going no where so far. Doesn't look like the reported issue has been fixed there. In fact, no commits since before this blog post."
Tor is not, nor has it ever been, trustworthy. Hell, you can still try active deanonymization for youself: https://github.com/Miserlou/Detour
This didn't used to be a problem, as it was essentially run as a sandbox project for the academic anonymity community. It was very up front about its capabilities and limitations.
Unfortunately, in recent years, the US government has been bankrolling more "privacy" software development through its propaganda arms (OTF, RFA, etc.), and the Snowden revelations have led private foundations to follow suit.
As such, the organization doubled down on rebranding to be a "human rights" _tool_, as this is what grant giving organizations love to promote (free speech in Iran, activist publishing, etc.) This combined with a overly-enthusiastic do-gooders gaining more and more prominence in the Tor organization has led to the dangerous situation of promoting inherently insecure software as a security solution to vulnerable people. This is a general problem in the scene (remember when those activists in South America got vanned for using CryptoCat?) - and one that I've been guilty of myself in the past.
I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin. Unfortunately, I think the opposite will happen.
I'm not sure where this narrative is coming from. TOR was developed as an anonymous network by the US Naval Research Lab. It was designed for use by military and intelligence. TOR was never just some academic experiment.
TOR is still a valid tool. No, it wasn't designed to foil NSA level surveillance, because it was built by the US. But this vulnerability isn't even related to TOR, it has to do with the TOR Browser.
The Snowden leaks contain slides where the NSA clearly laments the use of TOR, so saying that it never has been trustworthy is simply not true.
A more dangerous side-effect of branding Tor as a "human rights" tool is that whereas if it's a "privacy" tool, people in oppressive regimes can legitimately use it (e.g. people working for such regimes), whereas as a "human rights" tool, its simple presence on a computer is evidence of guilt.
> This combined with a overly-enthusiastic do-gooders gaining more and more prominence in the Tor organization has led to the dangerous situation of promoting inherently insecure software as a security solution to vulnerable people.
What is the inherently secure alternative available to these vulnerable people?
> I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin.
What causes you to believe that activists and vulnerable people would stop using Tor if this warning were in place?
Most of the issues seem to revolve around accessing clearnet resources through Tor. Hidden services within Tor still seem secure (whatever that word means these days).
The fact that Mozilla is so slow in implementing real per-process sandboxing in Firefox and that it doesn't even plan to rewrite most of the browser in Rust over the next few years, makes me think that maybe Tor should just bite the bullet and rebuild on top of Chromium, while vigilantly watching out for anti-privacy features in it that they can remove.
Yes, but being the man in the middle is much easier when the target willingly places you in the middle of his connection...
The whole situation can be worked around by using a custom prefs.js that disables auto updating addons (there are various other attacks that can be prevented by tweaking settings in about:config such as the webrtc related ones) and there are various websites providing privacy oriented prefs.js. A better workaround would be for the TOR browser maintainers to ship such a file with it, and a solution would of course be Mozilla fix things on their side.
Using Tor may actually be less secure that using a normal browser.
At least when I connect to Microsoft, Google, Facebook, etc. I don't expect to get hit by a driveby JS exploit, and Google does help with "safe browsing".
With Tor, you're one HTTP website (or not HSTS website) away from a driveby virus, with no way to tell that you're connecting to a dangerous exit node
Tor Project runs several scanners for this behavior. Arguably, unless your ISP, ISP's ISP, coffee shop, etc., are all 100% on top of their game, this could happen in any one of those environments too.
When you connect to Microsoft, Google, Facebook, etc., you're connecting via HTTPS, so there's nothing malicious a Tor exit node could do in these cases.
It's the certificate used to sign TLS for addons.mozilla.org. Since "Tor Browser" is a lightly modified Firefox that hasn't had its automatic addon update checking disabled, and Mozilla's addon signing process is an automated rubber stamp, that's a problem.
To be clear, I don't think it's so much a problem on Mozilla's part; perhaps manual review would be a good idea, but I doubt they have the resources. The problem here is that Tor Browser has claims made for it that aren't supported by the amount of work that's actually gone into making it secure. That would appear to be entirely on the people who run the Tor foundation, or whatever nonprofit structure it is that they use.
Is it really so easy to control a significant portion of tor exit nodes? I seem to remember there are automatic systems and members of the project checking for suspiscious nodes.
[+] [-] FiloSottile|9 years ago|reply
I recommend you read this instead, which provides a more level-headed and technically correct analysis of the vulnerability (which was there, even if not properly in the terms described by OP):
https://hackernoon.com/postmortem-of-the-firefox-and-tor-cer...
[+] [-] jwcrux|9 years ago|reply
[0] https://github.com/IndependentOnion/rotor-browser/commit/916...
[1] https://github.com/IndependentOnion/rotor-browser/issues/7
[+] [-] nzp|9 years ago|reply
EDIT: s/Tor/Tor Browser/
[+] [-] throwaway7767|9 years ago|reply
Still, I seem to recall that when the tor browser auto-update mechanism was deployed, the idea was that HTTPS with pinning was only the first step, and that going forward the updater would also check PGP signatures. It's a bit disappointing to see that hasn't happened yet.
Especially with reproducible builds and several trusted signers independently verifying the built binaries and signing the resulting package, this would add considerable security to the update process.
[+] [-] lucastx|9 years ago|reply
"Old news. This was fixed in 6.0.5.
https://blog.torproject.org/blog/tor-browser-605-released
Interesting note: The author is part of the rotor browser fork that is going no where so far. Doesn't look like the reported issue has been fixed there. In fact, no commits since before this blog post."
https://www.reddit.com/r/TOR/comments/53u1cd/tor_browser_exp...
[+] [-] 4ad|9 years ago|reply
6.0.5 was released five days ago. There is no universe where this qualifies as old news.
[+] [-] throwanem|9 years ago|reply
> I'd honestly suspect the author was COINTELPRO were it not for the fact that so many of his statements and code are literally laughable.
So there's that!
[+] [-] Mizza|9 years ago|reply
This didn't used to be a problem, as it was essentially run as a sandbox project for the academic anonymity community. It was very up front about its capabilities and limitations.
Unfortunately, in recent years, the US government has been bankrolling more "privacy" software development through its propaganda arms (OTF, RFA, etc.), and the Snowden revelations have led private foundations to follow suit.
As such, the organization doubled down on rebranding to be a "human rights" _tool_, as this is what grant giving organizations love to promote (free speech in Iran, activist publishing, etc.) This combined with a overly-enthusiastic do-gooders gaining more and more prominence in the Tor organization has led to the dangerous situation of promoting inherently insecure software as a security solution to vulnerable people. This is a general problem in the scene (remember when those activists in South America got vanned for using CryptoCat?) - and one that I've been guilty of myself in the past.
I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin. Unfortunately, I think the opposite will happen.
[+] [-] nickbail3y|9 years ago|reply
TOR is still a valid tool. No, it wasn't designed to foil NSA level surveillance, because it was built by the US. But this vulnerability isn't even related to TOR, it has to do with the TOR Browser.
The Snowden leaks contain slides where the NSA clearly laments the use of TOR, so saying that it never has been trustworthy is simply not true.
[+] [-] openasocket|9 years ago|reply
Is it even possible to protect against end-to-end time correlation attacks without massively increasing latency?
[+] [-] PieterH|9 years ago|reply
[+] [-] kevinr|9 years ago|reply
What is the inherently secure alternative available to these vulnerable people?
> I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin.
What causes you to believe that activists and vulnerable people would stop using Tor if this warning were in place?
[+] [-] wybiral|9 years ago|reply
[+] [-] necessity|9 years ago|reply
[+] [-] eli|9 years ago|reply
[+] [-] mtgx|9 years ago|reply
[+] [-] willvarfar|9 years ago|reply
[+] [-] necessity|9 years ago|reply
The whole situation can be worked around by using a custom prefs.js that disables auto updating addons (there are various other attacks that can be prevented by tweaking settings in about:config such as the webrtc related ones) and there are various websites providing privacy oriented prefs.js. A better workaround would be for the TOR browser maintainers to ship such a file with it, and a solution would of course be Mozilla fix things on their side.
[+] [-] MBCook|9 years ago|reply
1. You'd need to be able to MitM all of them (where controlling a ton of TOR exit nodes comes in handy)
2. You'd need to know an extension lots of people have (I'm guessing NoScript is default on TOR browsers)
[+] [-] nixos|9 years ago|reply
At least when I connect to Microsoft, Google, Facebook, etc. I don't expect to get hit by a driveby JS exploit, and Google does help with "safe browsing".
With Tor, you're one HTTP website (or not HSTS website) away from a driveby virus, with no way to tell that you're connecting to a dangerous exit node
[+] [-] tedks|9 years ago|reply
[+] [-] heinrich5991|9 years ago|reply
[+] [-] kevin_thibedeau|9 years ago|reply
[+] [-] mdadm|9 years ago|reply
Seriously? That seems like a really weird - to say the least - decision to make about something this important...
[+] [-] throwanem|9 years ago|reply
To be clear, I don't think it's so much a problem on Mozilla's part; perhaps manual review would be a good idea, but I doubt they have the resources. The problem here is that Tor Browser has claims made for it that aren't supported by the amount of work that's actually gone into making it secure. That would appear to be entirely on the people who run the Tor foundation, or whatever nonprofit structure it is that they use.
[+] [-] rnhmjoj|9 years ago|reply
[+] [-] nilved|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] nijiko|9 years ago|reply
This is the joke right?