top | item 1880412

This is why sites choose to stay vulnerable to Firesheep

354 points| ericflo | 15 years ago |google.com | reply

139 comments

order
[+] kneath|15 years ago|reply
This is a problem we (GitHub) are facing in a big way right now. Google Charts doesn't offer https alternatives, so almost all our users get a big "this site is going to steal all your private information" (mixed content warning). We chose to roll out SSL first, then deal with the hard problem of mixed content warnings (building ridiculous image proxies) later.

I think a lot of developers underestimate how big of an impact this warning is on users, especially on browsers like IE that throw up a dialog on every page that has this warning. Developers understand that it's not that big of a deal — but to a user, it looks like the site is full of viruses, malware and is going to steal all your bank account information.

[+] patio11|15 years ago|reply
This is a problem we (GitHub) are facing in a big way right now

This also broke Bingo Card Creator something fierce when I rolled out SSL support. It was the reason I hadn't had it previously, and I knew it was going to be a problem going in, and I tested for it, and I still managed to hose two pages which were critical to my business for most of a week.

Figure on a 40~50% drop in conversion from a non-technical audience on IE if they get one of those popups, by the way. It is the worst possible place to be: not enough to trigger an automated "Oh cripes!" from the website, but big enough to murder business results.

[+] thwarted|15 years ago|reply
During the last Velocity conference, one of the last sessions on the last day was a talk from Google guys about how to make SSL faster, because they had recently turned SSL on for all gmail accounts.

I asked how they deal with the unlocked icon and warning dialogs for mixed protocol content on the page and the response was that people are so used to the popups and the lock being unlocked, that they (Google) don't consider it to be a problem. The response was really short and curt and I felt it was kind of a cop-out.

[+] swolchok|15 years ago|reply
The warning isn't spurious, by the way. A man in the middle could inject evil JS into urchin.js (or whatever the equivalent is now) just as easily as he could inject it into your site's JS; the page is not secure.
[+] NiekvdMaas|15 years ago|reply
For Google Charts, there is a workaround. Simply change the hostname to www.google.com, example:

  http://chart.apis.google.com/chart?chs=200x200&cht=qr&chl=http://www.adperium.com/ (normal)
  https://www.google.com/chart?chs=200x200&cht=qr&chl=http://www.adperium.com/ (HTTPS)
It's probably not what Google prefers, but this works for us.
[+] Udo|15 years ago|reply
Exactly. Browser makers (including Mozilla/Firefox to a large degree) are responsible for the fact that HTTPS hasn't become the standard protocol as it should have been years ago. It's not only the unproductive mixed content warning but also the insistence of all browsers to only accept expensively bought certificates and throw a very scary and hard to overcome error dialog if a site uses any other kind of cert. While that isn't a problem for big(gish) commercial sites like GitHub, it presents an insurmountable hurdle for private sites and small-time projects for no good reason. For most sites I don't need "secure" origin verification as badly as encryption. The lack of a verifiable server address shouldn't mean that I should be bullied to not use an encrypted connection with it. But even if the verdict is that you absolutely can't have one without the other, browser makers should AT LEAST include trusted root certs of authorities who offer free SSL certificates, too.
[+] ryanto|15 years ago|reply
Yup, I agree with you. This is a pretty big problem with a lot of other google services as well.

The google maps api for example will not work behind https. Google has publicly said that this is because they want their maps free and open, not behind some page where the user needs to be logged in. This create a huge problem for any site that uses google maps. They do offer a solution though, for $10,000 a year they will let you use the map api behind https.

[+] ericflo|15 years ago|reply
Not only that, but in IE it's a modal dialog. You can't do anything (even switch to another tab) until you've acknowledged the scary warning.
[+] tomjen3|15 years ago|reply
Couldn't you just cache the charts locally, and then serve them directly to the user?
[+] ansonparker|15 years ago|reply
Absolutely. I have run an SSL-only site (https://domize.com) for a few years now.

Other than Google Analytics I can't think of a single other widget/embed/analytics app that has supported SSL out of the box. It's a real shame, but on the other hand I'd bet good money that the web will be 99% SSL within the next 24 months.

[+] aschobel|15 years ago|reply
We had to disable support for Google Maps on Catch.com because the costs was too prohibitive to use their "enterprise" SSL enabled version.

It's a damn shame because it was a really cool integration.

[+] notphilatall|15 years ago|reply
How about: Give every user a monotonically incrementing value that's initialized at the start of the session using HTTPS. For every request, the client will provide the next value in the expected sequence. Listeners won't have the secret key that was exchanged during the HTTPS authentication, and can't issue requests on the legitimate client's behalf.

Forcing the requests to be serial sucks, but if you only do it for privileged actions (as opposed to public page GETs) it should be manageable.

[+] mickeyben|15 years ago|reply
We discovered the exact same issue and rolled back few weeks ago.

The worst is that the default selected choice in the modal box is to not load anything.

[+] ntoshev|15 years ago|reply
The image proxy won't work: Google's js APIs are throttled per IP to prevent abuse.
[+] tommorris|15 years ago|reply
Is there a lot of GitHub users using IE? ;-)
[+] drivebyacct2|15 years ago|reply
Could you proxy through your own servers?
[+] jasonkester|15 years ago|reply
To be accurate, this is not the reason many sites choose not to go with SSL for everything. The real reason is that most sites don't need to be SSL for everything.

I run a travel blogging site, where 99% of all pageviews are from random people off the internet reading people's trip reports and looking at photos. Encrypting all that traffic would do nothing except bog the site for everybody.

Every once in a great while (in terms of total traffic), somebody will log in and post something. That tiny moment could benefit from SSL, since chances are it's happening from a public internet cafe or wifi hotspot. That's the only time a user is actually vulnerable to this sort of attack, so that's when they need to be protected.

But when you look at the internet at a whole, the traffic fraction that needs protecting looks pretty much the same. When you're showing me pictures of cats with funny captions, please don't encrypt them before sending them to me just because you read something about security on HackerNews.

[+] tghw|15 years ago|reply
The thing that Firesheep brought to people's attention is that the login is not the only thing that needs to be SSL protected. The cookies you get after signing in are often sent in the clear, and that cookie is just as good as your login for gaining access.
[+] drivingmenuts|15 years ago|reply
Honestly, how many sites are aware that they are vulnerable?

It seems like you assume that because the security-oriented 0.5% of the web knows about it, the rest of the web should, too.

For most people, just making sure that their site runs at all is quite enough for them to handle, and keeping current on the latest vulnerabilities is way down on the list.

Additionally, fixing a site takes time. How long has Firesheep been out? A week? Two? You should realize that for many sites, even those staffed by very competent tech people, a month is the minimum amount of time for immediate action.

[+] harshpotatoes|15 years ago|reply
I agree that most of the web is probably ignorant at best of most security vulnerabilities. But, keep in mind that firesheep is not exploiting a new vulnerability, but an old one that has been known about since probably early 2000. Firesheep is new in that it is automating the work of other programs (which were admittedly a little less user friendly).
[+] drivebyacct2|15 years ago|reply
How many sites (that any of us are legitimately worried about) employ webmaster, developers, system admins or other that DON'T know why SSL/HTTPS is important? You can't honestly be giving facebook, twitter, etc a pass on understanding very basic concepts... (sniffing, http (cookies))?

Firesheep has been around for 2+ weeks now, but come on, we've all known this has been possible for forever. I'm 20, and I knew how to do this (and did) /years/ ago. I think Firesheep is just what everyone needed.

There are really good reasons why this is taking a long time and it is NOT lack of knowing that this problem exists.

That having been said, my laptop is now running a LiveCD of x2go's LTSP client and my desktop computer is running the x2go server. Very near-native performance and total security. (I trust my desktops' endpoint).

[+] CaptainMcCrank|15 years ago|reply
I read the overclocking ssl posting (http://www.imperialviolet.org/2010/06/25/overclocking-ssl.ht...) & I've been seeing plenty of follow up about how SSL is cheap and easy to scale, but I have yet to see one tutorial that describes actually implementing overclocking SSL or implementing it cheaply.

So- to the HN community, is this whole "ssl is cheap" a false meme, or does someone have actual instructions on how to deploy & implement a scalable SSL?

[+] al_james|15 years ago|reply
There has got to be a sensible way around this. It seems overkill to require every pageview to be over HTTPS, even for otherwise public sites. For example, should these public discussion pages be over HTTPS on hacker news?

On my site I am planning the following: operate the login page over HTTPS and issue two cookies. One is HTTPS only and other other is for all pages. The public (non HTTPS) cookie is only used for identification (e.g. welcome messages and personalisation). However, all requests that change the database in any way are handled over HTTPS and we check to make sure the user has the secret HTTPS cookie as well. Often forms submit to a HTTPS backend and then redirects back to the public page over HTTP. Also, all account information pages (sensitive pages) will be over HTTPS.

This way, the worst that can happen via cookie sniffing is that someone can see pages as though they were someone else. In your case, this is not much of a risk.

[+] mike-cardwell|15 years ago|reply
This is just dangerous. Example. If news.ycombinator.com implemented this dual cookie method. A man in the middle could intercept the page I'm looking at now, where I'm entering this comment in a textarea. They could modify the underlying form to post to the same page as the update form on the profile page, and set a hidden email field. Then when I hit the "reply" button, even though I'm posting to a HTTPS page, I'm not posting to the one I think I am, because the page containing the form it's self wasn't protected by HTTP.

I hope I explained that well enough. Mixed content is hard to do right. Forcing every page over SSL prevents anyone making any modifications to any page, and is just inherently safer.

[+] brown9-2|15 years ago|reply
Is all this extra effort really worth the alternative of just using HTTPS for everything?

Have you really examined the extra cost of 100% https compared with the scheme you've outlined? Sounds like this idea would require a decent amount of effort to identify where to use https, to ensure that each privileged request is using https, etc.

I can see that for some cases it is advantageous to stick to regular http for unimportant requests and use https for the important stuff, but I have a strong feeling that this is only applicable for the minority of use cases and websites.

[+] piotrSikora|15 years ago|reply
You're think about right solution, but you're overdoing it and in the end you're unnecessary complicating very simple thing.

The simplest (and IMHO the best) solution is to have everything served over HTTP for unauthenticated users and everything served over HTTPS for authenticated users (this requires authentication cookie marked as "secure" and regular cookie "authenticated=true" that would redirect authenticated user to HTTPS version of the site in case that he/she would go to the HTTP site).

[+] iwr|15 years ago|reply
Browsers should use two kinds of notifications: "encryption on" (green or red) and "certificate is present" (green or red). Websites that do banking or handle sensitive information should be green/green (SSL-on with verified cert), while ordinary websites could be green/red (SSL-on with self-signed cert).
[+] jules|15 years ago|reply
Is the solution to Firesheep to have every logged in page in https? Or is this not necessary?
[+] DrStalker|15 years ago|reply
Any HTTP request that includes the session cookie needs to be secured, otherwise the firesheep user will be able to grab the session cookie and use it in their own requests.
[+] boyter|15 years ago|reply
You can also SSH tunnel out to a secure server, which is what you should do on any public network that isnt under your control.

It won't protect you from man in the middle attacks on the general internet, or fix the underlying issue with most websites, but it will stop firesheep.

[+] EricButler|15 years ago|reply
While sites wait for services such as adsense to support SSL, adding a second Secure cookie and requiring on sensitive pages and to perform destructive actions can help reduce risk to users. Depending on the site, it may be OK to skip showing ads on a few authenticated pages. Wordpress implemented this in 2008: http://ryan.boren.me/2008/07/14/ssl-and-cookies-in-wordpress...

This won't protect against active attackers, but is definitely a step forward and will make a full transition easier in the future, when possible.

[+] technoweenie|15 years ago|reply
We spent about a week trying this on GitHub. It works pretty well as long as you have no ajax requests. We were basically left with this option:

1) Lose the ajax (and spend a significant time redoing bits of the site) 2) Scary iframe hacks. 3) SSL Everywhere.

I feel like we made the best choice (I certainly don't mind removing any chance we'll have adsense any time soon :). It cleaned up a lot of logic based around determining which pages were important enough to require SSL (admin functions, private repos, etc).

It's brought on some other issues though. Safari and Chrome don't seem to properly cache HTTPS assets to disk, for one. This is an old problem: http://37signals.com/svn/posts/1431-mixed-content-warning-ho... . I'm not too worried about increased bandwidth bills on our end, I'm worried about having a slower site experience. We're also seeing users complain about having to log in every day. Are browsers not keeping secure cookies around either?

[+] Groxx|15 years ago|reply
An alternative? : http://www.tcpcrypt.org/
[+] erikano|15 years ago|reply
> Tcpcrypt is opportunistic encryption. If the other end speaks Tcpcrypt, then your traffic will be encrypted; otherwise it will be in clear text.

I like that, as opposed to requiring users to have to install some plugin before they can even talk to the server.

[+] tapz|15 years ago|reply
Is there an alternative to HTTPS?
[+] alexyoung|15 years ago|reply
It's kind of obvious that a major reason is IPV4. Getting another IP so you can run both SSL and plain HTTP is only going to get harder.
[+] hapless|15 years ago|reply
Is there any reason that the web server can't set up an SSL reverse proxy to fetch the adsense ads ? (Obviously cookies would have to be passed-through...)

It would cost you more bandwidth, but then there's no annoying warning message about mixed content.

[+] trumbo|15 years ago|reply
What's the deal with that page not having any stylesheets? It looks completely broken. Am I the only one seeing this?
[+] ohwaitnvm|15 years ago|reply
Turn off adblock and reload the page.
[+] joshfraser|15 years ago|reply
it's one of the reasons. probably not the only one.
[+] yanw|15 years ago|reply
Facebook and Twitter don't run Adsense, it's mostly run on content sites that don't require you log into them.
[+] ericflo|15 years ago|reply
I don't understand your argument here--are you saying we shouldn't mind if the vulnerable sites aren't Facebook or Twitter?
[+] benblack|15 years ago|reply
Irrelevant contrarian opinion that adds nothing to the debate, but indicates with certainty I am more interested in being pedantic and scoring points than having a useful discussion.