top | item 4619541

Never Give Your Information To 10 Minute Old Startups

501 points| RKearney | 13 years ago |blog.ryankearney.com | reply

169 comments

order
[+] JoeCortopassi|13 years ago|reply
Everyone here that is thinking of giving this company the benefit of the doubt needs to go read their (smeagle) responses to RKearney from the original thread. Here are some samples of the careless attitude behind this:

----

"if anyone's concerned about your AWS key, just destroy your IAM user and create a new one. that's what it was designed for."

----

In response to advice saying they should notify users by email:

"good idea. actually, we'll just wipe them and force new ones."

----

In response to RKearney warning people about just what exactly is exposed:

"in case you have issues with your AWS keys. RKearny's email: [email protected] https://secure.gravatar.com/avatar/f7d7b021fb488fe6a67ddb286....

[+] enoch_r|13 years ago|reply
I was slightly sympathetic to them right up until smeagle posted RKearney's email. While not noticing an incredibly obvious security hole is serious, it's somewhat understandable in the context of a site that unintentionally goes public before it's ready. What's far, far worse is the mindset in which someone who points out a security hole is the problem, and should be personally attacked.

They should have thanked him, notified their users, done a thorough review of their own security, and warned new signups to only use IAM keys. Instead they got defensive, made excuses, and attacked the messenger.

[+] danielweber|13 years ago|reply
I'm a big fan of responsible disclosure, and I think that the HN crowd has far too much of a one-track mind about "always disclose security problems everywhere!", especially with something that may not have been ready for release.

But that "contact Ryan if anything goes wrong" is grade A asshole.

[+] ainsleyb|13 years ago|reply
This is a severe lack of customer service. The least that can be done is a quick shutdown of the site until there's a good fix, an email to all customers (since legally they have to disclose the breach: http://en.wikipedia.org/wiki/Security_breach_notification_la...), and a thanks out to whomever reported the issue.

If you make a mistake, own up to it. Honesty is the best key to building a business, and I'm sure they've at least lost the HN trust for any product in the future.

[+] patio11|13 years ago|reply
Many folks in the security community might suggest a) An oblique warning publicly like "There exists a security problem with this; I have mailed the devs" b) actually mailing the devs c) waiting for confirmation of fix or a reasonable time and only then d) tar-and-feather. The term-of-art for this is "responsible disclosure."

This incentivizes people to fix things quickly and preserves the reputational value of breaking into things without researcher-vendor relations getting adversarial when you announce something like "I harvested a couple dozen of your customers' API keys" or "Here's an exploitation roadmap you can follow in your browser" in a public forum.

[+] Silhouette|13 years ago|reply
10 minutes? Never give your information to a business that made a mistake like this, ever.

That wasn't merely a "security vulnerability". It was also a demonstration that the people running the business have absolutely no idea what they are doing when it comes to security, privacy, or testing and release processes. (Actually, there is an alternative explanation, which is even worse: they knew and didn't care. I prefer to assume naivety rather than malice.)

Unfortunately, the only sensible action when faced with a business like this is to run away and not look back for a very long time, except perhaps to check who the people responsible were so you can avoid anything else they work on in the near future as well.

[+] larrys|13 years ago|reply
"Never give your information to a business that made a mistake like this, ever."

Fwiw back in 1996 or 97 the UPS website did the same thing. By altering the tracking number you could see somewhat complete information on someone else's shipment. Since the tracking numbers ran in sequence from the shippers log books giving one tracking number from a competitor you could see all their customers. (To get that all you had to do was place a single order so they shipped to you. Although I guess it wouldn't have been even easier to social engineer someone to simply give you any tracking number and save that step.)

[+] rlpb|13 years ago|reply
> a business that made a mistake like this

I go one step further. I refuse to give information that provides more access to a business than they need to have or that can even affect any other service I receive from anywhere else.

Here, I'd like to give them a key that works only with glacier vaults that they have created, and nothing else. If this isn't possible, then I'll go without.

[+] olalonde|13 years ago|reply
> It was also a demonstration that the people running the business have absolutely no idea what they are doing when it comes to security, privacy, or testing and release processes.

I'm pretty sure a lot of successful startups were started by people who had "absolutely no idea what they were doing". Give them a break...

[+] seldo|13 years ago|reply
Those who know me will laugh to see me continuing to beat this dead horse, but this is a really great example of why ORM+scaffolding is an anti-pattern, by which I mean it seems like a good idea at first, but the costs outweigh the benefits.

It's absolutely true that you can use ORM and scaffolding patterns in a totally secure way. But the problem is that the defaults are insecure -- every table can be accessed, every record is available, every field can be edited, and the URLs for doing so are (deliberately) easily guessable.

One of the simplest and most fundamental rules of effective security is to close everything down by default and only open things up as required, after careful consideration. Scaffolding breaks that rule.

[+] grey-area|13 years ago|reply
One of the simplest and most fundamental rules of effective security is to close everything down by default and only open things up as required, after careful consideration. Scaffolding breaks that rule.

This is really an argument for building authentication and authorization into every app, rather than against scaffolding/ORMs.

As rails doesn't have auth (of both kinds) built in, it doesn't really matter if they offer scaffolding or not - any editing url you make is going to be completely without protection unless you add it. The only thing you'd be adding by not having guessable urls without authentication/authorization is security through obscurity.

So IMHO the lack of auth is really the issue here (and the thing that breaks the rule in your final sentence), rather than the guessable urls.

[+] angryasian|13 years ago|reply
I think it has less to do with ORM and more to do with laziness. Its really one line in the controller, if loggedin user is not the user trying to edit redirect.
[+] nathan_long|13 years ago|reply
>> One of the simplest and most fundamental rules of effective security is to close everything down by default and only open things up as required, after careful consideration.

Which is why my Rails authorization library takes a whitelisting approach.

https://github.com/nathanl/authority#default_methods

[+] oelmekki|13 years ago|reply
disclaimer : I don't use scaffolding either

Well, scaffolds are not suppose to totally avoid coding. They try to provide what you may write again and again, but you're supposed to take that as a basis, not a final product.

[+] Andrex|13 years ago|reply
At first, I thought I was supposed to notice the complete lack of "HTTPS" in the URL bar.

The data leakage obviously overshadows it, but I can't think of a site that wouldn't be a better "fit" for SSL encryption than an app like this, aside from banking/government sites.

SSL might have been a "nice-to-have" back in the day when there were real arguments to be made against it (mostly performance-related), but even those don't really apply to a "pet project" made by "two nerds" (smeagol's words, not mine.) And for an app like this, I think it's critical.

Just my two cents.

[+] zacharyvoase|13 years ago|reply
I took a look at the team. Is it considered apropos to state how surprised you are that developers who come across as relatively senior are capable of making an incredibly fundamental security mistake such as this?

It looks like this was just the default Rails resource scaffolding.

[+] adgar2|13 years ago|reply
> Is it considered apropos to state how surprised you are that developers who come across as relatively senior are capable of making an incredibly fundamental security mistake such as this?

Only if you're concerned that you can't tell who is "relatively senior." In this case, your judgement was unfortunately wrong.

[+] ddellacosta|13 years ago|reply
I'm curious, did you let them know that this vulnerability exists before you wrote an article and posted it to HN?

If you let them know and they ignored you, then I understand that you'd want to write an article and spread it around. It's important that customers know when a company doesn't value their security. At that point, the proper way for them to handle it is to quietly fix it, and then let all their affected customers know so they have a chance to change their security settings.

However, if you didn't give them a bit of time first, then you are doing more damage than good to them--and their customers.

[+] RKearney|13 years ago|reply
They already knew and it was fixed before I posted this.
[+] adgar2|13 years ago|reply
Since you presented 2 possibilities here, neither of which are accurate, what is your response to the reality of the situation?
[+] sergiotapia|13 years ago|reply
Holy shit! I consider myself a mediocre programmer at best and even I wouldn't make such a dumb mistake. This is literally something only a amateur would do. I'm just awe struck that this would even happen. How?
[+] mattdeboard|13 years ago|reply
Beware the mistake you think you'll never make :)
[+] ainsleyb|13 years ago|reply
What we've found is that there are 2 mindsets: building and breaking. When you're building a product it's super hard to switch to the breaking mindset of security, simply because mental context switching is expensive and mentally exhausting. The most important thing is to force yourself into that mode before posting anything publicly. If you don't have the security experience, have a friend or service (like ours) look it over. Data is one of the most important assets to your company (or project), and any sort of disclosure can shut you down permanently.
[+] thisone|13 years ago|reply
it's something someone would do who's never worked with authentication and authorization before and doesn't have the fallback of a professional tester (aka breaker).

As people have mentioned, rails doesn't have it built in. I've used gems to provide it since I don't trust myself to write good enough security algorithms (and really, why reinvent the wheel if I don't have to).

In .net we can use the asp.net membership. But you've always got to have that authorization part, which I think can get forgotten about unless you've got a system under you belt or something/someone to crib from.

Sometimes you just don't think, and sometimes it becomes very public.

[+] zhodge|13 years ago|reply
Same here haha. I'd be pretty upset if that was my personal information up and available for all to see, but thankfully that wasn't the case.

How this happened is what I want to know too.

[+] paulsutter|13 years ago|reply
I'd love to see some concrete suggestions on the right way to do security for a site like this. This would take far more than protecting a few web pages from unauthorized access. What else should they do? How should they store sensitive data like AWS keys? Should they include a feature to force the creation of a new temporary key to prevent users from naively storing their master key? Is there a better mechanism than storing the key?
[+] gfodor|13 years ago|reply
One appropriate way to do this is to not build this type of application as a web service, but instead as a standalone application. (Disclaimer: I haven't really tried this product, for obvious reasons.)
[+] rdl|13 years ago|reply
I've been working on a similar kind of problem (linking to AWS); the right thing is to walk users through IAM credential creation (if you must have the keys), explaining exactly what rights are needed, why, and how to verify, and to not take anything more than you need.

I'd also encapsulate use of any user-provided sensitive data in an API then called by your service, and put some logic within the API (because I don't want a random web UI screwup to dump everything for everyone) -- rate limits, etc.

[+] rdl|13 years ago|reply
I've been working on a similar kind of problem (linking to AWS); the right thing is to walk users through IAM credential creation (if you must have the keys), explaining exactly what rights are needed, why, and how to verify, and to not take anything more than you need.
[+] jeromeparadis|13 years ago|reply
I'm puzzled. For the kind of service they want to offer, to have such newbie security vulnerabilities makes you wonder if it even works. It's the kind of flaw that any moderately experienced developer never leaves opened.
[+] citricsquid|13 years ago|reply
Security stuff aside, I'm curious why a system would be designed this way. Surely (in most systems) all users have the same page for managing their account (eg: /account) or is this system designed so that the management portion (eg: what a support person would use) is the same as what the users use? I don't think I've encountered a site that had accounts edited this way before.
[+] natrius|13 years ago|reply
The last time I used Rails (2009), the way RESTful URLs are set up encouraged this pattern. It's simple enough to restrict access to the user in question, but it is (or was) easy to overlook.
[+] notatoad|13 years ago|reply
don't confuse the URL structure with the application design. just because the url is /user/87/edit doesn't mean that there is a file called edit inside a folder called 87. almost any modern web development framework lets you create whatever URLs you want. i'm sure, internally, that every user's edit panel is powered by the same code.
[+] adgar2|13 years ago|reply
> I don't think I've encountered a site that had accounts edited this way before.

Any site that employs the pattern of specifying user account routes using the user's primary key in the URL needs to implement authorization. This site clearly skipped that step.

To me, this looks like the stereotypical bare-bones rails deployment by a newbie.

[+] alyx|13 years ago|reply
To be fair, the guy who owns that site did mention that it wasn't meant to be picked up by HN and was still in the early stages of development.
[+] philwelch|13 years ago|reply
If it's accessible on the public internet and asks for something as secure as API keys, that is when you should worry about security, not when it's "meant to be picked up by HN".
[+] kooshball|13 years ago|reply
I have mixed feeling about this.

On one hand we all want to move quickly, get users, add new features, etc etc.

On the other, security issues like this are just so vital that nothing else really matter if your data is not secure. It's especially true for a BACKUP SERVICE that promises ridiculous stuff like "99.999999999%" uptime on the frontpage.

[+] notatoad|13 years ago|reply
still... developing an application and then bolting on some security over top of it later seems like a recipe for disaster. And pushing it to a public server before any security has been implemented is a very stupid thing to do.
[+] hellosmithy|13 years ago|reply
Wow. That's taking the idea of an MVP a little too far. Just goes to show worth doing a little research before handing over any real information to a new service like this.
[+] ricksta|13 years ago|reply
I would recommend using CanCan for security if they haven't done so already so you can't just type in other users user_id in the url to view or edit. https://github.com/ryanb/cancan

Cancan is great way to make sure that you can only read or edit your own records in the database with Rails.

[+] catch23|13 years ago|reply
Cancan probably wouldn't have prevented this. It's not because someone didn't use some library. The developer probably just did a User.find(params[:id]) instead of doing something like current_user from whatever authentication system they were using. He probably used the scaffolding generator to make everything and forgot to go back and ensure things are secure.

It's also interesting that the aws key/secret are "masked" on the page, but you can just visit http://www.iceboxpro.com/users/12.json and get the formatted json representation with no masking.

[+] nathan_long|13 years ago|reply
This kind of thing can be handled with doing something like `current_user.accounts.find(params[:id)` instead of `accounts.find(params[:id])`.

If it's not your resource, it's like it doesn't exist for you.

[+] louischatriot|13 years ago|reply
I really don't think it has anything to do with the startup being 10 minutes old. That's security 101 we're talking about ...
[+] alinajaf|13 years ago|reply
The optimistic and humble side of me wants to believe that this is a rare occurrence.

The truth is that I don't remember working on a single codebase that didn't have some eventually discovered vulnerability in auth(entication|orization). When I eventually do comb through controllers and find easily exploited access-control violations, I've often been met with responses similar to the behaviour of the developers at Icebox.

Rails does and will continue to protect you from a lot of mistakes, but nothing is going to help long term unless you know what words like authentication, access control and session management mean.

If you're a professional web developer and you care about your users then please buy and have a read through The Web Application Hackers Handbook[1]. Every page is dripping with easily exploitable attacks you didn't think of. That last app you built is almost definitely vulnerable to a handful of them.

[1] http://www.amazon.com/gp/product/1118026470?ie=UTF8&tag=...

[+] metalruler|13 years ago|reply
Another hole that has been exploited in the past (speaking generally here, not about this specific startup) is a password reset function that confirms the email address it is sending the password/recovery link to. If the accounts are sequentially numbered, it's a trivial exercise to fetch a reset link for each member, and scrape the email address returned.
[+] driverdan|13 years ago|reply
Major data leaks / security issues like this are not confined to sites that are 10 minutes old. Yesterday I discovered a recently funded startup is exposing all personal user data and activity to the world via their public, unprotected APIs. I'm hoping they fix it quickly before someone interested in harvesting that data finds it.
[+] andrewcooke|13 years ago|reply
if anyone else is having problems viewing this: it's a serious hole in icebox, the service featured here http://news.ycombinator.com/item?id=4619132

if you've used that service, the information you entered was publicly visible (key to access aws, etc) (the thread linked above says it has now been patched).

[i don't understand why, but when i access the link for this thread i get the gzipped page as a download; linux + chrome 22; firefox displays what appears to be gzipped data; wget saves the gzipped data as index.html; same behaviour for chrome on opensuse and ubuntu; windows 7 + ie9 (in a vm) shows the gzipped data in notebook; is no-one else seeing this?!]

[update: fixed now - it looks like it wasn't changing the content type]

[+] RKearney|13 years ago|reply
What OS/Browser are you using?
[+] cinbun8|13 years ago|reply
I would rather say 'Never give your information to a startup that does not handle it responsibly'. Not every startup is irresponsible with data.

In this case you have a strong point. Good work finding the security issue and reporting it.