Heroku offered me a paid penetration test contract, but required that I sign a retroactive non-disclosure agreement which would have precluded publishing this article.
Worth pointing out: virtually all paid pentesting engagements are delivered under NDA. In fact, more often than not, they're done under the far-stricter terms of a master agreement with detailed IP clauses.
If you're talking to any firm about having your app tested, get an NDA in place, and don't feel bad about asking. Nobody who thinks they need it should forestall having their app checked out because they think the firm they're going to work with is going to try to make news out of their findings.
Obviously, if your customers find vulnerabilities themselves, all bets are off!
I can't blame Titanous under these circumstances, though.
It's a whole different ballgame when you're testing applications under contract (as I know you know), but as a budding researcher/security guy, it might be better for him to publish via responsible disclosure (as he did just now) than to accept a temporary (presumably secret) contract.
I know that ethical disclosure is a whole can of worms that isn't completely relevant to this conversation, but I'm a firm believer that Titanous went about this the right way. First off, he is the customer (as a Heroku user), and secondly he notified Heroku and waited until the vulnerability was fixed before publishing his findings. If all researchers behaved in such a responsible way, we'd probably be in a better place as an industry.
Anyway, the point I'm making is that, sure, having a penetration testing contract with Heroku might have been nice for his career, but I think being able to point to this research that he's conducted on his own and responsibly disclosed is far better. Hell, I'd offer him a job right now if he seemed interested.
Of course there was never any question that any new penetration testing I would do with them would be under NDA. What worried me is that the disclosure may not be made public if I was under a NDA that prevented me from publishing it. I didn't want to have any professional or ethical conflicts.
Very glad to see this not only documented, but patched extremely quickly. Heroku continues to impress, and Titanous is a credit to the security profession.
This is a really interesting insight into the way someone found access to a system but could someone explain to me why he needed PGP keys from heroku? I'm sure there is some good reason but if someone could tell me that would be great.
It's standard practice to encrypt your emails when sending potential security notifications to the party that you're notifying. Not only does it keep out prying eyes, it ensures the person sending the email is who they say they are.
This is a minor point, but one of my favorite details: the use of a simple table for the "Disclosure Timeline", which is really the clearest way to illustrate the step-by-step sequence of events. I would love to see this be a standard practice for any narrative in which order/visual comparison is vital. HTML tables are so easy to include (or, even bulleted lists).
A masterful explanation, on top of being an altruistic deed.
Perhaps a fantastic example of why keeping detailed configuration in environment variables is probably a bad idea. Your private SSH key isn't part of the environment. It's configuration. Treat it as such.
Not sure that would have helped much in this case - he used dumping the environment as the particular avenue to get the "private" data, but he mentioned he had source code access and the ability to run untrusted code. If important credentials were in or available to the code, it sounds like they would have been vulnerable anyway.
It's a hard problem trying to secure credential that code needs to work, from other code running as the same user when someone has source code access to "authorised" code.
Well someone could have easily discovered the environment variables vuln without getting the source code, although it is unclear why so much was disclosed in env vars. It suggests that everything runs as the same user, or you would not get anything.
[+] [-] tptacek|13 years ago|reply
Worth pointing out: virtually all paid pentesting engagements are delivered under NDA. In fact, more often than not, they're done under the far-stricter terms of a master agreement with detailed IP clauses.
If you're talking to any firm about having your app tested, get an NDA in place, and don't feel bad about asking. Nobody who thinks they need it should forestall having their app checked out because they think the firm they're going to work with is going to try to make news out of their findings.
Obviously, if your customers find vulnerabilities themselves, all bets are off!
[+] [-] david_shaw|13 years ago|reply
It's a whole different ballgame when you're testing applications under contract (as I know you know), but as a budding researcher/security guy, it might be better for him to publish via responsible disclosure (as he did just now) than to accept a temporary (presumably secret) contract.
I know that ethical disclosure is a whole can of worms that isn't completely relevant to this conversation, but I'm a firm believer that Titanous went about this the right way. First off, he is the customer (as a Heroku user), and secondly he notified Heroku and waited until the vulnerability was fixed before publishing his findings. If all researchers behaved in such a responsible way, we'd probably be in a better place as an industry.
Anyway, the point I'm making is that, sure, having a penetration testing contract with Heroku might have been nice for his career, but I think being able to point to this research that he's conducted on his own and responsibly disclosed is far better. Hell, I'd offer him a job right now if he seemed interested.
[+] [-] Titanous|13 years ago|reply
[+] [-] Titanous|13 years ago|reply
[+] [-] kposehn|13 years ago|reply
[+] [-] seany|13 years ago|reply
[+] [-] cjeane|13 years ago|reply
[+] [-] redslazer|13 years ago|reply
[+] [-] leftnode|13 years ago|reply
[+] [-] unknown|13 years ago|reply
[deleted]
[+] [-] unknown|13 years ago|reply
[deleted]
[+] [-] lawn|13 years ago|reply
[+] [-] AndyKelley|13 years ago|reply
[+] [-] danso|13 years ago|reply
A masterful explanation, on top of being an altruistic deed.
[+] [-] illamint|13 years ago|reply
[+] [-] bigiain|13 years ago|reply
It's a hard problem trying to secure credential that code needs to work, from other code running as the same user when someone has source code access to "authorised" code.
[+] [-] justincormack|13 years ago|reply
[+] [-] katsumeiyo|13 years ago|reply
[deleted]
[+] [-] sodelate|13 years ago|reply