Good thing Satoshi wasn't scared of cryptography or we wouldn't have bitcoin. Also good thing the Truecrypt devs weren't scared of it, or the Debian devs, or the OpenBSD devs and others who gave us OpenSSH.
Moxie Marlinspike doesn't seem to be scared of it either. According to him it's not impossible to learn simple, tried and true methods for crypto engineering that are all spelled out in Schneier's books and various white papers. These standards were completely ignored by the cryptocat guys (and ignored repeated warnings from auditors). So if you can read, comprehend and do bare minimum research of NIST standards you're pretty safe rolling your own crypto software if you know about attention to detail, the doom principal, PRNGs and their flaws, and know not to attempt to make your own primitives. If you guy's can push out incredibly complex and relatively secure financial trading software there's no reason you can't roll your own SHA-256 on your Android phone replacing SHA-1 FDE or even design Redphone like Moxie did. Would also help if you hung around hashcat forums, read Schneier's cryptogram newsletter, and watched some lectures about software crypto engineering.
No, some "tried and true" methods of using crypto are in one of Schneier's books, _Practical Cryptography_. _Practical_ is great, but it's not complete, a fact mitigated by the effort Schneier and Furgusen go to in that book to convince people not to write casual crypto.
The methods described in _Applied Cryptography_ are unfortunately well-tried, but few of them are true. _Applied_ is an almanac of cryptographic concepts. Where _Practical_ tries hard to present best practices at every point in the book, _Applied_ instead strives for the broadest coverage; it's a survey, not an instruction guide. Unfortunately for all of us, ~20 years of _Applied_ readers have tried to put directly into practice the material in that book, much of which is (relevant to modern standards) half-baked.
It's also important to know that _Practical_ is also showing its age. There are very common, very serious vulnerabilities that _Practical_ does very little to prevent. For instance, "Design Rule 4" in _Practical_, "The Horton Principle", implies that message authentication should precede encryption. This construction is now disfavored; protocols that use mac-then-encrypt have been broken with side channels abetted by attacker chosen ciphertext.
Other concepts missing from _Practical_: elliptic curve --- particularly given endangered status of RSA, which would be a nit if not for the extremely detailed coverage Schneier gives to more theoretical threats to AES, (EC)DSA parameter tampering, the notion of minimizing randomness from cryptosystems, the modern AE modes, an in-depth treatment of side channels (where "in-depth" might mean "at least as much coverage as is given to the question of what block cipher mode to use") --- particularly error oracles, applications of hash collisions, RSA message formatting (it doesn't even cover OAEP), and secure key derivation.
I write this from a place of love; _Practical Cryptography_ is one of my all-time favorite software security books. It's one of those books you can "read backwards" to learn how to break systems in addition to learning how to build them. With age, though, _Practical_ is becoming more useful as a breaker's guide and less and a builder's guide.
There is no book anywhere that a generalist developer can read to learn how to build a secure cryptosystem from scratch, and generalist developers should be relying instead on high-level libraries like Keyczar and Nacl.
I don't know a think about Satoshi's competence (I couldn't, because nobody knows who he is), but Moxie Marlinspike has spent over 15 years building his competence. If you're Moxie, build whatever you want.
You're right, but there's still a valid point I think.
I'm a reasonably competent home mechanic. I happily rebuild engines and gearboxes for cars and motorcycles at home. I can modify and tune engines. I easily do all my households vehicle maintenance. And I happily help friends out with their repairs and maintenance. One thing I won't do though, is fix a friends brakes. I'm happy working on my own brakes, and happy to accept the responsibility for my actions. I won't take hat responsibility on for friends though. If you're not happy/capable of fixing your own brakes, take it to someone qualified and insured.
I think crypto is the same. There's nothing wrong with learning about and writing your own crypto as a curious amateur. But deploying that crypto where it can get other people killed is a bigger deal. Don't be "the cryptocat guy", "Oh yeah, I just found out you can't use water for brake fluid, turns out it boils when it gets hot! Who knew? Anyway, I just put brake fluid in my brakes, you should probably upgrade yours too sometime…"
I think if you want to _deploy_ crypto to other people, you owe it to yourself to have at least as much "professional training" as an auto mechanic gets before they're allowed to repair your brakes. And to further stretch the analogy - I'll point out tha most mechanics these days don't "repair" much in brake systems, they'll just replace large components - its becoming more common to not be able to buy seals and pistons any more - you're expected to just replace entire calipers or master cylinders. In my head that's the same approach as saying "I'll choose to use GPG or OpenSSL, but I won't 'do my own crypto'" - for someone who's _not_ prepared to put in a few hundred hours of proper crypto training, I think "rolling your own crypto software" is something best left to personal/educational projects, and never deployed I a way that other people might rely upon it.
"Good thing Satoshi wasn't scared of cryptography or we wouldn't have bitcoin."
...a system for which there is no security definition, no security proof, a known polynomial time attack (for some vague notion of "attack"), etc. I do not think this is the best example you can come up with.
One of the problems here is that cryptography cannot just be cobbled together from component pieces. Things do not always compose securely. Bitcoin is not a signature system, so even if the ECDSA implementation is perfect it does not guarantee that Bitcoin is secure. One of the things that makes cryptography, especially in the public-key setting, difficult is the very precise assumptions and very precise definitions of security; it is easy to make an assumption that is not implied by a security definition, and very easy to violate a key assumption without even realizing it. Worse, cryptosystems must be composed with other software, and such compositions can be insecure (which is basically what happened with Skype).
It's not that people should be scared away. Rather, we need to develop better tools that help programmers catch or avoid these kinds of bugs.
What I've been doing is just reading the code of already audited inplementations of cryptography to understand it better. For example Skein/Threefish. As a test to see what kind of speed SHA-3 would have on a mobile device I implemented my own Threefish SHA3 on Android for full disc encryption. First I looked at Pyskein, Java skein and all the other implementations released by pros. Then I looked at everything the authors who submitted other NIST3 finalists wrote, blogged and argued about on mailing lists. Then I read about the android SHA-1 design decisions. Then I watched lectures on crypto software engineering. Then I looked up everything about accelerometer side channels, PRNGs problems and gotchas, Moxies blog posts, and his yaffs source he used for whispercore, and went through tons of Dos and Dont comments both here and on Schneier's blogs about writing secure software. Then I watched BSDcon coding the openbsd secure way, and talked with maintainers who implement crypto on the openbsd mailing list.
Result is SHA3 full disk for Android. Its certainly not foolproof or nor do I claim its secure but a lot of people are digging through the source and forensics hackers are interested in seeing what they can do with it. I realize I broke cardinal rule of using something that hasn't been proven for decades first (like AES 256) but its been a good learning experience thats for sure.
Thanks for sharing your perspective and knowledge here.
Really important to know how to be safe and what to look out for. I wasn't getting the feeling that many infosec professionals on HN were able/willing to outline reasonable guidelines for experienced devs/admins to follow.
For Bitcoin what would the consequences have been of getting it wrong at the beginning? Essentially no money would have been lost, no lives lost etc. Also it appears that the person who did create it had a good understanding of crypto and the recent literature (and may even have written it).
It is possible to learn it but learn it first and ship it to the public later or if you must with great big experiment/insecure warnings all over it.
Satoshi is more than likely one or more individuals with significant crypto experience. Open{SSH/BSD/SSL} developers have significant crypto experience.
Don't get me wrong I am a huge Debian fanboy, but how much crypto code do Debian developers write?
This is not really the most insightful comment, but anyway.
I found the article to be needlessly defeatist.
I am not a crypto expert. I've read Practical Cryptography and have a lots of experience with software engineering in general.
This article (yet again) loosely says "crypto" without specifying whether he's talking about "crypto protocols" or "crypto primitives" (I use "protocol" in a theoretical sense: saving a file piped through gpg with passphrase rememembered in memory constitutes a protocol).
It's well understood that mere mortals shouldn't create "crypto primitives". But I would argue that we're soon going to reach the point where many software engineers will have to understand crypto protocol creation.
Just like many software engineers in the past 10 years or so have had to become aware of multicore (NoSQL, horizontal scaling, Go/Rust concurrency, probably even async callbacks in Javascript etc are all different aspect of multicore, imho).
I don't think we as a profession should abdicate our responsibility to store/transport data securely by just saying "crypto is hard; so don't do it".
I also take issue with the claim that crypto code is either 100% working, or 0% working. From a crypto theory point-of-view, yes, that is how cryptographers think.
But in practice, there is a VAST difference between an attack that requires 2^32 INTERACTIONS with a remote server and one that requires 2^32 COMPUTATIONS on the attackers machine. A cryptographer would say both attacks are equally easy (kinda like O(n) notation).
Just my two cents.
And finally, re: the RNG vulnerability in Cryptocat, that is very bad and just sloppy coding. But even that vulnerability required that the attacker compromise the private SSL key of cryptocat's server. Defense in depth FTW.
No, it's not just crypto primitives, it's crypto protocols, too. It's even designing other protocols that run on top of crypto protocols (see CRIME). Even implementing existing crypto primitives is hazardous: D.J. Bernstein's work on cache timing attacks against AES are proof of that.
The problem with crypto is that it's a specialist profession that generalists think they can do. In reality, crypto is more like law than software engineering: You can't just reason it from first principles, and you can't write any tests that will tell you that it's working. You have to know the specific attacks that people are capable of, and come up with a strategy that will avoid not only the current attacks that you know about, but future attacks that haven't been discovered yet. You're tasked with building a system of obstacles that nobody will be able to find any clever workarounds for, and your adversaries are smarter than you, more numerous than you, more well-funded than you, they're experts in breaking crypto, and they're all from the future.
That's not something a handful of engineers working in isolation can do. It takes the whole community years to even come close.
Saying that software engineers can design crypto protocols is like saying that software engineers can be their own lawyers. A few can, but almost all engineers who think they know the law are wrong. Even lawyers routinely lose cases. The best we can do is to try to give them better tools to handle the common cases (e.g. Creative Commons, SSH, better APIs), and remind them to talk to the specialists before getting too creative.
The whole purpose of Cryptocat is that people are supposed to be able to use it without trusting the integrity of the server. That's practically the project thesis of Cryptocat.
I don't think practical cryptographers ignore the difference between computations and interactions—there are different threat models and they are carefully studied. Part of the problem might be that a system designed to be safe against 2^32 interactions is deployed in a place where the system is vulnerable to attacks on the order of 2^32 computations.
The problem I think the author is highlighting is that the mistake risk distribution is hard to understand. Some kinds of off by one errors may cause relatively small reductions in security margins. Some kinds may cause complete loss of system integrity. It's hard to distinguish between the two without extensive testing and expertise. Furthermore, the kind of user who would never tell you about your error is exactly the kind who will find it.
Finally, loss of your security infrastructure is unlikely to cause just small visual discomfort to your users—it's likely to hurt them materially.
The thing about crypto is that nobody can get it right the first time. It requires two things that most programmers have in relatively short supply. Radical transparency and radical humility.
I mentioned on another crypto thread the frustrating fact that more than one professional crypto friend of mine is working on designs that could replace PGP, not to mention a lot of terrible ad hoc app crypto --- but because they're pro's, they're uncomfortable sharing designs until they're confident that they've fully validated them. That's what humility looks like: knowing you're an industry leading expert at cryptographic design and still waiting months or even years to publish so you can make sure you got things right.
That attitude to crypto is pervasive, annoying, and wrong. We don't tolerate the "you're too stupid to use that" attitude in any other part of software development, and we shouldn't tolerate it in cryptography.
Every developer needs to touch crypto. Encrypted communications needs to be our default. And yes, of course, we should prefer verified, standard algorithms (NSA Suite B, for example).
It's OK to get it wrong, it's OK to fail forward, even with cryptography. ROT13 will protect you very well, if your attack vector is someone glancing over your shoulder for 1 second. As long as the code is open, and you're honest about what it does, you've made people a little bit safer.
There's a fair amount of gloating around Cryptocat, but it protected people's communications from me, because I didn't know how to break it. So that's better than nothing.
>We don't tolerate the "you're too stupid to use that" attitude in any other part of software development, and we shouldn't tolerate it in cryptography.
We shouldn't, but we should provide tools that allow software engineers to securely design applications without having to be crypto experts, in much the same way I can write python code without being a kernel hacker. Two examples spring to mind: Authenticated https api calls and bcrypt. These both work securely without requiring deep knowledge and they are so easy to setup it is unlikely someone will roll their own.
You can PLAY with crypto, discuss it try things out and have fun with it. After you have been doing that and hanging out in the right circles for a few years, reading lots and probably breaking other people's ideas and implementations not just yours then you will be some sort of position to judge whether your work may be safe to unleash on the public as anything more than a low security experiment.
One of the problems is with security and crypto is that the people who really understand it make fairly weak promises such as that it is "Pretty Good Privacy" but the incompetent, greedy or malicious make strong marketing claims about the security that they are offering. Emphasis on incompetent in the Cryptocat case.
Crypto is an area where the Dunning Kruger effect[1] seems both especially strong and especially dangerous.
Everybody should be able to fly a plane, too. Think of how awesome that would be! A rebirth of the American general aviation industry; new airplane designs; solutions to congested airports.
It does not follow from that sentiment that anyone should be able to jump into the cockpit of a Cessna and just figure things out for themselves.
There's a fair amount of gloating around Cryptocat, but it protected people's communications from me, because I didn't know how to break it. So that's better than nothing.
Not if 'nothing' is "don't send the message", rather than the "send the message in the clear" that you're assuming.
Bad crypto gives end users false confidence in the security of their messages. They then send messages they normally wouldn't, and suffer the consequences when those messages end up being read by others.
Amateurs can play with crypto all they like for fun, but they have no business releasing a product to end users.
The math behind OTP is pretty simple, but I may have made a mistake. I've posted the source code to crypto forums, HN, wilders security and emailed it to several prominent crypto developers/experts. No one seems to care nor want to look at it in-depth.
I've worked as a dev where we encrypted research data using standard, industry-accepted crypto (RSA and symmetric AES, etc) as well.
Having said that, I'm not an expert. And the real experts (Bruce Schneier) won't verify anything. They just say, "It's not been broken yet, which is a good indication."
Dive in and write some crypto code. Do it and make mistakes. That's how you learn. Not every program is life or death/mission critical. And if you never do it, you won't learn how. We learn by making mistakes.
Tarsnap is nice crypto code if you like C. It's easy to read too. I have no relation to Colin Percival or tarsnap. He's an expert, and even he makes mistakes:
No one seems to care nor want to look at it in-depth.
I'm not an expert, but I wouldn't look at an OTP implementation either. Key management is already hard enough. It's hard to imagine a useful system you could build on a one-time pad.
All these "crypto.... scary!" things are right of course. But somehow tiresome. They're ripe for satire... something involving Chuck Norris being the only one perfect enough to encrypt stuff, just as he's able to compress random data. 100%. With his biceps.
I would agree, but for something as important as cryptography, I would prefer people error on the side of too cautious or repetitive if it means less people making the same mistake twice. It's the same reason why I don't mind repeated HN submissions if it's a good enough delay - the discussion was clearly relevant the first time, and new people have joined the community since then and deserve an equal chance to talk about the subject with new perspective.
This point bears repeating every so often: cryptography is difficult, and those of us who haven't spent years in its mathematical bowels are not qualified to create usable algorithms. Meaning: the encryption code you wrote yourself is probably hackable. Easily hackable.
Great link! The takeaway for me is that in an ideal world we would be able to use only strong peer-reviewed crypto written by programmers experienced in codebreaking and cryptanalysis. Phil Zimmerman admits in that article that this is his ideal standard and yet his own PGP software doesn't meet his ideal.
So we're in a situation where we have to make imperfect choices. Is there any reliable guide that discusses the current best practices in securing things like ssh, VPN, etc?
I get the feeling that the preference of many of the infosec professionals on HN is that a properly configured SSH tunnel is the only secure way to communicate between machines on the Internet. ( IPSec VPN connections I guess are weaker?)
So just use stunnel or autotunnel? Tunneling traffic imo seems like an inelegant hack, but are there any crypto libraries that are reliable?
Crypto is hard, but to be fair it's also not easy to figure out what crypto to use if you want to rely on external libraries or systems like SSH.
I feel kind of divided about using the Cryptocat as an example. On the one hand, yes, off-by-one errors (and similar simple mistakes) are really easy to make if you you're writing something with low feedback.
But the Cryptocat example is frustrating because the code is so bad. Not in a "poorly written" sense, but in a "going about things in a completely insane way" sense. The code generates a random number between 0 and 1 not by dividing a 53 bit random number by 2^53, but by generating 16 decimal digits, with the rationale that 2^53 ≈ 10^16. If the code hadn't been trying to do the wrong thing in the first place, there wouldn't have been an opportunity for that off-by-one error.
My point isn't that you'll be protected from simple classes of errors as long as you're coding sanely, but rather that that is the lesson you teach if you point to Cryptocat as your example. Crypto is hard, but you can't make that point very well by pointing to code written by someone who doesn't know the correct (and incredibly standard) solutions to common non-crypto problems.
"Crypto can't be a «bit broken». It can't be «mostly working». Either it's 100% correct, or you shouldn't have bothered doing it at all. The weakest link breaks the whole chain."
I completely agree. Ever seen a password hash mega-post on paste-bin? Bought to you by bad crypto. Saying "I don't
know" is one of the hardest things to do, and also one of the most powerful. I applaud you for knowing your limits in this regard. I like to say crypto is like surgery, best left for the experts. You never want to "I kinda messed up that triple bypass," because there are major consequences for it. The same is true for virtually all aspects of cryptography. Thank you for this post!
I don't like this attitude. It's really anti-hacker.
Of course, the world is littered with cryptosystems that some foo thought would be secure and wasn't. XOR isn't encryption. ;)
I would suggest that hackers interested in crypto take a sober and unafraid look at the history of crypto and then spend time breaking ciphers, "hacking" in all senses of the word. Read the crypto papers and work on the advanced math until you really and truly grasp "all the things". Pour those 10K hours in....
Crypto is hard. Doing it mostly right requires humility and hard study, and then you have a good chance of being wrong. Nothing to be afraid of if you understand the situation and understand the years and years it takes to really become good at it...
I have said this before, but what we really need is the equivalent of SQL for cryptography. We need a language that can be used to describe what kind of security is needed from a cryptosystem and a compiler to turn that into code. It is not just that cryptography is hard to implement; it is also easy to use it incorrectly, compose it poorly with other systems (see the Skype attack), etc.
I recently skimmed a giant manuscript about the creation of a language for defining cryptographic requirements and primitives (120ish pages). I seem to recall that the author was either from Stanford and/or had a name that begins with a "P". I wish I had saved the citation. Hopefully someone else will be able to recognize what I am referring to with my shoddy reference.
It's not just a matter of perfection in the field of programming. It's an entirely different field that happens to involve some of the same things and ideas. Just because you're a driver, it doesn't mean you're a qualified mechanic, and if you're a mechanic, it doesn't mean you're a forensic crash investigator. They are related in terms of subject matter, but they are not equivalent skill sets.
As programmers, we are dealing with interfaces and movement/copying of data. Security deals with epiphenomenal leakage of access and data. It is not programming, though it involves programming.
These also seem like good reasons not to use proprietary/closed source solutions.
At least with cryptocat everyone know knows about the security issues now and take precautions as if their entire encrypted communications have been compromised. With proprietary solutions you have to rely on an honor system where you hope the vendor tells you there is an issue.
The article implies that the off-by-one error in the Cryptocat random number generator is catastrophic. Correct me if I am wrong, but surely that is hardly the case. The random numbers generated are a tiny bit less uniformly distributed than they would be without the error. That's an imperfection but does it have any practical consequences at all ? I know that flaws like this can introduce weaknesses that are vulnerable to attacks, but my intuition tells me this flaw is very close to the "doesn't matter" end of the catastrophic->doesn't matter spectrum. I am more than willing to be educated by someone with specialised domain knowledge.
There ought to be a good set of FOSS unit tests for those who dare implement their own crypto. For instance, you let it hook in to your PRNG, and it'll tell you if the output is random-looking enough.
It wouldn't be a panacea for bad crypto, and it does create a risk of people thinking "oh, it passed all of the tests, it must be secure," while still implementing it overall incorrectly. But I still think it would mitigate these "foolish/easy" errors and allows devs to focus on proper overall implementation.
When you can't have bugs, you can have proofs. While unknown angles of attack can exists, it is still possible to prove that the known ones are closed.
But if few people bother themselves with encryption code then there are not many eyeballs on it and mistakes or backdoors become more likely. Probably encryption code should not be left entirely to the "experts" at NSA, Microsoft, etc.
Assuming you're talking about implementing, and not using preexisting, vetted libraries:
Spend a long time in the community. Listen closely to the people who have a good reputation. Have your code publicly vetted before it's ever used. Expect your code to be torn apart.
"Writing a testcase for this would have required complicated thinking and coding which would be as likely to contain an error as it was likely for the code to be tested to contain an error."
i don't know the details, by why couldn't you just collect a million bits of supposedly random data from the generator and run it through the nist test suite? it's true that testing for randomness does require complicated thinking, but fortunately it's been done. standing on the shoulders of giants and whatnot.
[+] [-] dobbsbob|12 years ago|reply
Moxie Marlinspike doesn't seem to be scared of it either. According to him it's not impossible to learn simple, tried and true methods for crypto engineering that are all spelled out in Schneier's books and various white papers. These standards were completely ignored by the cryptocat guys (and ignored repeated warnings from auditors). So if you can read, comprehend and do bare minimum research of NIST standards you're pretty safe rolling your own crypto software if you know about attention to detail, the doom principal, PRNGs and their flaws, and know not to attempt to make your own primitives. If you guy's can push out incredibly complex and relatively secure financial trading software there's no reason you can't roll your own SHA-256 on your Android phone replacing SHA-1 FDE or even design Redphone like Moxie did. Would also help if you hung around hashcat forums, read Schneier's cryptogram newsletter, and watched some lectures about software crypto engineering.
[+] [-] tptacek|12 years ago|reply
The methods described in _Applied Cryptography_ are unfortunately well-tried, but few of them are true. _Applied_ is an almanac of cryptographic concepts. Where _Practical_ tries hard to present best practices at every point in the book, _Applied_ instead strives for the broadest coverage; it's a survey, not an instruction guide. Unfortunately for all of us, ~20 years of _Applied_ readers have tried to put directly into practice the material in that book, much of which is (relevant to modern standards) half-baked.
It's also important to know that _Practical_ is also showing its age. There are very common, very serious vulnerabilities that _Practical_ does very little to prevent. For instance, "Design Rule 4" in _Practical_, "The Horton Principle", implies that message authentication should precede encryption. This construction is now disfavored; protocols that use mac-then-encrypt have been broken with side channels abetted by attacker chosen ciphertext.
Other concepts missing from _Practical_: elliptic curve --- particularly given endangered status of RSA, which would be a nit if not for the extremely detailed coverage Schneier gives to more theoretical threats to AES, (EC)DSA parameter tampering, the notion of minimizing randomness from cryptosystems, the modern AE modes, an in-depth treatment of side channels (where "in-depth" might mean "at least as much coverage as is given to the question of what block cipher mode to use") --- particularly error oracles, applications of hash collisions, RSA message formatting (it doesn't even cover OAEP), and secure key derivation.
I write this from a place of love; _Practical Cryptography_ is one of my all-time favorite software security books. It's one of those books you can "read backwards" to learn how to break systems in addition to learning how to build them. With age, though, _Practical_ is becoming more useful as a breaker's guide and less and a builder's guide.
There is no book anywhere that a generalist developer can read to learn how to build a secure cryptosystem from scratch, and generalist developers should be relying instead on high-level libraries like Keyczar and Nacl.
I don't know a think about Satoshi's competence (I couldn't, because nobody knows who he is), but Moxie Marlinspike has spent over 15 years building his competence. If you're Moxie, build whatever you want.
[+] [-] bigiain|12 years ago|reply
I'm a reasonably competent home mechanic. I happily rebuild engines and gearboxes for cars and motorcycles at home. I can modify and tune engines. I easily do all my households vehicle maintenance. And I happily help friends out with their repairs and maintenance. One thing I won't do though, is fix a friends brakes. I'm happy working on my own brakes, and happy to accept the responsibility for my actions. I won't take hat responsibility on for friends though. If you're not happy/capable of fixing your own brakes, take it to someone qualified and insured.
I think crypto is the same. There's nothing wrong with learning about and writing your own crypto as a curious amateur. But deploying that crypto where it can get other people killed is a bigger deal. Don't be "the cryptocat guy", "Oh yeah, I just found out you can't use water for brake fluid, turns out it boils when it gets hot! Who knew? Anyway, I just put brake fluid in my brakes, you should probably upgrade yours too sometime…"
I think if you want to _deploy_ crypto to other people, you owe it to yourself to have at least as much "professional training" as an auto mechanic gets before they're allowed to repair your brakes. And to further stretch the analogy - I'll point out tha most mechanics these days don't "repair" much in brake systems, they'll just replace large components - its becoming more common to not be able to buy seals and pistons any more - you're expected to just replace entire calipers or master cylinders. In my head that's the same approach as saying "I'll choose to use GPG or OpenSSL, but I won't 'do my own crypto'" - for someone who's _not_ prepared to put in a few hundred hours of proper crypto training, I think "rolling your own crypto software" is something best left to personal/educational projects, and never deployed I a way that other people might rely upon it.
[+] [-] betterunix|12 years ago|reply
...a system for which there is no security definition, no security proof, a known polynomial time attack (for some vague notion of "attack"), etc. I do not think this is the best example you can come up with.
One of the problems here is that cryptography cannot just be cobbled together from component pieces. Things do not always compose securely. Bitcoin is not a signature system, so even if the ECDSA implementation is perfect it does not guarantee that Bitcoin is secure. One of the things that makes cryptography, especially in the public-key setting, difficult is the very precise assumptions and very precise definitions of security; it is easy to make an assumption that is not implied by a security definition, and very easy to violate a key assumption without even realizing it. Worse, cryptosystems must be composed with other software, and such compositions can be insecure (which is basically what happened with Skype).
It's not that people should be scared away. Rather, we need to develop better tools that help programmers catch or avoid these kinds of bugs.
[+] [-] dobbsbob|12 years ago|reply
Result is SHA3 full disk for Android. Its certainly not foolproof or nor do I claim its secure but a lot of people are digging through the source and forensics hackers are interested in seeing what they can do with it. I realize I broke cardinal rule of using something that hasn't been proven for decades first (like AES 256) but its been a good learning experience thats for sure.
[+] [-] 23david|12 years ago|reply
Really important to know how to be safe and what to look out for. I wasn't getting the feeling that many infosec professionals on HN were able/willing to outline reasonable guidelines for experienced devs/admins to follow.
[+] [-] josephlord|12 years ago|reply
It is possible to learn it but learn it first and ship it to the public later or if you must with great big experiment/insecure warnings all over it.
[+] [-] dfc|12 years ago|reply
Don't get me wrong I am a huge Debian fanboy, but how much crypto code do Debian developers write?
[+] [-] lightyrs|12 years ago|reply
[+] [-] taway2012|12 years ago|reply
I found the article to be needlessly defeatist.
I am not a crypto expert. I've read Practical Cryptography and have a lots of experience with software engineering in general.
This article (yet again) loosely says "crypto" without specifying whether he's talking about "crypto protocols" or "crypto primitives" (I use "protocol" in a theoretical sense: saving a file piped through gpg with passphrase rememembered in memory constitutes a protocol).
It's well understood that mere mortals shouldn't create "crypto primitives". But I would argue that we're soon going to reach the point where many software engineers will have to understand crypto protocol creation.
Just like many software engineers in the past 10 years or so have had to become aware of multicore (NoSQL, horizontal scaling, Go/Rust concurrency, probably even async callbacks in Javascript etc are all different aspect of multicore, imho).
I don't think we as a profession should abdicate our responsibility to store/transport data securely by just saying "crypto is hard; so don't do it".
I also take issue with the claim that crypto code is either 100% working, or 0% working. From a crypto theory point-of-view, yes, that is how cryptographers think.
But in practice, there is a VAST difference between an attack that requires 2^32 INTERACTIONS with a remote server and one that requires 2^32 COMPUTATIONS on the attackers machine. A cryptographer would say both attacks are equally easy (kinda like O(n) notation).
Just my two cents.
And finally, re: the RNG vulnerability in Cryptocat, that is very bad and just sloppy coding. But even that vulnerability required that the attacker compromise the private SSL key of cryptocat's server. Defense in depth FTW.
[+] [-] dlitz|12 years ago|reply
The problem with crypto is that it's a specialist profession that generalists think they can do. In reality, crypto is more like law than software engineering: You can't just reason it from first principles, and you can't write any tests that will tell you that it's working. You have to know the specific attacks that people are capable of, and come up with a strategy that will avoid not only the current attacks that you know about, but future attacks that haven't been discovered yet. You're tasked with building a system of obstacles that nobody will be able to find any clever workarounds for, and your adversaries are smarter than you, more numerous than you, more well-funded than you, they're experts in breaking crypto, and they're all from the future.
That's not something a handful of engineers working in isolation can do. It takes the whole community years to even come close.
Saying that software engineers can design crypto protocols is like saying that software engineers can be their own lawyers. A few can, but almost all engineers who think they know the law are wrong. Even lawyers routinely lose cases. The best we can do is to try to give them better tools to handle the common cases (e.g. Creative Commons, SSH, better APIs), and remind them to talk to the specialists before getting too creative.
[+] [-] tptacek|12 years ago|reply
[+] [-] tel|12 years ago|reply
The problem I think the author is highlighting is that the mistake risk distribution is hard to understand. Some kinds of off by one errors may cause relatively small reductions in security margins. Some kinds may cause complete loss of system integrity. It's hard to distinguish between the two without extensive testing and expertise. Furthermore, the kind of user who would never tell you about your error is exactly the kind who will find it.
Finally, loss of your security infrastructure is unlikely to cause just small visual discomfort to your users—it's likely to hurt them materially.
[+] [-] noonespecial|12 years ago|reply
[+] [-] tptacek|12 years ago|reply
Compare and contrast.
[+] [-] ww520|12 years ago|reply
[+] [-] graham_king_3|12 years ago|reply
Every developer needs to touch crypto. Encrypted communications needs to be our default. And yes, of course, we should prefer verified, standard algorithms (NSA Suite B, for example).
It's OK to get it wrong, it's OK to fail forward, even with cryptography. ROT13 will protect you very well, if your attack vector is someone glancing over your shoulder for 1 second. As long as the code is open, and you're honest about what it does, you've made people a little bit safer.
There's a fair amount of gloating around Cryptocat, but it protected people's communications from me, because I didn't know how to break it. So that's better than nothing.
[+] [-] EthanHeilman|12 years ago|reply
We shouldn't, but we should provide tools that allow software engineers to securely design applications without having to be crypto experts, in much the same way I can write python code without being a kernel hacker. Two examples spring to mind: Authenticated https api calls and bcrypt. These both work securely without requiring deep knowledge and they are so easy to setup it is unlikely someone will roll their own.
[+] [-] josephlord|12 years ago|reply
One of the problems is with security and crypto is that the people who really understand it make fairly weak promises such as that it is "Pretty Good Privacy" but the incompetent, greedy or malicious make strong marketing claims about the security that they are offering. Emphasis on incompetent in the Cryptocat case.
Crypto is an area where the Dunning Kruger effect[1] seems both especially strong and especially dangerous.
[1] http://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect
[+] [-] tptacek|12 years ago|reply
It does not follow from that sentiment that anyone should be able to jump into the cockpit of a Cessna and just figure things out for themselves.
[+] [-] gyardley|12 years ago|reply
Not if 'nothing' is "don't send the message", rather than the "send the message in the clear" that you're assuming.
Bad crypto gives end users false confidence in the security of their messages. They then send messages they normally wouldn't, and suffer the consequences when those messages end up being read by others.
Amateurs can play with crypto all they like for fun, but they have no business releasing a product to end users.
[+] [-] 16s|12 years ago|reply
http://16s.us/FreeOTP/nsa/
The math behind OTP is pretty simple, but I may have made a mistake. I've posted the source code to crypto forums, HN, wilders security and emailed it to several prominent crypto developers/experts. No one seems to care nor want to look at it in-depth.
I've worked as a dev where we encrypted research data using standard, industry-accepted crypto (RSA and symmetric AES, etc) as well.
Having said that, I'm not an expert. And the real experts (Bruce Schneier) won't verify anything. They just say, "It's not been broken yet, which is a good indication."
Dive in and write some crypto code. Do it and make mistakes. That's how you learn. Not every program is life or death/mission critical. And if you never do it, you won't learn how. We learn by making mistakes.
Tarsnap is nice crypto code if you like C. It's easy to read too. I have no relation to Colin Percival or tarsnap. He's an expert, and even he makes mistakes:
http://www.daemonology.net/blog/2011-01-18-tarsnap-critical-...
[+] [-] jessaustin|12 years ago|reply
I'm not an expert, but I wouldn't look at an OTP implementation either. Key management is already hard enough. It's hard to imagine a useful system you could build on a one-time pad.
[+] [-] davidw|12 years ago|reply
[+] [-] obituary_latte|12 years ago|reply
[+] [-] tptacek|12 years ago|reply
If we were talking instead about people operating their own pharmacies, or doing their own home electrical work, nobody would bat an eyelash.
[+] [-] Shank|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] area51org|12 years ago|reply
I first encountered this in a classic essay by PGP's Phil Zimmerman: http://www.philzimmermann.com/EN/essays/SnakeOil.html
edit: fix incorrect link
[+] [-] danielweber|12 years ago|reply
I'm reminded of Yeats:
Anyone confident in their crypto is dangerous.[+] [-] stcredzero|12 years ago|reply
Actually, most problems are at the security protocol level. Implementing those is the hardest.
[+] [-] 23david|12 years ago|reply
So we're in a situation where we have to make imperfect choices. Is there any reliable guide that discusses the current best practices in securing things like ssh, VPN, etc?
I get the feeling that the preference of many of the infosec professionals on HN is that a properly configured SSH tunnel is the only secure way to communicate between machines on the Internet. ( IPSec VPN connections I guess are weaker?) So just use stunnel or autotunnel? Tunneling traffic imo seems like an inelegant hack, but are there any crypto libraries that are reliable?
Crypto is hard, but to be fair it's also not easy to figure out what crypto to use if you want to rely on external libraries or systems like SSH.
[+] [-] mistercow|12 years ago|reply
But the Cryptocat example is frustrating because the code is so bad. Not in a "poorly written" sense, but in a "going about things in a completely insane way" sense. The code generates a random number between 0 and 1 not by dividing a 53 bit random number by 2^53, but by generating 16 decimal digits, with the rationale that 2^53 ≈ 10^16. If the code hadn't been trying to do the wrong thing in the first place, there wouldn't have been an opportunity for that off-by-one error.
My point isn't that you'll be protected from simple classes of errors as long as you're coding sanely, but rather that that is the lesson you teach if you point to Cryptocat as your example. Crypto is hard, but you can't make that point very well by pointing to code written by someone who doesn't know the correct (and incredibly standard) solutions to common non-crypto problems.
[+] [-] schrodingersCat|12 years ago|reply
I completely agree. Ever seen a password hash mega-post on paste-bin? Bought to you by bad crypto. Saying "I don't know" is one of the hardest things to do, and also one of the most powerful. I applaud you for knowing your limits in this regard. I like to say crypto is like surgery, best left for the experts. You never want to "I kinda messed up that triple bypass," because there are major consequences for it. The same is true for virtually all aspects of cryptography. Thank you for this post!
Edit: Here a link to a blog post that was discussed recently on HN that is pertinent: http://www.daemonology.net/blog/2013-06-17-crypto-science-no...
[+] [-] bhitov|12 years ago|reply
[+] [-] pnathan|12 years ago|reply
Of course, the world is littered with cryptosystems that some foo thought would be secure and wasn't. XOR isn't encryption. ;)
I would suggest that hackers interested in crypto take a sober and unafraid look at the history of crypto and then spend time breaking ciphers, "hacking" in all senses of the word. Read the crypto papers and work on the advanced math until you really and truly grasp "all the things". Pour those 10K hours in....
Crypto is hard. Doing it mostly right requires humility and hard study, and then you have a good chance of being wrong. Nothing to be afraid of if you understand the situation and understand the years and years it takes to really become good at it...
[+] [-] betterunix|12 years ago|reply
[+] [-] dfc|12 years ago|reply
[+] [-] tptacek|12 years ago|reply
[+] [-] tstactplsignore|12 years ago|reply
[+] [-] stcredzero|12 years ago|reply
As programmers, we are dealing with interfaces and movement/copying of data. Security deals with epiphenomenal leakage of access and data. It is not programming, though it involves programming.
[+] [-] santosha|12 years ago|reply
[+] [-] dclusin|12 years ago|reply
At least with cryptocat everyone know knows about the security issues now and take precautions as if their entire encrypted communications have been compromised. With proprietary solutions you have to rely on an honor system where you hope the vendor tells you there is an issue.
[+] [-] billforsternz|12 years ago|reply
[+] [-] kGrange|12 years ago|reply
It wouldn't be a panacea for bad crypto, and it does create a risk of people thinking "oh, it passed all of the tests, it must be secure," while still implementing it overall incorrectly. But I still think it would mitigate these "foolish/easy" errors and allows devs to focus on proper overall implementation.
Or does something like this already exist?
[+] [-] loup-vaillant|12 years ago|reply
This is going to be costly…
[+] [-] motters|12 years ago|reply
[+] [-] thewarrior|12 years ago|reply
[+] [-] NegativeK|12 years ago|reply
Spend a long time in the community. Listen closely to the people who have a good reputation. Have your code publicly vetted before it's ever used. Expect your code to be torn apart.
[+] [-] pasquinelli|12 years ago|reply
i don't know the details, by why couldn't you just collect a million bits of supposedly random data from the generator and run it through the nist test suite? it's true that testing for randomness does require complicated thinking, but fortunately it's been done. standing on the shoulders of giants and whatnot.