top | item 12153638

Just how bad is OpenSSL? (2012)

51 points| francium_ | 9 years ago |lists.randombit.net | reply

49 comments

order
[+] aerovistae|9 years ago|reply
Frankly I've never liked man pages. To me they always screamed "This is how documentation was done in the 90s." The examples are often very unclear or incomplete, and the explanations often assume prior knowledge without providing links in case such knowledge is absent.

Modern documentation has gotten way better, as seen in the Stripe docs and many others, and I wish the man pages could be updated accordingly.

[+] dingaling|9 years ago|reply
Yes man pages are usually upside-down; the examples should be right at the start and then lead to a drill-down into options. 9/10 times I end-up having to search the web for a basic introductory example.

But even in big corps corps with ISO9000 accreditation there is seldom self- questioning as to whether documentation is useful rather than just ticking the box for process-completeness.

[+] microcolonel|9 years ago|reply
The quality of manpages can vary significantly, however many of them are excellent. The Git manpages stand out; most of the system manpages for OpenBSD are excellent as well.

One other nice thing is integration with an editor. I view manpages in emacs, and can yank the snippets directly into my other buffers for extra convenience.

[+] gkya|9 years ago|reply
The quality of the medium is not determined by the quality of its content. GNU manpages especially are incredibly low quality (they have info(1) for extensive documentation), but not all man pages are GNU manpages. If you use man with shell job control (Ctrl-z, fg, bg), you can easily read multiple man pages simultaneously.
[+] Nullabillity|9 years ago|reply
At least man pages don't break your scrolling.
[+] taeric|9 years ago|reply
And, for myself, I couldn't disagree more. Man pages are very well done, since they are more often used for reference than they are for discovery.

Similarly, documentation seems to have apexed with the TeXbook. :(

[+] geggam|9 years ago|reply
What is your better solution ? Instead of being derogatory about a technology which works how about creating your ideal and seeing if the Internet likes it ?
[+] __b__|9 years ago|reply
"For instance it doesn't have everything you need to validate certificates..."

Yet it has all the CA crap thrown in, via the overloaded openssl binary. As "examples". And according to the documentation, not even "correct" illustrations of how libssl should be used.

Encryption and authentication are two separate problems.

Just because you figured out a way to encrypt a message does not mean you have also figured out how to a way to send it to only the correct recipient... over an insecure network. (Insecure not only in the sense of "plaintext" but in the sense you are not in control of much of anything - routing, PKI infrastructure, etc.)

It seems to me that one would want to solve the authentication problem first, and then move on to encryption.

This comment shows that for proponents of using SSL on the public web, it's been the other way around. Authentication was never sorted out.

When it comes to authentication, all due respect to the OpenSSL authors, SSH has provided a better attempt at a solution than any implementation of PKI using SSL/TLS.

And one more thing, how many ciphers does a user really need? As we've heard time and again, many of them are not even "safe" to use. Some of the alternative SSL libraries have wisely removed them. But I guess OpenSSL is append only?

[+] qwertyuiop924|9 years ago|reply
OpenSSL is pretty bad. After reading about some of the stuff that lead the the libressl fork, I wouldn't trust it with my lunch money. Sure, the algorithms are good, but as far as the code's concerned, Heartbleed was the tip of the iceberg.
[+] ktRolster|9 years ago|reply
There's a saying, "don't roll your own crypto," and it's good advice.

In the case of openssl, you might be better off rolling your own. At least the vulnerabilities you end up with are different than the ones that the rest of the world has.

[+] nickpsecurity|9 years ago|reply
The experts writting it for themselves part seemed inaccurate given what I read in LibreSSL commits. It was one atrocity after another. Still love Ted Unganst's observation about them making surd endianess of CPU doesnt change while protocol is running. Just cant remember how often that check was performed.

"Experts"... lol...

[+] stefs|9 years ago|reply
i interpreted this as: this was written by security experts (cryptographers), not expert programmers. this means the algorithms are generally ok, but the implementation is wacky (and issue prone).
[+] ori_b|9 years ago|reply
After the string of vulnerabilities, I know that OpenSSL got a wave of investment.

I'm curious how much of this still stands today.

[+] ctz|9 years ago|reply
I got a few thousand dollars of that money as a security bug bounty.

OpenSSL fixed the problem quickly, but one year on still haven't accepted the regression test for the issue. It would be amusing if it wasn't so horrifying.

[+] red_admiral|9 years ago|reply
If you think OpenSSL is bad, try MIRACL (only documentation I could find is a word file that's basically a list of function signatures). And OpenSSL at least generally builds fine on a vanilla Ubuntu machine.

In contrast, libsodium deserves praise for writing documentation like they want people to actually use their library.

[+] mSparks|9 years ago|reply
openSSL dates from a time when security was mostly of low importance. (not that things have really changed that much. iot I'm looking at you).

shock horror it shows.

i find it really quite painful that no one seems to be taking this as seriously as it deserves.

cost must be literally hundreds of billions a year now of electronic crime simply because we have been denied secure communications from day 1.