top | item 4954715

BLAKE2, an improved version of the SHA-3 finalist BLAKE optimized for speed

44 points| stalled | 13 years ago |blake2.net | reply

30 comments

order
[+] drakeandrews|13 years ago|reply
Over the past year and a half I've had it drummed into me that fast hashing is bad and slow hashing is good. If this is so, why have they optimised this new hashing algorithm to be as fast as possible?
[+] tptacek|13 years ago|reply
Password hashes and general-purpose crypto hashes are not the same thing. If it helps, try using the technical term for the kinds of functions cryptography offers for password hashing: key derivation functions (KDFs).
[+] rmccue|13 years ago|reply
To expand on what has already been noted: a good place for fast hashing is when you're using hashing millions of objects. For example, if I wanted to parse thousands of feeds and store items using a unique hash as their key, I'd want a fast hash that's collision resistant (for the uniqueness part) and one that's fast (for the millions of objects). With password hashing (KDFs, as tptacek noted), the collision resistance is useful, but we want to keep it fairly slow to avoid being able to brute force attacks.
[+] quonn|13 years ago|reply
Slow hashing is usually bad, except for password-derivation functions.
[+] tsewlliw|13 years ago|reply
fast hashing is bad for storing one way functions of passwords because it makes them easier to attack and make many guesses about the original value from the transformed value, hence all the noise about using slow "key derivation functions" instead. for all other common use cases, faster is better, all else equal.
[+] newman314|13 years ago|reply
Does this mean that this variant will be resubmmited as SHA-3?
[+] dchest|13 years ago|reply
SHA-3 competition is over, Keccak won, so no.
[+] IheartApplesDix|13 years ago|reply
I'm not sure how reduced memory requirements are a benefit to encryption. Are there really any low end systems still in use today that actually have issues with memory usage? Even $150 notebooks come with 1 gig of ram. This seems more like it would help save ram on interception devices like the Narus Device or the huge datacenters owned by the NSA, which have a huge issue with storing all the data required to intercept and decrypt eavesdropped communications reliably.
[+] theatrus2|13 years ago|reply
The last system where I used SHA-256 had 16KiB of RAM (and 128KiB of direct-mapped flash). Yes, memory use for security algorithms is important.
[+] s353|13 years ago|reply
I think of reduced memory requirements as a benefit to every application, including encryption. That's because nearly every application I use competes for memory; I also use memory as general storage media for stuff for which I want fast I/O and/or don't need to save permanently (e.g. mfs or tmpfs mounts). I think of memory as a precious resource.
[+] dchest|13 years ago|reply
Think smart cards.
[+] Daniel_Newby|13 years ago|reply
> Even $150 notebooks come with 1 gig of ram.

The issue is with the processor's cache memory, which is small, typically 64 kiB or so. (The TLB cache is also important.) Even a few hundred bytes savings can give an important improvement in speed amd power consumption.

[+] rb2k_|13 years ago|reply
A little off topic, but since it's on the page: Is it only me or does "mebibyte" simply look wrong?

You can't just decide that people aren't supposed to say "megabyte" anymore

[+] dchest|13 years ago|reply
http://en.wikipedia.org/wiki/Mebibyte

It's becoming more common. By the way, if you use OS X, your utilities output megabytes (1000 bytes), not mebibytes. GNU utilities switched to KiB, MiB, etc. (1024 bytes).

[+] idbfs|13 years ago|reply
I agree that it looks odd. However, especially in applications like cryptography, I think that removing the ambiguity of "megabyte" (i.e. do we mean 10^6 or 2^20 bytes?) is worth the introduction of a new term.
[+] Heliosmaster|13 years ago|reply
as other people pointed out, Mebibyte is the correct form. You can't just make the wrong version right by means of usage.