This is a pretty bad list. To say that the fast inverse square root trick revolutionized computing is a joke. I doubt anyone on HN was unaware of any of these algorithms and couldn't name more "revolutionary" ones.
No FFT? No simplex algorithm? Hashing? Strassen's?
That inverse square root algorithm is a neat trick, but it did not revolutionize anything.
In addition to several algorithms already mentioned, I feel that suffix trees and suffix array algorithms should be there as well. They are making all kinds of approximate searches feasible in bioinformatics.
I'm also not sure isqrt counts as a new algorithm. It's just Newton's method with a bit hack to exploit the floating point format and extract a good starting point.
The Fast Fourier Transform absolutely deserves to be on the list.
All of signal processing went from, "We'd like to do a Fourier Transform but can't afford O(n^2) so here is a faster alternative that kind of does something useful" to, "We start with a FFT."
It's an efficient way for performing Lasso (L^1-penalization) to regression models, which has the benefit of (in addition to reducing risk of overfitting) producing sparse models.
SIAM put out a 'ten algorithms of the century' https://www.siam.org/pdf/news/637.pdf a few years ago and it's really tough to argue with six or seven or eight of them.
(MCMC, simplex, Krylov, Householder decomposition, QR, Quicksort, FFT, Ferguson&Forcade's integer relation stuff (that led to stuff like PSLQ), and fast multipole)
[+] [-] avmich|12 years ago|reply
[+] [-] gamegoblin|12 years ago|reply
No FFT? No simplex algorithm? Hashing? Strassen's?
[+] [-] rumbler|12 years ago|reply
In addition to several algorithms already mentioned, I feel that suffix trees and suffix array algorithms should be there as well. They are making all kinds of approximate searches feasible in bioinformatics.
[+] [-] mistercow|12 years ago|reply
[+] [-] cliveowen|12 years ago|reply
[+] [-] gamegoblin|12 years ago|reply
A cool 19th century algorithm is Radix Sort, though.
[+] [-] TTPrograms|12 years ago|reply
[+] [-] btilly|12 years ago|reply
All of signal processing went from, "We'd like to do a Fourier Transform but can't afford O(n^2) so here is a faster alternative that kind of does something useful" to, "We start with a FFT."
[+] [-] tlarkworthy|12 years ago|reply
Neural nets opened a lot of doors.
Bellman equation is in a lot of equipment.
[+] [-] fla|12 years ago|reply
[+] [-] mjcohen|12 years ago|reply
[+] [-] bbosh|12 years ago|reply
[+] [-] carlosvega|12 years ago|reply
[+] [-] TrainedMonkey|12 years ago|reply
[+] [-] michaelochurch|12 years ago|reply
It's an efficient way for performing Lasso (L^1-penalization) to regression models, which has the benefit of (in addition to reducing risk of overfitting) producing sparse models.
[+] [-] GFK_of_xmaspast|12 years ago|reply
SIAM put out a 'ten algorithms of the century' https://www.siam.org/pdf/news/637.pdf a few years ago and it's really tough to argue with six or seven or eight of them.
(MCMC, simplex, Krylov, Householder decomposition, QR, Quicksort, FFT, Ferguson&Forcade's integer relation stuff (that led to stuff like PSLQ), and fast multipole)
And FORTRAN.