Very good demonstration of subsequent improvements of a naive algorithm. To me that was somewhat depreciated by the fact that you can actually calculate n-the Fibonacci number using Binet's closed form formula (http://en.wikipedia.org/wiki/Fibonacci_number#Closed-form_ex...). You will need arbitrary precision arithmetic starting with certain 'n' though, as IEEE 754 will not give you correct result.
Actually, you need just Z ring enriched (if this is right word) with sqrt(5). So instead of one float, you use pair of integers of arbitrary precision. So
Given the precision needed in the arithmetic, I've always wondered if the matrix exponentiating method wouldn't be faster. I've never been motivated enough to actually research it, test it, or think hard about it though...
I’d love to see an exact algorithm using the closed form formula. It’s obviously possible to do — though it seems fairly complicated — but would it really be faster? My instinct is that it would be slower: if anyone wants to prove me wrong, I’d be thrilled!
(The asymptotic complexity is surely the same, in any case.)
The title is hyperbole (I'm sure that I've written much, much worse algorithms, many times), but the breakdown of Fibonacci sequence algorithms is really enjoyable.
Typical way to gain views and make it look like you know what you are talking about (although the author does seem to know good algorithm development). Hell, comparing it to Bogosort is a stretch. Bogosort is not even a naive algorithm.
Bogosort terminates with probability 1. Is it really reasonable to say it doesn’t guarantee termination? People often do say that about randomised algorithms, which confuses me a little. Probability 1 is as guaranteed as anything probabilistic can reasonably hope to be, isn’t it?
[Edited to add: thanks for the replies. I think I expressed myself poorly here. It’s not that I don’t understand the difference between “with probability 1” and “always”. What I mean is that I don’t understand why people sometimes make a big deal out of it, and say “that algorithm is not even guaranteed to terminate” as though that somehow means the algorithm is untrustworthy or useless. The trouble with the game “roll a die until you get 100 sixes in a row” is that it has an astronomical expected running time – not that it might _never_ end, but that it will almost certainly take a very long time.]
I always find it incredibly difficult to concentrate on comparisons of Fibonacci sequence algorithms when I know for a fact that there is a closed-form expression[1] which gives F(n) in constant time.
In a somewhat similar genre, I enjoyed this paper on computing primes, and some of the very-inefficient algorithms that have been used for such purposes (often mislabeled as the "sieve of Eratosthenes"): http://www.cs.hmc.edu/~oneill/papers/Sieve-JFP.pdf
The key point is that he is trying to calculate arbitrarily large fibonacci numbers, and so the addition is itself an O(n) operation. This means that even after memoization, the complexity is still O(n^2), and he uses a number of tricks to reduce that.
If you're writing in JavaScript it's a nice candidate for self memoizing functions. Obviously this is only helpful if you're making numerous requests to the method, a single request still produces a series of recursive calls.
My personal favorite has always been permutation sort, where you try all possible permutations of a sequence and check if it is sorted.
What's nice about it is that it is deterministic yet ridiculously slow.
It can also be really easily implemented in Prolog[1] where you simply define what a permutation and being sorted means. After that you just search for a sorted permutation.
Yeah, I recalled the name as random sort, and was pleasantly surprised when the bogosort link directed to the same algorithm. I particularly enjoyed the "Quantum Bogosort" algorithm on the Wiki page.
Dunno. IIRC Every time I have seen the "bad" Fibonacci recursive algorithm has been followed by the "good" recursive one (bottom-up), which is O(1) in size if your language/implementation does tail-call elimination...
I think it’s only the jargon that’s confusing you. As long as you can remember (or look up) the definition of matrix multiplication, then it genuinely is easy to verify.
It seems like I had to as part of an interview once. Still, I have needed to write recursive functions many, many times and it's good to review some of what can go wrong.
[+] [-] mynegation|15 years ago|reply
[+] [-] stralep|15 years ago|reply
(a,b)+(c,d) = (a+c,b+d)
(a,b)/sqrt(5) = (b,a/5)
(a,b)(c,d) = (ac+5bd,ad+bc)
2phi = (1,1)
F(n) = (2phi^n - (2-2phi)^n)/(2^n*sqrt(5))
[edit] As ot said, the right word is "extended"
[+] [-] shasta|15 years ago|reply
[+] [-] robinhouston|15 years ago|reply
(The asymptotic complexity is surely the same, in any case.)
[+] [-] mccutchen|15 years ago|reply
[+] [-] sevenproxies|15 years ago|reply
[+] [-] perlgeek|15 years ago|reply
Nonono, Bogosort is way worse than naive recursive fibonacci - the former doesn't even guarantee termination, recursive fibonacci still does.
If you want to calculate fibonacci numbers not as a misguided exercise in algorithms but actually efficiently, use an algebraic form: http://en.wikipedia.org/wiki/Fibonacci_number#Computation_by...
[+] [-] robinhouston|15 years ago|reply
[Edited to add: thanks for the replies. I think I expressed myself poorly here. It’s not that I don’t understand the difference between “with probability 1” and “always”. What I mean is that I don’t understand why people sometimes make a big deal out of it, and say “that algorithm is not even guaranteed to terminate” as though that somehow means the algorithm is untrustworthy or useless. The trouble with the game “roll a die until you get 100 sixes in a row” is that it has an astronomical expected running time – not that it might _never_ end, but that it will almost certainly take a very long time.]
[+] [-] qntm|15 years ago|reply
[1] http://en.wikipedia.org/wiki/Fibonacci_number#Closed-form_ex...
[+] [-] Peaker|15 years ago|reply
[+] [-] _delirium|15 years ago|reply
[+] [-] ubasu|15 years ago|reply
http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-11.html...
[+] [-] TheBoff|15 years ago|reply
The key point is that he is trying to calculate arbitrarily large fibonacci numbers, and so the addition is itself an O(n) operation. This means that even after memoization, the complexity is still O(n^2), and he uses a number of tricks to reduce that.
[+] [-] snissn|15 years ago|reply
[+] [-] TheBoff|15 years ago|reply
That's what I call a bad algorithm!
[+] [-] unknown|15 years ago|reply
[deleted]
[+] [-] mwbiz|15 years ago|reply
[+] [-] eru|15 years ago|reply
[+] [-] michaelcampbell|15 years ago|reply
[+] [-] ginkgo|15 years ago|reply
What's nice about it is that it is deterministic yet ridiculously slow.
It can also be really easily implemented in Prolog[1] where you simply define what a permutation and being sorted means. After that you just search for a sorted permutation.
[1] http://rosettacode.org/wiki/Sorting_algorithms/Permutation_s...
[+] [-] cmaggard|15 years ago|reply
http://en.wikipedia.org/wiki/Bogosort#Quantum_bogosort
[+] [-] masterzora|15 years ago|reply
[+] [-] samuel|15 years ago|reply
[+] [-] georgieporgie|15 years ago|reply
Since the Fibonacci numbers are defined by a linear recurrence, we can express the recurrence as a matrix, and it’s easy to verify that...
It's been way too long since my math minor for me to understand that.
[+] [-] robinhouston|15 years ago|reply
I think it’s only the jargon that’s confusing you. As long as you can remember (or look up) the definition of matrix multiplication, then it genuinely is easy to verify.
[+] [-] ignifero|15 years ago|reply
[+] [-] fhars|15 years ago|reply
[Edit:] But there are some interesting data structures based on them: http://en.wikipedia.org/wiki/Fibonacci_heap
[+] [-] te_platt|15 years ago|reply