top | item 33451296

(no title)

metric10 | 3 years ago

FWIW, actually, one thing I learned in practice that’s wasn’t highlighted in my Algorithms course: overhead (constant C) matters. You can feel good about yourself for choosing an algorithm that scales in O(lg n) time, but if your you ignore the cost of each operation (C) you might be slow.

For example:

1. When n is small, an array is almost always better. Arrays have very little overhead compared to even a hash map.

2. Algorithms with the same O() may still have significant differences at runtime and might be balanced differently between insert and search times. AVL trees take longer than Red Black trees to insert, but might be 1 level better in height. That means one less access. Useful for a routing table, for example.

So, in summary, if your looking at other people’s code and see lots of arrays don’t get too smug…n is usually small.

discuss

order

adhoc32|3 years ago

Also cache matters a lot. If n is small an array is always faster by a big margin.

sterlind|3 years ago

a good example of this is Fibonacci heaps. on paper they're great, but they result in egregious pointer chasing, while radix heaps are less flexible but can be backed by a contiguous array.

weirdly, in all my recent algorithm work, only the big-O has mattered (or it's all just NP-hard or even EXPTIME.)