top | item 35735216

(no title)

Psychlist | 2 years ago

If you have a fast design/architecture, you may never need to optimise the code at all. But the flip side is that with a bad design or bad architecture optimising the implementation won't save you. With a sufficiently bad architecture starting again is the only reasonable choice.

I've seen code that does "fast" searches of a tree in a dumb way come out O(n^10) or worse (at some point you just stop counting), and the solution was not to search most of the tree at all. Find the relevant node and follow links from that.

Meanwhile in my day job performance really doesn't matter. We need a cloud system for the distributed high bandwidth side, but the smallest instances we can buy with the necessary bandwidth have so much CPU and RAM that even quite bad memory leaks take days to bring an instance down. Admittedly this is C++ with a sensible design (if I do say so myself) so ... good design and architecture means you don't have to optimise.

discuss

order

jamesfinlayson|2 years ago

> If you have a fast design/architecture, you may never need to optimise the code at all. But the flip side is that with a bad design or bad architecture optimising the implementation won't save you. With a sufficiently bad architecture starting again is the only reasonable choice.

Yep, completely agree. I worked at a company with a poorly architected high-throughput system that was written in Perl. It got to a point where no more optimisations could make it scale, so it was rewritten. Of course the rewrite in a "faster language" was touted as the reason for its success but the truth was the new architecture didn't pound the database anywhere near as much.