top | item 42345953

(no title)

theendisney4 | 1 year ago

To understand you would have had to be there. Computers got faster (and then some) in the old days you would write something (or not even bother) then see it took way longer than desirable. You would rewrite and itterate over possible ways to rewrite. Sometimes you would see the light, other times you would try different approches brute force. The point where optimization was nesasary was completely obvious and 99% of modern code never needs the consideration.

If communism takes over the world and be given 30 years half the texts wouldnt make sense as it talks about something to do with capitalism? that doesnt exist.

discuss

order

gregjor|1 year ago

I began my programming career on machines with performance, memory, and storage constraints no one today can imagine. Some of the necessary hacks and shortcuts from back then look like premature optimization and stupid coding today.

The Y2K “problem” gives the canonical example. In a world of vast and very cheap and fast storage, it makes no sense to save two characters in a date. But back in the ‘70s and early ‘80s when I implemented dates like that cutting those two characters over a few million records saved significant money. Disk space used to cost a lot, RAM (or core memory) used to cost a lot more.

vacuity|1 year ago

Computer hardware is very fast now, so why does my computer lag noticeably on OS and browser operations? A facetious question, and perhaps it's not remotely a dealbreaker, but I expect better and would expect the same of my own software. I agree with GGP that too many people seem to take "premature optimization is the root of all evil" as "don't optimize until it's too late and then painstakingly get diminishing returns". There is a comfortable middle ground of the optimizing-delivering tradeoff that I think is wildly missed.