top | item 46380280

(no title)

Philadelphia | 2 months ago

We never had anything different, though. Computers always became so obsolete after a while that there was no longer any point in trying to upgrade them. I think I got eight years out of my 1997 Power Mac G3, including a CPU upgrade to a G4, RAM upgrades, hard disk upgrades, a video card, and USB expansion, but then the new machines coming out were just so much better that throwing money into more upgrades was just tossing it into a black hole.

discuss

order

WackyFighter|2 months ago

Maybe in the late 90s and early 2000s. These days hardware from over a decade ago works fine. I am typing this comment on a 2011 Dell E6410. Install Debian / Arch Linux and the machine is surprisingly capable. Just running HTOP I am using 2.5G of ram (out of 8GB) and the CPU is at 2%.

TBH, I have a Ryzen 5950X based tower and while it is faster than my previous desktop which was a i7 4970K (or whatever it is), the previous machine would be fine tbh. I am not even sure why I upgraded tbh.

trinsic2|2 months ago

I guess its a byproduct of a faster moving curve with improved technology. 20 years ago you didn't need to replace the entire platform for at least 10 years.

vel0city|2 months ago

20 years ago I was hopping from Intel to AMD and then back to Intel. After that practically every decent jump in CPU performance on the Intel side of things meant a new socket (LGA775, 1156, 1155, 1150, 1151...). AMD typically kept sockets for a bit longer but wasn't as competitive until Ryzen which had a few jumps in chipset compatibility in there.

In the last 20 or so years if I wanted a few years newer CPU for whatever reason it usually meant I needed a whole new motherboard, and that often (but not always) also meant new RAM.