top | item 41498146

(no title)

Silasdev | 1 year ago

Exactly.

It's just a natural maturity-curve that every new piece of tech inevitably goes through.

Tech on the maturity curve:

ICE cars: settled

EV cars: rising

PC's: settled

Phones: settled

Smart TV's: slowly rising still

Wearables: rising

discuss

order

Zak|1 year ago

It seems to me that it's based more on the use cases than the speed of improvements to the technology itself.

There were times when a CPU four times as fast changed what I could do with a PC. For a long time, with Moore's Law in full swing, we reliably got that sort of improvement every three years, and PCs older than that were widely seen as obsolete. Today that would only speed up batch jobs for me and have no impact on any of my workflows.

Some sort of on-device AI thing is probably the next threshold for PCs. I don't think there's anything production-ready and compelling right now, but I can imagine useful automation features when it gets good enough.

trinix912|1 year ago

It's also that software back in the day was much more fine-tuned to use the very limited resources as well as possible, so getting a better CPU would visibly speed up things.

Somewhere in the late 2000s, the CPUs got powerful and cheap enough (in the sense of "cents per MHz") that it shifted from having to be creative to get your programs to perform at acceptable speed, to not having to and instead focusing on delivering marketable software faster.

The only thing nowadays I can imagine requiring a substantial amount of raw processing power would be on-device AI processing, but that doesn't seem to be the case here, as large parts of the processing is still done in the cloud.