top | item 47177517

(no title)

weinzierl | 2 days ago

OP has a good point, but for me, I'd rather wish we'd skipped the 90s and picked up again much later. I like to think of the 8-bit era as an early bronze age of computing, lots of things went right and were done right.

16-bit, to me, are the dark ages. Lot's of confusion, not much good came out of it technologically and aesthetically. God, everything was ugly. Maybe all the trials and tribulations were necessary for what was about to come but I like to believe they weren't.

32-bit to me is the golden age and 64-bit is platinum.

If you offered my a time machine to go back, I'd surely say: "No, thank you!". There hasn't been a better time than now, but if you'd forced me at gun point, I'd pick the 80s over the 90s any time.

discuss

order

saulpw|2 days ago

I agree that 32-bit is the golden age, but 64-bit is enterprise bloat. I personally would go for 1995 vs 2005, but I think 2005 was a lot better than 2015 in terms of interfaces.

billev2k|2 days ago

I personally think that "peak computing" was around 2005. Maybe 2006.

leptons|2 days ago

There was plenty of amazing stuff going on with computing in the 90's. You just had to know where to look. Do you consider the 68000 CPU to be 16 or 32 bit, or both?

weinzierl|2 days ago

The 68000 was introduced in 1979, to me it is part of the 80s.

But you are making a good point. Maybe the distinction between 8/16/32/64-bit isn't really helpful. I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.

mamcx|2 days ago

I instead think the mistake was internet. If I were a time lord, I found the way to cripple the internet tech (3-body problem style) so it fails to do better than 56k.

saulpw|2 days ago

I think it's the "always on" nature as opposed to the bandwidth. But maybe one begets the other.