top | item 32223458

(no title)

catchclose8919 | 3 years ago

It's... unpredictably far away! We suck so bad at estimating these kinds of things, that whether it's tomorrow or 500 years from now, I'm convinced we'll be totally surprised, unprepared, and our nor even sincere effort to stear things down a good path (eg. OpenAI and such...) will be either worthless or detrimental!

We should simply pursue "organic growth" eg. grow as much people as we can care for and educate properly. Unfortunately our elites seem really hell bent of keeping the social landscape hyperconpetitive so the resources we have don't really get distributed... and as a consequence of course (rational) pople have fewer children! Also there's an intellectual war fought on "traditional values" that enabled stuff like extended famimlies to exist and make child rearing at least bearable by sharing the load of caring for the nasty little critters.

discuss

order

sim1collins|3 years ago

"Unpredictably far away"—nice way of putting it.

Whether we "need" humans depends on where one's values are; for those who would like humans in some form to continue to exist, the idea of AGI making humans obsolete isn't terribly comforting. Even those who might really like the idea of humans sort of evolving into AI (and eventually ceasing to biologically exist) might not be keen to rush toward human obsolescence if they're not thrilled with initial versions of AGI (diversity and optionality is nice).

catchclose8919|3 years ago

I only care about makind sure that all essential and valuable human characteristics, perspective, thought patterns and values get carried on to whatever the next thing is :) Extinction of bio-humans can be perfectly fine if it happens without information loss imo.

EDIT+: Counterintuitively, it might be more benefficial to have way more people in the (maybe short) era before the passing of bio-humans, to increase the probability that as many as possible of the valuable human ideas get passed on to our AGI descendents! If AGI is finally achieved by eg. "three middle aged white guys surviving some apocalypse in some bunker", then that AGI will only have and carry on the values and mindsets of those "three middle aged white guys"!!! Ironically, "unification under one umbrella of AGI research efforts" like eg. OpenAI folks try to do increase the likelihood the "three middle aged white guys fathering a narrow-minded AGI", instead of encouraging a diversified competition landscape...

We seem to be handling the upcoming birth of AGI just as well as we "handled" this pandemic...