top | item 32223631

(no title)

sim1collins | 3 years ago

"Unpredictably far away"—nice way of putting it.

Whether we "need" humans depends on where one's values are; for those who would like humans in some form to continue to exist, the idea of AGI making humans obsolete isn't terribly comforting. Even those who might really like the idea of humans sort of evolving into AI (and eventually ceasing to biologically exist) might not be keen to rush toward human obsolescence if they're not thrilled with initial versions of AGI (diversity and optionality is nice).

discuss

order

catchclose8919|3 years ago

I only care about makind sure that all essential and valuable human characteristics, perspective, thought patterns and values get carried on to whatever the next thing is :) Extinction of bio-humans can be perfectly fine if it happens without information loss imo.

EDIT+: Counterintuitively, it might be more benefficial to have way more people in the (maybe short) era before the passing of bio-humans, to increase the probability that as many as possible of the valuable human ideas get passed on to our AGI descendents! If AGI is finally achieved by eg. "three middle aged white guys surviving some apocalypse in some bunker", then that AGI will only have and carry on the values and mindsets of those "three middle aged white guys"!!! Ironically, "unification under one umbrella of AGI research efforts" like eg. OpenAI folks try to do increase the likelihood the "three middle aged white guys fathering a narrow-minded AGI", instead of encouraging a diversified competition landscape...

We seem to be handling the upcoming birth of AGI just as well as we "handled" this pandemic...