top | item 35372914

(no title)

Hayarotle | 2 years ago

An AGI being able to wipe out humanity doesn't necessarily mean that it can take over the universe. The world's governments are already capable of causing extreme suffering through a nuclear war. AGI risk scenarios aren't equivalent to an unbounded intelligence explosion. An AGI only needs to be more powerful than humanity to be a threat. It can be a threat even if it isn't that intelligent, as long as it gives unprecedented power to a few individuals or governments.

Both humanity and a super-intelligent AGI are bound by the laws of physics. Super-intelligence does not imply omnipotence; it simply means that the AGI is orders of magnitude more intelligent than humans. If humans can figure out how to colonize the Milky Way in 90 million years, then the answer to the question of why no AGI has done it is the same as the answer to the question of why no extraterrestrial species has done it.

discuss

order

No comments yet.