top | item 36458582

(no title)

dangond | 2 years ago

Sure, but if you're a creature that's useful to humans, you'll find that you'll either get domesticated and lose all your freedom or get hunted to near (or total) extinction. Any life on earth with some semblance of intelligence is dominated by us. Dolphins, as smart as they are, have no way to use their intelligence to flip the script and become the dominant species, and are dependent on us not deciding that they would be useful to us (beyond the ones we take for aquariums).

The only exceptions I can think of to the above rule are viruses and bacteria, where (in most cases) we can't really exterminate them entirely from the face of the earth even if we wanted to. However, it seems to me that sufficient intelligence would allow for better understanding of different bacterial/viral structures that would allow you to make a specific chemical that would be very good at killing that specific thing.

Overall, the danger from a bootstrapping AI that becomes vastly more intelligent than humans (if possible) seems to me to be that we would lose full agency according to its whims as it gets more and more power.

discuss

order

kaba0|2 years ago

I read a great comment on HN that argued that super-human intelligence is not that “OP” advantage — and it really did convince me.

Life is a game with elements where intelligence matters, plenty where it is pure luck, and others where we have a bunch of unknowns (data).

Would a super-intelligent AI have a significant advantage in a game of Monopoly, for example? I think many sci-fi scenarios fail to take this into account, especially the data aspect. Humans are quite intelligent (in the extremes at least), and any extra over that may well be in the diminishing returns category.