(no title)
dangond | 2 years ago
The only exceptions I can think of to the above rule are viruses and bacteria, where (in most cases) we can't really exterminate them entirely from the face of the earth even if we wanted to. However, it seems to me that sufficient intelligence would allow for better understanding of different bacterial/viral structures that would allow you to make a specific chemical that would be very good at killing that specific thing.
Overall, the danger from a bootstrapping AI that becomes vastly more intelligent than humans (if possible) seems to me to be that we would lose full agency according to its whims as it gets more and more power.
kaba0|2 years ago
Life is a game with elements where intelligence matters, plenty where it is pure luck, and others where we have a bunch of unknowns (data).
Would a super-intelligent AI have a significant advantage in a game of Monopoly, for example? I think many sci-fi scenarios fail to take this into account, especially the data aspect. Humans are quite intelligent (in the extremes at least), and any extra over that may well be in the diminishing returns category.