I think there's a theory out there that if something can't die, it's more of a "library" than "immortal"... because being born and dying (and the fact that you sharing resources with another living thing is possibly you sharing/shortening your one finite life with another) is so essential for any social bonding. so a machine that has obtained all the knowledge of the universe and enabled to act upon that knowledge is still just a library with controllers attached (no more sophisticated of a concept than a thermostat)in the end, if synthetic super intelligence results in the end of mankind, it'll be because a human programmed it to do so. more of a computer virus than a malevolent synthetic alien entity. a digital nuclear bomb.
No comments yet.