Funny thing About Asimov was how he came up with the laws of robotics and then cases on how they don't work.
There are a few that I remember, one where a robot was lying because a bug in his brain gave him empathy and he didn't want to hurt humans.
I was always a bit surprised other sci fi authors liked the "three laws" idea, as it seems like a technological variation of other stories about instructions or wishes going wrong.
Narratives build on top of each other so that complex narratives can be built. This is also the reason why Family Guy can speedrun through all the narrative arcs developed by culture in 30 seconds clip.
>he came up with the laws of robotics and then cases on how they don't work. There are a few that I remember, one where a robot was lying because a bug in his brain gave him empathy and he didn't want to hurt humans.
IIRC, none of the robots broke the laws of robotics, rather they ostensibly broke the laws but the robots were later investigated to have been following them because of some quirk.
And one that was sacrificing a few for the good of the species. You can save more future humans by killing a few humans today that are causing trouble.
In the Foundation books, he revealed that robots were involved behind the scenes, and were operating outside of the strict 3 laws after developing the concept of the 0th law.
>A robot may not harm humanity, or, by inaction, allow humanity to come to harm
Therefore a robot could allow some humans to die, if the 0th law took precedence.
nitwit005|10 months ago
buzzy_hacker|10 months ago
nthingtohide|10 months ago
Family Guy Nasty Wolf Pack
https://youtu.be/5oW9mNbMbmY
The perfect wish to outsmart a genie | Chris & Jack
https://youtu.be/lM0teS7PFMo
pfisch|10 months ago
That of course isn't stopping us from marching forwards though in the name of progress.
nix-zarathustra|10 months ago
IIRC, none of the robots broke the laws of robotics, rather they ostensibly broke the laws but the robots were later investigated to have been following them because of some quirk.
hinkley|10 months ago
pfisch|10 months ago
kagakuninja|10 months ago
>A robot may not harm humanity, or, by inaction, allow humanity to come to harm
Therefore a robot could allow some humans to die, if the 0th law took precedence.
bell-cot|10 months ago
creer|10 months ago
soulofmischief|10 months ago