top | item 43646247

(no title)

aszantu | 10 months ago

Funny thing About Asimov was how he came up with the laws of robotics and then cases on how they don't work. There are a few that I remember, one where a robot was lying because a bug in his brain gave him empathy and he didn't want to hurt humans.

discuss

order

nitwit005|10 months ago

I was always a bit surprised other sci fi authors liked the "three laws" idea, as it seems like a technological variation of other stories about instructions or wishes going wrong.

buzzy_hacker|10 months ago

Same here. A main point of I, Robot was to show why the three laws don't work.

nthingtohide|10 months ago

Narratives build on top of each other so that complex narratives can be built. This is also the reason why Family Guy can speedrun through all the narrative arcs developed by culture in 30 seconds clip.

Family Guy Nasty Wolf Pack

https://youtu.be/5oW9mNbMbmY

The perfect wish to outsmart a genie | Chris & Jack

https://youtu.be/lM0teS7PFMo

pfisch|10 months ago

I mean, now we call the three laws "alignment", but it honestly seems inevitable that it will go wrong eventually.

That of course isn't stopping us from marching forwards though in the name of progress.

nix-zarathustra|10 months ago

>he came up with the laws of robotics and then cases on how they don't work. There are a few that I remember, one where a robot was lying because a bug in his brain gave him empathy and he didn't want to hurt humans.

IIRC, none of the robots broke the laws of robotics, rather they ostensibly broke the laws but the robots were later investigated to have been following them because of some quirk.

hinkley|10 months ago

And one that was sacrificing a few for the good of the species. You can save more future humans by killing a few humans today that are causing trouble.

pfisch|10 months ago

Isn't that the plot of westworld season 3?

kagakuninja|10 months ago

In the Foundation books, he revealed that robots were involved behind the scenes, and were operating outside of the strict 3 laws after developing the concept of the 0th law.

>A robot may not harm humanity, or, by inaction, allow humanity to come to harm

Therefore a robot could allow some humans to die, if the 0th law took precedence.

creer|10 months ago

Good conceit or theme by an author - on which to base a series of books that will sell? Not everything is an engineering or math project.

soulofmischief|10 months ago

That is still one of my favorite stories of all time. It really sticks to you. It's part of the I, Robot anthology.