(no title)
nonbirithm | 2 years ago
If LLMs are eventually regarded by a lot of people as an authoritative source, regardless of whether or not they are, I expect a lot of such cases of "morality laundering" to appear.
nonbirithm | 2 years ago
If LLMs are eventually regarded by a lot of people as an authoritative source, regardless of whether or not they are, I expect a lot of such cases of "morality laundering" to appear.
jstarfish|2 years ago
Teslas too-- a self-driving car mows down a crowd and nobody is held responsible?
Taking illegal action based on what GPT told you doesn't excuse you pulling the trigger. You pulled the trigger. You go to jail.
Don't normalize "just following orders" or it's going to end predictably.
skygazer|2 years ago
Most don’t think autonomous systems will become safe or accepted as safe until manufacturers are willing to assume liability and indemnify users, as Mercedes has.l, by contrast.
https://insideevs.com/news/575160/mercedes-accepts-legal-res...
[Edited to note the attempt, so as not to assert success — I don’t think that’s settled]
ahartman00|2 years ago
"Federal Judge Kevin Castel is considering punishments for Schwartz and his associates. In an order on Friday, Castel scheduled a June 8 hearing at which Schwartz, fellow attorney Peter LoDuca, and the law firm must show cause for why they should not be sanctioned."
https://arstechnica.com/tech-policy/2023/05/lawyer-cited-6-f...
unknown|2 years ago
[deleted]