top | item 20056961

"Liability sponge": When algorithms mess up, the nearest human gets the blame

2 points| bnabholz | 6 years ago |technologyreview.com

3 comments

order

AnimalMuppet|6 years ago

Do we want to let algorithms be the liability sponge? Do we want humans to say "It's not my fault, blame the algorithm (that I wrote)"?

The algorithms aren't responsible; humans are (because they wrote the algorithms).

bnabholz|6 years ago

My apologies - when posted I made an attempt to reduce the title to fit HN restrictions; I've revised it now to be a bit more clear. The implication is that whichever human is closest to the accident is at fault. There is some sense in saying "the human in the car is ultimately responsible," but if that is the case, I think it will kill self-driving cars. Why would you accept the blame for something you didn't "do?"

In my very humble opinion... I think the classic "corporate IP" laws should extend here. If you write software and that software belongs to your company, they should be responsible for the liability of said software. Ownership and consequences should be associated, otherwise you've misplaced the incentive for people/organizations to do the right thing.

It's hard to see a positive outcome. Blaming the driver will make people not want self-driving tech. Blaming the dev will just make them pick a less risky line of work. Blaming the company is probably the most fair, but then companies will be less likely to develop it.