top | item 21959834

(no title)

brycesbeard | 6 years ago

I agree in spirit, but a lot of human behaviors, derived from decisions, don’t have consequences. Did Nestle face consequences for their decisions to promote formula, etc?

discuss

order

denzil_correa|6 years ago

Let's not confuse accountability of decision making entity with consequences faced by the accountability. I was referring to the former, not the latter. The former is a case for law, the latter is a subject of law enforcement. In the case of Nestlé, you have a clear assigned accountable entity. We don't have that with algorithmic decision making (yet) where organizations wash off their hands saying "Oops! It was an algorithm, we didn't do anything!" Weak law enforcement is not a reason to not have proper laws in place.

wongarsu|6 years ago

Why would organizations not be responsible for decisions made by an algorithm? After all they are still the entity who decided to use that algorithm, and to execute the decision.

Sure, they say "Oops! It was an algorithm, we didn't do anything!", but is that any different than management saying "Oops! It was just a rogue employee, we didn't do anything!" For anything sufficiently consequential or systematic the second excuse doesn't work, so why should the first excuse work?

jotm|6 years ago

> We don't have that with algorithmic decision making (yet) where organizations wash off their hands saying "Oops! It was an algorithm, we didn't do anything!"

That's what individuals within companies do all the time.

You're onto something, form a limited liability company for your AIs and hire them as contractors. There, a clear assigned accountable entity :)