top | item 19253444

(no title)

anton_tarasenko | 7 years ago

"Algorithmic justice" reminded me of a study where researchers predicted the risk of a crime better than judges:[1]

> Millions of times each year, judges must decide where defendants will await trial—at home or in jail. By law, this decision hinges on the judge’s prediction of what the defendant would do if released. This is a promising machine learning application because it is a concrete prediction task for which there is a large volume of data available. Yet comparing the algorithm to the judge proves complicated. First, the data are themselves generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the single variable that the algorithm focuses on; for instance, judges may care about racial inequities or about specific crimes (such as violent crimes) rather than just overall crime risk. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: a policy simulation shows crime can be reduced by up to 24.8% with no change in jailing rates, or jail populations can be reduced by 42.0% with no increase in crime rates.

[1] https://www.cs.cornell.edu/home/kleinber/w23180.pdf

discuss

order

gbrown|7 years ago

The authors note that judges may care explicitly about racial bias, but based on a quick read they're making a really, really big mistake in the language they're using: they confuse arrests with crime. Arrests and convictions are simply a measurement mechanism for crime, which is known to have severe biases.

lamy2000|7 years ago

Kleinberg uses arrests for violent crime because they are known to have substantially less bias/zero bias

reader5000|7 years ago

There is no convincing evidence that arrest rates "severely" overestimate offense rates; if anything it is just as likely arrest rates underestimate offense rates.

_v7gu|7 years ago

Until the populace learns how to improve their chances of getting released and starts to game the system, introducing endogeneity.

(Also noticed the nice coincidence of a professor with user name klienber having a NBER Working Paper)

candiodari|7 years ago

Which is totally not possible with judges, right ?

I feel like a lot of arguments being made here fail the A vs B test. Any argument that purports to provide help with choosing Judges vs Algorithms needs to apply differently to Judges, and differently to Algorithms.

How about: with Judges we simply won't know (for sure) what influences them. Are they racist ? Who knows. Do they prefere to let people with jobs out (realistically: yes, but we don't know for sure). Do they ...

With algorithm we can literally test, by presenting them with artificial cases, lots of them, and see how they judge. With a judge, you can't.

dagw|7 years ago

Until the populace learns how to improve their chances of getting released

As long as that correlates with behavior we want to see from the populace; https://xkcd.com/810/

will_brown|7 years ago

>By law, this decision hinges on the judge’s prediction of what the defendant would do if released.

That isn’t the legal standard for release pending trial.

Thus is may look like an algorithm “predicts” crime better than judges, but judges can’t withhold bail/bond because they “predict” a certain defendant will commit another crime (because that’s not exactly judges are doing when determining bond).