(no title)
anton_tarasenko | 7 years ago
> Millions of times each year, judges must decide where defendants will await trial—at home or in jail. By law, this decision hinges on the judge’s prediction of what the defendant would do if released. This is a promising machine learning application because it is a concrete prediction task for which there is a large volume of data available. Yet comparing the algorithm to the judge proves complicated. First, the data are themselves generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the single variable that the algorithm focuses on; for instance, judges may care about racial inequities or about specific crimes (such as violent crimes) rather than just overall crime risk. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: a policy simulation shows crime can be reduced by up to 24.8% with no change in jailing rates, or jail populations can be reduced by 42.0% with no increase in crime rates.
gbrown|7 years ago
lamy2000|7 years ago
reader5000|7 years ago
_v7gu|7 years ago
(Also noticed the nice coincidence of a professor with user name klienber having a NBER Working Paper)
candiodari|7 years ago
I feel like a lot of arguments being made here fail the A vs B test. Any argument that purports to provide help with choosing Judges vs Algorithms needs to apply differently to Judges, and differently to Algorithms.
How about: with Judges we simply won't know (for sure) what influences them. Are they racist ? Who knows. Do they prefere to let people with jobs out (realistically: yes, but we don't know for sure). Do they ...
With algorithm we can literally test, by presenting them with artificial cases, lots of them, and see how they judge. With a judge, you can't.
dagw|7 years ago
As long as that correlates with behavior we want to see from the populace; https://xkcd.com/810/
will_brown|7 years ago
That isn’t the legal standard for release pending trial.
Thus is may look like an algorithm “predicts” crime better than judges, but judges can’t withhold bail/bond because they “predict” a certain defendant will commit another crime (because that’s not exactly judges are doing when determining bond).