top | item 10874296

(no title)

masterzora | 10 years ago

> Schneier is discussing an unpleasant fact; unbiased algorithms often discover that things we previously attributed to bias were actually unbiased predictors.

That's an interesting interpretation. To me it looks more like he's discussing that bias can (probably will) be baked into algorithms even unintentionally and often subtly. And that highlights a significant difficulty with the "unbiased predictors" thing you say: can you distinguish between an unbiased algorithm discovering that something that looks like bias isn't and bias being subtly baked into the algorithm? I think that's where the "we need to understand what we expect out of the algorithms and ensure the expectations are met" bit comes in, at least in part.

discuss

order

yummyfajitas|10 years ago

Definitely - one makes more money than the other. If a bank is biased against some group they are turning away profitable customers. This is also purely a statistical problem; once some quant discovers they can make more money by fixing the bias, they'll do it.

Understanding what to expect out of the algorithms is absolutely the wrong way to determine this. The fact is that we simply don't know apriori the optimal way to allocate credit. That's why we need an algorithm in the first place.

pron|10 years ago

Or, having realized that algorithms making predictions based on past data are politically conservative (pretty much by definition), if we really want to improve the economy, we should take political action to change the situation. This is how it's always been done, and it can work; sometimes rather well.