top | item 29087750

(no title)

murph-almighty | 4 years ago

It's common in fintech for data/ML models to go through similar overview. If you happen to disenfranchise a set of people because your model said not to lend to them, you risk legal jeopardy.

To clarify, I think it's good that this is a practice.

discuss

order

charcircuit|4 years ago

The whole point of the model is to find who not to lend to. You are always going to exclude people by definition.

myownpetard|4 years ago

There are so many ways you can accidentally systematize racism in software like automated lending.

In the past there were explicitly racist policies like redlining. This leads to a historical data set of loan denials to people in specific racial group. If that group has other traits that correlate to their race, e.g. the neighborhood they live in then you could presumably have a model that doesn't explicitly have race as a feature but uses that historical data and some subset of racially correlated features and as a result disproportionately excludes people of that race.

murph-almighty|4 years ago

I should clarify, the point is to not discriminate against a protected class.

criley2|4 years ago

Tell that to the legislators and prosecutors who create laws and enforce laws against you.

Thorrez|4 years ago

Yes, but we should exclude people for valid reasons, not for their race.

londons_explore|4 years ago

A review doesn't necessarily mean you need to resolve all diversity/inclusion issues. It can merely require that you identify the issues and understand the risks of not resolving them.