With a moment's thought, even the most emotive amongst us should see that the mugshots will be part of the training set--the photographed individuals are, after all, the class of true positives.
You train a model on a bunch of photos of white people, and a few photos of black people.
You then deploy that model, and use the model to match black person detained by racist officers against a database of photos that the police have from before. In that database the majority of people are black.
Shitty AI that was not properly taught what black people look like because most of the people in the training data were white, says that it found a probable match for detained black person.
Racist officers do not attempt to second guess the computer, so they throw innocent black person into their car and drive off to the police station.
codetrotter|2 years ago
You then deploy that model, and use the model to match black person detained by racist officers against a database of photos that the police have from before. In that database the majority of people are black.
Shitty AI that was not properly taught what black people look like because most of the people in the training data were white, says that it found a probable match for detained black person.
Racist officers do not attempt to second guess the computer, so they throw innocent black person into their car and drive off to the police station.