top | item 19661727

(no title)

tzar | 6 years ago

Isn't the problem, then, that the software doesn't work well enough with dark skin?

discuss

order

FakeComments|6 years ago

I’ve seen no evidence of it.

What I’ve seen is differential comparisons, eg, comparing the rate of white detection to black detection or the difference in certainty scores on each — but I’d really appreciate it if people could show me the actual certainty numbers on black faces so I can see if it’s failing to recognize, misrecognizing, or just less sure then white faces.

got2surf|6 years ago

The evidence is literally in the original article: "Darker-skinned women were the most misclassified group, with error rates of up to 34.7%. By contrast, the maximum error rate for lighter-skinned males was less than 1%"

Regardless of whether the higher error rate is a combination of race, gender, or both, it's still a huge issue. Granted, that study was from a year ago, and other companies have since improved their facial recognition systems. But an overall precision/accuracy/f1 score doesn't mean much when accuracy varies that much by group. Sure, you can market it as "accurate on white males", but you can't market it as "accurate"