top | item 28020851

(no title)

username90 | 4 years ago

They never promised that the output would be error free, having output with errors is still useful for many applications. And the issues you are talking about got fixed as soon as it was discovered and since then Google has made sure to always diversify their datasets by race. Nowadays that is common knowledge that you need to do it, but back then it wasn't obvious that a model wouldn't generalize across human races and it is much thanks to that mistake that everyone now knows it is an issue.

discuss

order

3gg|4 years ago

It was discovered by others, not them; they fixed the issue only retroactively when it was called out in public. This lack of oversight is part of what I mean with applying things with caution.

And why would they have assumed in the first place that the model _would_ generalize across human races, or any other factor for that matter?