Seems like more of a bug than bias. The problem is in ignoring the appearance of the person in the first place. It's a statistical model, and of course there are more black rappers and white investment bankers. If it noticed that the person was white to begin with, and applied that trait, it wouldn't have to guess about the race at all.
prododev|1 year ago
Yes, this is what the author is pointing out - there's a statistical bias in the dataset that is showing in the results.
gruez|1 year ago