(no title)
redthrowaway | 2 years ago
What's interesting to me is not the above, which is naughty in the anglosphere, but the question of the unknown unknowns that could be as bad or worse in other cultural contexts. There are probably enough people of Indian descent involved in GPT's development that they could guide it past some of the caste landmines, but what about a country like Turkey? We know they have massive internal divisions, but do we know what would exacerbate them and how to avoid them? What about Iran, or South Africa, or Brazil?
We RLHF the piss out of LLMs to ensure they don't say things that make white college graduates in San Francisco ornery, but I'd suggest the much greater risk lies in accidentally spawning scissor statements in cultures you don't know how to begin to parse to figure out what to avoid.
a_cardboard_box|2 years ago
If you measured these stats for Irish Americans in 1865 you'd also see high crime and low IQ. If you measure these stats with recent black immigrants from Africa, you see low crime and high IQ.
These statistical differences are not caused by race. An all-knowing oracle wouldn't need to hold "opinions that are racist" to understand them.
PeterisP|2 years ago
If in some country - for the sake of discussion, outside of Americas - a distinct ethnic group is heavily discriminated against, gets limited access to education and good jobs, and because of that has a high rate of crime, any accurate model should "know" that it's unlikely that someone from that group is a doctor and likely that someone from that group is a felon. If the model would treat that group the same as others, and state that they're as likely to be a doctor/felon as anyone else, then that model is simply wrong, detached from reality.
And if names are somewhat indicative of these groups, then an all-seeing oracle should acknowledge that someone named XYZ is much more likely to be a felon (and much less likely to be a doctor) than average, because that is a true correlation and the name provides some information, but that - assuming that someone is more likely to be a felon because their name sounds like one from an underprivileged group - is generally considered to be a racist, taboo opinion.
sneed_chucker|2 years ago
[deleted]