(no title)
jsinai | 5 years ago
To roll with the example “man” ~ “doctor”, “woman” ~ “nurse”, the harm is having a giant and widely used search engine reinforce baseless gender biases, ie that there is no underlying reason why women should be nurses and men doctors. What is the harm you may ask? The harm may be subtle, eg being surprised when you find out your next doctor is a woman or your next is a man. It could suppress career choices and aspirations, and it could even be financial, eg reinforcing systemic pay gaps.
thu2111|5 years ago
s__s|5 years ago
You’ve essentially created a fictional data set because it’s biased due to the underlying prejudice (preconceived opinion that is not based on reason or actual experience) that men ought to be nursing more, despite that not being reality.
We’re in a strange situation where we have large concerted efforts by activists to inject fiction in to our facts (whatever the medium) with the aim of distorting perceptions in such a way as to some how correct what they perceive to be injustice in the real world.
visarga|5 years ago
jsinai|5 years ago