top | item 14371676

Machine Bias: Man Is to Computer Programmer as Woman Is to Homemaker?

15 points| acoravos | 9 years ago |fatml.org | reply

53 comments

order
[+] backpropaganda|9 years ago|reply
If I were training a classifier to predict whether a sentence is talking about household activities v/s not, wouldn't the occurrence of man/woman in the sentence be a good feature? Today, woman do perform household activities more (whether we like it or not), and wouldn't it make sense to use that piece of information when performing some predictive analysis?

The technical sense of "bias" arises when the train and test distributions differ. Obviously if you train with a dataset of text from a foreign country's news and then apply it on an American context, the difference in the data distributions will introduce bias, but why do we need a social twist to this already well-functioning term? If the same classifier is trained and evaluated in India (with its sexist roles, say), then there's no (technical) bias and I don't see why it's a bad application.

[+] praxulus|9 years ago|reply
>wouldn't it make sense to use that piece of information when performing some predictive analysis?

No, because eventually your system will graduate from predicting the results of society's bias to reinforcing society's bias. That is a bad thing.

[+] xupybd|9 years ago|reply
I think you have a really good point here. The problem is that we have this current bias in society and people wish to change it. I think there is a fear, that if we reflect this bias, in the way we talk, we re-enforce the bias.

It seems an effective tool, if you want to change thinking then police the way words can be used around the topic. It is however worrying that machines could start playing a role in this. It could become a powerful tool in steering public opinion. This doesn't seem too bad, but that could be used to favour an incumbent political party, or more than likely to sell products we otherwise don't really want.

But you are right machines need accuracy and removing that bias could be detrimental to the task they're solving.

[+] anigbrowl|9 years ago|reply
No, it would not be a good feature. For one thing, baking the bias of existing practices as opposed to constraints risks reinforcing that practice as more and more decision-making is left to ML. Second it makes your system vulnerable to verbal paradoxes designed to exploit that bias.
[+] vtange|9 years ago|reply
This is the tug-o-war of influencer v. influencee. A machine that just tells-it-as-it-is might hold an advantage over one that willingly ignores some data to promote a different view of the world.

Personally, I see more danger in people trying to make machines that evangelize their own biases to the world than machines being molded by the existing social assumptions of society, given that we expect machines to perform most of the work/control most of the resources in the future.

[+] mkrum|9 years ago|reply
If you are going to "debias" your model, what is the point of even training the model to handle these issues in the first place? Not surprisingly, human language can be biased. If you train a model on human language it will not magically transcend those biases. The problem is that people have this expectation that ML is going to lead to these perfect decision makers.

Machine Learning creates models that reflect the data, not the truth.

[+] reader5000|9 years ago|reply
In the sjw-religion, why is "homemaker" considered inferior to "computer programmer"? One of the oldest and most important human occupations versus hunched over at a desk slaving for a salary until being outsourced to a bot in 5 years? I've never understood the default sjw/"feminism" assumptions that anything feminine is "bad".
[+] angmarsbane|9 years ago|reply
It isn't so much bad or negative as it is risky. A homemaker, male or female, becomes financially reliant on his/her partner. A partner who can die, become too disabled/ill to work, or who can leave after the homemaker has missed his/her key career/skill building years.

Women are more likely to be pushed or encouraged to take this important supportive role to benefit others while putting themselves at risk.

[+] AstralStorm|9 years ago|reply
Because it pays less (therefore enforcing financial gender inequality for which there is no good reason) and has been stereotyped for longer. It is sort of backwards thinking.

The often unspoken assumption is that stereotypes are strictly bad and evil. Without stereotypes, all the social conduct just breaks down and explodes - people suddenly become unpredictable.

However, stereotypes (including SJW stereotype) can cause big frictions between groups. Even more so when they're actually inaccurate, invalid or misapplied.

Another thing is something called "stereotype threat" which reinforces certain behaviors while punishing other - a kind of self fulfilling prophecy at times. You think you would behave as if some label would be applied to you therefore you behave to fit in. The drive to fit in is human, social and often subconscious.

[+] jakelazaroff|9 years ago|reply
It's not that anything feminine is "bad", it's that women are often pressured into doing "feminine" things because reasons.

There's no reason "homemaker" should be inferior to "computer programmer", but there's also no reason that homemaker should be "feminine". Men are equally capable of taking care of the house, and women are equally capable of programming computers.

[+] kingbirdy|9 years ago|reply
No one ever became a billionaire by homemaking
[+] anigbrowl|9 years ago|reply
In the sjw-religion, why is "homemaker" considered inferior to "computer programmer"?

It's not a religious position to observe the existence of economic and social disparities.