(no title)
ewjordan | 7 years ago
Unless they presented lots of unqualified resumes of people not in tech as part of the training, which seems like something someone might think reasonable. Then, the model would (correctly) determine that very few people coming from women's colleges are CS majors, and penalize them. However, I'd still expect a well built model to adjust so that if someone was a CS major, it would adjust accordingly and get rid of any default penalty for being at a particular college.
If the whole thing was hand-engineered, then of course all bets are off. It's hard to deal well with unbalanced classes, and as you mentioned, without knowing what their data looks like we can only speculate on what really happened.
But I will say this: this is not a general failure of ML, these sorts of problems can be avoided if you know what you're doing, unless your data is garbage.
lalaland1125|7 years ago
That's exactly the issue we are talking about here. Woman's colleges would have less training data so they would get updated less. For many classes of models (such as neural networks with weight decay or common initialization schemes) this would encourage the model to be more "neutral" about women and assign predictions closer to 0.5 for them. This might not affect the overall accuracy for women (as it might not influence whether or not they go above or below 0.5), but it would cause the predictions for women to be less confident and thus have a lower ranking (closer to the middle of the pack as opposed to the top).
ewjordan|7 years ago
A class imbalance doesn't change that: if there's no gradient to follow, then the class in question will be strictly ignored unless you've somehow forced the model to pay attention to it in the architecture (which is possible, but would take some specific effort).
What I'm suggesting is that it's likely that they did (perhaps accidentally?) let a loss gradient between the classes slip into their data, because they had a whole bunch of female resumes that were from people not in tech. That would explain the difference, whereas at least with NNs, simply having imbalanced classes would not.