(no title)
VancouverMan | 1 year ago
This was at a long-established mall shop that specialized in photography products and services. The same photographer had taken suitable photos of some other people in line ahead of us rather quickly.
The studio area was professional enough, with a backdrop, with dedicated photography lighting, with ample lighting in the shop beyond that, and with an adjustable stool for the subject to sit on.
The camera appeared to be a DSLR with a lens and a lens hood, similar enough to what I've seen professional wedding photographers use. It was initially on a tripod, although the photographer eventually removed it during later attempts.
Despite being in a highly-controlled purpose-built environment, and using photography equipment much better than that of a typical laptop or phone camera, the photographer still couldn't take a suitable photo of this particular woman, despite repeated attempts and adjustments to the camera's settings and to the environment.
Was the photographer "racist"? I would guess not, given the effort he put in, and the frustration he was exhibiting at the lack of success.
Was the camera "racist"? No, obviously not.
Sometimes it can just be difficult to take a suitable photo, even when using higher-end equipment in a rather ideal environment.
It has nothing to do with "racism".
red_admiral|1 year ago
I don't think anyone is saying that the universities or the software companies have some kind of secret agenda to keep black people out. As far as I can tell there's good evidence they're mostly trying to get more black people in (and in some cases to keep Asians out, but that's another story). I also don't think anyone here was acting out of fear or hatred of black people.
What I am claiming is that the universities in question ended up with a proctoring product that was more likely to produce false positives for students with darker skin colors, and did not apply sufficient human review and/or giving people the benefit of the doubt to cancel out those effects. It is quite likely that whatever model-training and testing the software companies did, was mostly on fair-skinned people in well-lit environments, otherwise they would have picked up this problem earlier on. This is not super-woke Ibram X Kendi applied antiracism, this is doing your job properly to make sure your product works for all students, especially as the students don't have any choice to opt out of using the proctoring software beyond quitting their college.
To me it's on the same level as having a SQL injection vulnerability: maybe you didn't intend to get your users' data exposed - about 100% of the time when this happens, the company involved very much did not intend to have a data breach - but it happened anyway, you were incompetent at the job and your users are now dealing with the consequences.
And to the extent that those consequences here fall disproportionately on skin colors (and so, by correlation, on ethnicities) that have historically been disadvantaged, calling this a type of racism seems appropriate. It's very much not the KKK type of racism, but it could very well still meet legal standards for discrimination.
zahlman|1 year ago
The issue is that, for most people, the term "racism" connotes a moral failing comparable to the secret agendas, fear and hatred, etc. Specifically, an immoral act motivated by a deliberately applied, irrational prejudice.
Using it to refer to this sort of "disparate impact" is at best needlessly vague, and at worst a deliberate conflation known to be useful to (and used by) the "super-woke Ibram X Kendi" types - equivocating (per https://en.wikipedia.org/wiki/Motte-and-bailey_fallacy) in order to attach the spectre of moral outrage to a problem not caused by any kind of malice.
If you're interested in whether someone might have a legal case, you should be discussing that in an appropriate forum - not with lay language among laypeople.
dpkirchner|1 year ago
> Despite being in a highly-controlled purpose-built environment
Frankly it sounds like the environment was not purpose-built at all. It was built to meet insufficient standards, perhaps.
Iulioh|1 year ago
realitychx2020|1 year ago
Every major system in the US academic system is aimed to reducing Asian population. It often comes in the guise of DEI with a very wide definition of "Diversity" that rarely includes Asian.
These systems will use subtle features to blackbox racism. They may just be overt and leak over metadata to achieve it, or get smart and using writing styles.