top | item 21147916

Google contractors reportedly targeted homeless people for facial recognition

185 points| jamesgagan | 6 years ago |theverge.com

147 comments

order

username90|6 years ago

I remember the outrage when people discovered that Google's AI wasn't properly trained on black faces. It makes sense that they try hard to avoid that happening again by paying black people to let Google scan their faces. It is not unethical to try to diversify your training data.

https://www.telegraph.co.uk/technology/google/11710136/Googl...

Anyway, this part sounds directly illegal, seems like it was just Randstad being greedy but if anyone from Google knew about it then it is bad but I doubt that they couldn't budget enough money to get the scans legally:

> They said Randstad project leaders specifically told the TVCs to (...) conceal the fact that people’s faces were being recorded and even lie to maximize their data collections.

https://www.nydailynews.com/news/national/ny-google-darker-s...

daniel-cussen|6 years ago

Randstad is dishonest in general.

A friend went through the application and they wanted her to digitally sign 46 contracts, one after the other, without a chance to read the following contract before signing the current one. Including one about an arbitration clause. She did see that the first contract offered to send the rest of the contracts printed, by mail, but when she talked to the rep, he acted like he didn't have access to the contracts he wanted her to sign (yeah right and later he'll be like, well you signed y, so you gave up the right to x, probably knows them by heart), and that she should simply sign them and then go back and print them.

Presumably they have to offer to send them by mail for the contract based on online signatures to be binding, so it's interesting that the rep refused to do so. It was especially sad they have a deal with unemployment offices that funnel workers to them using state funds.

dclusin|6 years ago

I feel like they were needlessly dishonest and misleading and that being truthful would have gotten more people on board. Saying something along the lines of "hey, we noticed our facial recognition algorithms don't work so well for African Americans. Could you help us fix that by letting us take a picture of your face? We'll give you a $5 gift card for your troubles."

jeromebaek|6 years ago

There is a premise in this deduction, which Randstad made: 1. We need darker faces in our training data. 2. Therefore, gather training data from homeless people.

How do you go from 1 to 2? With the premise "darker-faced people tend to be homeless".

This is not necessarily a false premise -- statistically, it is true, and it is a reflection of systemic injustice -- but the outrage is not whether it's true or false; the outrage is that Randstad exploited this painful fact.

kauffj|6 years ago

Kudos to Google's contractor for offering this opportunity to the people who need it most.

I would happily sell anyone a picture or scan of my face for $5. But I would even more happily have that chance go to someone who needs it more than myself.

This article also mentions that the contractor may have lied to or misled the homeless, which is deplorable. But the behavior described by the title itself is nothing objectionable. The fact that many will object is a phenomenon I've seen called "Copenhagen Ethics": https://blog.jaibot.com/the-copenhagen-interpretation-of-eth...

tobib|6 years ago

> I would happily sell anyone a picture or scan of my face for $5. But I even more happily have that chance go to someone who needs it more than myself.

Would you really? My gut feeling tells me that's not the case for most people for privacy or ethical reasons. Just because those people are poor, we expect them to have lower privacy or ethical standards.

The link you posted has the following example, I think you're referring to that

> BBH Labs was an exception – they outfitted 13 homeless volunteers with WiFi hotspots and asked them to offer WiFi to SXSW attendees in exchange for donations. In return, they would be paid $20 a day plus whatever attendees gave in donations.

That's completely different. Offering Wifi has zero long term effects. It's providing people with a "business opportunity" that wouldn't have access to it otherwise. Giving someone 5 bucks for their face picture (or other biometrics) is totally different and has long term negative effects.

gbanfalvi|6 years ago

What if it was one dollar instead of five? It would have saved hundreds of thousands for Randstad. Or if google is paying for it and this feature motivates 1% more people to buy the phone, they could be making a lot of money. I bet loads of homeless would be fine with $1 for a snapshot. How does the outcome differ for all these homeless people in this case?

What if they could have bargained for $10 or even more instead? I don’t think either company would even blink at the sum, but many desperate people out there would be a lot better off.

I agree with you that some observers are never going to be satisfied and to them there’s always more an individual or a company can do. There is definitely an observer effect.

Similarly, If we took my line of questioning all the way to an absurd extreme, the best outcome would be if all these people got permanent shelter, jobs, and a stable life. But we can’t expect companies with profit targets to do this them. Nobody would feel bad about this exchange, but it would be pretty unrealistic.

So i guess I need to reframe my original question. Why do certain exchanges feel ok while other ones leave a sour taste in everybody’s mouth?

To me it seems like the answer is because the exchange felt unfair. Both parties stand to benefit but, instead of doing something genuinely beneficial for both, the party in power offered the (almost) bare minimum. That sense of unfairness is multiplied when you contextualize the exchange as Very Large Business vs. Small Homeless Person.

Similarly the link to the phenomenon discusses our role as observers, but it doesn’t discuss the parties’ roles in the exchanges. They’re not only observers, they’re also actors. The people performing the homeless study could, for example offered something to the control group at the completion of the experiment.

jakelazaroff|6 years ago

The issue is not that the contractor only marginally helped these people. It’s that they exploited a massive power imbalance in order to reap a vastly larger reward than they offered.

“Copenhagen Ethics” really just strikes me as a rhetorical tool to defend exploitation. “What, just because I offered this person a job I have to pay them a minimum wage?”

blub|6 years ago

$5 isn't much. Why not post here a picture and 3d scan of your face as a token of good will?

It could help some start-ups that need such a face for demo purposes or other experiments.

o_p|6 years ago

This, if $0 + not giving face instead of $5 was a superior choice, the homeless would have chosen it, but they didnt.

andrerm|6 years ago

> which is deplorable. But

There is always a but

chooseaname|6 years ago

Giving $5 to a homeless person doesn't help them.

phjesusthatguy3|6 years ago

You're selling yourself short.

Google, et al, want to use my likeness to facilitate database lookups. They are welcome to a perpetual, exclusive license of that data at a quarter of a trillion USD. They know how to get in touch with me; I'm 100% serious.

SamBam|6 years ago

It sounds like there were three issues:

1. The contractor targeted homeless people

2. They targeted people with darker skin

3. They may not have been forthright or truthful about what they were doing.

Number 3 is clearly wrong. But I think so long as the contractors were upfront and truthful about what they were doing, I don't know if 1 or 2 are problematic.

The only argument I can see for why they shouldn't pay homeless people money for an easy job is that the prospect of money might be so enticing that they're willing to give up personal rights or freedoms (the same argument why we don't allow selling of organs). But $5 neither seems high enough, nor the process invasive enough, that this argument would hold water.

As for ensuring that enough of a sample range is in the database as an attempt at avoiding data bias, this should be a no-brainer good thing.

aeturnum|6 years ago

If this report is correct it seems like they were targeting a vulnerable population based on the logic that their vulnerability made them less likely to insist on being treated better than google wanted to treat them. That seems really bad to me.

If you're asking folks on the street and happen to get a lot of unhoused folks because they're around, that's fine. Writing memos telling people to target vulnerable populations because they're vulnerable is gross and deeply unethical.

ehsankia|6 years ago

At least the Verge has the less clickbait headline, mentioning that it was contractors. The original source mentions Google in the headline but the rest of articles only refers to Randstad.

One part that is a bit confusing to me is, the original source makes no references whatsoever to any consent form. Usually you can't collect this sort of data without signed consent, and previous reports [0] do mention such a form. I know most people don't read the form, but I'm curious how you can get away with telling someone you're just playing a game and lie so much when the form should clearly state what you're collecting.

Still, there should definitely be better vetting of contractors and stories like this definitely look very bad, even if the intentions were actually to help reduce ML bias.

[0] https://www.engadget.com/2019/07/29/google-paid-for-face-sca...

EDIT: The original article does indeed mention an show a picture of the "agreement".

domador|6 years ago

I'm on the fence regarding whether it's unfair for the a news source to attribute this sin directly to Google via its headline. "Actually, it was our contractors who did it" seems like too common and easy an excuse for companies and governments who want to outsource the blame and the fallout for their questionable projects.

SolaceQuantum|6 years ago

"I know most people don't read the form, but I'm curious how you can get away with telling someone you're just playing a game and lie so much when the form should clearly state what you're collecting."

Mental illness, addiction, the constant 24/7 stress of being homeless, potentially systemtic issues starting from childhood that gets in the way of developing necessary reading skills to accurately analyze and knowledge base understanding of the concepts being read, etc. are all factors that would make reading and understanding any consent form of enough technological-legal terminology a very difficult task.

xenocyon|6 years ago

IMHO an entity is ethically responsible for anything its contractor does, unless there is reasonable clarity that the contractor acted in opposition to the client's wishes. Having someone else do your dirty work doesn't make it less dirty.

journalctl|6 years ago

> ...I’m curious how you can get away with telling someone you’re just playing a game and lie so much...

Since you’re curious: homeless people aren’t important to Google or society in general because they have so little and everyone has all but stopped caring about them. They’re poor, they’re unfortunate, and so they’re exploited. This has been happening since... leafs through book forever. Serfs used to toil away in fields until they perished, and nobody gave a damn about them either.

They did this because they could get away with it, because they (and probably Google) knew there wouldn’t be any consequences. It’s the same old song and dance: the poor get explored for the benefit of the rich, and most people don’t seem to care.

JohnFen|6 years ago

> The original source mentions Google in the headline but the rest of articles only refers to Randstad.

If you hire a contractor, you are responsible for what the contractor is doing unless the contractor is operating outside of the terms of the contract.

If the contractor is within the terms of the contract, saying that "Google is doing this" is not deceptively inaccurate.

lovich|6 years ago

Why do you think a form would stop this from happening? Its just words on paper until there is enforcement behind it.

Who is going to enforce the law against Google in the name of homeless people? Its not like the government in the US has been a champion of the downtrodden in recent years

TallGuyShort|6 years ago

A while ago Google got bad press because some image-tagging service identified some people as "gorillas", and IIRC it was blamed on not having enough diversity of skin color in the training data. So... it sounds like at least the "instructed them to target people of color" part of this is them trying to correct that. But in isolation that sounds even worse than the first instance. I guess you're damned if you do, damned if you don't.

JohnFen|6 years ago

> I guess you're damned if you do, damned if you don't.

This is not a case of that.

Google (or its contractor) could easily have done this in a way that was not objectionable. They simply decided not to.

kpx11|6 years ago

Next news: Google engineer sneezes in subway, google trying to infect people.

I mean there's no need to have google's name in there, other than to click-bait-trick people into viewing their subpar journalism with ads.

But, It's kinda shitty to cheat people no matter what. You cannot say hey pixel 4 is gonna have face unlock and i want you face scanned for that obviously, but contractor should have done a better job.

DoreenMichele|6 years ago

There's a long, long history of leaving women, people of color, poor people and other groups out of data sets. For example, I've read articles that indicate we can't create good photos of people of color because film standards were normalized to white skin.

So, try to fix that and... there's hell to pay?

File under: "No good deed goes unpunished."

mattnewton|6 years ago

I don't think there is anything wrong with the memo saying "we need more people of color in our data set." I hope everyone agrees with that.

What seems to have been bad is the contractor misinforming people about what data would be collected (and for what use), and it's not clear what Google had in their contract to prevent that kind of unethical behavior. It is also very questionable IMO to target the homeless "because they won't talk to the media" which was allegedly in the instructions the contracting firm Randstad gave to it's workers.

Disclaimer: While I work as a low level employee at an unrelated team in Google, my opinions are my own and do not represent those of my employer, and this is the first I am hearing of this.

umvi|6 years ago

It's not an easy problem as you imply. It's not that you are "leaving out" groups so much that you have to go out of your way to include minorities in your data set by the definition of the word minority. This can make your project magnitudes more complex.

No matter what you make, some minority corner case will break your tech and generate outrage. ("How DARE your speech recognition not work on AAVE!", "How DARE your facial recognition not work on burn center victims!" etc.)

microtherion|6 years ago

But now they have a dataset where a disproportionate proportion (possibly the last majority) of people of color represented were homeless.

That's bound to introduce other kinds of bias into the data.

gerash|6 years ago

This seems like a fake outrage to me. If I were a homeless I'd be more than happy for someone to take a photo of me for $5. In fact, I'd find it pretty hypocritical if someone was spending their energy fighting against possible infringement of my rights in such scenario instead of actually providing me with money or food.

autoexec|6 years ago

> If I were a homeless I'd be more than happy for someone to take a photo of me for $5.

The problem isn't that they were offering money in exchange for photos of homeless people it's that they were tricking homeless people into giving up their biometric data by telling them they'll pay them $5 just to play with a phone for a few minutes.

If they were honest about what they were taking and why I wouldn't have a problem with it.

l_t|6 years ago

> Google and Randstad didn’t immediately reply to requests for comment.

The content of the article is interesting enough, but this line at the end caught my attention.

Is it reasonable to expect someone to "immediately reply" before you publish the article? Because that doesn't sound like ethical journalism to me, unless I'm misunderstanding the meaning of "immediately" in this context.

ry_ry|6 years ago

Doesn't matter, I suspect - It works within the narrative and implies they have something to hide. It's good /tabloid/ journalism, and poor investigative journalism.

Randstad are very much the former.

jonas_kgomo|6 years ago

This is too desperate. It seems AI will help in criminalizing and unbiased profiling of people of color, as a person of color it feels really hard to imagine a future where justice is done with due diligence.

Joy at Media Lab has been looking at this issue for a while and advocating for balance. https://www.technologyreview.com/s/612775/algorithms-crimina...

Also I find it weird that Nvidia was able to simulate realisticly looking people last year, and Google is struggling to find humans, can't they use that as ground-truth?

willwashburn|6 years ago

The first thing they teach you about research, do not do it on the vulnerable population. It's not like people going to the public would matter; The Pixel 4's features and hardware all leaked way before the announcement.

andrerm|6 years ago

> “They [Google contractor] said to target homeless people because they’re the least likely to say anything to the media,” the ex-staffer said. “The homeless people didn’t know what was going on at all.”

curiousgal|6 years ago

How dare they give $5 to homeless people instead of college students walking around campus with airpods.

journalctl|6 years ago

Yeah, usually you have to pay at least minimum wage to exploit college students.

kapuasuite|6 years ago

This really counts as “extreme and unsavory” these days?

rootsudo|6 years ago

Same thing panasonic does in Japan to improve facial recognition - except they just pay foreigners 5000 JPY and tell you what they do.

jdkee|6 years ago

Google seems to be outsourcing their ethics.

olalonde|6 years ago

Anyone knows why they were giving gift cards instead of cash? I suppose there's a legal reason.

Notorious_BLT|6 years ago

I can think of several reasons that homeless activism groups would suggest as reasons to use gift cards instead of cash. Most of them are pretty evident if you consider what kinds of goods are only available through cash-only purchases.

sarcasmatwork|6 years ago

Selling ones privacy for a $5 Starbucks G/C. They know their gullible audience!

NN88|6 years ago

This is blatantly unethical.

Expanding your database? Great

Forgetting situational ethics? Disgusting

acdc4life|6 years ago

Why doesn’t the model trained on one race generalize to different races? That sounds inferior to human vision.

kllrnohj|6 years ago

"All X's look the same" where X is any of [Asians, Black people, white people, etc..] is a very common refrain.

Human eyes also need to be trained on diverse data. It's the cross-race effect: https://en.wikipedia.org/wiki/Cross-race_effect

rcar|6 years ago

Facial features can vary from ethnicity to ethnicity (see e.g., https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3074358/), and so a machine learning model trained solely on pictures from one ethnic group may not understand how to reliably distinguish different people of another ethnicity.

mgraczyk|6 years ago

This strikes me as the ethically best possible way to collect this data. Google is paying people who need the money for something simple and completely harmless.

The main counterargument appears to be that those who sold data "didn't understand what was going on". It's hard to imagine moral convictions in which someone could consistently argue that the homeless don't understand money in exchange for photos, but it's acceptable to leave them to fend for themselves on the street.

Google is, at worst, helping people who need help.

rmsaksida|6 years ago

> This strikes me as the ethically best possible way to collect this data

"a contracting agency named Randstad sent teams to Atlanta explicitly to target homeless people and those with dark skin, often without saying they were working for Google, and without letting on that they were actually recording people’s faces"

How can this be the most ethical way to collect data?

The problem isn't in acquiring facial recognition data from homeless people, but in mischaracterising the nature of the experiment when doing so. If the reporting is accurate, they lied to vulnerable people and tricked them into selling their data for cheap.

Companies can't go around hustling people into giving away their private information. It doesn't matter if you think this is "for their own good", a homeless person may want to refuse being catalogued by Google for a variety of reasons.

JohnFen|6 years ago

> This strikes me as the ethically best possible way to collect this data.

I don't see how failing to get informed consent counts as "the ethically best possible way".

moate|6 years ago

OOF. That's not how this works sir. Did you read the article? Some quotes:

“They said to target homeless people because they’re the least likely to say anything to the media,” the ex-staffer said. “The homeless people didn’t know what was going on at all.”

Some were told to gather the face data by characterizing the scan as a “selfie game” similar to Snapchat, they said. One said workers were told to say things like, “Just play with the phone for a couple minutes and get a gift card,” and, “We have a new app, try it and get $5.”

Google (or their contractor if you're going to fight about the semantics here) is, at worst, guilty of misleading people about what they were doing, targeting vulnerable people with the expressed idea that they would be less likely to create problems, and not actually improving anyone's conditions in a real way by doing this.

Here's the moral convictions I have: lying to someone about what is happening to you in order to create a functioning business is bad business. It's entirely removed from the fact that small increments of money were given to some homeless people. I don't get to abuse homeless people as long as I give them 5 dollars afterwards. That's not how morality works. These people weren't lifted out of their conditions because of this life-changing sum. They weren't put into treatment centers or given job training. They were purposefully mislead and then compensated less than the price of a combo meal at McDonald's.