top | item 12761562

Facebook apologises for removing breast cancer awareness video

105 points| xufi | 9 years ago |bbc.com

92 comments

order

jacquesm|9 years ago

Just imagine the harm seeing a breast can do to a vulnerable young child. /s

Facebook needs to literally grow up, removing porn is one thing but clinical images, mothers feeding their children and such should not even be up for discussion.

It's a fine line between protecting your users from seeing offensive content and outright censorship, good to see them doing the right thing in this case, pity it is still on a 'case-by-case' basis instead of a healthy review of their policies.

The main criterion seems to be 'is the internet raising a large enough stink'? If yes then restore the image.

protomyth|9 years ago

The weird thing is I'm around a lot of conservatives in a conservative place in this country, you know, the people who supposedly are the ones getting offended. I haven't found one damn person who thinks Facebook is doing the right thing. I'm really trying to find these offended people, but I cannot.

The Vietnam photo censorship was met with quite a bit of confusion since I think most folks misremember it being our high school history book.

I'm starting to think that one group who keeps calling the FCC over TV shows is alive and well online.

Joof|9 years ago

I had the unique opportunity of visiting a nudist event recently. While children weren't allowed, some of the people had grown up in nudist communities. They are incredibly capable people with no trauma; nudity itself just isn't damaging -- not all nudity is sexual.

TheSpiceIsLife|9 years ago

We all need to literally grow up.

We need to stop calling naked female breasts "porn".

For the simple reason that, well ... they're fucking not.

How the fuck can the naked humam body be offensive. Since it's a purely historico-religious ideology, it equates to "God made a mistake".

What nonsense.

codemac|9 years ago

> The main criterion seems to be 'is the internet raising a large enough stink'? If yes then restore the image.

That's the same criterion for taking things down though.

raverbashing|9 years ago

Puritanism at its worse, also a reflection of upper management attitude I'm sure

EdHominem|9 years ago

They don't need to remove porn though. They could keep a NSFW counter and increment it for every piece of suspected porn you post. If your NSF counter / post count is greater than 0.2, start hiding your posts from people with a low NSFW ratio.

This way, even if they mis-categorize a medical image it won't be banned, just not widely disseminated.

DanBC|9 years ago

> pity it is still on a 'case-by-case' basis instead of a healthy review of their policies.

They reviewed their policies over a year ago and images of breasts are allowed if they're breastfeeding, or after mastectomy (either reconstructed or tattooed or just left) or raising awareness.

http://www.huffingtonpost.co.uk/2015/03/16/breastfeeding-fac...

https://www.facebook.com/help/340974655932193

https://www.facebook.com/help/318612434939348?helpref=search

What you see here are people reporting images, and facebook algos / employees mistakenly banning.

mtgx|9 years ago

It also shows just how bad Facebook is at machine learning (among the other top machine learning companies). They can't even distinguish "cartoon breasts" from real breasts?

Just look at this video, and see how ridiculous it is that Facebook had an automated tool that would take that down:

https://www.youtube.com/watch?time_continue=9&v=a7836jKqJag

jwebb99|9 years ago

> Facebook needs to literally grow up, removing porn is one thing but clinical images, mothers feeding their children and such should not even be up for discussion.

But people are offended by seeing clinical images of nudity, mother feeding their babies, etc. And Facebook wants to placate all of their users so nobody has a reason to leave.

Of course, they can't please everyone, but they'd rather make a fool of themselves before they'd become a more morally/socially opinionated corporation like Starbucks.

lukasm|9 years ago

P0rn is illegal in many countries. Facebook wants to do business in these countries, hence aggressive filtering and minimum liability.

ravenstine|9 years ago

Why can't Facebook filter out whatever they want to filter out? I mean, you wouldn't want to see penises all over your feed, and not everyone necessarily agrees with you about the appearance of breasts(except me, I want more breasts on my Facebook), so I can understand why Facebook would make such a decision without necessarily deeming it as wrong. Nobody has to use Facebook. And their main criterion, "is the internet raising a large enough stink", seems perfectly reasonable to me. But what a weird world we live in that a short-lived incident about womens' breasts ending with an apology from Facebook gets a news story. Then again, it's the BBC, so I shouldn't be surprised.

Swizec|9 years ago

I for one welcome our new robotic overlords and agree that it isnuncomfortable to be reminded that my favorite sexual objects are in fact meant for other purposes as well.

I wonder if seeing breast feeding triggers a fundamental paradox in us humans. Seeing as how we're the only primates who have boobs even when not breast feeding. Originally boobs meant that this femal is not available for sex. In humans it means that she is very available BUT ALSO that she isn't.

This is confusing and this conflict causes discomfort.

alex-|9 years ago

I find it surprising that this is newsworthy.

It appears that Facebook never argued that the image was in breach of its policies. Just that some software it runs had a bug that miss classified this image.

Then when challenged they apologized and approved the add.

So to me the summary appears to be "Software company has bug that effected one customer, apologies and fixes the issue" which must happen every hour of every day...

Am I missing something?....

h4nkoslo|9 years ago

"Breast cancer awareness" is a thing only because it is a form of signaling that privileges female bodies. That's why it has such tremendous buy-in despite already saturated "awareness" going back 20 years, or the fact that there are a half-dozen causes of death with greater preventibility and lethality even just considering women.

I'm really tired of people engaging in pointless signaling campaigns and expecting to get points for being So Brave in the face of ~ universal consensus that they are correct, or taking minor bureaucratic snafus like this as evidence that they are somehow not in a position of complete victory.

pyrophane|9 years ago

Every time I see these I think the same thing: this shouldn't be an issue. If we weren't allowing so few players to define so much of our experience of the internet, it wouldn't matter that much what any single one of them decides to censor. Hopefully it will be that way again someday.

tdb7893|9 years ago

I think part of the problem is that companies need to censor these sorts of images or run into regulatory trouble because being an "adult" site has some regulatory implications they don't want to run afoul of

fnbr|9 years ago

This reminds me of the incident with the Norwegian newspaper posting the "Napalm Girl" photo.

Facebook is trying to automate the detection of illegal/unwanted images, and it seems extremely difficult to detect the context of the image to the extent that you can differentiate between acceptable images of human bodies, and unacceptable images (which would be, I assume, the vast, vast majority of such images posted).

I wonder how they could proceed with this- maybe with some sort of anomaly detection, where you do a first pass to detect all images containing the unwanted features (e.g. naked bodies), and then a second pass to try and detect the activity that's going on, or to detect if the image is famous (e.g. a picture of David, the famous Italian statue, would be acceptable, while a photo of a naked man in the same position would presumably not be).

[1] http://www.siliconbeat.com/2016/09/12/sheryl-sandberg-respon...

geff82|9 years ago

As long as they remove breast pictures instead of pictures of beheadings (I complained several times as they appeared in my newsfeed, but they were always deemed as "ok"), our world is not going anywhere in terms of peace.

mcbits|9 years ago

They do censor pictures of beheadings, even in "private" messages between consenting adults. Or at least they did several years ago.

striking|9 years ago

I understand why it is this way. "Perfect is the enemy of good" and all that. It's probably much cheaper to run a system that is imperfect.

But it bothers me that we leave so much of our discourse to such imperfect systems.

tomcam|9 years ago

Good for them. Next: stop the Prager University videos from being suppressed.

judah|9 years ago

I think you're mistaken: it's Google/YouTube that has placed several of the excellent Prager University videos under restricted mode. Not Facebook.

I'm not aware of Facebook suppressing the Prager University videos.

And in the YouTube case, it's likely an automated response to (mis)flagging by users who politically differ.

turblety|9 years ago

I do agree this is ridiculous and they should be embarrassed about this kind of stuff, but I do want to remind people that Facebook is an advertising product selling "you" to advertisers. They have to please their advertisers, not users. I am glad to see people complaining. I hope people continue to complain and news outlets like this ridicule them, but until we stop giving our information away for free to these corporations, they are going to continue to do immoral and dangerous activities. Facebook is not the problem here. We are!

tn13|9 years ago

Facebook is eventually going to get into more systematic mess over this sort of rubbish.

Have one single clear principle and apply is consistently. Change the principle don't make exceptions if needed. "Educational videos wont be removed" could have been a good policy that Google has had for Youtube.

Or even "No Breasts"can be a good policy too. If you want to show breast cancer videos do it on you tube, shoot it with a prop or link to another page. I dont see why that does not work.