top | item 14388102

Facebook's internal rulebook on sex, terrorism and violence

108 points| eatsfoobars | 8 years ago |theguardian.com | reply

84 comments

order
[+] kyledrake|8 years ago|reply
I hope the take-away that comes out of the leaked documents is this: community moderation is very, very hard. It's not just a simple cut-and-dry "free speech vs censorship" issue. There's an enormous amount of nuance involved to sustain a community that won't just devolve into a toxic waste dump that's poison for 99% of the people there.

In my experience working with startups, many of them don't treat their content moderation team well, or it tends to be an afterthought. In reality, this can be the hardest and most important job at a company. PTSD is a real problem with these teams, and pay is often sub-par. And when they make mistakes (or when they don't but people still get mad), they tend to get the bulk of the criticism from it. And then at the end, they usually aren't allowed to talk publicly about any of the work they do.

Please take your content moderation department seriously. If you have one, read through the crap Facebook's team has to deal with on a daily basis, and then go over and hug them.

[+] coldtea|8 years ago|reply
>There's an enormous amount of nuance involved to sustain a community that won't just devolve into a toxic waste dump that's poison for 99% of the people there.

Is it? I always admired the much-scoffed YouTube as the free-est place to talk on the internet. Where a vegan gay liberal from NY can discuss on the same video with a neo-con Mongolian and a 15-year old hipster from Sweden, an Israeli Jew, an Arab, and a black South African truck driver, and even agree and share experiences, and at worse you just get a few insults thrown and that's it.

[+] sillysaurus3|8 years ago|reply
To rephrase a flagged comment: 4chan is a pretty good counterexample. It's not necessarily true that a community destroys itself if you have no moderation. This community would; 4chan hasn't. Why?
[+] tripzilch|8 years ago|reply
> a community that won't just devolve into a toxic waste dump that's poison for 99% of the people there

Have you seen Facebook?

That's not what it's optimizing for. Profit and eyeballs. Don't delude yourself. Toxic? Poison? Sure thing, as long as they can sell your attention span, Facebook don't care.

Honestly. You know what it's doing. Doing to everybody.

In as far as this team is building or sustaining a community, it's a tiny sandcastle on a beach facing the automated tidal wave of corporate exploitation of our dopamine/addiction response.

Sustaining a community of junkies is very, very hard. It's not just a simple cut-and-dry "freedom to do whatever to your body" vs "personal health". There's an enormous amount of nuance involved to sustain a crack house that won't just devolve into a toxic waste dump that's poison for 99% of the people there.

In my experience working at crack houses, many of them don't treat their bouncers well, or it tends to be an afterthought. In reality, this can be the hardest and most important job. PTSD is a real problem with these teams, and pay is often sub-par. And when they make mistakes (or when they don't but people still get mad), they tend to get the bulk of the criticism from it. And then at the end, they usually aren't allowed to talk publicly about any of the work they do.

(the analogy holds about as far as until you try to hug them--which is when they'll punch your lights out. but they do mean well, or something)

[+] nextlevelwizard|8 years ago|reply
>PTSD is a real problem with these teams

Yeah, that's not even close. They might get stressed, but it's not like seeing your best friend(s) getting blown to pieces every time you close your eyes.

[+] wiz21c|8 years ago|reply
Listen. I used the phone since 40 years. With the phone I had conversations with people that could have, well, annoyed others to say the least. I've said super bad things with my phone. Was I moderated ? No. Was it a problem ? No. Moderating is just here to make sure that FaceBook can still operate. That's just a problem for FaceBook to protect its image.

Let's imagine that there is no moderation. What will happen ? OK, some people will have argument, there will be insult, blood, etc. And, well, depending on a tipping point, we may realize that oooooops, letting people talk in the open is not so good. FB would be blamed and hen disappear. So be it. Now, another thing to do is : prosecute those who say unacceptable things. With prosecution, you have the justice system that will handle the case. That'll be slower but there will be discussions and, hopefully better laws.

FaceBook is becoming the Judge Dredd of free speech. It decides what's criminal and what's not and it applies his own power to handle the case directly.

[+] mythrowaway1234|8 years ago|reply
My first thought on reading this makes me deeply empathize with the people that have to deal with these questions. It must be extremely draining and emotionally difficult work, probably traumatic. Just having to write a policy about what constitutes acceptable nudity in the context of the Holocaust by itself is pretty terrible. And yet it's important for people to see those images. Just like the controversy over the image of the naked Vietnamese girl running down a road with napalm burns. That too is an important image. But to allow that image, someone has to look at these kinds of images and decide what's ok and what's not. I can't even imagine all the disturbing shit one has to look at to come up with those rules (which will be imperfect).

I can't imagine that there are many other single entities that have to deal with the ugliness of humanity at the scale of FB.

The sheer volume of content is unlike anything else humans have ever seen before. By comparison news outlets like the Guardian have the luxury to struggle with these kinds of questions on a 1-off basis.

Given all that I have to wonder how government regulations would make this better?

[+] henrikschroder|8 years ago|reply
> Given all that I have to wonder how government regulations would make this better?

No no, that would impact free speech.

You want all large user content-handling companies to be able to tell users wanting to talk about X to go somewhere else, and it then has to be legal for those users to make that somewhere else themselves. There's enough idiots screaming about free speech on private platforms already, we don't want more of them.

[+] minimaxir|8 years ago|reply
> Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals.

100k followers is a curiously high threshold to be labeled as a public figure. I'd wager many people who are notable enough to have Wikipedia pages would not be able to hit that threshold even on Twitter.

[+] tptacek|8 years ago|reply
Pedantry: you can keep a Wikipedia page with virtually no followers or following of any kind. The criteria for Wikipedia pages is simply:

(a) That there be some recognizable claim of notability --- this is an extremely low bar.

(b) That the notability claim, and any other material in the article, be backed entirely by reliable secondary sources.

(c) That after the article is stripped down to facts that can be verified in reliable secondary sources, there's still an encyclopedia article's worth of content left. This, too, is a very low bar.

Where pages tend to run afoul of WP's notability requirement is item (b).

But it's easy to see --- and to provide examples of --- people that with no public influence at all still having a Wikipedia page.

[+] matt4077|8 years ago|reply
It's a sufficient, not a necessary condition for being considered a public figure.

(Although I'd disagree with your implication that having a wikipedia page should constitute proof of the publics' interest in someone's life, and all the consequences being a "public figure" entails in some jurisdictions)

[+] maxerickson|8 years ago|reply
What is the downside of having a high threshold for removing protections?
[+] ouid|8 years ago|reply
Why is there so much emphasis on the credibility of the threat? Why not just blanket remove all explicit threats? Establishing credibility is impossibly hard.
[+] adiabatty|8 years ago|reply
Most (≈99.99%?) death threats aren't at all serious.
[+] johnthealy3|8 years ago|reply
This is the type of thing that can really benefit from public disclosure and discourse. What's "acceptable" varies so much by person and over time that having the censoring done in a black box will never be the answer.

The guidelines in the article seem overall reasonable given the range of possibilities versus the number of people working on it. I hope this stays out of the reach of government as long as possible.

[+] johnnydoe9|8 years ago|reply
There was a recent documentary I saw shared by a couple blogs about the moderators from Facebook,YouTube, etc and how they have to deal with depression now because of the job.

Full documentary is free to watch by the creators themselves https://vimeo.com/213152344

[+] chroem-|8 years ago|reply
It's interesting that they have specific provisions for Zionism.
[+] DanBC|8 years ago|reply
That's because there's a specific campaign that actually happened with "how to stab a Jew" and similar.

https://news.vice.com/video/palestinian-social-media-uprisin...

> Leaderless Palestinian youth, inspired by instructional videos and photos on social media encouraging people to "Stab a Jew," are thought to be behind a new wave of violence in Israel and the West Bank. Uncoordinated and spontaneous attacks by individual young Palestinians, mostly under the age of 25, started to occur almost daily from October 2015, with assailants often using a household weapon — a knife, axe, meat cleaver, screwdriver — before being fired upon by nearby Israeli security forces. So far, the bloodshed has claimed the lives of at least 28 Israelis and 189 Palestinians, 128 of whom Israel says were assailants.

[+] pottersbasilisk|8 years ago|reply
Yes, it is, Id expect general provisions just for religion. I guess some religions are more equal than others.
[+] Udik|8 years ago|reply
If you refer to "#stab and be the fear" etc., I guess it's not because it's about Zionism, but rather because it's an incitement to commit terrorist acts (totally credible, seen the situatuon). So I guess it makes sense. (And I say this as a total sympathizer with the plight of Palestinians and their struggle against the Israeli occupation.)
[+] literallycancer|8 years ago|reply
Might have been because of the hashtag, but yeah, it's a bit weird.
[+] jmacpore|8 years ago|reply
Are the original files out there for people to go through themselves? Or do we only have what the Guardian will show us?
[+] CaliforniaKarl|8 years ago|reply
I doubt documents like that would ever be published willingly.

If those documents were public, that (in my opinion) would just enable bullies etc., as they would know exactly what they could get away with. That would then trigger tightening the rules, which would then trigger much complaining about Facebook suppressing free speech etc..

There are also the motivations of whomever wants this out there. Some people may only want the debate, but some want it out there so as to enable the bullies. And others just want the fame and fortune of being the one who published it.

Honestly, this all sucks.

[+] marmaduke|8 years ago|reply
Parts of this read like a mundane description of minority report.
[+] anigbrowl|8 years ago|reply
As I have said many times, violence is considered more tolerable than sex. Guess what you're gonna get more of. And yet there is no mechanism to decide who has input into FB's community standards.
[+] pottersbasilisk|8 years ago|reply
Do you have any legal citations about the right to be heard?
[+] delinka|8 years ago|reply
Do you have any legal citations about any party being compelled to listen to another such that the other's right to free speech is preserved?
[+] afuchs|8 years ago|reply
Are you asking for proof that a speaker has a right to force other people to listen to their argument?
[+] darz0re|8 years ago|reply
I'm more surprised that they haven't moved to Machine Learning yet.
[+] s_kilk|8 years ago|reply
The cult around ML here is pretty amusing. As another poster said, AI is currently staggeringly bad at this task, and I'm not sure it's going to be good enough in the near future either.
[+] lwansbrough|8 years ago|reply
Machine learning is incredibly bad at this kind of stuff. There are far too many false positives and special cases to leave to ML. And the stakes are fairly high for Facebook: if they block content that shouldn't be blocked, it's newsworthy.