top | item 28990292

(no title)

CarelessExpert | 4 years ago

> It just gives people with bad thoughts a thing to look at.

The word "just" is doing a lot of work, here. By your line of reasoning, propaganda isn't anything anyone should worry about.

Not to go full Godwin, but the Nazi regime is an instructive analogy, here. By your reasoning, all the Nazis did was "just" promote a lot of anti-Jewish propaganda. Would you really go on to say "Blaming the Nazis for negative outcomes in the world is just like blaming violent video games. All they did was put the messages out there, then the people with the bad thoughts did the mass murder..."?

And to preempt the objection that Facebook isn't intentionally distributing material that, say, promotes ethnic violence, while Facebook corporate obviously does not have that as an official policy, their algorithm is doing precisely that, by actively promoting these types of materials. Facebook's own internal research has shown that it can and does steer individuals to increasingly extreme content.

So if we agree that a) Facebook's systems steer individuals to violent or extremist content (as proven by their own research), b) propaganda is a tool that works to steer public opinion and drive human behaviour (as proven by historical precedents like Nazi regime), c) extremist content serves as effective propaganda (which is well trod ground in research on extremism), and d) that Facebook knew all these things and failed to curtail what was going on (as now revealed in these internal documents), then I don't see how you can possibly defend Facebook, here.

discuss

order

No comments yet.