(no title)
leepowers | 4 years ago
1) If they take false content down that will reinforce her beliefs.
2) If they leave false content up, she will keep consuming it, which will reinforce her beliefs.
3) So - when it comes to changing her mind (and the millions like her), 1 and 2 are a wash. She's adopted an unfalsifiable position. There is no policy Facebook, Twitter, YouTube, et al. can adopt to reason her out of this position. She will have to reason her self out of it at some point.
4) The purpose of taking down false content is not to change her mind, or change the mind of anyone else who has adopted an unfalsifiable position. The purpose is to stop the spread of false and untrue information. If there's 10 million people who have taken the unfalsifiable position the goal is to prevent another 10 million from adopting the same viewpoint.
5) However I can't be sure #4 will actually work. Its very difficult to lockdown information and prevent its spread.
6) And can these platforms moderate edge cases with accuracy? If they bungle the job users will lose trust in them as an information source. But - since these platforms are the main driver of misinformation, then discrediting them as information sources would be a net good.
7) So - no moderation means we continue the status quo of information and vaccine hesitancy.
8) Requiring moderation might combat hesitancy by a) preventing the spread of misinformation and b) discrediting platforms in the eyes of the hesitant or hesitant adjacent.
9) Because these platforms are public and performative they are ill suited for mea culpas. Rarely do people relish engaging with ideas that might prove you wrong, especially in a public setting. The work of helping people reason themselves out of unreason will be done outside these platforms.
No comments yet.