top | item 25709872

(no title)

ebilgenius | 5 years ago

And yet Amazon managed to get a plan out of Parler that promised to increase moderation of violent content manually with the help of volunteers. Whether or not that plan was workable is moot now, it seems.

discuss

order

dagmx|5 years ago

It is moot because the parler system has no accountability for the company or the "moderators". Parler can just say it was moderated by these random accounts and then try and wash their hands clean. The point is to have accountable, enforceable moderation.

For people interested, this is the moderation system according to an interview the CEO gave ( https://www.nytimes.com/2021/01/07/opinion/sway-kara-swisher... ).

> Well, the way we work on our platform is we put everything to a community jury. So everyone’s judged by a jury of their peers in determining whether the action is illegal or against our rules. And so if reported, it goes to a jury of people’s peers. And if it’s deemed illegal, promptly deleted. But, you know, the jury of five people get to decide. And it’s a random jury, so they don’t know each other. They don’t know what they’re voting. They just get the independent facts of the situation and they make their own judgment call. We’ve actually been inviting journalists and other people to join the jury as well, so that we have a nice transparent jury system.