top | item 28691691

(no title)

woko | 4 years ago

It depends on the scale. I can vouch that crowd-moderation works fine for a small forum (~ 1000 members) which I am part of. And there is no karma system. You get to report posts (3 reports mean that the post is deleted), and warn users (24-hour ban after 3 "active" warnings, and then it scales up to a permanent ban after 15 "active" warnings). Warnings become "inactive" after a month.

It also depends on the threat model. If the community is the target of an harassment campaign coordinated by external actors, then you might need additional tools, or people dedicated to the job. However, this won't necessarily solve the problem, as external actors could double-down, and moderators can lose their minds (suspicion of a troll behind every post, abuse of power, absence of control of the moderators, possible presence of a spy/agitator among the moderation team, etc.). I won't name the forum and the community, but I have a specific one in mind. It does not help that it is a source of information for gaming media, which means that it is often linked to in press articles, which attracts much attention from all kinds of people.

That being said, I get back to the subject: user-generated content on platforms (and not just forums). If the goal is to reach a large scale, then I fully agree with you.

discuss

order

spurgu|4 years ago

> You get to report posts (3 reports mean that the post is deleted), and warn users (24-hour ban after 3 "active" warnings, and then it scales up to a permanent ban after 15 "active" warnings). Warnings become "inactive" after a month.

Sounds like I could do some damage there by signing up with three accounts?

woko|4 years ago

In practice, this has not happened yet, and it has been 3 years since the forum inception.

One obstacle which I forgot to mention is that an account cannot report posts or warn other members unless the account is 3 months old *and* the account has created at least 300 posts. Both conditions have to be met. I guess it is a sufficient hindrance for most Internet trolls to forget about the forum if they had no intention to take part in the community in the first place.

michaelpb|4 years ago

Yeah, I will agree with you that there can be some smaller-scale systems that work fine. That said, in these cases in my experience it's always few key "hero" mods who are just very committed to volunteering to keep things cleaned up.

Without actually hiring people, it's hard to get that level of commitment, and just as you are saying, as soon as the work gets hard enough (eg clever trolls that turn users against each other, or paranoid political crusaders who think the mods are in league with unseen forces), even the best volunteers end up quitting at the worst times.

In the long run I think the solution is just hiring moderators. It costs money, but if you want a job done well and consistently ya gotta pay.