(no title)
woko | 4 years ago
It also depends on the threat model. If the community is the target of an harassment campaign coordinated by external actors, then you might need additional tools, or people dedicated to the job. However, this won't necessarily solve the problem, as external actors could double-down, and moderators can lose their minds (suspicion of a troll behind every post, abuse of power, absence of control of the moderators, possible presence of a spy/agitator among the moderation team, etc.). I won't name the forum and the community, but I have a specific one in mind. It does not help that it is a source of information for gaming media, which means that it is often linked to in press articles, which attracts much attention from all kinds of people.
That being said, I get back to the subject: user-generated content on platforms (and not just forums). If the goal is to reach a large scale, then I fully agree with you.
spurgu|4 years ago
Sounds like I could do some damage there by signing up with three accounts?
woko|4 years ago
One obstacle which I forgot to mention is that an account cannot report posts or warn other members unless the account is 3 months old *and* the account has created at least 300 posts. Both conditions have to be met. I guess it is a sufficient hindrance for most Internet trolls to forget about the forum if they had no intention to take part in the community in the first place.
michaelpb|4 years ago
Without actually hiring people, it's hard to get that level of commitment, and just as you are saying, as soon as the work gets hard enough (eg clever trolls that turn users against each other, or paranoid political crusaders who think the mods are in league with unseen forces), even the best volunteers end up quitting at the worst times.
In the long run I think the solution is just hiring moderators. It costs money, but if you want a job done well and consistently ya gotta pay.