So a key fact from the comments after the paper was published I expect all the blogs to miss: I asked the author how many of the reported CSAM hits were blocked by servers that include a very common "bare minimum" blocklist for bad servers, and he checked and said 87%.
So the majority of the CSAM that they detected never reaches most of the fediverse. Furthermore, the methodology of the study relies on sorta Mastodon's firehose, which is instantaneous. That means the hits they catch are before moderators get a chance to handle them. In most of the remaining percentage, they likely get reported and blocked quickly.
So most of the fediverse is blocking the vast majority of this garbage already.
Sure, there's still a lot of work to do, and plenty of ideas how to improve moderation, but the fediverse doesn't have VCs. Donate to Mastodon if you want to help fund work on this stuff.
A misleading headline. Not too different from the last round of headlines that there are card carrying fascists in the fediverse. Both are true and they are blocked by all well run servers.
This is like the web and just like we don't say that HTTP has a X problem it doesn't make sense to say that open source software run by thousands of people across the world under different laws has any specific purpose or problem.
Improvements can be made to the software but bad admins can just rip that code out. Still I bet good instances will make changes.
A social network has a duty to take reasonable steps to ensure that bad people can’t easily conspire together to do their bad deeds. Society has a problem in that people engage in shitty behavior. One way societies combat this is by limiting the ability of like minded people from openly sharing the effects of their misdeeds.
Do you think it’s OK for a convention center to host a meeting where attendees are passing around photographs of CSAM to each other? Why treat the online world differently than the offline world in this regard?
Saying "your car has a flat tire" isn't blaming your car, and saying "Mastodon has a CSAM problem" isn't blaming Mastodon.
This article seems to do a pretty fair job of pointing out that they investigate this issue on many platforms, but that they've seen higher rates of CSAM on Mastodon than others. They also specifically call out, accurately, that decentralized platforms have different challenges with addressing these problems when compared to a platform with a central arbiter.
Distinguish real CSAM (everyone agree to ban) and generated "CSAM" (no child had abused for it, and the border between photorealistic or drawn, and age is a bit difficult) is important. IMO latter shouldn't be called "CSAM" to avoid confusion. Should we ban it is a different discussion.
[+] [-] ocdtrekkie|2 years ago|reply
So the majority of the CSAM that they detected never reaches most of the fediverse. Furthermore, the methodology of the study relies on sorta Mastodon's firehose, which is instantaneous. That means the hits they catch are before moderators get a chance to handle them. In most of the remaining percentage, they likely get reported and blocked quickly.
So most of the fediverse is blocking the vast majority of this garbage already.
Sure, there's still a lot of work to do, and plenty of ideas how to improve moderation, but the fediverse doesn't have VCs. Donate to Mastodon if you want to help fund work on this stuff.
[+] [-] throwaway72762|2 years ago|reply
This is like the web and just like we don't say that HTTP has a X problem it doesn't make sense to say that open source software run by thousands of people across the world under different laws has any specific purpose or problem.
Improvements can be made to the software but bad admins can just rip that code out. Still I bet good instances will make changes.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] version_five|2 years ago|reply
[+] [-] synetic|2 years ago|reply
Do you think it’s OK for a convention center to host a meeting where attendees are passing around photographs of CSAM to each other? Why treat the online world differently than the offline world in this regard?
[+] [-] akerl_|2 years ago|reply
This article seems to do a pretty fair job of pointing out that they investigate this issue on many platforms, but that they've seen higher rates of CSAM on Mastodon than others. They also specifically call out, accurately, that decentralized platforms have different challenges with addressing these problems when compared to a platform with a central arbiter.
[+] [-] cyanydeez|2 years ago|reply
[+] [-] fomine3|2 years ago|reply