Regardless of how you feel about content moderation, 48 hours is a ridiculously long time given what AI can do today. That “bad” image could have been propagated around the world to millions of people in that time. It can and should be removed in minutes because AI can evaluate the “bad” image quickly and a human moderator isn’t required anymore. However, the compute costs would eat into profits…Again, I’m not judging about content moderation, but this is an extremely weak initiative.
Ukv|12 days ago
CSAM can be detected through hashes or a machine-learning image classifier (with some false positives), whereas whether an image was shared nonconsensually seems like it'd often require context that is not in the image itself, possibly contacting the parties involved.
voidUpdate|12 days ago
thaumasiotes|12 days ago
Everything can be detected "with some false positives". If you're happy with "with some false positives", why would you need any context?
pjc50|12 days ago
Manuel_D|12 days ago
timr|12 days ago
Someone reports something for Special Pleading X, and you (the operator) have to ~instantly take down the thing, by law. There is never an equally efficient mechanism to push back against abuses -- there can't be, because it exposes the operator to legal risk in doing so. So you effectively have a one-sided mechanism for removal of unwanted content.
Maybe this is fine for "revenge porn", but even ignoring the slippery slope argument (which is real -- we already have these kinds of rules for copyrighted content!) it's not so easy to cleanly define "revenge porn".
Ray20|12 days ago
The next step would be for the government to demand direct access to these tools. Then the government would be able to carry out holocausts against any ethnic group, only 10 times more effectively and inevitably than Hitler did.
happymellon|12 days ago
You've got this the wrong way around. These are social media sites.
People are publicly publishing revenge porn, and the government has told sites that if they are requested to take down revenge porn then they have to.
They don't have to monitor, because they are being told of it's existence.
oneeyedpigeon|12 days ago
Manuel_D|12 days ago
pjc50|12 days ago