top | item 47073634

(no title)

logankeenan | 12 days ago

Regardless of how you feel about content moderation, 48 hours is a ridiculously long time given what AI can do today. That “bad” image could have been propagated around the world to millions of people in that time. It can and should be removed in minutes because AI can evaluate the “bad” image quickly and a human moderator isn’t required anymore. However, the compute costs would eat into profits…

Again, I’m not judging about content moderation, but this is an extremely weak initiative.

discuss

order

Ukv|12 days ago

> It can and should be removed in minutes because AI can evaluate the “bad” image quickly and a human moderator isn’t required anymore.

CSAM can be detected through hashes or a machine-learning image classifier (with some false positives), whereas whether an image was shared nonconsensually seems like it'd often require context that is not in the image itself, possibly contacting the parties involved.

voidUpdate|12 days ago

I would not want to be the supervisor that has to review any CSAM positives to check for false ones

thaumasiotes|12 days ago

> CSAM can be detected through hashes or a machine-learning image classifier (with some false positives), whereas

Everything can be detected "with some false positives". If you're happy with "with some false positives", why would you need any context?

pjc50|12 days ago

Indeed. It seems that the process being described is some kind of one-stop portal, operated by or for OFCOM or the police, where someone can attest "this is a nonconsensual intimate image of me" (hopefully in some legally binding way!), triggering a cross-system takedown. Not all that dissimilar to DMCA.

Manuel_D|12 days ago

The issue is that if you need to achieve 0% false negatives, you're going to get a lot of false positives.

timr|12 days ago

Another, related issue is that the takedown mechanism becomes a de facto censorship mechanism, as anyone who has dealt with DMCA takedowns and automated detectors can tell you.

Someone reports something for Special Pleading X, and you (the operator) have to ~instantly take down the thing, by law. There is never an equally efficient mechanism to push back against abuses -- there can't be, because it exposes the operator to legal risk in doing so. So you effectively have a one-sided mechanism for removal of unwanted content.

Maybe this is fine for "revenge porn", but even ignoring the slippery slope argument (which is real -- we already have these kinds of rules for copyrighted content!) it's not so easy to cleanly define "revenge porn".

Ray20|12 days ago

Regardless of how you feel about content moderation, we are talking about a situation where the government is DEMANDING corporations to implement automated, totalitarian surveillance tools. This is the key factor here.

The next step would be for the government to demand direct access to these tools. Then the government would be able to carry out holocausts against any ethnic group, only 10 times more effectively and inevitably than Hitler did.

happymellon|12 days ago

> the government is DEMANDING corporations to implement automated, totalitarian surveillance tools

You've got this the wrong way around. These are social media sites.

People are publicly publishing revenge porn, and the government has told sites that if they are requested to take down revenge porn then they have to.

They don't have to monitor, because they are being told of it's existence.

oneeyedpigeon|12 days ago

Are you conflating this specific principle with the much wider Online Safety Act? Because, while the latter has certain privacy-undermining elements to it, I'm not sure how asking social media companies to take down content has anything to do with 'surveillance'.

Manuel_D|12 days ago

As bad as I think this law is, this isn't demanding any degree of surveillance in the sense that real human beings have their information or activity tracked. This is mandating taking down content, not surveiling anyone.

pjc50|12 days ago

Every social media site already has a system for removing porn.