(no title)
Ukv | 12 days ago
CSAM can be detected through hashes or a machine-learning image classifier (with some false positives), whereas whether an image was shared nonconsensually seems like it'd often require context that is not in the image itself, possibly contacting the parties involved.
voidUpdate|12 days ago
thaumasiotes|12 days ago
Everything can be detected "with some false positives". If you're happy with "with some false positives", why would you need any context?
unknown|12 days ago
[deleted]
pjc50|12 days ago