(no title)
JulianMorrison | 3 years ago
Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]
pas|3 years ago
citation needed.
do these CSAM scanning things actually help reduce kid exploitation?
and if they do, is this the best use of our resources?
raxxorraxor|3 years ago
No, there really should not. I would not want a facebook employee to look at my pictures. I don't use their services, but the thought is pretty off-putting. The idea that these companies have to police content is what is wrong.
There are other ways to get to offenders here. An environment that takes good care of kids will spot it. Not some poor fella that needs to look at private images.
hedora|3 years ago
Even an account lock is probably a bad idea; it alerts the pedophile that they're under investigation, allowing them to destroy evidence, cut ties with coconspirators, etc.
Best to let law enforcement deal with it. In this case, assuming it somehow went to trial, the jury would almost certainly acquit, and the account would be restored.
There is the matter of the accused losing access to the account while the case was active though. That's potentially a big deal.