top | item 28921906

(no title)

psychometry | 4 years ago

Thus the manual review. No one's going to be going to prison over a hash collision here.

discuss

order

gameman144|4 years ago

But a manual reviewer in Cupertino or elsewhere still gets access to your personal (possibly very intimate or otherwise private) photos. Privacy from law enforcement is hardly the only privacy that people value.

Jtsummers|4 years ago

If you desire privacy, never upload your images to any cloud service that doesn't offer true end-to-end encryption of the data (that is, one where they do not have the key). Use a service where data is only decryptable on your own devices or devices that you personally authorize. Which is, presently, none of the popular services that I'm aware of.

threeseed|4 years ago

They would only have access to the photos that are being reviewed.

And you can either choose between (a) someone having to see your photos or (b) relying on an automated but imperfect process. You have to pick one.

b112|4 years ago

I used to work in the same building, as a department with legal authorities (purposefully vague here), and the burn out rate was astronomical.

Good, descent people, waking up screaming, cold shakes, permanently damaged from what they could not unsee.

You couldn't pay me enough to go through images of such sickness.

Outside of all the yes/no, on/off phone stuff, how are they going to hire, and keep staffed, a department of people having to look at this stuff.

How are they going to insure it?!

Jtsummers|4 years ago

Right. Requiring exact matches for this kind of material is absurd as a single pixel change would foil any detection. So everyone, practically speaking, trying to detect it is going to use some form of hash algorithms. And every hash algorithm, by definition, permits potential collisions and false positives. Which is why any sensible program will use a manual review process before pushing anything forward to law enforcement. Apple's system, requiring ~30 matches, means that you'd have to have 30 or so false positives that also happen to look like CSAM to manual reviewers to end up getting a false case sent off to law enforcement.