top | item 28081738

(no title)

ducadveritatem | 4 years ago

That isn't accurate. They're not blindly handing lists of users over to the government.

If an account uploads multiple images that match to known exploitative images and exceeds a threshold, then the account is flagged for review by Apple. (Note the threshold is selected to provide a ~1 in 1 trillion probability of incorrectly flagging an account.) Once they review and confirm a match, it's then forwarded to the National Center for Missing & Exploited Children for further action (and presumably referral to Law Enforcement.)

More details in their whitepaper: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

discuss

order

hapless|4 years ago

There are no details in the whitepaper.

The "1 in 1 trillion" figure is accidental flagging on the target database, but there is no validation whatsoever on the target database. How can you, or I, or any other citizen, know whether non-CSAM items are present in the target database?

-------

The NCMEC is a patsy for the police state on this one. It's gross, it's ugly, and it is a terrible outcome for the charity.

In their participation in this program, they make themselves into a front for the CIA, FBI, and DIA forces that are aching for opportunities to crack down on dissent in America. This is an awful, terrible outcome.

--------

The whole thing is an incredibly thin, easily pierced veil for any government. Even if you think the secret police forces of the United States generally do well by citizens, how do you feel about China, or Russia, or Eritrea, or Burma, or Turkmenistan using these tools to flag people trafficking images with undesirable fingerprints?

DSingularity|4 years ago

This is a good point. I believed that the fact that apple manually reviews the content implies they will compare the images against those in the database. Without the database of content it does imply that outside organizations are uploading hashes to apple and that apple cannot determine the scope of the content.

However, that does not invalidate the fact that apple is in the loop! It’s not just the NCMEC that has to be corrupted - it’s also apple employees. Apple has stated in their whitepaper that they review all flagged content before forwarding to the NCMEC. If the apple employees forward the non-CSAM matches then that is a failure of the reviewers who have betrayed their duty to prevent authoritarian abuse of this system.