(no title)
ducadveritatem | 4 years ago
If an account uploads multiple images that match to known exploitative images and exceeds a threshold, then the account is flagged for review by Apple. (Note the threshold is selected to provide a ~1 in 1 trillion probability of incorrectly flagging an account.) Once they review and confirm a match, it's then forwarded to the National Center for Missing & Exploited Children for further action (and presumably referral to Law Enforcement.)
More details in their whitepaper: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...
hapless|4 years ago
The "1 in 1 trillion" figure is accidental flagging on the target database, but there is no validation whatsoever on the target database. How can you, or I, or any other citizen, know whether non-CSAM items are present in the target database?
-------
The NCMEC is a patsy for the police state on this one. It's gross, it's ugly, and it is a terrible outcome for the charity.
In their participation in this program, they make themselves into a front for the CIA, FBI, and DIA forces that are aching for opportunities to crack down on dissent in America. This is an awful, terrible outcome.
--------
The whole thing is an incredibly thin, easily pierced veil for any government. Even if you think the secret police forces of the United States generally do well by citizens, how do you feel about China, or Russia, or Eritrea, or Burma, or Turkmenistan using these tools to flag people trafficking images with undesirable fingerprints?
DSingularity|4 years ago
However, that does not invalidate the fact that apple is in the loop! It’s not just the NCMEC that has to be corrupted - it’s also apple employees. Apple has stated in their whitepaper that they review all flagged content before forwarding to the NCMEC. If the apple employees forward the non-CSAM matches then that is a failure of the reviewers who have betrayed their duty to prevent authoritarian abuse of this system.