(no title)
romeovs | 4 years ago
They are trying to prevent CSAM images from being stored and distributed using Apple products. If that goal is easily circumvented, the whole motivation for this (anti-)feature becomes invalid.
I a way, this architecture could potentially even make Apple products more attractive for CSAM distributors, since they now have a known way to fly under the radar (something that is arguably harder/riskier on other image sharing platforms, where the matching happens server-side).
One reasonable strategy Apple could have against that is through constantly finetuning the NeuralHash algorithm to hopefully catch more and more offenders. If that works reasonably well, it might deter criminals from their platform because an image that flies under the radar now might not fly under the radar in the future.
NB. I'm not trying to say Apple is doing the right thing here, especially since the above arguments put the efficacy of this architecture under scrutiny.
nonbirithm|4 years ago
If what Apple is aiming for is a more complete version of E2EE on their servers, maybe that's just an unintended consequence of the implementation, and the very reason why they're surprised that this received so much pushback. If Apple wanted to offer encryption for all user files in iCloud and leave no capability to decrypt the files themselves, they'd still need to be able to detect CSAM to protect themselves from liability. In that case, scanning on the device would be the only way to make it work.
If that were the case, I still wouldn't believe that moving the scan to the device fundamentally changes anything. Apple has to conduct a scan regardless, or they'll become a viable option for criminals to store CSAM. But in Apple's view, their implementation would mean they'd likely be the first cloud company that could claim to have zero knowledge of the data on on their servers while still satisfying the demands of the law.
Supposing that's the case, maybe what it would demonstrate is that no matter how you slice it, trying to offer a fully encrypted, no-knowledge solution for storing user data is fundamentally incompatible with societal demands.
But since Apple didn't provide such an explanation, we can only guess what their strategy is. They could have done a lot better job at describing their motivations, instead of hoping that the forces of public sentiment would allow it to pass like all the other scanning mechanisms actually had in the past.