(no title)
romeovs | 4 years ago
But could this not also be used to circumvent the CSAM scanning by converting images that are in the CSAM database to visually similar images that won't match the hash anymore? That would effectively defeat the CSAM scanning Apple and others are trying to put into place completely and render the system moot.
One could argue that these spoofed images could also be added to the CSAM database, but what if you spoof them to have hashes of extremely common images (like common memes)? Adding memes to the database would render the whole scheme unmanageable, no?
Or am I missing something here?
So we'd end up with a system that: 1. Can't be reliably used to track actual criminal offenders (they'd just be able to hide) without rendering the whole database useless. 2. Can be used to attack anyone by making it look like they have criminal content on their iPhones.
chongli|4 years ago
Wouldn't it be easier for offenders to avoid Apple products? That requires no special computer expertise and involves no risk on their part.
romeovs|4 years ago
They are trying to prevent CSAM images from being stored and distributed using Apple products. If that goal is easily circumvented, the whole motivation for this (anti-)feature becomes invalid.
I a way, this architecture could potentially even make Apple products more attractive for CSAM distributors, since they now have a known way to fly under the radar (something that is arguably harder/riskier on other image sharing platforms, where the matching happens server-side).
One reasonable strategy Apple could have against that is through constantly finetuning the NeuralHash algorithm to hopefully catch more and more offenders. If that works reasonably well, it might deter criminals from their platform because an image that flies under the radar now might not fly under the radar in the future.
NB. I'm not trying to say Apple is doing the right thing here, especially since the above arguments put the efficacy of this architecture under scrutiny.
the_other|4 years ago
argvargc|4 years ago