(no title)
twhb
|
3 years ago
The abandoned plan was perceptual hashing, which should return the same hash for very similar photos, while the new one is a checksum, which should return the same hash only for identical photos. I don’t think that invalidates the point, but it does seem relevant. It certainly makes it much less useful for CSAM scanning or enforcing local dictator whims, since it’s now trivial to defeat if you actually try to.
nomilk|3 years ago
Although it was starting on CSAM material, it wasn't clear which other illegal activities Apple would assist governments in tracking. In countries in which [being gay is illegal](https://www.humandignitytrust.org/lgbt-the-law/map-of-crimin...), having Apple employees aid law enforcement by pointing out photographic evidence of unlawful behaviour (for example, a man hugging his husband) would have been a recipe for grotesque human rights abuses.
With photos encrypted, Apple can't be pressured to hire human reviewers to inspect them, and thus cannot be pressured by governments that enforce absurd laws to pass on information on who might be engaging in "unlawful" activities.
[1] https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...
drbawb|3 years ago
Is there any proof they actually abandoned this? NeuralHash seems alive and well in iOS 16[1]. Supposedly the rest of the machinery around comparing these hashes to a blind database, encrypting those matches, and sending them to Apple et al. to be reviewed has all been axed. However that's not exactly trivial to verify since Photos is closed source.
[1]: https://support.apple.com/guide/iphone/find-and-delete-dupli...
TimTheTinker|3 years ago
miohtama|3 years ago