top | item 28119295

(no title)

throwprvcyaway | 4 years ago

Doesn't the hash change by exporting a photo as a new file type or by changing a few pixels in photoshop?

If this was the FINAL solution to catch every last child pornographer in one glorious roundup MAYBE it would be worth the massive risk of authoritarian abuse but this algorithm sounds stupidly easy to get around for the deviants while still throwing our collective privacy under the bus.

discuss

order

ratww|4 years ago

This is a PhotoDNA hash, not a file-content hash. It is a bit more powerful than a normal hash:

> In the same way that PhotoDNA can match an image that has been altered to avoid detection, PhotoDNA for Video can find child sexual exploitation content that’s been edited or spliced into a video that might otherwise appear harmless

https://en.wikipedia.org/wiki/PhotoDNA