top | item 34962260

(no title)

_manifold | 3 years ago

>To use Take It Down, anyone—minors, parents, concerned parties, or adults concerned about their own underage images being posted online—can anonymously access the platform on NCMEC’s site. Take It Down will then generate a hash that represents images or videos reported by users as sexualizing minors, including images with nudity, partial nudity, or sexualized poses. From there, any online platform that has partnered with the initiative will automatically block uploads or remove content matching that hash.

This sounds impressive if you don't know how file hashing works. If a malicious actor wants to get around this, all they would have to do is change a single pixel and/or re-export as a different format.

discuss

order

AdamJacobMuller|3 years ago

Not necessarily. There are much newer technologies than simple hashes of files now, which are effectively content-aware image hashing algorithms which are highly resistant to manipulation techniques (re-encoding, resizing, even things like rotation/blur) they are of course tunable algorithms which the more you want to catch the more false positive rate there is but you can already today do much better than simple file hash.

Look at https://www.microsoft.com/en-us/photodna and https://openbase.com/python/ImageHash/documentation

_manifold|3 years ago

I think it's definitely more useful, especially long term, in a more controlled system where the government agency that is handling the actual CSAM is simply submitting hashes of the content the company (Microsoft, Apple, or whoever else) to add to their database with which they can use to flag/review suspicious content.

However, the system described in the article is open to the public, and simultaneously privacy/anonymity oriented. I see this as a double-edged sword. While it does protect the identity of legitimate users, that also opens it up to nefarious actors flooding the system with images/videos taken from legitimate content creators on OnlyFans other sites, potentially getting those creators' content flagged/removed. Even if this simply triggers a manual review, you could feasibly spam the system with so many that it grinds to a halt.

zamnos|3 years ago

Good thing it's open source, now I know how much I need to change the image in order for the hash to change!

mikestew|3 years ago

all they would have to do is change a single pixel and/or re-export as a different format

This sounds plausible if you don't know how perceptual hashing works:

https://en.wikipedia.org/wiki/Perceptual_hashing

mrguyorama|3 years ago

Your own links talks about how perceptual hashing hasn't been proven to be robust enough for this use case, and also introduces a new problem: Hash collisions, such that you can generate images that hash to the same perceptual hash as an illicit image.

cyanydeez|3 years ago

Perceptual hashing exists.