top | item 28118739

(no title)

briefcomment | 4 years ago

As long as false positives are a thing, there are very practical reasons for switching, especially for those who have kids.

I wouldn't be surprised if a completely innocent false positive gets you put on a list indefinitely, with little recourse.

discuss

order

whisps|4 years ago

False positives of the kind you're thinking of aren't possible--it's checking for hashes that match known bad images, not running machine learning/image detection to detect if the photo you just took contains bad content. The issue is that there's nothing stopping Apple/the government from marking anything it finds objectionable--like anti-government free speech--as a Bad Image, beyond CSAM.

simion314|4 years ago

The thing is Apple uses some custom hash thing with parameters generated by AIs. As some other article shows you can get conflicting hashes if some color patterns and shadows match. Also the threshold they mentioned is secret so it could be 1 or 2 or it could change in future.

watt|4 years ago

Once the policy decision is made that it can run some kind of scanning, it opens the doors for any kind of scanning. Today it's that "neural hash", tomorrow it's going to do something even more invasive.

briefcomment|4 years ago

Really, hashes are sufficiently unique? The objections I saw for this news were along the lines that random images could be manipulated to have hashes that match the flagged cases, in a way that was undetectable by the naked eye.

throwprvcyaway|4 years ago

Doesn't the hash change by exporting a photo as a new file type or by changing a few pixels in photoshop?

If this was the FINAL solution to catch every last child pornographer in one glorious roundup MAYBE it would be worth the massive risk of authoritarian abuse but this algorithm sounds stupidly easy to get around for the deviants while still throwing our collective privacy under the bus.

thekyle|4 years ago

False positives are possible. Apple states that their hash function has about a 1 in 1 trillion chance of producing one.

acuozzo|4 years ago

> it's checking for hashes that match known bad images

Many of the hashes provided by the NCMEC are MD5. There are going to be false positives left and right.