top | item 28159154

(no title)

fredsir | 4 years ago

Except the list of hashes is a black box. Today it's child porn, tomorrow it's anti-government or something similar. It's a slippery slope that's only going in one direction.

discuss

order

imwillofficial|4 years ago

Agreed, but as of today, we are at the top of the slope. Maybe we skip, maybe we don’t.

As implemented, I agree with Apple

fredsir|4 years ago

Where there is a slope, riders will come. The only way to prevent people from riding the slope is to never built the slope in the first place.

robertoandred|4 years ago

“Anti-government” is absurdly vague. Not how this system works.

fredsir|4 years ago

> “Anti-government” is absurdly vague.

That's the point. It's not an open to the public list, it's secret and controlled by few. It can contain whatever they want it to contain.

> Not how this system works.

That's absurdly naive.

stetrain|4 years ago

But Apple doesn't report to authorities based only on a hash match.

themaninthedark|4 years ago

They hash the files on your device.

The hashes are then sent out to for comparison to the hash table.

Right now they are only comparing to the CSAM table but they can add other later.

Also, how are the hashes sent? Can they be intercepted?

stetrain|4 years ago

A human at Apple reviews thumbnails of the flagged images to see if they are actually CSAM. You do not get reported just because of a hash match.