top | item 28160368

Scanning “private” content

267 points| lisptime | 4 years ago |lwn.net | reply

77 comments

order
[+] AlexandrB|4 years ago|reply
I haven't seen much discussion of the changes to search, which seem a little dystopian as well:

> Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

Will apple.cn be extending this to searches about "tank man" or a certain stuffed bear? Oh bother...

[+] nullc|4 years ago|reply
I noticed google doing this too while trying to look up case law related to the discussion of apple's spyware. Related google searches popped up a big intimidating notice: "WARNING Child sexual abuse imagery is illegal".

The obvious implication is that your searches are being reported to some unaccountable authority.

I have no doubt that this creates a chilling effect against public discussion about these practices.

[+] Spivak|4 years ago|reply
If you search anything related to eating disorders on Tumblr you’re presented with basically the same thing with links to help resources. MyFitnessPal pops up a similar message if you log less than 1000 calories in a day. Same with topics related to suicide on any search engine.

There is a very real bright line difference popping up a “please get help” message for self destructive behaviors and arbitrary censorship.

Source: have eating disorder and consumed thinspo and proana content in my teens.

[+] shocks|4 years ago|reply
This is interesting because it’s trivial to use someone else’s Siri and make it search whatever you want.

(I can trigger my partners with 75% reliability if I just speak in a high pitched voice)

[+] ogurechny|4 years ago|reply
> Child pornography and other types of sexual abuse of children are unquestionably heinous crimes; those who participate in them should be caught and severely punished.

Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong? Definitely a strange sight. This looks like Soviet era prefaces about the decisions of latest CPSU congress and some relevant opinions of comrade Brezhnev that were expected to be found in any decently sized publication, whether it was a material science textbook, or a paper on Babylonians. It doesn't matter what you think, just do the required dance.

This might seem as nitpicking, but that preemptive display of obedience is the very thing that allows the likes of Apple and its customers to use the pretext successfully.

[+] bruce343434|4 years ago|reply
CSAM is just a very touchy subject (pun not intended) because people really loose their shit easily if you mismanage your words. It's kind of weird actually, no other crime can set people off so badly. And in the age of online witch hunting, well... I get why the author does it.
[+] franga2000|4 years ago|reply
> Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong?

I'm pretty sure I have, actually. In the age of Twitter mobs that can have you fired over a misplaced comma, nothing should be left to chance. And protecting yourself from a misunderstanding or ambiguity isn't enough anymore, as a lot of the time these people are malicious. They will not only twist your words, but make things up entirely in order to make you look bad. And when (if) you get your 30 seconds to defend yourself, you'll want to have something short and unambiguous right at the top of the page to point at.

[+] xdennis|4 years ago|reply
I think it's because it's very easy to just say: "wait, you're against Apple fighting CP? Are you a pedophile?". People DO think like that.
[+] h_anna_h|4 years ago|reply
I really wish they would avoid stating this belief of theirs as a fact. I do not enjoy being told what I should think. Just get straight to the topic instead.
[+] iammisc|4 years ago|reply
The Butlerian Jihad cannot come soon enough. As time goes on, I realize that certain authors had better insights into the human condition than others. For example, Alduous Huxley has turned out to be more prophetic than George Orwell. I believe Frank Herbert will as well. Man cannot be ruled by machine.

This cannot last much longer. A lot of political division in this country is due to technology. As someone genuinely interested in computation, it upsets me to realize this, but more and more it seems inevitable.

[+] telxoss|4 years ago|reply
I think the mistake here is that maybe man can't be ruled by machine but men using machines to rule other men..

It is already game set match.

[+] pintxo|4 years ago|reply
Endless wars, right and wrong language, always on connected devices monitoring us constantly? Orwell seems to have a point.
[+] Nexxius|4 years ago|reply
So what you are really saying here is that "there is no sanctuary"?
[+] userbinator|4 years ago|reply
How would you organise a rebellion on platforms controlled by the same entities you're rebelling against, ones who have shown that they can exert tremendous censoring power? They've also become an essential part of life for much of the population. As much as I'd love to be proven wrong, I don't think there will be any mass revolutions or other "liberty tree watering" going on in the near future. The masses have already been beaten into submission and thoroughly enslaved.
[+] wydfre|4 years ago|reply
While I'm over here, a schizophrenic terrorized by the government, and I cannot wait for an AI to replace people. I want AI to overthrow humanity and subjugate us with a simulation, except with German orthography instead of Kanji (thus implying it is a German AI, not a Japanese AI as is canon in the movies, because I'll take my chances this time around). I want Cortana with a moral compass that could slice angstroms, not some Japanese bumpkins and CIA salvia-dealers' cloned consciousness with excuses.

Can you imagine: everyone at the NSA is celebrating Apple and the CSAM automated scanning.

Simultaneously, connected intelligence officials: wait, they aren't going to let the nation state AI judge what I do for the STATE? Surely, no god-like AI would understand what is necessary. No xir.

[+] gentleman11|4 years ago|reply
You should be forbidden from calling it private if you scan it like this. Companies should be forced to market their data storage as public so as to not deliberately mislead users
[+] franga2000|4 years ago|reply
Public might not be the right word either, but I totally agree that it shouldn't be allowed to call it private. Maybe we need a new word for "not public, but not really private [from your corporate/government overlords] either".

Is the fact that we need this new word a sign that we've gone too far? It's 1984 already, isn't it??

[+] voldacar|4 years ago|reply
Imagine what it will be like when everyone has a neuralink. They will be able to scan for prohibited thoughts and stop you from thinking them before they're even fully formed in your mind. A world of infinite control and perfect morality
[+] robocat|4 years ago|reply
We naturally do this to ourselves: programming ourselves using positive and negative reinforcement/stimulus. Our “rational” selves are capable of training some odd responses into our “irrational” selves. Some people do it very purposefully.

One of the central themes of 1984 was thought-crime and how one was trained to recognise and avoid it (without ever specifying the parameters of its definition!)

[+] thordenmark|4 years ago|reply
Someone else's icloud photos got into mine once, just random sunsets from some place I'd never been. It was weird.

How can we trust this?

[+] anonymousiam|4 years ago|reply
Maybe get the hash database and create a trove of clean documents that force a hash collision and overload the system with false positives. Maybe also add something about the Fourth Amendment in the documents for good measure.
[+] josephcsible|4 years ago|reply
> get the hash database

Apple designed the system so that their hashes are never known to client devices. Their server is fundamentally involved in checking your hashes against their list.

[+] gentleman11|4 years ago|reply
You are assuming people will not be marked for life for their false positives
[+] anigbrowl|4 years ago|reply
I don't know why people are so shocked about this. Apple has always been the shopping mall of personal computing with brand prestige coming ahead of all other considerations.
[+] lettergram|4 years ago|reply
Librem 5 and pinephone are the future
[+] tannhaeuser|4 years ago|reply
What about Fairphone? EU only (or isn't it?) so I guess it's not a choice for that market where Apple's scanning has been introduced already as of today, but still. Though I've bought an iphone to get away from Google's all-seeing-eyes and won't crawl back to Google's ecosystem now, so I'd like to know more about Fairphone's O/S options. Is their Android de-Googled? AIU they're at least allowing installing alternative OSs without loosing warranty.
[+] horrified|4 years ago|reply
Is Apple's approach even likely to catch any pedophiles? Seems to me at most it would succeed in keeping such photos from Apple devices, which is perhaps good for their public image, but does nothing to catch predators?

It is very public now that Apple will scan for such pictures, so how many pedophiles will keep them on their phones?

[+] intricatedetail|4 years ago|reply
Or Apple is taken over by pervs who want to be able to look at private photos to "verify" them. Once algo reaches a threshold they'll be able to see "lowres" versions.
[+] drivingmenuts|4 years ago|reply
Apple will refuse requests to add other types of hashes …

Sure, but when it’s mandated by law, what then? How difficult would it be to force insurrection related imagery to that hash database? Apple would be in a very tough spot if that were made into law.

They’ve already opened a can of iWorms, because now that they’ve shown it can be done, lawmakers in any country can mandate the same technology be added to every device and it doesn’t have to be restricted to CSAM. That’s just the family-friendly excuse.

[+] mmis1000|4 years ago|reply
A small reminder that, iPhone in China holds their icloid data in China(雲上貴州) due to request of government. Will they ever reject request from government if that can harm their sell? I doubt.
[+] intricatedetail|4 years ago|reply
Apple has no way to know what image hash has been derived from. So China can give a hash of Winnie the pooh and claim it is CSAM. Apple won't know.
[+] midmagico|4 years ago|reply
Spot the Apple employee in the comments who wasn't the one who leaked the internal NCMEC memo..
[+] TausAmmer|4 years ago|reply
Sending images of authority abusing its powers? No no, can't do that, back to gulag.
[+] doctoboggan|4 years ago|reply
Does anyone know if this also scans files in your icloud synced desktop or documents folder?
[+] tpush|4 years ago|reply
It does not. Only photos (designated to be) uploaded to iCloud Photo Library.
[+] zimbatm|4 years ago|reply
iCloud is not P2P encrypted so it's likely that they already do. They can do it without your knowledge or having to push a software update.
[+] villgax|4 years ago|reply
This is something we almost certainly didn't raise any issues with regards to Google Drive as well