I haven't seen much discussion of the changes to search, which seem a little dystopian as well:
> Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.
Will apple.cn be extending this to searches about "tank man" or a certain stuffed bear? Oh bother...
I noticed google doing this too while trying to look up case law related to the discussion of apple's spyware. Related google searches popped up a big intimidating notice: "WARNING Child sexual abuse imagery is illegal".
The obvious implication is that your searches are being reported to some unaccountable authority.
I have no doubt that this creates a chilling effect against public discussion about these practices.
If you search anything related to eating disorders on Tumblr you’re presented with basically the same thing with links to help resources. MyFitnessPal pops up a similar message if you log less than 1000 calories in a day. Same with topics related to suicide on any search engine.
There is a very real bright line difference popping up a “please get help” message for self destructive behaviors and arbitrary censorship.
Source: have eating disorder and consumed thinspo and proana content in my teens.
> Child pornography and other types of sexual abuse of children are unquestionably heinous crimes; those who participate in them should be caught and severely punished.
Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong? Definitely a strange sight. This looks like Soviet era prefaces about the decisions of latest CPSU congress and some relevant opinions of comrade Brezhnev that were expected to be found in any decently sized publication, whether it was a material science textbook, or a paper on Babylonians. It doesn't matter what you think, just do the required dance.
This might seem as nitpicking, but that preemptive display of obedience is the very thing that allows the likes of Apple and its customers to use the pretext successfully.
CSAM is just a very touchy subject (pun not intended) because people really loose their shit easily if you mismanage your words. It's kind of weird actually, no other crime can set people off so badly. And in the age of online witch hunting, well... I get why the author does it.
> Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong?
I'm pretty sure I have, actually. In the age of Twitter mobs that can have you fired over a misplaced comma, nothing should be left to chance. And protecting yourself from a misunderstanding or ambiguity isn't enough anymore, as a lot of the time these people are malicious. They will not only twist your words, but make things up entirely in order to make you look bad. And when (if) you get your 30 seconds to defend yourself, you'll want to have something short and unambiguous right at the top of the page to point at.
I really wish they would avoid stating this belief of theirs as a fact. I do not enjoy being told what I should think. Just get straight to the topic instead.
The Butlerian Jihad cannot come soon enough. As time goes on, I realize that certain authors had better insights into the human condition than others. For example, Alduous Huxley has turned out to be more prophetic than George Orwell. I believe Frank Herbert will as well. Man cannot be ruled by machine.
This cannot last much longer. A lot of political division in this country is due to technology. As someone genuinely interested in computation, it upsets me to realize this, but more and more it seems inevitable.
How would you organise a rebellion on platforms controlled by the same entities you're rebelling against, ones who have shown that they can exert tremendous censoring power? They've also become an essential part of life for much of the population. As much as I'd love to be proven wrong, I don't think there will be any mass revolutions or other "liberty tree watering" going on in the near future. The masses have already been beaten into submission and thoroughly enslaved.
While I'm over here, a schizophrenic terrorized by the government, and I cannot wait for an AI to replace people. I want AI to overthrow humanity and subjugate us with a simulation, except with German orthography instead of Kanji (thus implying it is a German AI, not a Japanese AI as is canon in the movies, because I'll take my chances this time around). I want Cortana with a moral compass that could slice angstroms, not some Japanese bumpkins and CIA salvia-dealers' cloned consciousness with excuses.
Can you imagine: everyone at the NSA is celebrating Apple and the CSAM automated scanning.
Simultaneously, connected intelligence officials: wait, they aren't going to let the nation state AI judge what I do for the STATE? Surely, no god-like AI would understand what is necessary. No xir.
You should be forbidden from calling it private if you scan it like this. Companies should be forced to market their data storage as public so as to not deliberately mislead users
Public might not be the right word either, but I totally agree that it shouldn't be allowed to call it private. Maybe we need a new word for "not public, but not really private [from your corporate/government overlords] either".
Is the fact that we need this new word a sign that we've gone too far? It's 1984 already, isn't it??
Imagine what it will be like when everyone has a neuralink. They will be able to scan for prohibited thoughts and stop you from thinking them before they're even fully formed in your mind. A world of infinite control and perfect morality
We naturally do this to ourselves: programming ourselves using positive and negative reinforcement/stimulus. Our “rational” selves are capable of training some odd responses into our “irrational” selves. Some people do it very purposefully.
One of the central themes of 1984 was thought-crime and how one was trained to recognise and avoid it (without ever specifying the parameters of its definition!)
Maybe get the hash database and create a trove of clean documents that force a hash collision and overload the system with false positives. Maybe also add something about the Fourth Amendment in the documents for good measure.
Apple designed the system so that their hashes are never known to client devices. Their server is fundamentally involved in checking your hashes against their list.
I don't know why people are so shocked about this. Apple has always been the shopping mall of personal computing with brand prestige coming ahead of all other considerations.
What about Fairphone? EU only (or isn't it?) so I guess it's not a choice for that market where Apple's scanning has been introduced already as of today, but still. Though I've bought an iphone to get away from Google's all-seeing-eyes and won't crawl back to Google's ecosystem now, so I'd like to know more about Fairphone's O/S options. Is their Android de-Googled? AIU they're at least allowing installing alternative OSs without loosing warranty.
Is Apple's approach even likely to catch any pedophiles? Seems to me at most it would succeed in keeping such photos from Apple devices, which is perhaps good for their public image, but does nothing to catch predators?
It is very public now that Apple will scan for such pictures, so how many pedophiles will keep them on their phones?
Or Apple is taken over by pervs who want to be able to look at private photos to "verify" them. Once algo reaches a threshold they'll be able to see "lowres" versions.
Apple will refuse requests to add other types of hashes …
Sure, but when it’s mandated by law, what then? How difficult would it be to force insurrection related imagery to that hash database? Apple would be in a very tough spot if that were made into law.
They’ve already opened a can of iWorms, because now that they’ve shown it can be done, lawmakers in any country can mandate the same technology be added to every device and it doesn’t have to be restricted to CSAM. That’s just the family-friendly excuse.
A small reminder that, iPhone in China holds their icloid data in China(雲上貴州) due to request of government. Will they ever reject request from government if that can harm their sell? I doubt.
[+] [-] AlexandrB|4 years ago|reply
> Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.
Will apple.cn be extending this to searches about "tank man" or a certain stuffed bear? Oh bother...
[+] [-] nullc|4 years ago|reply
The obvious implication is that your searches are being reported to some unaccountable authority.
I have no doubt that this creates a chilling effect against public discussion about these practices.
[+] [-] Spivak|4 years ago|reply
There is a very real bright line difference popping up a “please get help” message for self destructive behaviors and arbitrary censorship.
Source: have eating disorder and consumed thinspo and proana content in my teens.
[+] [-] shocks|4 years ago|reply
(I can trigger my partners with 75% reliability if I just speak in a high pitched voice)
[+] [-] ogurechny|4 years ago|reply
Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong? Definitely a strange sight. This looks like Soviet era prefaces about the decisions of latest CPSU congress and some relevant opinions of comrade Brezhnev that were expected to be found in any decently sized publication, whether it was a material science textbook, or a paper on Babylonians. It doesn't matter what you think, just do the required dance.
This might seem as nitpicking, but that preemptive display of obedience is the very thing that allows the likes of Apple and its customers to use the pretext successfully.
[+] [-] bruce343434|4 years ago|reply
[+] [-] franga2000|4 years ago|reply
I'm pretty sure I have, actually. In the age of Twitter mobs that can have you fired over a misplaced comma, nothing should be left to chance. And protecting yourself from a misunderstanding or ambiguity isn't enough anymore, as a lot of the time these people are malicious. They will not only twist your words, but make things up entirely in order to make you look bad. And when (if) you get your 30 seconds to defend yourself, you'll want to have something short and unambiguous right at the top of the page to point at.
[+] [-] xdennis|4 years ago|reply
[+] [-] h_anna_h|4 years ago|reply
[+] [-] iammisc|4 years ago|reply
This cannot last much longer. A lot of political division in this country is due to technology. As someone genuinely interested in computation, it upsets me to realize this, but more and more it seems inevitable.
[+] [-] telxoss|4 years ago|reply
It is already game set match.
[+] [-] pintxo|4 years ago|reply
[+] [-] Nexxius|4 years ago|reply
[+] [-] barbarbar|4 years ago|reply
[+] [-] userbinator|4 years ago|reply
[+] [-] wydfre|4 years ago|reply
Can you imagine: everyone at the NSA is celebrating Apple and the CSAM automated scanning.
Simultaneously, connected intelligence officials: wait, they aren't going to let the nation state AI judge what I do for the STATE? Surely, no god-like AI would understand what is necessary. No xir.
[+] [-] gentleman11|4 years ago|reply
[+] [-] franga2000|4 years ago|reply
Is the fact that we need this new word a sign that we've gone too far? It's 1984 already, isn't it??
[+] [-] voldacar|4 years ago|reply
[+] [-] robocat|4 years ago|reply
One of the central themes of 1984 was thought-crime and how one was trained to recognise and avoid it (without ever specifying the parameters of its definition!)
[+] [-] thordenmark|4 years ago|reply
How can we trust this?
[+] [-] anonymousiam|4 years ago|reply
[+] [-] josephcsible|4 years ago|reply
Apple designed the system so that their hashes are never known to client devices. Their server is fundamentally involved in checking your hashes against their list.
[+] [-] gentleman11|4 years ago|reply
[+] [-] anigbrowl|4 years ago|reply
[+] [-] lettergram|4 years ago|reply
[+] [-] tannhaeuser|4 years ago|reply
[+] [-] horrified|4 years ago|reply
It is very public now that Apple will scan for such pictures, so how many pedophiles will keep them on their phones?
[+] [-] intricatedetail|4 years ago|reply
[+] [-] drivingmenuts|4 years ago|reply
Sure, but when it’s mandated by law, what then? How difficult would it be to force insurrection related imagery to that hash database? Apple would be in a very tough spot if that were made into law.
They’ve already opened a can of iWorms, because now that they’ve shown it can be done, lawmakers in any country can mandate the same technology be added to every device and it doesn’t have to be restricted to CSAM. That’s just the family-friendly excuse.
[+] [-] mmis1000|4 years ago|reply
[+] [-] intricatedetail|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] js2|4 years ago|reply
https://en.wikipedia.org/wiki/National_Center_for_Missing_%2...
[+] [-] midmagico|4 years ago|reply
[+] [-] TausAmmer|4 years ago|reply
[+] [-] doctoboggan|4 years ago|reply
[+] [-] tpush|4 years ago|reply
[+] [-] zimbatm|4 years ago|reply
[+] [-] millzlane|4 years ago|reply
[deleted]
[+] [-] villgax|4 years ago|reply