Since this news broke a few weeks ago, I've seen many people suggesting that Apple miscalculated and might be surprised by the push-back they're receiving; that some misunderstanding may have occurred.
I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple. Extending Apple the benefit of doubt does not seem possible in this case.
Yesterday we saw OnlyFans exit the adult industry. Two weeks ago we saw Apple exit the privacy industry.
EDIT:
Some are questioning whether Apple was ever in the privacy industry. That's a good question. Even though their devices were certainly not secure and could be compromised, I think they were certainly in the privacy industry in the sense that they marketed an intention to make their devices as private and secure as possible[0]. Which is basically all a consumer can ask from a computer company.
I think you're underestimating the degree to which companies have their own internal logic that has nothing to do with the external reality. That tendency increases the larger a company grows. If you're in a one-room house, you can always see outside. If you're inside a giant office building, mostly you see the building.
Did somebody raise the concern of push-back? I'm sure. But the moral questions around CSAM are something that was settled long ago internally. When I was at Twitter fighting abuse, the CSAM stuff was a separate group. My boss called them The Department of Mysteries because we almost never saw them or spoke to them. It was led by a serious person, an ex-FBI agent or something like that. They did what they did and we were all ok with it and grateful for it, because that shit is horrific and we didn't want it on our platform and we didn't want to have to deal with it ourselves.
My cousin was a PO for sex offenders, and one of our regular discussion topics at family reunions was how sex offenders were way more technologically savvy than a state parole department. How they really needed more help in making sure offenders weren't reoffending while on parole, while also not forcing them to just not use computers and phones altogether. If even I've heard this, I'm sure that Apple execs have heard it from law enforcement a zillion times.
It's also clear Apple put a lot of thought into addressing the privacy concerns for this. Technologically, it's sophisticated, impressive.
So I can easily believe the people at Apple said, "Sure, there are reasonable concerns, but we think we have addressed them." And that they're surprised by the level of sustained pushback.
>> that one of the most sophisticated companies in the world
Microsoft Zune. The Super League. 47 Ronin. Wonder Woman 1984. Google+. Big companies make big mistakes. No matter how many focus groups you collect, no matter how many phone surveys you do, things can go wrong.
I think many at Apple really are surprised by this, and I put it down to a failure of elite consensus.
I think that a number of folks at high levels in the SV executive suite have accepted a manufactured consensus along with their peers in Washington, and that consensus is something like: "people don't care about the privacy of what's on their device and they'll put up with anything to stop CSAM, even if it means scanning personal backups and local files (as opposed to shared files.)"
This seems like a reasonable thing to believe, since server-side scanning of (mostly shared) files has been going on for years and nobody has pushed back very hard on it. But what I think the consensus missed is that the reason for this lack-of-pushback is that nobody in the wider world had really been asked to weigh in on it before. It was something that a few elite tech busybodies were aware of, and most people accepted the idea that providers needed to check out photos that lived (unencrypted) on their servers. Apple accepted this logic and extended it unthinkingly beyond shared photos to unshared private photo libraries on the user's personal device (even if they are staged for backup as part of the iCloud Photos synchronization service, which is just a policy choice.) This was a second mistake because it assumed that because users mostly ignored the scanning of shared server-hosted files, they had somehow given consent to having their private files searched on their device. I don't think they had.
Overall, this announcement is the first time anyone has attempted to have an actual public debate to see how real users feel about this kind of surveillance, particularly automated surveillance of private photos (and an automated system with potential flaws.) Apple's mistake here was to assume that their user base had already given consent -- when they'd just never been asked. It's a very human mistake to make, frankly. The question is whether Apple will listen to their users or if they'll double down and push this through against their users' pushback. I can forgive Apple for misunderstanding their users once, but continuing down this path will be a lot harder to understand.
ETA: To illustrate how much more pervasive Apple's surveillance is than the standard (ignoring the PSI protocols), consider this quote from an EU Parliament briefing: "Others, such as Dropbox, Google and Microsoft perform scans for illegal images, but 'only when someone shares them, not when they are uploaded'." (I can only trust that this is factually true.) In this sense, Apple's move to scan all photos in your library is a significant functional escalation.) https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/6593...
It's easy for a large company, especially its leadership, to be out of touch with their customers. Many instances of this are listed in another reply, but it's not all that unusual. I would definitely believe that Apple thought this solution was more private than the kind of scanning Google and Facebook does. Especially now that they're drinking their own cool-aid about being a "services" company where the line between "my device" and "your cloud" starts to blur.
Edit: It's also notable that news of this was leaked before Apple was able to officially announce anything. This means that Apple's marketing department was not able to control the tone/narrative as well as they normally do. The first thing many people heard was "Apple will be scanning your phone" without any of the nuances that came later.
> Yesterday we saw OnlyFans exit the adult industry. Two weeks ago we saw Apple exit the privacy industry.
One of those two statements I disagree with.
Apple exiting the privacy industry would look like this to me: "we've decided from now on that, like almost all other cloud providers, we'll give ourselves access to your stuff for (ahem) legitimate purposes".
Not like this: "we'll implement a neuralhash on the client device rather than on the servers, and then do a cryptographic private-set-intersection protocol with them on the server".
That's a lot of cost and effort to prevent themselves, as a company, misusing the CSAM detector for other purposes. If there are government agencies involved, Apple is also making sure that they can't just use this as a backdoor to get access to everyone's files, it's as if the government said "we need to prevent child abuse, give us a backdoor" and Apple went "ok we'll give you a small backdoor that's ok at detecting abuse images and nothing more" - if the government was expecting to use the backdoor for more than this, they'll be disappointed.
(I'm pretty sure they have other backdoors already, by the way. My guess would be a zero-day on the baseband processor firmware.)
I'm not saying I agree or disagree with Apple's latest move, but "exit the privacy industry" feels a bit a strong statement to me. You have less privacy than you did three weeks ago, and an option on even less privacy in the future (but then again Apple could just change the T&C), but you're still better off than with competitors that offer similar functionality.
One speculated motive is that Apple wanted an End-to-End Encryption system for iCloud, but the FBI put immense pressure in 2018 internally to shut that project down because it could spread CSAM, among other things.
Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.
Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).
We've seen a recent wave of big tech companies moving into quasi government roles in relation to censorship and rule enforcement. The apple thing is in keeping with the general trend.
> I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple
After having read a lot of internal Apple emails between executives[0], I find it extremely easy to imagine that they were completely dumbfounded that the rest of the world did not see things the same way they did.
I’ll note reports that Apple significantly increased iPhone manufacturing volume for the new phones coming out this fall. I hope it continues to be interesting to see what happens next.
> I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple. Extending Apple the benefit of doubt does not seem possible in this case.
Why is that so difficult to imagine? Apple's security model has always been "just trust us and don't question it", questioning their own practices themselves just has never been done.
That's already what's happening with any other Apple software, their own services are off limit of their model, explicitly excluded & explicitly trusted. This opinion is also reflected in their security threat document they published, they never talk about themselves being in the list of potential threats.
That's just in the continuity on how they usually work.
With that move I don't see how Apple will be able to refuse when a government asks it to scan for images of Whinnie the Pooh for example. They say they won't but they are too reliant on Foxconn's Chinese plants for manufacture, they could easily be blackmailed into compliance. (Android devices with a Chinese OS probably already report a wide array of stuff to the government)
This is another step towards total global surveillance of citizens. I don't see what can be done about it, technology makes it possible so it will happen, it is just too juicy for governments, they can't resist it.
There is an easy way for Apple to handle this. During iPhone setup present the following screen:
"We here at Apple do not want our servers to host images of exploited children, but we also respect your privacy. So you're free to use your phone with the photos stored locally, but if you'd like to enable iCloud we need you to press the button below to *install* our CSAM detector."
Then the argument basically goes away. It's like a virus scanner that you voluntarily installed. But having it baked into the phone as it ships rubs me the wrong way.
The problem is once you add this functionality, the cat's out of the bag and replies to requests to further invade users' privacy on-device change from "we can't" to "we won't", a position Apple can't possibly hope to maintain, esp. with the likes of China or the US government.
Yeah, I've alerted family to this and they're asking if they're going to need to move off of iOS. I told them to wait and see for now...fingers crossed that this gets walked back.
I've turned off auto-updates and I'm reconsidering my decade long allegiance to this brand. I love apple, I love the look and feel, but I could contribute to Linux to make it better. This is the final straw.
Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM.
Sexual abusers very often take photos and often upload these to their communities, and Apple has given them a secure device with which to do that. This is becoming increasingly widespread — the number of reported CSAM material grew by more than 50% last year, to nearly 70 million images and videos [1].
Due to Apple's privacy, the numbers look like this: Facebook reported over 50 million combined images and videos, Google reported 3.5 million, Dropbox, Microsoft, Snap, and Twitter over 100,000 images and videos. Apple reported only 3,000 photos in the same period, and no videos. [1]
Sexual abuse is experienced at some point in childhood by around 1 in 9 girls, and 1 in 53 boys. 93% of perpetrators are known to the victim. In 2016, CPS substantiated or found strong evidence to indicate sexual abuse of 57,329 children [2]. That's CPS, meaning in the US alone.
> They must have known the consequences, PR and otherwise.
The PR consequences of that have gone largely unnoticed, interestingly, and it's more so their attempts to curb it that are getting flak from the media.
I suspect this is because we can't create statistics on what we can't detect. There can be no "Apple is enabling, and allowing to continue, the sexual abuse of 50,000 children a year."
One speculated motive is that Apple wanted an End-to-End Encryption system for iCloud, but the FBI put immense pressure in 2018 internally to shut that project down because it could spread CSAM, among other things.
Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.
Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).
Pressure from government(s). We are just one stupid terrorist attack to make this framework acts as a dragnet for catching “terrorists” and “terrorist sympathizers”.
its probably not their real internal motive but i think it is somewhat interesting that they've done it in a way that was publicly discussed, sparking valuable debates over this and similar practices and forcing people consciously and governments even publicly to decide upon if they like this or not. I also think it could be a nice way out of this negative PR if they just claim they wanted to open the debate over this apparently not so uncommon practice of scanning user data independent of if it is on-device or not...
My unproven theory is that they're moving away from tech savey users (who won't like this) towards soccer mums (who want this sort of thing and all the parental control stuff they've been doing recently and the weird no porn on tumblr thing etc).
The second market is bigger and less discerning. Its a purely economic decision: come and bring your kids to our nice safe child-proof walled garden.
There is also the parallel requirement in places like China and Russia to police what people have on their phones. This move clearly differentiates Apple from other Western tech companies and maybe protects them from bans in big and fast growing markets.
The problem is that the definition of what is "illegal" can shift very quickly, esp. in certain countries, so moving the ability to scan for "illegal" stuff to user's devices is absolutely disastrous privacy/freedom-wise.
You seem to be concerned about actual CSAM "users" getting caught. What is being argued is that this creates a scenario where authoritarian governments can slip embarrassing photos into the list and find where they originated, find memes, etc.
Whose to say drawn images won't get flagged? I mean a particular genre that exists but in all essence...it's not illegal not real. Creepy, weird, and disturbing, but not nearly as bad as the former.
The overarching trend I see recently is that your rights, as an individual citizen, simply do not matter. They will be brushed away by any agenda that has a semblance of globalist right-think.
Your right to privacy, in the face of government agencies executing their mission, does not matter.
Your right to free movement, in the face of a flu that overburdens the hospital system, does not matter.
Your right to free speech, in the face of the need to eliminate outsider politicians, does not matter.
Your right to election security, in the face of the establishment getting their preferred candidate, does not matter.
Your right to raise your children with traditional values, in the face of social engineering guidelines, does not matter.
Your right to bodily autonomy, in the face of globally coordinated medical interventionism, does not matter.
Your right to closed borders, in the face of foreign policy expediency, does not matter.
Your right to eat what you want, in the face of “climate change” activism, will not matter.
Your preferences are simply not safeguarded by your rights, which can be overruled by the whim of “experts.” If you want to imagine how any particular future scenario unfolds, just ask yourself whether your rights would be an inconvenience to the plans of, say, Bill Gates. As a sort of stand-in for the general careerpol/NGO/billionaire/Harvard class running things.
(and most of the media and many of the public intelligentsia merrily support this)
Your argument is completely undermined by being full of references to enthusiastic ignorance. Covid was a massive IQ test, which many failed by following con men who told them to take the easy path, rather than the path of self responsibility. Rather than insisting that doubling down on bad decisions is somehow defending your "rights", you need to come to terms with how you were led so far astray in the first place.
Furthermore, "right to closed borders" ? You previously invoked "free movement" yet there is also a right to closed borders? It appears you've just cloaked the same tired red team talking points in the language of freedom. Please, as a libertarian, stop trying to use freedom to justify what is a highly authoritarian movement. You're doing freedom no favors.
[+] [-] merricksb|4 years ago|reply
Policy groups ask Apple to drop plans to inspect messages, scan for abuse images - https://news.ycombinator.com/item?id=28230248 - 284 points, 1 day ago, 190 comments
Policy Groups Urge Apple to Abandon Building Surveillance Capabilities - https://news.ycombinator.com/item?id=28232068 - 92 points, 1 day ago, 25 comments
Policy groups ask Apple to drop plans to inspect iMessages scan for abuse images - https://news.ycombinator.com/item?id=28231094 - 33 points, 1 day ago, 3 comments
[+] [-] dmitryminkovsky|4 years ago|reply
I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple. Extending Apple the benefit of doubt does not seem possible in this case.
Yesterday we saw OnlyFans exit the adult industry. Two weeks ago we saw Apple exit the privacy industry.
EDIT:
Some are questioning whether Apple was ever in the privacy industry. That's a good question. Even though their devices were certainly not secure and could be compromised, I think they were certainly in the privacy industry in the sense that they marketed an intention to make their devices as private and secure as possible[0]. Which is basically all a consumer can ask from a computer company.
[0] https://9to5mac.com/wp-content/uploads/sites/6/2019/01/DwGoq...
[+] [-] wpietri|4 years ago|reply
Did somebody raise the concern of push-back? I'm sure. But the moral questions around CSAM are something that was settled long ago internally. When I was at Twitter fighting abuse, the CSAM stuff was a separate group. My boss called them The Department of Mysteries because we almost never saw them or spoke to them. It was led by a serious person, an ex-FBI agent or something like that. They did what they did and we were all ok with it and grateful for it, because that shit is horrific and we didn't want it on our platform and we didn't want to have to deal with it ourselves.
My cousin was a PO for sex offenders, and one of our regular discussion topics at family reunions was how sex offenders were way more technologically savvy than a state parole department. How they really needed more help in making sure offenders weren't reoffending while on parole, while also not forcing them to just not use computers and phones altogether. If even I've heard this, I'm sure that Apple execs have heard it from law enforcement a zillion times.
It's also clear Apple put a lot of thought into addressing the privacy concerns for this. Technologically, it's sophisticated, impressive.
So I can easily believe the people at Apple said, "Sure, there are reasonable concerns, but we think we have addressed them." And that they're surprised by the level of sustained pushback.
[+] [-] sandworm101|4 years ago|reply
Microsoft Zune. The Super League. 47 Ronin. Wonder Woman 1984. Google+. Big companies make big mistakes. No matter how many focus groups you collect, no matter how many phone surveys you do, things can go wrong.
[+] [-] matthewdgreen|4 years ago|reply
I think that a number of folks at high levels in the SV executive suite have accepted a manufactured consensus along with their peers in Washington, and that consensus is something like: "people don't care about the privacy of what's on their device and they'll put up with anything to stop CSAM, even if it means scanning personal backups and local files (as opposed to shared files.)"
This seems like a reasonable thing to believe, since server-side scanning of (mostly shared) files has been going on for years and nobody has pushed back very hard on it. But what I think the consensus missed is that the reason for this lack-of-pushback is that nobody in the wider world had really been asked to weigh in on it before. It was something that a few elite tech busybodies were aware of, and most people accepted the idea that providers needed to check out photos that lived (unencrypted) on their servers. Apple accepted this logic and extended it unthinkingly beyond shared photos to unshared private photo libraries on the user's personal device (even if they are staged for backup as part of the iCloud Photos synchronization service, which is just a policy choice.) This was a second mistake because it assumed that because users mostly ignored the scanning of shared server-hosted files, they had somehow given consent to having their private files searched on their device. I don't think they had.
Overall, this announcement is the first time anyone has attempted to have an actual public debate to see how real users feel about this kind of surveillance, particularly automated surveillance of private photos (and an automated system with potential flaws.) Apple's mistake here was to assume that their user base had already given consent -- when they'd just never been asked. It's a very human mistake to make, frankly. The question is whether Apple will listen to their users or if they'll double down and push this through against their users' pushback. I can forgive Apple for misunderstanding their users once, but continuing down this path will be a lot harder to understand.
ETA: To illustrate how much more pervasive Apple's surveillance is than the standard (ignoring the PSI protocols), consider this quote from an EU Parliament briefing: "Others, such as Dropbox, Google and Microsoft perform scans for illegal images, but 'only when someone shares them, not when they are uploaded'." (I can only trust that this is factually true.) In this sense, Apple's move to scan all photos in your library is a significant functional escalation.) https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/6593...
[+] [-] AlexandrB|4 years ago|reply
Edit: It's also notable that news of this was leaked before Apple was able to officially announce anything. This means that Apple's marketing department was not able to control the tone/narrative as well as they normally do. The first thing many people heard was "Apple will be scanning your phone" without any of the nuances that came later.
[+] [-] red_admiral|4 years ago|reply
One of those two statements I disagree with.
Apple exiting the privacy industry would look like this to me: "we've decided from now on that, like almost all other cloud providers, we'll give ourselves access to your stuff for (ahem) legitimate purposes".
Not like this: "we'll implement a neuralhash on the client device rather than on the servers, and then do a cryptographic private-set-intersection protocol with them on the server".
That's a lot of cost and effort to prevent themselves, as a company, misusing the CSAM detector for other purposes. If there are government agencies involved, Apple is also making sure that they can't just use this as a backdoor to get access to everyone's files, it's as if the government said "we need to prevent child abuse, give us a backdoor" and Apple went "ok we'll give you a small backdoor that's ok at detecting abuse images and nothing more" - if the government was expecting to use the backdoor for more than this, they'll be disappointed.
(I'm pretty sure they have other backdoors already, by the way. My guess would be a zero-day on the baseband processor firmware.)
I'm not saying I agree or disagree with Apple's latest move, but "exit the privacy industry" feels a bit a strong statement to me. You have less privacy than you did three weeks ago, and an option on even less privacy in the future (but then again Apple could just change the T&C), but you're still better off than with competitors that offer similar functionality.
[+] [-] gjsman-1000|4 years ago|reply
Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.
Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).
[+] [-] version_five|4 years ago|reply
[+] [-] userbinator|4 years ago|reply
[+] [-] blubapabedo|4 years ago|reply
[+] [-] egypturnash|4 years ago|reply
Especially “all content must be reviewed for child porn before publishing, or in real time if streaming”.
It now seems not at all implausible that Apple’s half-baked attempt to scan everything for child porn is due to this too.
[+] [-] kalleboo|4 years ago|reply
After having read a lot of internal Apple emails between executives[0], I find it extremely easy to imagine that they were completely dumbfounded that the rest of the world did not see things the same way they did.
[0] https://twitter.com/TechEmails
[+] [-] alfiedotwtf|4 years ago|reply
[+] [-] hughrr|4 years ago|reply
They have serious problems admitting they did something stupid historically. Butterfly keyboards, reliability issues, you’re holding it wrong etc.
[+] [-] browningstreet|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] Covzire|4 years ago|reply
[+] [-] Grustaf|4 years ago|reply
[+] [-] realusername|4 years ago|reply
Why is that so difficult to imagine? Apple's security model has always been "just trust us and don't question it", questioning their own practices themselves just has never been done.
That's already what's happening with any other Apple software, their own services are off limit of their model, explicitly excluded & explicitly trusted. This opinion is also reflected in their security threat document they published, they never talk about themselves being in the list of potential threats.
That's just in the continuity on how they usually work.
[+] [-] ajnin|4 years ago|reply
This is another step towards total global surveillance of citizens. I don't see what can be done about it, technology makes it possible so it will happen, it is just too juicy for governments, they can't resist it.
[+] [-] 3pt14159|4 years ago|reply
"We here at Apple do not want our servers to host images of exploited children, but we also respect your privacy. So you're free to use your phone with the photos stored locally, but if you'd like to enable iCloud we need you to press the button below to *install* our CSAM detector."
Then the argument basically goes away. It's like a virus scanner that you voluntarily installed. But having it baked into the phone as it ships rubs me the wrong way.
[+] [-] dmitryminkovsky|4 years ago|reply
[+] [-] ahiknsr|4 years ago|reply
Not really. Virus scanners don't snitch on users to governments.
[+] [-] llampx|4 years ago|reply
[+] [-] tradesmanhelix|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] andix|4 years ago|reply
But for the future I probably need Android phones with custom ROMs. Without spyware.
[+] [-] ruph123|4 years ago|reply
[+] [-] tradesmanhelix|4 years ago|reply
[+] [-] high_5|4 years ago|reply
[+] [-] tradesmanhelix|4 years ago|reply
[+] [-] justinzollars|4 years ago|reply
[+] [-] miohtama|4 years ago|reply
[+] [-] unicornmama|4 years ago|reply
[+] [-] lowkey_|4 years ago|reply
> why they enables CSAM in the first place?
Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM.
Sexual abusers very often take photos and often upload these to their communities, and Apple has given them a secure device with which to do that. This is becoming increasingly widespread — the number of reported CSAM material grew by more than 50% last year, to nearly 70 million images and videos [1].
Due to Apple's privacy, the numbers look like this: Facebook reported over 50 million combined images and videos, Google reported 3.5 million, Dropbox, Microsoft, Snap, and Twitter over 100,000 images and videos. Apple reported only 3,000 photos in the same period, and no videos. [1]
Sexual abuse is experienced at some point in childhood by around 1 in 9 girls, and 1 in 53 boys. 93% of perpetrators are known to the victim. In 2016, CPS substantiated or found strong evidence to indicate sexual abuse of 57,329 children [2]. That's CPS, meaning in the US alone.
> They must have known the consequences, PR and otherwise.
The PR consequences of that have gone largely unnoticed, interestingly, and it's more so their attempts to curb it that are getting flak from the media.
I suspect this is because we can't create statistics on what we can't detect. There can be no "Apple is enabling, and allowing to continue, the sexual abuse of 50,000 children a year."
[1] https://www.nytimes.com/2020/02/07/us/online-child-sexual-ab... [2] https://www.rainn.org/statistics/children-and-teens
[+] [-] gjsman-1000|4 years ago|reply
Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.
Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).
[+] [-] safog|4 years ago|reply
Companies try to do stupid things and end up with egg on their face all the time.
They'll backtrack from this and people will forget it ever happened in a few months.
[+] [-] tlogan|4 years ago|reply
[+] [-] mercora|4 years ago|reply
[+] [-] LatteLazy|4 years ago|reply
The second market is bigger and less discerning. Its a purely economic decision: come and bring your kids to our nice safe child-proof walled garden.
There is also the parallel requirement in places like China and Russia to police what people have on their phones. This move clearly differentiates Apple from other Western tech companies and maybe protects them from bans in big and fast growing markets.
[+] [-] emodendroket|4 years ago|reply
[+] [-] tradesmanhelix|4 years ago|reply
[+] [-] captainredbeard|4 years ago|reply
[+] [-] sk2020|4 years ago|reply
[+] [-] MeinBlutIstBlau|4 years ago|reply
[+] [-] Tycho|4 years ago|reply
Your right to privacy, in the face of government agencies executing their mission, does not matter.
Your right to free movement, in the face of a flu that overburdens the hospital system, does not matter.
Your right to free speech, in the face of the need to eliminate outsider politicians, does not matter.
Your right to election security, in the face of the establishment getting their preferred candidate, does not matter.
Your right to raise your children with traditional values, in the face of social engineering guidelines, does not matter.
Your right to bodily autonomy, in the face of globally coordinated medical interventionism, does not matter.
Your right to closed borders, in the face of foreign policy expediency, does not matter.
Your right to eat what you want, in the face of “climate change” activism, will not matter.
Your preferences are simply not safeguarded by your rights, which can be overruled by the whim of “experts.” If you want to imagine how any particular future scenario unfolds, just ask yourself whether your rights would be an inconvenience to the plans of, say, Bill Gates. As a sort of stand-in for the general careerpol/NGO/billionaire/Harvard class running things.
(and most of the media and many of the public intelligentsia merrily support this)
[+] [-] czzr|4 years ago|reply
[+] [-] only_as_i_fall|4 years ago|reply
Mixing together a bunch of weak arguments does not create a strong argument.
[+] [-] mindslight|4 years ago|reply
Furthermore, "right to closed borders" ? You previously invoked "free movement" yet there is also a right to closed borders? It appears you've just cloaked the same tired red team talking points in the language of freedom. Please, as a libertarian, stop trying to use freedom to justify what is a highly authoritarian movement. You're doing freedom no favors.