(no title)
_m8fo | 2 years ago
This is not to say they should scan locally, but my understanding of CSAM was that it would only be scanned on its way to the cloud anyways, so users who didn’t use iCloud would’ve never been scanned to begin with.
Their new proposed set of tools seems like a good enough compromise from the original proposal in any case.
matwood|2 years ago
I speculated (and now we know) when this new scanning announced, that it was in preparation for full E2EE. Apple came up with a privacy preserving method of trying to keep CSAM off their servers while also giving E2EE.
The larger community arguments swayed Apple from going forward with their new detection method, but did not stop them from moving forward with E2EE. At the end of the day they put the responsibility back on governments to pass laws around encryption - where they should be, though we may not like the outcome.
gonehome|2 years ago
At the time I also thought it was obvious it was in preparation for e2ee (despite loud people on HN who disagreed).
I do wonder if they had intended to have it be default on though, maybe not since probably better for most users to have a recovery option.
theshrike79|2 years ago
To counter the "think of the children" -argument governments use to justify surveillance, Apple tried scanning stuff on-device but the internet got a collective hissy-fit of intentionally misunderstanding the feature and it was quickly scrapped.
Shank|2 years ago
They basically did. If you turn on Advanced Data Protection, you get all of the encryption benefits, sans scanning. The interesting thing is that if you turn on ADP though, binary file hashes are unencrypted on iCloud, which would theoretically allow someone to ask for those hashes in a legal request. But it's obviously not as useful for CSAM detection, as, say, PhotoDNA hashes. See: https://support.apple.com/en-us/HT202303
kobalsky|2 years ago
how was it misunderstood? your device would scan your photos and notify apple or whoever if something evil was found. wasn't that what they were trying to do?
bryan_w|2 years ago
> scanning stuff on-device
What do you think they were going to do once the scanning turned up a hit? Access the photos? Well that negates the first statement.
no_time|2 years ago
I don’t see the problem with this status quo. There is a clear demarcation between my device and their server. Each serving the interests of their owner. If I have a problem with their policy, I can choose not to entrust my data to them. And luckily, the data storage space has heaps of competitive options.
judge2020|2 years ago
chatmasta|2 years ago
The generic space does, yes. But if you want native integration with iOS, your only choice is iCloud. It would certainly be nice if this was an open protocol where you could choose your own storage backend. But I think the chances of that ever happening are pretty much zero.
mindslight|2 years ago
turquoisevar|2 years ago
No outright statement confirming or denying this has ever made to my knowledge, but the implication, based both on Apple's statements and the statement of stakeholders, is that this isn't currently the case.
This might come as a surprise to some, because many companies scan for CSAM, but that's done voluntarily because the government can't force companies to scan for CSAM.
This is because based on case law, companies forced to scan for CSAM would be considered deputized and thus subsequently it would be a breach of the 4th amendments safeguards against "unreasonable search and seizure".
The best the government can do is to force companies to report "apparent violations" of CSAM laws, this seems like a distinction without a difference, but the difference is between required to actively search for it (and thus becoming deputized) v. reporting when you come across it.
Even then, the reporting requirement is constructed in such a way as to avoid any possible 4th amendment issues. Companies aren't required to report it to the DOJ, but rather to the NCMEC.
The NCMEC is a semi-government organization, autonomous from the DOJ, albeit almost wholly funded by the DOJ, and they are the ones that subsequently report CSAM violations to the DOJ.
The NCMEC is also the organization that maintains the CSAM database and provides the hashes that companies, who voluntarily scan for CSAM, use.
This construction has proven to be pretty solid against 4th amendment concerns, as courts have historically found that this separation between companies and the DOJ and the fact that only confirmed CSAM making its way to the DOJ after review by the NCMEC, creates enough of a distance between the DOJ and the act of searching through a person's data, that there aren't any 4th amendment concerns.
The Congressional Research Service did a write up on this last year for the ones that are interested in it[0].
Circling back to Apple, as it stands there's nothing indicating that they already scan for CSAM server-side and most comments both by Apple and child safety organizations seem to imply that this in fact is currently not happening.
Apple's main concerns however, as stated in the letter by Apple, echo the same concerns by security experts back when this was being discussed. Namely that it creates a target for malicious actors, that it is technically not feasible to create a system that can never be reconfigured to scan for non-CSAM material and that governments could pressure/regulate it to reconfigure it for other materials as well (and place a gag order on them, prohibiting them to inform users of this).
At the time, some of these arguments were brushed off as slippery slope FUD, and then the UK started considering something that would defy the limits of even the most cynical security researcher's nightmare, namely a de facto ban on security updates if it just so happens that the UK's intelligence services and law enforcement services are currently exploiting the security flaw that the update aims to patch.
Which is what Apple references in their response.
0: https://crsreports.congress.gov/product/pdf/LSB/LSB10713
Shank|2 years ago
> Nothing in this section shall be construed to require a provider to—
> (1) monitor any user, subscriber, or customer of that provider;
> (2) monitor the content of any communication of any person described in paragraph (1); or
> (3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).
The core of 18 U.S. Code § 2258A - Reporting requirements of providers is available at https://www.law.cornell.edu/uscode/text/18/2258A.
MBCook|2 years ago
Since it all went down they added the advanced security option that encrypts photos, messages, and even more.
But that option is opt-in since if you mess it up they can’t help you recover.
Moldoteck|2 years ago
Gigachad|2 years ago
It’s objectively better than what google does but I’m glad we somehow ended up with no scanning at all.