top | item 37349151

(no title)

_m8fo | 2 years ago

I’m not sure I understand Apple’s logic here. Are iCloud Photos in their data centers not scanned? Isn’t everything by default for iCloud users sent there automatically to begin with? Doesn’t the same logic around slippery slope also apply to cloud scans?

This is not to say they should scan locally, but my understanding of CSAM was that it would only be scanned on its way to the cloud anyways, so users who didn’t use iCloud would’ve never been scanned to begin with.

Their new proposed set of tools seems like a good enough compromise from the original proposal in any case.

discuss

order

matwood|2 years ago

You are correct, the original method would only have scanned items destined to iCloud and only transmitted some hash of matching hashes. And yes, similar slippery arguments exist with any providers that store images unencrypted. They are all scanned today, and we have no idea what they are matched against.

I speculated (and now we know) when this new scanning announced, that it was in preparation for full E2EE. Apple came up with a privacy preserving method of trying to keep CSAM off their servers while also giving E2EE.

The larger community arguments swayed Apple from going forward with their new detection method, but did not stop them from moving forward with E2EE. At the end of the day they put the responsibility back on governments to pass laws around encryption - where they should be, though we may not like the outcome.

gonehome|2 years ago

There are also ways to detect matches even with e2ee iirc and I suspect they found doing that instead easier than dealing with the previous approach.

At the time I also thought it was obvious it was in preparation for e2ee (despite loud people on HN who disagreed).

I do wonder if they had intended to have it be default on though, maybe not since probably better for most users to have a recovery option.

theshrike79|2 years ago

In my opinion their goal was to get stuff to a state where they could encrypt everything on iCloud so that even they can't access it.

To counter the "think of the children" -argument governments use to justify surveillance, Apple tried scanning stuff on-device but the internet got a collective hissy-fit of intentionally misunderstanding the feature and it was quickly scrapped.

Shank|2 years ago

> In my opinion their goal was to get stuff to a state where they could encrypt everything on iCloud so that even they can't access it.

They basically did. If you turn on Advanced Data Protection, you get all of the encryption benefits, sans scanning. The interesting thing is that if you turn on ADP though, binary file hashes are unencrypted on iCloud, which would theoretically allow someone to ask for those hashes in a legal request. But it's obviously not as useful for CSAM detection, as, say, PhotoDNA hashes. See: https://support.apple.com/en-us/HT202303

kobalsky|2 years ago

> but the internet got a collective hissy-fit of intentionally misunderstanding the feature

how was it misunderstood? your device would scan your photos and notify apple or whoever if something evil was found. wasn't that what they were trying to do?

bryan_w|2 years ago

> so that even they can't access it.

> scanning stuff on-device

What do you think they were going to do once the scanning turned up a hit? Access the photos? Well that negates the first statement.

no_time|2 years ago

> I’m not sure I understand apples logic here. Are iCloud Photos in their data centers not scanned? Isn’t everything by default for iCloud users sent there automatically to begin with? Doesn’t the same logic around slippery slope also apply to cloud scans?

I don’t see the problem with this status quo. There is a clear demarcation between my device and their server. Each serving the interests of their owner. If I have a problem with their policy, I can choose not to entrust my data to them. And luckily, the data storage space has heaps of competitive options.

judge2020|2 years ago

This status quo is that a lot of countries want to use the CSAM argument to push privacy-invasive technology (cough UK) like e.g. forcing companies to allow the government to break E2EE to catch CSAM distributors. Apple made this feature while planning to move iCloud Photos to E2EE so that they could argue "look, we still catch x CSAM distributors with n < 0.x% false positive rate, even with E2EE photos. therefore you don't need to pass these laws that break E2EE."

chatmasta|2 years ago

> the data storage space has heaps of competitive options

The generic space does, yes. But if you want native integration with iOS, your only choice is iCloud. It would certainly be nice if this was an open protocol where you could choose your own storage backend. But I think the chances of that ever happening are pretty much zero.

mindslight|2 years ago

Precisely! The software running on the phone should be representing the owner of the phone, period. We begrudgingly accept cloud scanning because that ship has already sailed, despite it being a violation of the analog of fiduciary duty. But setting the precedent that software on a user's device should be running actions that betray the user is from the same authoritarian vein as remote attestation. The option ignored by the "isn't this a good tradeoff" question is one where the device encrypts files before uploading them to iCloud, iCloud may scan the encrypted bits anyway to do their legal duty, and that's the end of the story. This is what we'd expect to be happening if device owners' interests were being represented by the software on the device, and so we should demand no less despite the software being proprietary.

turquoisevar|2 years ago

> Are iCloud Photos in their data centers not scanned?

No outright statement confirming or denying this has ever made to my knowledge, but the implication, based both on Apple's statements and the statement of stakeholders, is that this isn't currently the case.

This might come as a surprise to some, because many companies scan for CSAM, but that's done voluntarily because the government can't force companies to scan for CSAM.

This is because based on case law, companies forced to scan for CSAM would be considered deputized and thus subsequently it would be a breach of the 4th amendments safeguards against "unreasonable search and seizure".

The best the government can do is to force companies to report "apparent violations" of CSAM laws, this seems like a distinction without a difference, but the difference is between required to actively search for it (and thus becoming deputized) v. reporting when you come across it.

Even then, the reporting requirement is constructed in such a way as to avoid any possible 4th amendment issues. Companies aren't required to report it to the DOJ, but rather to the NCMEC.

The NCMEC is a semi-government organization, autonomous from the DOJ, albeit almost wholly funded by the DOJ, and they are the ones that subsequently report CSAM violations to the DOJ.

The NCMEC is also the organization that maintains the CSAM database and provides the hashes that companies, who voluntarily scan for CSAM, use.

This construction has proven to be pretty solid against 4th amendment concerns, as courts have historically found that this separation between companies and the DOJ and the fact that only confirmed CSAM making its way to the DOJ after review by the NCMEC, creates enough of a distance between the DOJ and the act of searching through a person's data, that there aren't any 4th amendment concerns.

The Congressional Research Service did a write up on this last year for the ones that are interested in it[0].

Circling back to Apple, as it stands there's nothing indicating that they already scan for CSAM server-side and most comments both by Apple and child safety organizations seem to imply that this in fact is currently not happening.

Apple's main concerns however, as stated in the letter by Apple, echo the same concerns by security experts back when this was being discussed. Namely that it creates a target for malicious actors, that it is technically not feasible to create a system that can never be reconfigured to scan for non-CSAM material and that governments could pressure/regulate it to reconfigure it for other materials as well (and place a gag order on them, prohibiting them to inform users of this).

At the time, some of these arguments were brushed off as slippery slope FUD, and then the UK started considering something that would defy the limits of even the most cynical security researcher's nightmare, namely a de facto ban on security updates if it just so happens that the UK's intelligence services and law enforcement services are currently exploiting the security flaw that the update aims to patch.

Which is what Apple references in their response.

0: https://crsreports.congress.gov/product/pdf/LSB/LSB10713

Shank|2 years ago

To add a bit more color, 18 U.S. Code § 2258A specifically states:

> Nothing in this section shall be construed to require a provider to—

> (1) monitor any user, subscriber, or customer of that provider;

> (2) monitor the content of any communication of any person described in paragraph (1); or

> (3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

The core of 18 U.S. Code § 2258A - Reporting requirements of providers is available at https://www.law.cornell.edu/uscode/text/18/2258A.

MBCook|2 years ago

It WASN’T the case. Photos are listed on their page of stuff that’s not end to end encrypted.

Since it all went down they added the advanced security option that encrypts photos, messages, and even more.

But that option is opt-in since if you mess it up they can’t help you recover.

Moldoteck|2 years ago

so users who didn’t use iCloud would’ve never been scanned to begin with. - so why not implement csam for icloud only without local scanning?

Gigachad|2 years ago

Because the idea is that the iCloud data would be encrypted so their servers couldn’t scan it. With the plan being they would do on device scanning of photos that were marked as being stored on iCloud.

It’s objectively better than what google does but I’m glad we somehow ended up with no scanning at all.