top | item 43160961

(no title)

LittleTimothy | 1 year ago

I think it's actually valuable to hear from one of the former Tory ministers who was in favour of the bill says[1]. I don't necessarily agree with him, but it's interesting to hear he essentially argues that you don't have the security you think you do. If a bad actor wants to pwn you they'll do it on your device and you can't stop them. I think that's broadly true of some actors. If you personally are being targetted by a motivated opponent then yes, they will likely target your personal device first and then encrypted cloud is essentially moot. It's also an interesting idea to not say "We need this to tackle CSAM" but instead to say "We need this so that these companies can't enable CSAM whilst claiming to be unaware" - I think on a practical level that does hold more water.

At the end of the day though, he doesn't address the clearest problem with these backdoors which is that the payoff value of being able to blanket unencrypted cloud data is of such high value it's extremely likely to get exploited, and for the average person you're more worried about being exposed as part of a broad attack on infrastructure not a targeted attack on your individually.

It's also pretty difficult to give credence to the idea that they need this tool to tackle CSAM or organised crime. The reason you can't believe that is because they don't tackle CSAM or organised crime by and large. The UK government simply hasn't prioritized policing that, so we're not in a context of "we're doing all we can but we need more powers", we're in the context of "We can't be bothered, curtail people's rights so our job is easier". I'm sure Apple is not in favour of CSAM, but Apple isn't a member of the British police responsible for investigating and tackling CSAM, why are we trying to recruit them to be?

[1]https://x.com/BenWallace70/status/1893936287477912035

discuss

order

matthewdgreen|1 year ago

I don't think that's very persuasive. Targeted compromise of iPhones is incredibly expensive, and relatively hard for mere criminals to access. If that's the only way for a bad actor to access your data, you've instantly taken everyone but the most wildly sophisticated (and wealthy) criminals and state actors off the table.

Meanwhile iCloud backups are available not only to sophisticated folks who can compromise Apple's servers, but also to anyone who can social-engineer a password recovery flow or bribe an Apple customer service agent.

Second, re: CSAM, the iCloud ADP system is focused on backing up your personal devices. It is not designed to share data with other users. So a criminal can have CSAM on their phone and simply turn off iCloud Backup (and thus be "invisible") or they can use ADP. The two things are equivalent, and both assume a sophisticated user. I'm sure there's some bizarre and painful scheme where you could use ADP to distribute CSAM to other folks, but there are many easier ways to do that. Once you grant the CSAM point, you're just saying it's necessary for all personal device data to be constantly available for search by the government. (And while I disagree with that opinion, it is an opinion and should be fully fleshed out.)

rightbyte|1 year ago

> If a bad actor wants to pwn you they'll do it on your device and you can't stop them. I think that's broadly true of some actors.

I mean that is correct in the literal sense. Both Google and probably Samsung can hack my device remotely by remote code execution via targeted updates. So American and South Korean authorities.

But I don't think any "bad actor" could do it?

Like, the Foobarland police. Is that a reasonable take?