top | item 28079171

Apple's plan to “think different” about encryption opens a backdoor to your life

2260 points| bbatsell | 4 years ago |eff.org | reply

824 comments

order
[+] trangus_1985|4 years ago|reply
I've been maintaining a spare phone running lineage os exactly in case something like this happened - I love the apple watch and apple ecosystem, but this is such a flagrant abuse of their position as Maintainers Of The Device that I have no choice but to switch.

Fortunately, my email is on a paid provider (fastmail), and my photos are on a NAS, I've worked hard to get all of my friends on Signal. While I still use google maps, I've been trialing out OSM alternatives for a minute.

The things they've described are in general, reasonable and probably good in the moral sense. However, I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out). On the surface, it seems good - but I am concerned about other snooping features that this portents.

However, with icloud photos csam, it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior (even if the initial dataset is the most reprehensible behavior).

I'm saddened by Apple's decision, and I hope they recant, because it's the only way I will continue to use their platform.

[+] triska|4 years ago|reply
I remember an Apple conference where Tim Cook personally assured us that Apple is fully committed to privacy, that everything is so secure because the iPhone is so powerful that all necessary calculations can happen on the device itself, and that we are "not the product". I think the Apple CEO said some of this in the specific context of speech processing, yet it seemed a specific case of a general principle upheld by Apple.

I bought an iPhone because the CEO seemed to be sincere in his commitment to privacy.

What Apple has announced here seems to be a complete reversal from what I understood the CEO saying at the conference only a few years ago.

[+] Klonoar|4 years ago|reply
I think the EFF is probably doing good by calling attention to the issue, but let's... actually look at the feature before passing judgement, e.g:

https://twitter.com/josephfcox/status/1423382200880439298/ph...

- It's run for Messages in cases where a child is potentially viewing material that's bad.

- It's run _before upload to iCloud Photos_ - where it would've already been scanned anyway, as they've done for years (and as all other major companies do).

To me this really doesn't seem that bad. Feels like a way to actually reach encrypted data all around while still meeting the expectations of lawmakers/regulators. Expansion of the tech would be something I'd be more concerned about, but considering the transparency of it I feel like there's some safety.

https://www.apple.com/child-safety/ more info here as well.

[+] nerdponx|4 years ago|reply
The cynical take is that Apple was never committed to privacy in and of itself, but they are commited to privacy as long as it improves their competitive advantage, whether by marketing or by making sure that only Apple can extract value from its customers' data.

Hanlon's razor does not apply to megacorporations that have enormous piles of cash and employ a large number of very smart people, who are either entirely unscrupulous or for whom scruples are worth less than their salaries. We probably aren't cynical enough.

I am not arguing that we should always assume every change is always malicious towards users. But our index of suspicion should be high.

[+] cronix|4 years ago|reply
As soon as Cook became CEO, he let the NSA's Prism program into Apple. Everything since then has been a fucking lie.

> Andrew Stone, who worked with Jobs for nearly 25 years, told the site Cult of Mac last week that Steve Jobs resisted letting Apple be part of PRISM, a surveillance program that gives the NSA access to records of major Internet companies. His comments come amid speculation that Jobs resisted cooperating. “Steve Jobs would’ve rather died than give into that,” Stone told the site.

> According to leaked NSA slides about PRISM, Apple was the last tech behemoth to join the secret program — in October 2012, a year after Jobs died. Apple has said that it first heard about PRISM on June 6 of this year, when asked about it by reporters.

https://www.huffpost.com/entry/apple-nsa-steve-jobs_n_346132...

I mean, maybe they didn't call it "PRISM" when talking about it with Cook, so it could technically be true that they didn't hear of PRISM until media stories. Everyone knows the spy agency goes around telling all of their project code names to companies they're trying to compromise. Hello, sir. We're here to talk to you about our top secret surveillance program we like to call PRISM where we intercept and store communications of everyone. Would you like to join? MS did. So did Google. Don't you want to be in our select cool club?

[+] ksec|4 years ago|reply
Tim Cook doesn't Lie. I think he convinced himself what he said wasn't lying. That Apple and himself are so righteous. Which is actually worst, because that mentality filters through from Top to Bottom. And it is showing in their marketing and PR messages. He is also doing exactly Steve Jobs's last advice to him, Do the right thing. Except "the right thing" is so ambiguous it may turn out to be one of the worst advice.

My biggest turning point was Tim Cook flat out lying on the Apple case against Qualcomm. Double Dipping? Qualcomm patents being more than double than all the other six combined? And the tactics they used in court which was vastly different to Apple vs Samsung's case. And yes, they lost. ( Or settled )

That is the same with privacy. They simplifies their PR message as tracking = evil. Tracking is invading your privacy. Which is all good. But at the same time Apple is tracking you, everything you do on Apple Music, Apple TV+, App Store and even Apple Card. ( They only promise not to sell your Data to third party, they still have some of those Data. ). What that means is that only Apple is allowed to track you, but anyone else doing it are against privacy? What Apple really meant by the word Privacy then is that Data should not be sold to third parties. But no, they intentionally keep it unclear and created a war on Data Collection while they are doing it. And you now have people flat out claiming Apple doesn't collect any Data.

Then there is a war on Ads. Which was so bad the Ad industry pushes back and Tim Cook had to issue a mild statement saying they are not against Ads, only targeted Ads. What?

Once you start questioning all of his motives, and find concrete evidence that he is lying, along with all the facts from court case of how Apple has long term plans to destroy other companies, they all line up and shape how you view Tim Cook's Apple. And it isn't pretty.

And that is speaking from an Apple fan for longer than two decade.

[+] nonbirithm|4 years ago|reply
What I want to know is why they decided to implement this. Are Apple just trying to appear virtuous and took action independently? Or was this done at someone else's request?

For all the rhetoric about privacy coming from Apple, I feel that such an extreme measure would surely cause complaints from anyone deeply invested in privacy. And maybe they're just using words like "significant privacy benefits compared to previous techniques" to make it sound reasonable to the average user who's not that invested in privacy.

[+] JohnFen|4 years ago|reply
> because the CEO seemed to be sincere in his commitment to privacy.

The sincerity of a company officer, even the CEO, should not factor into your assessment. Officers change over time (and individuals can change their stance over time), after all.

[+] dilap|4 years ago|reply
There was a funny, tiny thing that happened a few years back that made me think Tim Cook is a liar.

It was back when Apple had just introduced the (now-abandoned) Force Touch feature (i.e., pressure sensitive touch, since abandoned, since it turns out pushing hard on an unyielding surface is not very pleasant or useful).

To showcase the capability, Apple had updated many of its apps with new force-touch features. One of which was mail: if you pushed just right on the subject line of a message, you'd get a tiny, unscrollable popout preview of its contents.

It was totally useless: it took just as much time to force touch to see the preview as just normally tapping to view the message, and the results were less useful. It was also fairly fiddly: if you didn't press hard enough, you didn't get the preview; if you pressed too hard, it would open into the full email anyway.

So Tim Cook, demoing the feature, said a funny thing. He said, "It's great, I use it all the time."

Which maybe, just maybe, is true, but personally I don't believe, not for a second.

So since then, I've had Tim down in my book as basically a big liar.

[+] avnigo|4 years ago|reply
I’m still waiting on iCloud backup encryption they promised a while back. There were reports that they scrapped those plans because the FBI told them to, but nothing official announced since 2019 on this.
[+] BiteCode_dev|4 years ago|reply
So you mean the company that was part of PRISM, that has unfair business practices and a bully as a founder was not really the world savior their marketting speach said they were ?

I'm in shock. Multi-billion dollars company usually never lies to make money! And power grabbing entities have such a neat track record in human history.

Not to mention nobody saw that coming and told repeatadly one should not get locked into such a closed and proprietary ecosystem in the first place.

I mean, dang, this serial killer was such a nice guy. The dead babies in the basements were weird but appart from that he was a stellar neighbour.

[+] anonuser123456|4 years ago|reply
Not really. This only applies to photos uploaded to iCloud. And photos uploaded to iCloud (and Google drive etc.) are already scanned on server for CP.

Apple is moving that process from on server to on phone in a way that protects your privacy better than current standards.

In the current system, all your photos are available to Apple unencrypted. In the new system, nothing will be visible to apple unless you upload N images with database hits. From those N tokens, Apple is then able to decrypt your content.

So when this feature lands, it improves your privacy relative to today.

[+] blakeinate|4 years ago|reply
This year I purchased my first iPhone since the 3G, after today I am starting to regret that decision. At this point, I can only hope Linux on mobile picks up steam.
[+] robertoandred|4 years ago|reply
Except the hashing and hash comparison are happening on the device itself.
[+] dylan604|4 years ago|reply
It is secure, as long as you have nothing to hide. If you have no offending photos, then the data won't be uploaded! See, it's not nefarious at all! /s
[+] mtgx|4 years ago|reply
It's all been downhill since we heard that they stopped developing the e2e encrypted iCloud solution because it might upset the FBI even more.
[+] c7DJTLrn|4 years ago|reply
Catching child pornographers should not involve subjecting innocent people to scans and searches. Frankly, I don't care if this "CSAM" system is effective - I paid for the phone, it should operate for ME, not for the government or law enforcement. Besides, the imagery already exists by the time it's been found - the damage has been done. I'd say the authorities should prioritise tracking down the creators but I'm sure their statistics look much more impressive by cracking down on small fry.

I've had enough of the "think of the children" arguments.

[+] geraneum|4 years ago|reply
Didn’t they [Apple] make the same points that EFF is making now, to avoid giving FBI a key to unlock an iOS device that belonged to a terrorist?

“ Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”

“… We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

“ The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Tim Cook, 2016

[+] Shank|4 years ago|reply
I really love the EFF, but I also believe the immediate backlash is (relatively) daft. There is a potential for abuse of this system, but consider the following too:

1. PhotoDNA is already scanning content from Google Photos and a whole host of other service providers.

2. Apple is obviously under pressure to follow suit, but they developed an on-device system, recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

3. Nobody, and I mean nobody, is going to successfully convince the general public that a tool designed to stop the spread of CSAM is a "bad thing" unless they can show concrete examples of the abuse.

For one and two: given the two options, would you rather that Apple implement serverside scanning, in the clear, or go with the on-device route? If we assume a law was passed to require serverside scanning (which could very well happen), what would that do to privacy?

For three: It's an extremely common trope to say that people do things to "save the children." Well, that's still true. Arguing against a CSAM scanning tool, which is technically more privacy preserving than alternatives from other cloud providers, is an extremely uphill battle. The biggest claim here is that the detection tool could be abused against people. And that very well may be possible! But the whole existence of NCMEC is predicated on stopping the active and real danger of child sex exploitation. We know with certainty this is a problem. Compared to a certainty of child sex abuse, the hypothetical risk from such a system is practically laughable to most people.

So, I think again, the backlash is daft. It's been about two days of the announcement being public (leaks). The underlying mathematics behind the system has barely been published [0]. It looks like the EFF rushed to make a statement here, and in doing so, it doesn't look like they took the time to analyze the cryptography system, to consider the attacks against it, or to consider possible motivations and outcomes. Maybe they did, and they had advanced access to the material. But it doesn't look like it, and in the court of public opinion, optics are everything.

[0]: https://www.apple.com/child-safety/pdf/Alternative_Security_...

[+] feanaro|4 years ago|reply
> that a tool designed to stop the spread of CSAM is a "bad thing"

It's certainly said to be designed to do it, but have you seen concerns raised in the other thread (https://news.ycombinator.com/item?id=28068741)? There have been reports from some commenters of the NCMEC database containing unobjectionable photos because they were merely found in a context alongside some CSAM.

Who audits these databases? Where is the oversight to guarantee only appropriate content is included? They are famously opaque because the very viewing of the content is illegal. So how can we know that they contain what they are purported to contain?

This is overreach.

[+] randcraw|4 years ago|reply
You presume Apple and the DoJ will implement this with human beings at each step. They won't. Both parties will automate as much of this clandestine search as possible. With time, the external visibility and oversight of this practice will fade, and with it, any motivation to confirm fair and accurate matches. Welcome to the sloppiness inherent in clandestine law enforcement intel gathering.

As with all politically-motivated initiatives that boldly violate the Constitution (consider the FISA Court, and its rubber stamp approval of 100% of the secret warrants put before it), the use and abuse of this system will go largely underground, like FISA, and its utility will slowly degrade due to lack of oversight. In time, even bad matches will log the IDs of both parties in databases that label them as potential sexual predators.

Believe it. That's how modern computer-based gov't intel works. Like most law enforcement policy recommendation systems, Apple's initial match algorithm will never be assessed for accuracy, nor be accountable for being wrong at least 10% of the time. In time it will be replaced by other third party screening software that will be even more poorly written and overseen. That's just what law enforcement does.

I've personally seen people suffer this kind of gov't abuse and neglect as a result of clueless automated law enforcement initiatives after 9-1-1. I don't welcome more, nor the gradual and willful tossing of everyone's basic Constitutional rights that Apple's practice portends.

The damages to personal liberty that are inherent in conducting secret searches without cause or oversight is exactly why the Fourth Amendment requires a warrant before conducting a search. NOW is the time to disabuse your sense of 'daftness'; not years from now, after the Fourth and Fifth Amendments become irreversibly passe. Or should I say, 'daft'?

[+] shivak|4 years ago|reply
> recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

Apple employs cryptographers, but they are not necessarily acting in your interest. Case in point: their use of private set intersection, to preserve privacy..of law enforcement, not users. Their less technical summary:

> Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection..

The matching is performed on device, so the user’s privacy isn’t at stake. But, thanks to PSI and the hash preprocessing, the user doesn’t know what law enforcement is looking for.

[+] echelon|4 years ago|reply
> There is a potential for abuse of this system, but consider the following too

> I think again, the backlash is daft.

Don't apologize for this bullshit! Don't let your love of brand trump the reality of what's going on here.

Machinery is being put in place to detect what files are on your supposedly secure device. Someone has the reins and promises not to use it for anything other than "protecting the children".

How many election cycles or generations does it take to change to an unfavorable climate where this is now a tool of great asymmetrical power to use against the public?

What happens when the powers that be see that you downloaded labor union materials, documents from Wikileaks, or other files that implicate you as a risk?

Perhaps a content hash on your phone puts you in a flagged bucket where you get pat downs at the airport, increased surveillance, etc.

The only position to take here is a full rebuke of Apple.

edit: Apple apologists are taking a downright scary position now. I suppose the company has taken a full 180 from their 1984 ad centerpiece. But that's okay, right, because Apple is a part of your identity and it's beyond reproach?

edit 2: It's nominally iCloud only (a key feature of the device/ecosystem), but that means having to turn off a lot of settings. One foot in the door...

edit 3: Please don't be complicit in allowing this to happen. Don't apologize or rationalize. This is only a first step. We warned that adtech and monitoring and abuse of open source were coming for years, and we were right. We're telling you - loudly - that this will begin a trend of further erosion of privacy and liberty.

[+] indymike|4 years ago|reply
> backlash is daft

Fighting to preserve a freedom is not daft, even if it is David vs. Goliath's bigger, meaner brother and his friends.

[+] vorpalhex|4 years ago|reply
Who verifies CSAM databases? Is there a way to verify the CSAM hashlist hasn't been tampered with and additional hashes inserted?

Would it be ok to use this approach to stop "terrorism"? Are you ok with both Biden and Trump defining that list?

[+] avnigo|4 years ago|reply
I’d be interested to see what any Apple executives would respond to the concerns in interviews, but I don’t expect Apple to issue a press release on the concerns.
[+] cblconfederate|4 years ago|reply
What is the point of E2EE vs TLS/SSL based encryption?
[+] wayneftw|4 years ago|reply
This is an abuse my property rights. The device is my property and this activity will be using my CPU, battery time and my network bandwidth. That's the abuse right there.

They should just use their own computers to do this stuff.

[+] api|4 years ago|reply
(2) is important. Apple put effort into making this at least somewhat privacy-respecting, while the other players just scan everything with no limit at all. They also scan everything for any purpose including marketing, political profiling, etc.

Apple remains the most privacy respecting major vendor. The only way to do better is fully open software and open hardware.

[+] Wowfunhappy|4 years ago|reply
This isn't the biggest issue at play, but one detail I can't stop thinking about:

> If an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. [...] For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Why is it different for children under 13, specifically? The 18-year cutoff makes sense, because turning 18 carries legal weight in the US (as decided via a democratic process), but 13?

13 is an age when many parents start granting their children more freedom, but that's very much rooted in one's individual culture—and the individual child. By giving parents fewer options for 13-year-olds, Apple—a private company—is pushing their views about parenting onto everyone else. I find that a little disturbing.

---

Note: I'm not (necessarily) arguing for greater restrictions on 13-year-olds. Privacy for children is a tricky thing, and I have mixed feelings about this whole scheme. What I know for sure, however, is that I don't feel comfortable with Apple being the one to decide "this thing we've declared an appropriate invasion of privacy for a 12-year-old is not appropriate for a 13-year-old."

[+] strogonoff|4 years ago|reply
If Mallory gets a lawful citizen Bob to download a completely innocuous looking but perceptual-CSAM-hash-matching image to his phone, what happens to Bob? I imagine the following options:

- Apple sends Bob’s info to law enforcement; Bob is swatted or his life is destroyed in some other way. Worst, but most likely outcome.

- An Apple employee (or an outsourced contractor) reviews the photo, comparing it to CSAM source image sample used for the hash. Only if the image matches according to human vision, Bob is swatted. This requires there to be some sort of database of CSAM source images, which strikes me as unlikely.

- An Apple employee or a contractor reviews the image for abuse without comparing it to CSAM source, using own subjective judgement. Better, but implies Apple employees could technically SWAT Apple users.

[+] farmerstan|4 years ago|reply
Police routinely get drug sniffing dogs to give false positives so that they are allowed to search a vehicle.

How do we know Apple or the FBI don’t do this? If they want to search someone’s phone all they need to do is enter a hash of a photo they know is on the targets phone and voila, instant access.

Also, how is this not a violation of the 14th amendment? I know Apple isn’t part of the government but they are basically acting as a defacto agent of the police by scanning for crimes. Using child porn as a completely transparent excuse to start scanning all our material for anything they want makes me very angry.

[+] cwizou|4 years ago|reply
The FT article mentioned it was US only, but I'm more afraid of how other governments will try to pressure Apple to adapt said technology to their needs.

Can they trust random government to give them a database of only CSAM hashes and not insert some extra politically motivated content that they deem illegal ?

Because once you've launched this feature in the "land of the free", other countries will require for their own needs their own implementation and demand (through local legislation which Apple will need to abide to) to control said database.

And how long until they also scan browser history for the same purpose ? Why stop at pictures ? This is opening a very dangerous door that many here will be uncomfortable with.

Scanning on their premises (considering they can as far as we know ?) would be a much better choice, this is everything but (as the "paper" linked tries to say) privacy forward.

[+] lovelyviking|4 years ago|reply
- Apple: Dear User, We are going to install Spyware Engine in your device.

- User: Are you out of your f... mind?

- Apple: It's for children protection.

- User: Ah, ok, no problem, please install spyware and do later whatever you wish and forget about any privacy, the very basis of rights, freedom and democracy.

This is by the way how Russia started to filter the web from political opponents. All necessary controls were put in place under the same slogan: "to protect children"

Yeah, right.

Are modern people that naive and dumb and can't think 2 steps forward? Is that's why it's happening?

Edit: Those people would still need to explain how living in society without privacy, freedom and democracy with authoritarian practices when those children will grow up will make them any 'safer' ...

[+] iamleppert|4 years ago|reply
It’s pretty trivial to iteratively construct an image that has the same hash as another, completely different image if you know what the hash should be.

All one needs to do, in order to flag someone or get them caught up in this system, is to gain access to this list of hashes and construct an image. This data is likely to be sought after as soon as this system is implemented, and it will only be a matter of time before a data breach exposes it.

Once that is done, the original premise and security model of the system will be completely eroded.

That said, if this does get implemented I will be getting rid of all my Apple devices. I’ve already switched to Linux on my development laptops. The older I get, the less value Apple products have to me. So it won’t be a big deal for me to cut them out completely.

[+] haskaalo|4 years ago|reply
At this point, I think phones can be compared to a home in terms of privacy.

In your house, you might have private documents, do some things you don't want other people to have or see just like what we have on our phones nowadays.

The analogy I'm trying to make is that if suddenly the government decided to install cameras in every houses with the premise to make sure no pedophile is abusing a child and that the cameras never send data unless the AI done locally detects it is something that I believe would shock everyone.

[+] skee_0x4459|4 years ago|reply
wow. in the middle of reading that, i realized that this is a watershed moment. why would apple go back on their painstakingly crafted image and reputation of being staunchly pro privacy? its not for the sake of the children (lol). no, something happened that has changed the equation for apple. some kind of decisive shift has occurred. maybe apple has finally caved in to the chinese market, like everyone else in the US, and is now making their devices compatible with chinese surveillance. or maybe the US government has finally managed to force apple to crack open its shell of encryption in the name of a western flavored surveillance. but either way, i think it is a watershed moment because securing privacy will from this moment onward be a fringe occupation in the west. unless a competitor rises up -- but thats impossible because there arent enough people who care about privacy to sustain a privacy company. thats the real reason why privacy has died today.

if you really want to save the children, why not build the scanning into safari? scan the whole phone! just scan it all. its really no different than what they are doing. its not like they would have to cross the rubicon to do it, not anymore anyway.

and also i think its interesting how kids will adjust to this. i think a lot of kids wont hear about this and will find themselves caught up in a child porn case.

im so proud of the responses that people seem to generally have. it makes me feel confident in the future of the world.

isnt there some device to encrypt and decrypt messages with a separate device that couples to your phone? like a device fit into a case and that has a keyboard interface built into a screen protector with indium oxide electrodes.

[+] zionic|4 years ago|reply
You can’t “save the children” by building a dystopia for them to grow up in.
[+] Waterluvian|4 years ago|reply
If I go on 4chan and an illegal image loads and caches into my phone before moderators take it down or I hit the back button, will Apple’s automated system ruin my life?

This kind of stuff absolutely petrifies me because I’m so scared of getting accidentally scooped up for something completely unintentional. And I do not trust police one bit to behave like intelligent adult humans.

Right now I feel like I need to stop doing ANYTHING that goes anywhere outside the velvet ropes of the modern commercial internet. That is, anywhere that cannot pay to moderate everything well enough that I don’t run the risk of having my entire life ruined because some #%^*ing algorithm picks up on some content I didn’t even choose to download.

[+] roody15|4 years ago|reply
My two cents: I get the impression this is related to NSO pegasus software. So once the Israeli firms leaks were made public Appple had to respond and has patched some security holes that were exposed publicly.

NSO used exploits in iMessage to enable them to grab photos, texts among other things.

Now shortly after Apple security patches we see them pivot and now want to “work” with law enforcement. Hmmm almost like once access was closed Apple needs a way to justify “opening” access to devices.

Yes I realize this could be a stretch based on the info. Just seems like an interesting coincidence… back door exposed and closed…. now it’s back open… almost like governments demand access

[+] shrimpx|4 years ago|reply
From Apple's original text[0]:

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching [...]

It's incredible that Apple arrived at the conclusion that client-side scanning that you cannot prevent is more private than cloud-scanning.

Since they claim they're only scanning iCloud content, why not scan in the cloud?

They decided the most private way is to scan iCloud content before it's uploaded to the cloud... Because if they scanned in the cloud it would be seen as a breach of privacy and is bad optics for a privacy-focused company? But scanning on the physical device that they have described as "personal" and "intimate" has better optics? That's amazing.

This decision can only be read as Apple paving the way to scanning all content on the device, to bypass the pesky "Backup to iCloud" options being turned off.

[0] https://www.apple.com/child-safety/

[+] nicetryguy|4 years ago|reply
I'm looking forward to this platform being expanded to facially ID against more databases such as criminals, political dissenters, or anyone with an undesirable opinion so that SWAT teams can barge into the homes of false positive identifications to murder them and their dogs.
[+] blintz|4 years ago|reply
One disappointing development from a larger perspective is that many privacy-preserving technologies (multi-party computing, homomorphic encryption, hardware enclaves, etc) are actually getting used to build tools that undermine once-airtight privacy guarantees. E2E starts to become… whatever this is.

A more recent example is how private set intersection became an easy way to get contact tracing tech everywhere while maintaining an often perfunctory notion of privacy.

I wonder where large companies will take this next. It behooves us cryptography/security people who actually care about not walking down this slippery slope to fight back with tech of our own.

This whole thing also somewhat parallels the previous uses of better symmetric encryption and enclaves technologies for DRM and copyright protection.