top | item 32003756

Apple previews Lockdown Mode

1577 points| todsacerdoti | 3 years ago |apple.com

742 comments

order
[+] alwillis|3 years ago|reply
Let's not let the perfect be the enemy of the good.

This is a huge step forward for iPhone users. Look, I get it. From the typical HN perspective, this potentially looks like a lot of hype. But many of you aren't looking at from a high level.

In the world we are now living in; even what's happening in the United States right now, being able to protect yourself from well-funded, determined attackers for the average person couldn't come at a better time.

There's a huge gap between Fortune 500 executives, government officials, etc. and regular people in terms of the resources available to them to prevent state-sponsored attackers. It doesn't take much these days to go from a nobody to being on somebody's radar.

If you're a woman seeking an abortion in a state where it's illegal or severely restricted, you could be the target of malware from your local or state government or law enforcement. In Texas, you can sue anyone who aids and abets a woman who attempts to get an abortion for $10,000, which is enough to get someone to trick someone into installing malware on a phone.

No, it's not China or Russia coming for you but it doesn't take much to ruin someone's life.

I don't think this is virtue signaling or marketing hype by Apple; if anything, this is right in alignment with the stance they've had on privacy for years. Even for a company the size of Apple, putting up $10 million to fund organizations that investigate, expose, and prevent highly targeted cyberattacks isn't pocket change.

At the end of the day, this is all good news for user privacy and security going forward. I also suspect if I lockdown my iPhone, my other compatible devices using the same Apple ID will also lockdown. No IT department required.

[+] dkarl|3 years ago|reply
> In Texas, you can sue anyone who aids and abets a woman who attempts to get an abortion for $10,000, which is enough to get someone to trick someone into installing malware on a phone.

Anecdata for people who think this is unlikely: my wife had an issue getting unclaimed property back from the state of Texas and hired someone who advertise the ability to help. She turned out to be a bulldog with a ton of knowledge of the necessary bureaucracy. She put hours per week into it on our behalf for months, through many rounds of filing paperwork and then hounding bureaucrats on the phone by telling them exactly how and why we could sue if they ignored it. She did all that for a cut that was a fraction of the $10k abortion bounty. The $10k might seem like a symbolic gesture, but it will spawn a cottage industry of bounty hunters. No doubt most of them will be ideologically excited wannabes who quickly give it up, but some will be dogged and effective and will cultivate an expanding repertoire of skills. It's a terrifying prospect.

There will be many, many people who never previously entertained the idea of getting involved in serious criminality who now need protection from the prying eyes of the state and their fellow citizens. To look at it from a cold and opportunistic viewpoint, this could change the public perception of digital privacy from being just for dangerous creepy people to something that everybody should value.

[+] kelnos|3 years ago|reply
I have mixed feelings about this.

Lockdown Mode basically cripples the phone, feature-wise. It's not quite to the point where I'd (even hyperbolically) say "why don't you just get an old dumb phone instead", but still...

The right thing to do would be to redesign the system from the bottom up to actually be secure in the face of vulnerabilities in any of these features that get disabled because they can be dangerous for people. (And maybe Apple is working on this behind the scenes, which will take them years to complete.)

But, agreed: let's not let perfect be the enemy of the good. It's better to have this option than to not have it, even though it likely creates a super restricted user experience that probably isn't particularly pleasant to use.

[+] heavyset_go|3 years ago|reply
If the state is after you, even low-level state actors, all it takes is a court order or subpoena to compel any of the parties involved with your phone or data to hand over your data or start collecting it.

If your threat model includes any level of the US government, and that includes women seeking abortions in states where it is illegal, you cannot rely on US-based company's tech to protect you from the law.

[+] Sebb767|3 years ago|reply
> There's a huge gap between Fortune 500 executives, government officials, etc. and regular people in terms of the resources available to them to prevent state-sponsored attackers. It doesn't take much these days to go from a nobody to being on somebody's radar.

It's also a question of whether you want that. Anyone can take anti-phishing training, it just takes a lot of time. Want to download a mod for a game? You better have a separate gaming machine with no important data on it and, to be sure, in a separate network. Want to buy a phone? Better drive to a random store, ordering is to dangerous.

Sure, it's easy to get on the radar, but avoiding a state-sponsored hack is also a lot of effort. Fortune 500 executives need to put that effort in and they do have the money to make it happen, but for most people, the problem is not the cost.

[+] datavirtue|3 years ago|reply
A prominent activist was targeted and her iPhone compromised (owned). She ended up in prison/tortured because of it.

Did not look good for Apple. For a company of their means, they had to do something.

[+] captainmuon|3 years ago|reply
I wonder, why doesn't Apple (and MS, Google, ...) throw all their weight into the ring and lobby for making selling exploits commerically a crime? It should be up there with counterfeighting money or selling nuclear secrets. NSO Group should be on sanction lists. Politicians should be ranting about how dangerous it is that foreign companies and countries can spy on US citizens (instead of what they are usually ranting about).

You could wake up one morning, and every billboard in Washington, every newspaper will have ads for this issue. Every representative would be followed around by lobbyists. And Apple could pay it from their coffee money.

Now, I get why we don't crack down harder on selling exploits. First, intelligence agencies love NOBUS (No one but us) exploits and believe something like this exists. Second, it is convenient because sometimes foreign intelligence agencies are used to spy where domesitc agencies are not allowed to; and third the US could probably do little (officially) against companies, say, in Israel.

But this is totally the kind of issue that you could escalate into a bipartisan national security thing. And it would be an incredible marketing, and security win if Apple could push any stricter legislation in that direction.

[+] gtvwill|3 years ago|reply
Lol this is a whole lotta faith based on nothing. Sorry bud Aussie laws gonna puck you here. Your Apple device can be backdoored curtosy of aus laws and apples not allowed to inform you it's happened. If you think lockdown mode gonna prevent this your 100% dreaming. Much lulz y'all should just put less data on your phone if your concerned with others knowing that data.
[+] jorvi|3 years ago|reply
I agree with the rest of your comment, but this

> Even for a company the size of Apple, putting up $10 million to fund organizations that investigate, expose, and prevent highly targeted cyberattacks isn't pocket change.

is kind of funny, as it’s about 1/20000 of their total cash reserves. With 20000 in my savings account, it’d be equivalent to giving 1 dollar to charity. In other words, pocket change :)

[+] switch007|3 years ago|reply
Is there any topic Roe v Wade can't be shoehorned in to?
[+] smoldesu|3 years ago|reply
> If you're a woman seeking an abortion in a state where it's illegal or severely restricted, you could be the target of malware from your local or state government or law enforcement.

Let's not get in above our heads, here: if the US government wants to know what's on your iPhone, they still have the faculties to retrieve that information. Setting your iPhone in a lockdown mode isn't going to let you escape the purview of government surveillance, and if it did then Apple wouldn't be announcing it today. We're all targets of government malware, and the way they ensure we all keep it installed is simple: they just make Apple and Google write it for them. This pervasive idea that Apple is somehow escaping the jurisdiction of PRISM is pretty hysterical, and it makes me excited for the first Senators to get caught paying for prostitution services with Apple Pay inside Lockdown Mode. The only enemy of "good" in a threat model is the unknown, and Apple makes sure there's plenty of unknown factors in your iPhone.

Edit: For all HN loves to rant about the Halloween Documents, you lot seem awfully unfamiliar with the Snowden leaks...

[+] hk1337|3 years ago|reply
I kind of want to turn it on and leave it on. I'm assuming since it's a "mode" that I can turn it off when I need to, do what I know is legit, then turn back on again.
[+] anshumankmr|3 years ago|reply
>$10,000, which is enough to get someone to trick someone into installing malware on a phone.

People have done far worse for far less.

[+] hulitu|3 years ago|reply
> At the end of the day, this is all good news for user privacy and security going forward.

What can i say ? Good luck then with your "privacy and security going forward". And remember later, when they knock at your door, that it was for your's and (mostly) their security.

[+] mkd1964|3 years ago|reply
On the other hand, if you're a ballot trafficker, this is good news for you and the non-profits and NGOs abetting you.
[+] rmbyrro|3 years ago|reply
> putting up $10 million isn't pocket change

10 Million = 0.0027% of Apple's sales in 2021.

Equivalent to an Apple developer who made 300K in 2021 donating 8 dollars.

If this doesn't classify as pocket change, it's quite close.

[+] lrvick|3 years ago|reply
Let us not gloss over the fact that in China Apple willingly handed over their HSMs to the CCP granting them full control of Apple devices there, even if it means aiding in Uyghur genocide.

When it comes down to money, or protecting the freedom or privacy of users, they will choose money. In this case the money is in good PR to help them secure more government contracts. They are playing all sides.

I do not feel anyone that needs high freedom, security, and privacy is well served by proprietary walled gardens. Particularly those that only grant holes in the walls to corrupt state actors.

https://www.nytimes.com/2021/05/17/technology/apple-china-ce...

[+] mlindner|3 years ago|reply
> If you're a woman seeking an abortion in a state where it's illegal or severely restricted, you could be the target of malware from your local or state government or law enforcement. In Texas, you can sue anyone who aids and abets a woman who attempts to get an abortion for $10,000, which is enough to get someone to trick someone into installing malware on a phone.

Can we stop spreading these lies?

[+] akira2501|3 years ago|reply
The disconnect here is that Apple already monopolizes the devices, the service, and the application distribution platform. Now, they're expecting you to be satisfied with them monopolizing the security controls and monitoring on your phone.

We expect so little of our phones with respect to our desktops when we know full well there's no legitimate reason to do so. Particularly now, if you're imagining that one needs security against state level actors.. then the notion that a single vendor is required to simplify the ecosystem and broaden adoption is directly in conflict with this future you have declared we are now in. It's literally the weakest possible model of defense available.

This isn't the perfect being the enemy of the good.. this is Apple monopolizing yet another aspect of the platform for themselves at the cost of true innovation.

[+] andrewmcwatters|3 years ago|reply
"Silly HN reader, you're just not seeing the big picture." Could you not?

You know what people do when they're targeted by state actors? They don't use computers. And if they have to, they air gap.

[+] Veserv|3 years ago|reply
Let’s not let better be the enemy of good either. Better than terrible is still bad and is nowhere near good.

It is frankly ridiculous that anybody should believe Apple when they claim to provide even minimal resistance to well-funded determined attackers. Protecting against well-funded determined attackers has been the holy grail of software security since forever and everybody in software security at least claims to be working toward that. Despite that, the prevailing state of “best-in-class” “best-practices” commercial software security is objectively terrible including Apple circa 1 year ago.

Are we supposed to believe that Apple, despite abject failure over the last few decades until as recently as the last time they announced security updates to the iPhone, has finally this time, for sure, pinky swear its true, jumped from terrible to the holy grail, or even good, because they said so?

No, this is absolute, utter, unequivocal garbage. Their claims are completely unsupported and they should be excoriated for spewing unsubstantiated bullshit that muddies the waters of the actual state of software security and misleads people into believing they are getting a meaningful degree of protection or software security.

If they want to make such claims, they should put their money where there mouth is and, instead of certifying iOS to EAL1+ and AVA_VAN.1 as they currently do, they should certify it in “Lockdown Mode” to EAL6-7 and AVA_VAN.5 which actually does certify protection against “high attack potential” attackers such as large organized crime and state-sponsored attackers. At the very least they could certify it to EAL5 and AVA_VAN.4 which certifies protection against “moderate attack potential” attackers. Until they do that, their claims to protect against state-sponsored attackers are complete unverifiable bullshit.

[+] blintz|3 years ago|reply
I am so excited about this news. I understand that some people are pessimistic, and view it as a "giving up" on complete security against nation-states. I think that's the wrong way to analyze the situation.

The dream I have is someone making a phone that is purpose-built to be secure against state actors. Unfortunately, this makes very little economic sense, and probably won't happen (maybe if some rich person started a foundation or something?). The phone would need to have pretty restricted functionality and would not be generally appealing to mass market consumers.

As it stands, securing a mass market modern smartphone, even from just remote attacks, is just intractable. We should not bury our heads in the sand and wishfully think that if they just spend a little more money, close a few more bugs, and make the sandboxing a little better, somehow iOS 16 or Android 13 will finally be completely secure against state actors. The set of features being shipped will grow fast enough that security mitigations will not someday 'catch up'.

This is the next best thing! The more we can give users the freedom to lock down their devices, the more the vision of an actual solution comes into view. This is the first step towards perhaps our only hope of solving this someday - applying formal methods and lots of public scrutiny to a small 'trusted code base', and finally telling NSO group to fuck off.

Even this dream may not pan out, but at least we can have hope.

[+] PuppyTailWags|3 years ago|reply
I would suspect any phone designed to resist a state-level actor, that is made available to me (a regular citizen) would 100% be a honeypot for a state level actor.
[+] dark_star|3 years ago|reply
Bunnie Huang is working on Betrusted [1], a communications device that is designed to be secure from state actors. The first step is Precursor (about: [2], purchase:[3]) the hardware and OS that will be the platform for the communications device.

It's designed to be secure even though it communicates via insecure wifi, for instance via tethering or at home. The CPU and most peripherals are in an FPGA with an auditable bitstream to program the device to ensure there are no back doors. Hardware and software are all open source. It has anti-tamper capability.

It looks well-thought-out.

1. https://betrusted.io/

2. https://www.bunniestudios.com/blog/?p=5921

3. https://www.crowdsupply.com/sutajio-kosagi/precursor

[+] ransom1538|3 years ago|reply
I want deniability. After watching the videos from Ukraine of Russians pulling out citizens from cars forcing them to unlock their phone with guns to their heads -- I want a way to hand someone a phone, unlock it, and STILL be protected. I want my private things in a volume with deniability. Trucrypt was close.
[+] gambiting|3 years ago|reply
>>The dream I have is someone making a phone that is purpose-built to be secure against state actors

I just don't see how anyone could build such a thing. State level actors have the tools necessary to force you or your company to build in any backdoor they want, and prevent you from ever talking about it to anyone. US certainly does, and could just force apple to add a backdoor to this lockdown mode and apple could never even hint at its existence under legal threat.

[+] RonMarken|3 years ago|reply
Realistically you cannot win against a resourceful adversary every time. But merely painting the situation through the lens of premature surrender is also a disservice.

It will be interesting to see what third-party researchers discover about these new protections. Might remember something about Apple rewriting format parsers for iMessage in memory-safe language with sandboxing as Blastdoor and it was discovered there was still plenty of attack-surface in the unprotected parsers.

[+] googlryas|3 years ago|reply
It might just be better to not rely on a phone, rather than rely on something achieving perfect security against the most malicious and capable of actors.

If I was really concerned about targeted cyber attacks against me, I think that I would exclusively use computers that I would buy from random people on Craigslist, take the hard drives out and only boot with live CDs using ram disks, and only connect via random public Wi-Fi locations.

[+] the_other|3 years ago|reply
With this announcement, Apple are saying "we will protect you from state actors", which is a role usually performed by states. Apple is saying "we operate at the same level as nation states; we are a nation-state level entity operating in the "digital world": It's a flag-raise.

It's the first such flag-raise I've seen. Security researchers talk about protections from state actors all the time, and there are tools which support that... but this is the first public announcement, and tool, from a corporation with more spare, unrestricted capital than many countries. It comes at a time when multiple nation states are competing for energy and food security; and Apple are throwing up a flag for a security-security fight (or maybe data-security). This is not just handy tech, it's full-on cultural zeitgeist stuff. Amazing.

[+] Terretta|3 years ago|reply
This is great, but also clever.

By offering users a more locked down option with clear tradeoffs, (a) users can make a choice between security and convenience, and (b) given user agency, negative press around hacks of not locked-down devices loses potency.

Meanwhile, the choice seems straightforward on most of these...

Lockdown Mode includes the following protections:

- Messages: Most message attachment types other than images are blocked. Some features, like link previews, are disabled.

GREAT!

- Web browsing: Certain complex web technologies, like just-in-time (JIT) JavaScript compilation, are disabled unless the user excludes a trusted site from Lockdown Mode.

GREAT!

- Apple services: Incoming invitations and service requests, including FaceTime calls, are blocked if the user has not previously sent the initiator a call or request.

GREAT!

- Wired connections with a computer or accessory are blocked when iPhone is locked.

GREAT! (Used to have to do this yourself with Configurator if you wanted to be hostile border-crossing proof.)

- Configuration profiles cannot be installed, and the device cannot enroll into mobile device management (MDM), while Lockdown Mode is turned on.

HMM ... there are hardening settings only available through Configurator or MDM profiles. Will those be defaulted on as well?

[+] matthewdgreen|3 years ago|reply
Last year I wrote: "In the world I inhabit, I’m hoping that Ivan Krstić wakes up tomorrow and tells his bosses he wants to put NSO out of business. And I’m hoping that his bosses say 'great: here’s a blank check.' Maybe they’ll succeed and maybe they’ll fail, but I’ll bet they can at least make NSO’s life interesting." [1]

Maybe this is the blank check :)

[1] https://news.ycombinator.com/item?id=27897975

[+] newscracker|3 years ago|reply
I hope Apple expands this quickly through minor updates to the OS rather than waiting for a next major release. This needs faster iteration than anything else.

Quoting what’s in the first release:

> At launch, Lockdown Mode includes the following protections:

> Messages: Most message attachment types other than images are blocked. Some features, like link previews, are disabled.

> Web browsing: Certain complex web technologies, like just-in-time (JIT) JavaScript compilation, are disabled unless the user excludes a trusted site from Lockdown Mode.

> Apple services: Incoming invitations and service requests, including FaceTime calls, are blocked if the user has not previously sent the initiator a call or request.

> Wired connections with a computer or accessory are blocked when iPhone is locked.

> Configuration profiles cannot be installed, and the device cannot enroll into mobile device management (MDM), while Lockdown Mode is turned on.

I’m not a target (I think, and hopefully don’t get to be one), but nevertheless I’d feel safer with this turned on (I very rarely use FaceTime, so not accepting it is not a big deal).

I’d also love more protections. Not allowing specific apps to connect to any network (WiFi included), Apple handling issue reports on apps with urgency (right now they seem to be ignored even when policy violations which are against the user’s interests are reported), etc.

[+] mcculley|3 years ago|reply
This is great but too big of a hammer for most use cases. What I really want is a per-application firewall.

For example, say I would like to install a photo editing application. It would need access to my photos. That is fine, so long as it is not allowed to connect to the Internet (or any other network). There is currently no way to ensure this.

[+] _the_inflator|3 years ago|reply
"Web browsing: Certain complex web technologies, like just-in-time (JIT) JavaScript compilation, are disabled unless the user excludes a trusted site from Lockdown Mode."

Highly interesting, that Apple is doing this. This is a thing. MS and Google are also taking steps to harden Chromium security against JIT compiler issues with JavaScript. https://www.zdnet.com/article/securing-microsoft-edge-switch...

[+] egberts1|3 years ago|reply
Too bad that Google does not offer this same “Lockdown Mode” as Apple does.

Instead, they (Google Play Store) removed our ability to see what “app privileges” that an app would required BEFORE we do the installation step from the Google Play Store. What we got instead was an obfuscated “Data Security” section that is pretty much always “blank”.

My flashlight app should not require GAZILLION app privilegeS nor hide that fact before I can determine whether I can safely install it, much like Apple App Store can do by doing the CRUCIAL pre-reveal of any needed app privilege(s) … for our leisure perusual and applying any applicable but personalize privacy requirement BEFORE we do the app install.

[+] janandonly|3 years ago|reply
If Apple was really serious about this, they would add one more feature to Lockdown mode: To delete and scrub permanently and definitively all your iCloud data.

You can close the proverbially "front door" by enabling "Lockdown mode" but if that same government sends a subpoena to Apple, then they will just give them a copy of all your iCloud private data.

[+] pluc|3 years ago|reply
Apple's been making it real difficult to pick Android lately. Only thing Android still has going for it is the ability to flash custom ROMs, eg CalyxOS or Graphene.
[+] lisper|3 years ago|reply
Extreme? This sounds like the way I have my computing environment configured by default (to the extent that I'm able to do so with browser extensions and whatnot).
[+] tialaramex|3 years ago|reply
> Most message attachment types other than images are blocked.

Who wants to bet that this reflects minimum requirements dictated for user experience, rather than reflecting what Apple are actually securing today ?

The correct model here, the one that would actually defeat these adversaries, is to start with what you can actually secure and expand from there, prioritising customer needs. This delivers security improvements for all customers, but it makes the calculus simple for Lockdown customers, whatever Lockdown allows will be OK.

Suppose today Apple has a working safe BMP reader, and a working safe WAV reader, but they're still using their ratty JPEG and MP3 implementations. As described, this feature says you can receive a JPEG attachment (which takes over your phone and results in your cousin who remains in the country being identified as a contact and imprisoned) but you can't listen to the WAV file an informant sent you because that's "dangerous"...

[+] highwaylights|3 years ago|reply
This seems to mimic, or at least rival, Google's Advanced Protection Program which has been running for a few years to offer similar protections to Google/Android users.

My concern about enabling this would be that I'm unsure how much this puts barriers in place to prevent the owner of an account regaining access should it be stolen by a threat actor (i.e. could this backfire on the account owner?).

It's still unclear to me how much Apple really protects against (for example) sim swaps to take over an iCloud account - and the documentation around when they'll truly insist on having something like a Recovery Key if it's enabled is sparse. It almost reads as if the right amount of begging will socially engineer access to a locked iCloud account by a threat actor with the right personal information to hand, which if coupled with Lockdown mode, seems pretty dangerous to the true account holder.

[+] someguydave|3 years ago|reply
This lockdown mode looks like what ought to be default security behavior.
[+] TIPSIO|3 years ago|reply
If you are "a target" and going to take measures of basically disabling everything on your iPhone, wouldn't it just make sense to get a burner dumb phone?

Hasn't this been happening for years (drug dealers, anonymous, etc..)?

[+] Nextgrid|3 years ago|reply
Most of the features of this lockdown mode should be on by default.
[+] post_break|3 years ago|reply
When reading through this list at each feature I can't help but go "why isn't this in regular iOS?"