top | item 10421736

Apple tells U.S. judge 'impossible' to unlock new iPhones

328 points| aaronbrethorst | 10 years ago |reuters.com | reply

193 comments

order
[+] BWStearns|10 years ago|reply
I applaud Apple for having made it technically impossible for them to betray their customers' trust, but I was just wondering about third party applications which have been granted access to various data on the phone. If the prosecution is looking for evidence in photos, and the suspect has granted Facebook access to their photos, could a judge compel Facebook to use their access to the phone in question to retrieve photos that were never on Facebook? I ask this in both a legal sense as well as a technical sense as I am not familiar enough with iOS permissions/API.
[+] avn2109|10 years ago|reply
>> "...technically impossible for them to betray their customers' trust..."

Impossible is a very strong word in this context. Let me take the opportunity to remind everyone that an iPhone is a very complicated device running nearly 100% closed source hardware and software, include all sorts of opaque cryptographic hardware and a known-to-be-compromised secondary baseband computer, such that the security of the device's entire stack, top-to-bottom, could not possibly be verified by a third party in principle, let alone in practice.

In light of that fact, security claims by Apple could be regarded as changing (reducing?) the probability of the law getting your data by technical means or collusion with Cupertino, but certainly not as insuring that the probability = 0.

Edit: This is not an indictment of Apple per se, since the same is true of literally every smartphone ever constructed, but at least e.g. Blackberry isn't out there claiming that they're unable to compromise your data, full stop.

[+] kevinchen|10 years ago|reply
From my understanding - important files are encrypted, and the ability to decrypt is lost when the user sleeps their phone. So iOS needs the user to enter a passcode before the data can be read again.
[+] dmayle|10 years ago|reply
What's more is the app update mechanism. From a technical standpoint, Apple just has to deliver an app update via auto-update to a version of an app that has the necessary access
[+] shalmanese|10 years ago|reply
For those who have questions about "Couldn't Apple just do X or Y?", Apple has published an eminently readable whitepaper on the software and hardware components of their security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Of course, Apple could be implementing things differently from as described but the whitepaper lays out what is and is not possible in the described system.

[+] zaroth|10 years ago|reply
The iOS Security Guide is 60 pages of technical goodness, but when dealing with encryption, it's really not enough to fully understand the system, even putting aside concerns about it being closed-source. There is such an incredible level of detail in the document, it would take a really long time to peel that onion back to be able to claim to really understand the trust model. It's some of the best technical documentation on encryption methods I've read, but even still there is plenty of room for ambiguity.

The key section I think most people really should take a look at is "iCloud Backup" starting on page 42. Almost everything you do with your device will end up in an iCloud Backup if you have enabled that, and while the data is encrypted for transport, note well the following;

  The backup set is stored in the user’s iCloud account and consists of a copy of the
  user’s files, and the iCloud Backup keybag. The iCloud Backup keybag is protected by
  a random key, which is also stored with the backup set. (The user’s iCloud password
  is not utilized for encryption so that changing the iCloud password won’t invalidate
  existing backups.)
In plain English, if you have enabled iCloud Backup, everything but your keychain itself is accessible in plaintext to Apple, and can be restored, without your password, to any new device that [you / the Feds] may provide.

I would very much love for Apple to provide an opt-in where the iCloud backup key is tied to the account password with extremely aggressive key-stretching. I would take the risk of losing my iCloud Backup over the trade-off of having my backups accessible to Apple and anyone they can be compelled to share them with. But I do appreciate for the average user, it's not uncommon for iCloud Backup to be immediately preceded by a password reset (just look down-thread for an example).

Note, Apple says that they use a combination of S3 and Azure to actually store the iCloud data, but that they have an additional layer of encryption over the data before sending it out. So while backups technically reside on Amazon/Microsoft servers, it's a black box to them.

[+] abtinf|10 years ago|reply
If they are doing things substantially differently than outlined in that whitepaper, and so actually do have access to locked devices, then they would have just perjured themselves and exposed themselves to extraordinary liability. Which makes me think they are probably telling the truth - short of 0-days, no one has access to locked iOS devices.
[+] faide|10 years ago|reply
In case anyone shares my confusion: they aren't talking about unlocking in the sense of removing carrier restrictions, but rather in the sense of disabling the security features of the device to access the data stored on it.
[+] alialkhatib|10 years ago|reply
I was similarly confused. While the HN title accurately reflects the Reuters title, I think it'd be reasonable to retitle the submission to something along the lines of "Apple: breaking encryption on devices running iOS 8 and newer 'would be impossible'", or something (point being to highlight that we're talking about the encryption issue, not carrier).
[+] daimyoyo|10 years ago|reply
I believe they're talking about trying to access a phone once it has been "locked" from too many wrong tries at the lock screen.
[+] supercoder|10 years ago|reply
Makes sense to anyone who uses an iPhone though.
[+] jmount|10 years ago|reply
I agree. For phones "unlock" means removing carrier restrictions and getting information off is something like "decrypt" or "crack."
[+] api|10 years ago|reply
So on one hand we're carrying around little location-aware sensors that upload all kinds of tracking data to who-knows-where, enabling unprecedented tracking of human beings and their movements at incredibly high resolution. We also voluntarily post so much data online any number of social media sites, financial institutions, etc. can see almost everything we do.

BUT on the other hand with proper use of crypto it is now possible to engage in secure communication at a distance that can never be decrypted without our consent and (if certain protocols and method are used) allow anonymity at both ends of the link (e.g. Tor hidden services). It is also possible to securely send money to others using Bitcoin without knowing their identity... ever.

Very interesting times... it's both a panopticon and an unprecedented age for anonymity and privacy.

[+] revelation|10 years ago|reply
Well no, you have already detailed the crucial vulnerability in this panopticon: the devices can not be trusted. Trying to do encryption and secure communication with untrusted devices then is utter folly.

I'm just astonished by the breadth of vulnerability in modern smartphones, particularly with Googles handling of Android security. We haven't even gotten into baseband vulnerabilities yet because the whole applications processor security thing is just an entirely crumbling piece of cardboard.

[+] lemiant|10 years ago|reply
Can someone explain how Apple has managed to render brute force ineffective if there are only 10000 possible 4-digit passcodes?
[+] 0x0|10 years ago|reply
Depending on the settings, the phone could:

  * Wipe after 10 unsuccesful PIN attempts
  * Be configured with a 6 digit numeric PIN code
  * Be configured with an unlimited alphanumeric password
  * Exponentially increase delay between PIN attempts after
    unsuccessful entry - for example 3 attempts in 3 seconds,
    next attempt after 10 seconds, next attempt after 60 
    seconds, next attempt after 4 hours, next attempt after 
    24 hours, next attempt after a week, next attempt in a 
    year (making up numbers to prove a point)
[+] dankohn1|10 years ago|reply
My 6 year old was very excited to find the passcode feature on his iPad and disregarded Daddy's direct warnings that if he kept changing the passcode ("soccer", "baseball", etc.) he wouldn't be able to remember the result and would get locked out. This, of course happened, and after some tears, the solution was for me to remote wipe the iPad and then restore from iCloud backup. The whole process was an extremely impressive mix of security and usability.
[+] mcculley|10 years ago|reply
After a few invalid attempts, it forces you to wait a while before trying again, increasing the wait time each time. One can also set the phone to wipe itself after 10 failed attempts.
[+] risk|10 years ago|reply
After six tries a rate limit is reached. There is no cool down period. You have to connect it to your computer and login to apple, or wipe it.
[+] djrogers|10 years ago|reply
Aside from the valid points below about rate limits and auto-wiping, the new default is a 6 digit passcode, so more zeros..
[+] jansenvt|10 years ago|reply
the same way everyone else does. they limit the number of attempts you get.
[+] draw_down|10 years ago|reply
You don't just get to try 10000 times.
[+] tartuffe78|10 years ago|reply
There is an option to "Erase Data after 10 failed passcode attempts"
[+] peterhadlaw|10 years ago|reply
To everyone replying to this comment saying that you can set a limit and then the phone will wipe itself after the limit has been reached - that point is moot. Although I wish it wasn't so, there are a couple people who were able to setup a mechanism to brute force an iPhone pin by shutting off the device before it registered an unsuccessful attempt therefore ... giving you limitless potential.
[+] guelo|10 years ago|reply
Theoretically, couldn't Apple push an auto-update to the phone that could decrypt the data?
[+] jacobwil|10 years ago|reply
Where would that software auto-update get the decryption key from? If the user has enabled a passcode, then their passcode is used in generation of the key used to protect the disk encryption key. Even if apple did push an update, the user would still have to unlock their device post-update.
[+] vessenes|10 years ago|reply
There is US case-law around this issue; the Federal government cannot compel "speech"; code is speech.
[+] djrogers|10 years ago|reply
The update couldn't be installed without someone unlocking the phone first... Not to mention the legal side of that would be very murky, I doubt any judge/court in the US could compel that action.
[+] joosters|10 years ago|reply
Some (but not all) user data on the phone is encrypted with the password, so no update would extract it all.
[+] tacotuesday|10 years ago|reply
This sounds really similar to when the Snowden docs were first leaked. You have to listen very carefully to exactly what is said and read between the lines.

http://www.cnet.com/news/obama-denies-that-us-spied-on-germa...!

"the United States is not monitoring and will not monitor the communications" means, "Was monitoring up until just now."

Here Apple says "We can't unlock new iPhones." Apple says nothing about their ability to restore iCloud backups or load software to capture evidence from a locked phone.

[+] PopeOfNope|10 years ago|reply
You hear that? That's the sound of a million security researchers simultaneously shouting 'Challenge Accepted!'
[+] asadhaider|10 years ago|reply
For users with Touch ID enabled, would it be legal for law enforcement to use your fingerprint on file to access the phone that way?

I'm not sure how easy it would be to fool the touch ID sensor on the phone. I think I remember a MythBusters episode where they made a silicone mould of a fingerprint to bypass a security system.

[+] zaroth|10 years ago|reply
Yes, and there is case law that fingerprints are not testimony (since you leave them everywhere you go). However, as others stated below, it won't work because the PIN is still required after a timeout. Really, they thought this through.
[+] interpol_p|10 years ago|reply
Too many failed attempts (three, I think) and the phone reverts to requiring a password. So unless they can get it right on the first try it would still be a bit of a risk — because even using your actual finger can miss at times.
[+] 0x0|10 years ago|reply
TouchID requires a passcode after the phone has been locked for 48 hours or so.
[+] qyv|10 years ago|reply
The only direct quote in the article that offers any real context:

"Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand"

This quote, granted taken without full context, seems to indicate a pretty clear stipulation of "clear legal authority". Not sure what to make of that, but I do think that in matters of privacy and surveillance the words are always chosen very carefully indeed.

[+] joeyspn|10 years ago|reply
In security, impossible is a marketing word
[+] venomsnake|10 years ago|reply
There is situational impossibility though. Apple not being able to decrypt a powered down phone is possible if they did their design work properly.
[+] __sarcasm|10 years ago|reply
But what if some diabolical fiend attaches an encrypted iPhone to a nuclear bomb, and the only way to diffuse the timer is by asking Apple to unlock the encryption to stop the timer???

Also, "child pornography" ...what if "child pornography" is involved? You wouldn't want THAT, right?

[+] xkiwi|10 years ago|reply
what if this is parallel construction? Can it state as 'impossible' to 'unlock'(crack/backdoor) in civil level but unlock-able on DoD/NSA level?
[+] vbezhenar|10 years ago|reply
iPhone pincodes by default are very short. iPhone protects them from bruteforce with delays after unsuccessful tries. But does the hardware, where pincode's hash is stored, protects it with delays and protects it from labaratory attack? I mean, government could just open iPhone, extract that chip, extract pincode's hash and crack it. You would need quite long password to withstand such an attack.

Another vector of attack is fingerprint scanner. After iPhone turned on, by default you can unlock it with fingerprint scanner. And turning iPhone off is not very quick, you need to push power button for a few seconds and then slide over screen. If police arresting you, then will get your phone from your hands and just force your finger to open it, no pincode needed.

So yes, while iPhone is probably safe when properly configured (ruling out possible vulnerabilities), the proper configuration is hard to use and is not default.

That's from the point of goverment attacker. If attacker is just some thieve or commercial espionage, probably iPhone is a good device even with default settings.

[+] yyhhsj0521|10 years ago|reply
Chrome says that the page contains malicious softwares... oops?
[+] jlebrech|10 years ago|reply
So the US wants it to be impossible for the average person to unlock a phone (via law) but Apple make it technically impossible and the US justice system cries foul. Swings and roundabouts.