top | item 17198481

Insider Attack Resistance

126 points| el_duderino | 7 years ago |android-developers.googleblog.com | reply

65 comments

order
[+] ISL|7 years ago|reply
This is the sort of development that makes me want a Pixel 2.

I've been leaning toward an iPhone, for the first time, on security grounds, so this is a welcome piece of news.

Thanks, Google. Please keep it up.

[+] saagarjha|7 years ago|reply
I'm pretty sure everything mentioned here has been a feature on iPhones for quite some time.
[+] cryptonector|7 years ago|reply
What about app security? I gave up on Android years ago. Is it still the case that apps expect to be given most/all permissions in order to function? If so then no thanks.
[+] kerng|7 years ago|reply
It shows that Google is behind... I think most systems (iPhone, even Windows with TPM and Bitlocker) had this stuff for many years.
[+] gustavmarwin|7 years ago|reply
The beauty of this can be seen by how those securities are leverage and enhanced on CopperheadOS.

I criticise Google a lot for how much information they store on us, but this work they do both on Android (open-source) and the Pixel phones hardware should receive more praise.

[+] vuluvu|7 years ago|reply
"To prevent attackers from replacing our firmware with a malicious version, we apply digital signatures."

How about putting a read/write switch on the device that prevents writing to the firmware if the switch is in the off position.

[+] jakobegger|7 years ago|reply
> We recommend that all mobile device makers do the same.

Kind of insencere when the biggest competitor has been doing this since 2013 (the feature is marketed as “secure enclave” by Apple)

[+] mandevil|7 years ago|reply
The Apple Secure Enclave cannot defend against someone who has the signing keys to the password software (or at least couldn't as of 2016)- that's why the FBI wanted Apple's "help" over the San Bernardino shooter. Apple said no, but could have done it- it was a policy choice of theirs to fight the FBI. Google has created a situation with the Pixel2 where they can't do that sort of thing even if they wanted to. And justified it without ever referencing "Search Warrants" or "Nation-state threat actors" even though that is the obvious driving force here.
[+] bitmapbrother|7 years ago|reply
This is not simply about the presence of a "secure enclave" or security module as Google calls it. It's about preventing the firmware on the security module from being compromised without knowing the users password.

To mitigate these risks, Google Pixel 2 devices implement insider attack resistance in the tamper-resistant hardware security module that guards the encryption keys for user data. This helps prevent an attacker who manages to produce properly signed malicious firmware from installing it on the security module in a lost or stolen device without the user's cooperation. Specifically, it is not possible to upgrade the firmware that checks the user's password unless you present the correct user password. There is a way to "force" an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user's data, effectively destroying it.

[+] praptak|7 years ago|reply
Does Apple market their solution as resistant to someone having all Apple's keys but not the user password?
[+] wffurr|7 years ago|reply
> biggest competitor

You mean Samsung?

[+] delvinj|7 years ago|reply
What if the attack is in the form of a court order?
[+] ZeroCool2u|7 years ago|reply
This constraint forces an attacker to focus on the user that actually has the password. From a security perspective, this forces the attacker to shift focus on forcing the user to reveal the secret as opposed to the company responsible for creating the firmware, in this case Google.

My guess is in the U.S. the 4th and 5th amendment would prevent the government from forcing you to reveal the secret, so long as you do not rely on biometric security, which has been in some cases ruled as exempt from the same rights as say a password. IANAL though, so I really can't elaborate on an explanation of why. I think if anything you're likely to be held on obstruction charges or have your assets frozen in an attempt to apply pressure on someone unwilling to cooperate. In other, perhaps less forgiving locales like North Korea, China, or Russia, I imagine one may end up being the subject of persuasion of a more physical nature.

[+] TimTheTinker|7 years ago|reply
A court order can’t compel someone to do the impossible. The updates in the Pixel 2 make it impossible for Google to circumvent security measures on it, thus protecting them from being coerced to do so (by courts and criminals alike).
[+] conradev|7 years ago|reply
That's the same attack vector as far as this change is concerned.

The idea is that nothing, not even Google, can change the change the firmware without first wiping the device or entering the passcode.

[+] smattiso|7 years ago|reply
Have there been notable cases of a malicious actor installing a compromised OS on a target's phone for spying purposes?
[+] nmstoker|7 years ago|reply
Impressive, but given the prevalence of apps that demand full access to all USB contents and then arming the user with only the ability to accept or decline all-or-nothing, this seems like an electronic Maginot Line.

But to be fair, they've got to start somewhere and there is always hope they'll extend the permissions options to be more powerful.

[+] ec109685|7 years ago|reply
One attack this wouldn’t guard against is a malicious actor pushing a buggy version of the Secure Enclave code that couldn’t be updated without destroying all data on the phone.
[+] dredmorbius|7 years ago|reply
Does the firmware signature preclude flasshing the devices with an alternate OS? (Considered independenttly of data stored.)
[+] bitmapbrother|7 years ago|reply
The locked bootloader would prevent flashing anything that was not signed by Google.
[+] ec109685|7 years ago|reply
It wouldn’t be able to read the encrypted data.
[+] gouggoug|7 years ago|reply
> There is a way to "force" an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user's data, effectively destroying it.

It's interesting.

Why "[wipe] the secrets used to decrypt the user's data, effectively destroying it" instead of wiping the data itself too?

Is this to potentially allow a third-party with enough power (i.e. a government entity) to eventually decrypt the data?

[+] mdellavo|7 years ago|reply
this is the standard "remote wipe" technique and a standard cryptographic application
[+] jpab|7 years ago|reply
I assume the data storage is on a separate chip which - in the context of this attack - is untrusted. That is, the firmware could try to wipe the data storage too, but that would be relatively easy to bypass and doesn't really gain you anything anyway.

Or from the opposite direction: Only the keys are stored within the trusted part of the hardware; they're the only thing you can reliably wipe.

[+] matthewmacleod|7 years ago|reply
Wiping data is time consuming. Wiping the keys that can be used to decrypt it is much less so.
[+] mplewis|7 years ago|reply
How will you decrypt the data once the key has been erased?