What about app security? I gave up on Android years ago. Is it still the case that apps expect to be given most/all permissions in order to function? If so then no thanks.
The beauty of this can be seen by how those securities are leverage and enhanced on CopperheadOS.
I criticise Google a lot for how much information they store on us, but this work they do both on Android (open-source) and the Pixel phones hardware should receive more praise.
The Apple Secure Enclave cannot defend against someone who has the signing keys to the password software (or at least couldn't as of 2016)- that's why the FBI wanted Apple's "help" over the San Bernardino shooter. Apple said no, but could have done it- it was a policy choice of theirs to fight the FBI. Google has created a situation with the Pixel2 where they can't do that sort of thing even if they wanted to. And justified it without ever referencing "Search Warrants" or "Nation-state threat actors" even though that is the obvious driving force here.
This is not simply about the presence of a "secure enclave" or security module as Google calls it. It's about preventing the firmware on the security module from being compromised without knowing the users password.
To mitigate these risks, Google Pixel 2 devices implement insider attack resistance in the tamper-resistant hardware security module that guards the encryption keys for user data. This helps prevent an attacker who manages to produce properly signed malicious firmware from installing it on the security module in a lost or stolen device without the user's cooperation. Specifically, it is not possible to upgrade the firmware that checks the user's password unless you present the correct user password. There is a way to "force" an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user's data, effectively destroying it.
This constraint forces an attacker to focus on the user that actually has the password. From a security perspective, this forces the attacker to shift focus on forcing the user to reveal the secret as opposed to the company responsible for creating the firmware, in this case Google.
My guess is in the U.S. the 4th and 5th amendment would prevent the government from forcing you to reveal the secret, so long as you do not rely on biometric security, which has been in some cases ruled as exempt from the same rights as say a password. IANAL though, so I really can't elaborate on an explanation of why. I think if anything you're likely to be held on obstruction charges or have your assets frozen in an attempt to apply pressure on someone unwilling to cooperate. In other, perhaps less forgiving locales like North Korea, China, or Russia, I imagine one may end up being the subject of persuasion of a more physical nature.
A court order can’t compel someone to do the impossible. The updates in the Pixel 2 make it impossible for Google to circumvent security measures on it, thus protecting them from being coerced to do so (by courts and criminals alike).
Impressive, but given the prevalence of apps that demand full access to all USB contents and then arming the user with only the ability to accept or decline all-or-nothing, this seems like an electronic Maginot Line.
But to be fair, they've got to start somewhere and there is always hope they'll extend the permissions options to be more powerful.
One attack this wouldn’t guard against is a malicious actor pushing a buggy version of the Secure Enclave code that couldn’t be updated without destroying all data on the phone.
> There is a way to "force" an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user's data, effectively destroying it.
It's interesting.
Why "[wipe] the secrets used to decrypt the user's data, effectively destroying it" instead of wiping the data itself too?
Is this to potentially allow a third-party with enough power (i.e. a government entity) to eventually decrypt the data?
I assume the data storage is on a separate chip which - in the context of this attack - is untrusted. That is, the firmware could try to wipe the data storage too, but that would be relatively easy to bypass and doesn't really gain you anything anyway.
Or from the opposite direction: Only the keys are stored within the trusted part of the hardware; they're the only thing you can reliably wipe.
[+] [-] ISL|7 years ago|reply
I've been leaning toward an iPhone, for the first time, on security grounds, so this is a welcome piece of news.
Thanks, Google. Please keep it up.
[+] [-] saagarjha|7 years ago|reply
[+] [-] cryptonector|7 years ago|reply
[+] [-] kerng|7 years ago|reply
[+] [-] gustavmarwin|7 years ago|reply
I criticise Google a lot for how much information they store on us, but this work they do both on Android (open-source) and the Pixel phones hardware should receive more praise.
[+] [-] vuluvu|7 years ago|reply
How about putting a read/write switch on the device that prevents writing to the firmware if the switch is in the off position.
[+] [-] laggyluke|7 years ago|reply
[+] [-] jakobegger|7 years ago|reply
Kind of insencere when the biggest competitor has been doing this since 2013 (the feature is marketed as “secure enclave” by Apple)
[+] [-] mandevil|7 years ago|reply
[+] [-] bitmapbrother|7 years ago|reply
To mitigate these risks, Google Pixel 2 devices implement insider attack resistance in the tamper-resistant hardware security module that guards the encryption keys for user data. This helps prevent an attacker who manages to produce properly signed malicious firmware from installing it on the security module in a lost or stolen device without the user's cooperation. Specifically, it is not possible to upgrade the firmware that checks the user's password unless you present the correct user password. There is a way to "force" an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user's data, effectively destroying it.
[+] [-] praptak|7 years ago|reply
[+] [-] wffurr|7 years ago|reply
You mean Samsung?
[+] [-] delvinj|7 years ago|reply
[+] [-] ZeroCool2u|7 years ago|reply
My guess is in the U.S. the 4th and 5th amendment would prevent the government from forcing you to reveal the secret, so long as you do not rely on biometric security, which has been in some cases ruled as exempt from the same rights as say a password. IANAL though, so I really can't elaborate on an explanation of why. I think if anything you're likely to be held on obstruction charges or have your assets frozen in an attempt to apply pressure on someone unwilling to cooperate. In other, perhaps less forgiving locales like North Korea, China, or Russia, I imagine one may end up being the subject of persuasion of a more physical nature.
[+] [-] TimTheTinker|7 years ago|reply
[+] [-] conradev|7 years ago|reply
The idea is that nothing, not even Google, can change the change the firmware without first wiping the device or entering the passcode.
[+] [-] smattiso|7 years ago|reply
[+] [-] nmstoker|7 years ago|reply
But to be fair, they've got to start somewhere and there is always hope they'll extend the permissions options to be more powerful.
[+] [-] ec109685|7 years ago|reply
[+] [-] dredmorbius|7 years ago|reply
[+] [-] bitmapbrother|7 years ago|reply
[+] [-] ec109685|7 years ago|reply
[+] [-] gouggoug|7 years ago|reply
It's interesting.
Why "[wipe] the secrets used to decrypt the user's data, effectively destroying it" instead of wiping the data itself too?
Is this to potentially allow a third-party with enough power (i.e. a government entity) to eventually decrypt the data?
[+] [-] mdellavo|7 years ago|reply
[+] [-] jpab|7 years ago|reply
Or from the opposite direction: Only the keys are stored within the trusted part of the hardware; they're the only thing you can reliably wipe.
[+] [-] matthewmacleod|7 years ago|reply
[+] [-] mplewis|7 years ago|reply
[+] [-] tagtoc|7 years ago|reply
[deleted]