Doesn't this make the FBI's case weaker with the public and the court, given there would then be more points of attack for hackers?
Comey went on record saying the tool should be secure in Apple's hands because Apple knows what it is doing [1]
And, he didn't even think of this tactic until Darrell Issa suggested it at the Congressional hearing [2].
> Issa: Did you receive the source code from Apple? Did you demand the source code?
> Comey: Did we ask Apple for their source code? Not that I'm aware of
It seems like the FBI is grasping at straws here. Does anyone buy this charade? The PR on the DOJ's side is atrocious. From the beginning, "just about one phone" was obviously a lie, and it's all been downhill since then.
For once, the fear, uncertainty and doubt tactics of the government are not working. I'm happy about that but concerned for our future when law enforcement blames technologists for not handing over data to phones. We need to continue educating each other on these issues regardless of what the courts say, and regardless of what ultimately comes out of Congress.
You're treating the public case the FBI made earlier as if it reflects its actual beliefs and motivations. Looking at the pattern of actions, rather than the rhetoric, suggests something else to me.
And it's a little early to say that it's not working. It's not working as well as they'd like. It's certainly not working for the echo chamber of HN and tech nerds, but what we think is irrelevant. It's not over 'til the fat lady sings, as they say....
> The FBI’s brief dismisses all of this as a marketing ploy, and then blasts Apple as a literal threat to American democracy, writing: “Apple’s rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government.”
Ironically, Apple giving users encryption doesn't weaken the Fourth Amendment; it makes it stronger because it provides the ability for citizens to be "secure in their persons, houses, papers, and effects, against unreasonable searches and seizures" in a way that the courts recently have been unable to.
Not only the source, they want the signing keys as well:
"For the reasons discussed above, the FBI cannot itself modify the software on Farook's iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers."
If the government compels Apple to give FBI Apple's private keys, the spy state starts to officially exists.
Currently, the best practices assume that such "keys to the kingdom" are, for example by Apple, only in some hardware devices, guarded, needing presence of more than one person to be used and that every signature made with them is permanently considered and logged. Even if FBI protects the keys, their use of the things they sign with them won't be able to be followed: these would be just plain easily copyable programs.
Does anyone know how far they need to go to put themselves outside of US jurisdiction? I know some companies already sign their software releases outside the US for tax reasons. Is that also enough to exclude themselves from being forced to hand over the keys?
I'm starting to picture a world, where software development (as well as corporate structures and finance) are off-shored to development havens, similar to the Swiss and their banking. We already have the start of data havens. How long until these large companies see the benefit from a legal perspective to defend themselves from an over-reaching government?
How would Wall Street react to such a move? Is such a move away from Wall Street even possible? Can companies switch exchanges?
Maybe one of the first everyday uses of space travel won't be colonisation or exploration, but corporations with nation-state level resources, like Apple, basing themselves off-planet for just these reasons...
The economic and security impacts of mandating a back doors in phones go far beyond what the President realizes.
Once he sees that he'll change his mind. However, a lot of damage has been done in that law enforcement believes they could get access to data if only us technology wizards would stop being lazy and do some work for them.
Aside from the compelled speech argument, on balance, we are not more secure with back doors, we are less secure.
So long as law enforcement does not understand this, they will not be effective at their jobs, and they will point the finger at technologists in future terrorist attacks. Even if law enforcement officers do not point the finger, they've already convinced so much of the public that it is the technologists' fault.
We need to continue educating each other on how technology works. Seek public figures and ask them to support our cause. This could go on for years, potentially coming up during every terrorist attack.
You mention Swiss banking but even that has proven to not safe from the US Government. Simple matter is, if you want to sell a product in the US or visit the US you are going to have to play by whatever rules are currently in place.
So taking development off shore or the like won't do much. If anything if the FBI is successful then Apple's business will be hurt world wide as China had already "checked" Apple products to insure there weren't US back doors or such already.
I am curious when it will come to the point where source code is published to ensure that there aren't back doors.
Well wall St would react very negatively if the US could demand source or back doors. It would severely hurt Apple's stock, of which the iPhone is their top performer. I suspect knowing this, they would go all the way and with their 150ish billion in pure cash reserves, the FBI has an awfully big fight on their hands. Apple could easily waste 10-15 years in red tape and appeals with a case like this all the while designing the iPhone so that even with the source it is hack resistant
Keeping the signing keys and servers outside the US wouldn't help much. The FBI could simply start arresting Apple executives who refuse to comply, charging them with obstruction of justice or whatever seems to apply.
I'm almost starting to believe there are forces in the FBI trying to thwart the case for the FBI, by making so ridiculous claims that the legal precedence will be set for a long time, preventing the FBI or any agency to get this kind of power.
But then again, it's more likely they just completely lost sight of how the real world actually functions, and their own mandate.
If this actually does go through, my view of the US will be closely aligned with Iran and North-Korea.
Could anyone explain to me, legally, why the FBI wouldn't be allowed to compel Apple to provide this? The government was able to compel private SSL keys from Lavabit. If the only difference is "Apple has money to fight the court order," then would a court's refusal to grant the FBI access to the source code and signing keys create a precedent that could protect future Lavabits?
The Lavabit case was decided on entirely procedural grounds—the Fourth Circuit never said that it was OK to compel the disclosure of the SSL keys, only that Lavabit had waived its objections in the district court (at which stage Lavabit was acting without counsel).
Source: I briefed and argued the Lavabit case in the Fourth Circuit.
If Lavabit refused, some random scapegoat within Lavabit (like, say the CEO, or whoever is deemed as the scapegoat) might get arrested, Lavabit would probably be ordered to close, and our lives wouldn't be affected too much. They're a tiny startup. They can be bullied quite easily without affecting national peace.
If Apple were ordered to close in an instant, there would be mass upheaval, unrest, protests, and huge economic losses just because of the sheer number of people that depend on their products on a daily basis for mission-critical things. Violence would break out. We might even end up facing nothing short of a civil war, and the federal government probably wouldn't want to risk that.
Surrendering the source code to iOS? That sounds like a huge overreach by a law enforcement agency. That thumb drive would be worth billions of dollars. A $60K/year bureaucrat is supposed to protect it from Apple's competitors? I wonder how long it would be before this prized intellectual property ended in the hands of Samsung, various Chinese companies, maybe a copy floating around Russian and East European networks, and eventually Google labs?
The scary thing is that the FBI, with the support of the President and quite a few top Congressional leaders, could very well win.
I don't understand. Can someone please clarify for me. They physically have the phone right? Could they not just read the whole flash, try a pin, write back the whole flash, repeat? (i.e. take the phone apart)
I suspect there are a myriad of technical means to pop the phone, including the method you outline, but I doubt the FBI has the talent or connections to get it done in way that would be presentable to court. (I.E. Nobody in house is capable and any contractors they would hire to do it don't want their methods made public in an open trial)
One assumes the NSA would make short work of this phone's lock if it had been recovered from the OBL compound, but using them in this instance also brings in some dicey legal issues (they "can't" operate domestically) and the NSA is even less willing to give up it's own tactics than a security consulting firm would be.
FBI seems to have chosen to go the "lawyer up and look for a court order and some wet blanket executives willing to hand things over" route-- Tim Cook is, thankfully, well principled enough to tell them to pound sand.
Surely, at some point in time, the FBI will figure out how to recruit technical staff that is capable of doing more than just extorting bitcoins from cyber drug kingpins.
I hope Apple will further improve iOS encryption so it can not be broken, even by people (FBI or Apple) having access to the source code. There should be no signing key that can be used without first entering the correct PIN in the first place. That's the real problem here, isn't it?
i can imagine reasons why apple would want or need to be able to update secure enclave firmware without the correct pin entered first. #2-3 are highly speculative:
1) fear of a bug. just the right firmware bug and you have 100 million phones lose data, and perhaps bricked too. unlikely, but consider the cost. i would be worried about this if i were in charge of the iphone project and the secure enclave feature were newish. we can imagine pretty good solutions to this one though, with work and time.
2) out of 100+ countries where they sell phones, over time, some will give them a confidential court order saying they must retain this capability. if a foreign court order, they could refuse to comply, but then would have to exit that market -- infeasible if multiple countries. and a different 'version' for just those countries would be noticed over time by security researchers?
3) they might already have an order as such from the U.S., for foreign intelligence purposes. as mentioned a different international version if noticed is a PR disaster for apple. so the easiest way to comply is just do it that way for all phones.
perhaps they push back on the fbi request because that's the one they can talk about, yet it templates the whole issue.
That is an interesting question. "signing a binary" is a form of endorsement. If code == speech, as Apple is arguing, then compelling the endorsement of code is, quite literally, compelled speech, which may fall afoul of the 1st Amendment.
I really hope if they manage to get the source code it leaks and weakens another American technology company, maybe when their industry is dead they will realize its a global market place.
The UK govt is equally eager to ban/compromise encryption, and I think the challenge for technologists is to find ways around this. How difficult would it be to develop a protocol that concealed encrypted payloads in innocent-looking plaintext traffic? A kind of "clandestine encryption."
I suppose the problem is that individuals might get away with using clandestine encryption, but no convenient service provided by companies like Apple would be able to.
Wouldn't it be straightforward for Apple to design future iPhones to accept device-targeted updates in order to comply with law enforcement requests like the one that started all this? Just add a step: 1) Is this update signed by Apple? 2) Does this update indicate a particular device and if so is this that device?
I wouldn't support the government compelling Apple to implement that, but I don't think I'd have a problem with Apple doing it voluntarily. Of course, if it's not a government mandate then anyone who cares (for criminal reasons or otherwise) can just use other devices that don't have the same system in place, so in the end it's just security theater. But at least it might detour some government attempts to overreach and effectively outlaw encryption entirely.
I'm very comfortable with President Obama's broad policies an intentions, but when it comes to privacy rights, I have one question: What would Richard Nixon do?
[+] [-] studentrob|10 years ago|reply
Comey went on record saying the tool should be secure in Apple's hands because Apple knows what it is doing [1]
And, he didn't even think of this tactic until Darrell Issa suggested it at the Congressional hearing [2].
> Issa: Did you receive the source code from Apple? Did you demand the source code?
> Comey: Did we ask Apple for their source code? Not that I'm aware of
It seems like the FBI is grasping at straws here. Does anyone buy this charade? The PR on the DOJ's side is atrocious. From the beginning, "just about one phone" was obviously a lie, and it's all been downhill since then.
For once, the fear, uncertainty and doubt tactics of the government are not working. I'm happy about that but concerned for our future when law enforcement blames technologists for not handing over data to phones. We need to continue educating each other on these issues regardless of what the courts say, and regardless of what ultimately comes out of Congress.
[1] https://youtu.be/g1GgnbN9oNw?t=2h43m12s
[2] https://youtu.be/g1GgnbN9oNw?t=1h21m33s
[+] [-] nobody_nowhere|10 years ago|reply
And it's a little early to say that it's not working. It's not working as well as they'd like. It's certainly not working for the echo chamber of HN and tech nerds, but what we think is irrelevant. It's not over 'til the fat lady sings, as they say....
[+] [-] ehartsuyker|10 years ago|reply
Ironically, Apple giving users encryption doesn't weaken the Fourth Amendment; it makes it stronger because it provides the ability for citizens to be "secure in their persons, houses, papers, and effects, against unreasonable searches and seizures" in a way that the courts recently have been unable to.
[+] [-] mstade|10 years ago|reply
[+] [-] rayiner|10 years ago|reply
[+] [-] woodman|10 years ago|reply
"For the reasons discussed above, the FBI cannot itself modify the software on Farook's iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers."
[+] [-] acqq|10 years ago|reply
Currently, the best practices assume that such "keys to the kingdom" are, for example by Apple, only in some hardware devices, guarded, needing presence of more than one person to be used and that every signature made with them is permanently considered and logged. Even if FBI protects the keys, their use of the things they sign with them won't be able to be followed: these would be just plain easily copyable programs.
[+] [-] junto|10 years ago|reply
Does anyone know how far they need to go to put themselves outside of US jurisdiction? I know some companies already sign their software releases outside the US for tax reasons. Is that also enough to exclude themselves from being forced to hand over the keys?
I'm starting to picture a world, where software development (as well as corporate structures and finance) are off-shored to development havens, similar to the Swiss and their banking. We already have the start of data havens. How long until these large companies see the benefit from a legal perspective to defend themselves from an over-reaching government?
How would Wall Street react to such a move? Is such a move away from Wall Street even possible? Can companies switch exchanges?
[+] [-] mattkevan|10 years ago|reply
[+] [-] studentrob|10 years ago|reply
Once he sees that he'll change his mind. However, a lot of damage has been done in that law enforcement believes they could get access to data if only us technology wizards would stop being lazy and do some work for them.
Aside from the compelled speech argument, on balance, we are not more secure with back doors, we are less secure.
So long as law enforcement does not understand this, they will not be effective at their jobs, and they will point the finger at technologists in future terrorist attacks. Even if law enforcement officers do not point the finger, they've already convinced so much of the public that it is the technologists' fault.
We need to continue educating each other on how technology works. Seek public figures and ask them to support our cause. This could go on for years, potentially coming up during every terrorist attack.
[+] [-] Shivetya|10 years ago|reply
So taking development off shore or the like won't do much. If anything if the FBI is successful then Apple's business will be hurt world wide as China had already "checked" Apple products to insure there weren't US back doors or such already.
I am curious when it will come to the point where source code is published to ensure that there aren't back doors.
[+] [-] SEJeff|10 years ago|reply
[+] [-] ThrustVectoring|10 years ago|reply
[+] [-] kelnos|10 years ago|reply
[+] [-] bsder|10 years ago|reply
No judge is going to be willing to let this level of power grab against the judiciary to slide.
Apple's lawyers must be very happy right now.
[+] [-] soft_dev_person|10 years ago|reply
But then again, it's more likely they just completely lost sight of how the real world actually functions, and their own mandate.
If this actually does go through, my view of the US will be closely aligned with Iran and North-Korea.
[+] [-] spaceheeder|10 years ago|reply
[+] [-] isamuel|10 years ago|reply
Source: I briefed and argued the Lavabit case in the Fourth Circuit.
[+] [-] dheera|10 years ago|reply
If Apple were ordered to close in an instant, there would be mass upheaval, unrest, protests, and huge economic losses just because of the sheer number of people that depend on their products on a daily basis for mission-critical things. Violence would break out. We might even end up facing nothing short of a civil war, and the federal government probably wouldn't want to risk that.
[+] [-] restalis|10 years ago|reply
[+] [-] npunt|10 years ago|reply
[+] [-] blisterpeanuts|10 years ago|reply
The scary thing is that the FBI, with the support of the President and quite a few top Congressional leaders, could very well win.
[+] [-] dm_mongodb|10 years ago|reply
(given it is a 5C with no secure enclave)
[+] [-] Phlarp|10 years ago|reply
One assumes the NSA would make short work of this phone's lock if it had been recovered from the OBL compound, but using them in this instance also brings in some dicey legal issues (they "can't" operate domestically) and the NSA is even less willing to give up it's own tactics than a security consulting firm would be.
FBI seems to have chosen to go the "lawyer up and look for a court order and some wet blanket executives willing to hand things over" route-- Tim Cook is, thankfully, well principled enough to tell them to pound sand.
Surely, at some point in time, the FBI will figure out how to recruit technical staff that is capable of doing more than just extorting bitcoins from cyber drug kingpins.
[+] [-] hammock|10 years ago|reply
[+] [-] justncase80|10 years ago|reply
Meaning, just having the code won't let them crack a phone and it won't let them patch the OS so that its subsequently crackable would it?
When the FBI is worse than the terrorists they need to take a step back and think about the bigger picture a little bit.
[+] [-] DavideNL|10 years ago|reply
[+] [-] dm_mongodb|10 years ago|reply
1) fear of a bug. just the right firmware bug and you have 100 million phones lose data, and perhaps bricked too. unlikely, but consider the cost. i would be worried about this if i were in charge of the iphone project and the secure enclave feature were newish. we can imagine pretty good solutions to this one though, with work and time.
2) out of 100+ countries where they sell phones, over time, some will give them a confidential court order saying they must retain this capability. if a foreign court order, they could refuse to comply, but then would have to exit that market -- infeasible if multiple countries. and a different 'version' for just those countries would be noticed over time by security researchers?
3) they might already have an order as such from the U.S., for foreign intelligence purposes. as mentioned a different international version if noticed is a PR disaster for apple. so the easiest way to comply is just do it that way for all phones.
perhaps they push back on the fbi request because that's the one they can talk about, yet it templates the whole issue.
[+] [-] Piskvorrr|10 years ago|reply
[+] [-] jpgvm|10 years ago|reply
[+] [-] coreyp_1|10 years ago|reply
[+] [-] ctdonath|10 years ago|reply
[+] [-] ps4fanboy|10 years ago|reply
[+] [-] ehartsuyker|10 years ago|reply
[+] [-] cbeach|10 years ago|reply
I suppose the problem is that individuals might get away with using clandestine encryption, but no convenient service provided by companies like Apple would be able to.
[+] [-] dreamcompiler|10 years ago|reply
https://en.wikipedia.org/wiki/Deniable_encryption
[+] [-] chrramirez|10 years ago|reply
[+] [-] thoughtsimple|10 years ago|reply
[+] [-] evunveot|10 years ago|reply
I wouldn't support the government compelling Apple to implement that, but I don't think I'd have a problem with Apple doing it voluntarily. Of course, if it's not a government mandate then anyone who cares (for criminal reasons or otherwise) can just use other devices that don't have the same system in place, so in the end it's just security theater. But at least it might detour some government attempts to overreach and effectively outlaw encryption entirely.
[+] [-] thyrsus|10 years ago|reply
[+] [-] Glyptodon|10 years ago|reply