We don't need to make arguments like "I trust the state, but what about criminals?". All US administrations have exhibited strong evidence of criminality and work to suppress dissent and oppress minorities. The current administration makes AG Barr's machinations exceptionally clear as they employ an ever more viscous kidnapping force and (incompetently) attempt to start wars around the world.
If you want to be able to resist these assholes (e.g. having anti-war or anti-ICE meetings that aren't snooped on, enable whistleblowers, etc), you need secure communications. Don't give them a backdoor.
This was briefly confusing, as AG Barr is the soft drinks company which manufactures Irn Bru in Scotland, and I was surprised to see they had an opinion.
But I'm interested in what the constructive path forward is on the rather more important issue of ubiquitous encryption. I think most people generally accept that well-regulated wiretapping (and other forms of interception of communication) are a useful and important law enforcement tool. Easy-to-use, commodity encryption makes this tool mostly useless. But obviously any "back door" would be immediately leaked/abused, and more to the point would just be circumvented by anybody who really did still want privacy. You can't shut the stable door etc.
We do sometimes choose to outlaw useful and entirely legitimate technology because of the potential for abuse. Is there a reasonable argument for doing so with mass-market encryption? In reality, it seems that the public benefit of having encryption in place is so overwhelming that this is a total non-starter. Or is it simply the case that we now have to accept that this tool is no longer one which can be used? I think we'd have to accept that ubiquitous encryption is here to stay and stop trying to fight it.
Imagine the worst case ("worst" from this PoV): end-to-end encryption is widely adopted and effective. How hard is a cop's legitimate job, compared to before telephone wiretaps? They can get to ubiquitous (and always increasing) surveillance footage and cell-phone tracking, just for a start. It seems really phony to frame this scenario as "going dark".
The problem is the United States is that the 4th Amendment provides gives US persons the right "to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures".
The legal reality today is that while legally we broadly interpret things like "companies are people", we very narrowly look at "papers, and effects" as literal, physical objects. Equating backdoors with wiretapping is only similar on the surface. A traditional wiretap allows the police to hear a series of conversations. A backdoor in an iPhone or your Dropbox/OneDrive/iCloud account potentially allows the police access to the entirety of your "papers, and effects".
I would be willing soften my opinion on the issue, but only if there was a meaningful change in how the law protects digital documents.
Note that a lot can be learned from "metadata" (keeping track of who is talking to who) and it's also hard to prevent without adding a lot of inefficiency via something like onion routing. So that would be the logical place for some kind of law enforcement compromise while still encrypting the messages themselves.
"This was briefly confusing, as AG Barr is the soft drinks company which manufactures Irn Bru in Scotland, and I was surprised to see they had an opinion"
Me too. I would have thought they would have been in favour of very strong encryption.... Made from girders.
I don't agree that encryption should be illegal, but also I don't think it's feasible to try and make it illegal without going all the way with our own version of the great firewall
> The cost of encryption, he said, is measured in "victims" who might have been saved from crime if law enforcement had been able to lawfully intercept communications earlier.
The cost of backdoors is measured in hospitals paralyzed by ransomware, personal information stolen by hackers, government secrets obtained by hostile nations. Not to mention the threat of pervasive surveillance by our own government, which has long shown it's willing to intrude on our privacy as much as it possibly can, laws be damned.
I challenge him to describe what an encryption backdoor actually is. Its like putting a hidden switch behind a brick in a wall. The brick looks the same, but if you take the time to knock on every brick, you'll eventually find the button. Every moderately powerful nations will have someone out there knocking on the bricks.
It takes a prohibitively long time to knock on 2^256 bricks. The real risks aren't from the crypto; they're from the humans who have access to the back door. Can they be trusted? Can they be compromised?
That's not to mention the business damage this would cause. The US is trying to block Huawei products due to back doors. Other countries will block US products if the US government starts requiring back doors.
That's a nice analogy, but it's still an analogy. It's easy for someone to come along and modify the analogy to fit their purpose e.g. "we'll have someone watch the wall of bricks so no one can knock on them!"
We simply can't let people legislate on issues they fundamentally don't understand. And understanding a backdoor is actually not that difficult - a smart, motivated person could literally learn the theory of this with some practical examples in a few hours of study.
Except we have actual history to look at in Barr's case, Iran-Contra. I absolutely consider this malice first and incompetence second. That malice is classism, people are not equal, some people are inherently better people, and they have more money. Everything is a product. And wealthy people can buy more of that product than others.
This is an attempt to make sure ordinary people cannot have privacy, cannot be protected from either each other, other governments or this government, but wealthy people will continue to be able to buy that protection one way or another. It's a feudalist's proposition. You do not understand the motives without understanding the underlying ideologies. Not everyone can be a lord over their own data - it's a fact. Your data, peasant, is not your data! It belongs to your lord. And who the lord is case by case, it could be the government, it could be a particular corporation or a trade group of them.
>He also accused tech firms of "dogmatic" posturing, saying lawful backdoor access "can be and must be" done, adding, "We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement, without materially weakening the security provided by encryption."
In the last crypto wars in the 90s and competent scientific explanations of the dilemma between trustworthy and backdoored didn't register with people who think in political terms either.
The government side of this is rhetorically, "we can imprison and kill engineers until one of you delivers us a flying pig, whether you still say that's impossible is your call." Some engineer is going to figure out how to meet their requirements. Pigs themselves can't fly per se, but when you get to discussing how far, in what direction, for how long, under what power, and the consequences of doing nothing, even flying pigs become a solvable problem. A technologist will say, "that's horrible, unethical, and wrong, you just ran those pigs off a cliff." Everyone else looks at the politicians and the guys with the guns and says, "See? Flying pigs." They also probably push that technologist over the cliff as well.
This is why they call politics the art of the possible.
What Barr is saying is that he plans to make tech companies accountable for his problems in justice. This kind of irrational posturing is probably a bargaining gambit on other regulatory areas or just an empty signal to rank and file law enforcement that the whitehouse and AG are on their sides.
Yeah but what's to stop a bad actor simply switching to a provably secure cryptographic system or app? Don't forget there's an arms race going on, where bad actors consistently try to outpace law enforcement by switching to provably secure and private systems. Take for example when 3DES[0] was discovered as insecure and all the criminals switched to AES[1]. And don't forget the old adage: If you outlaw encryption, then only outlaws will use encryption
Does mr Barr not understand that criminals will simply use free non-backdoored software instead? Of course he does. It's hard not to read this as another attemt to listen in on the conversation of regular citizens instead.
You could say that about criminalizing TOR. If you knew that by using TOR you were going to get swatted at 3am you'd probably stop using it. Same goes for encryption. The second any law is passed google/apple are going to pull apps; whatsapp etc are going to remove e2e (or alter it so the UI is the same but it can be snooped on) and criminals are going to have to figure out how to use PGP or whatever.
If he doesn't right now, he soon will. That will mean first, random government checking whether encrypted comms are backdoored. If they discover that some are double-encrypted, and the inner encryption isn't backdoored, then they will do what we right now call MITM all encrypted comms. By law, with harsh penalties, because that's how we roll in the USA.
I think this is just another step in a War on Change. Snooping was good when the bulk of the comms weren't encrypted, and cellphones could be confiscated and read because plaintext. Change to HTTPS, dominance of a cellphone company that isn't too compliant, and here's a change that needs to be reversed, to be stopped.
Seems like a failure of the tech community that most people don’t understand the ramifications of backdoors. People can also point to the example of China where there are indeed government backdoors and things are fine (from the government can get access but others can’t point of view). Clearly we don’t want to be like China in this regard but it does give an example of a “working” backdoor setup.
Exactly. Strong encryption is not rocket-science anymore. Criminals can even simply code their own encrpyted tools and in that case, law enforcement is completely out of the game (and actually, professional crime does no rely on Signal and sorts anyways).
It would be so much wiser to just keep quiet about that topic and rely on the neglicence of the common criminal - below the line there is a plethora of information in metadata and connection data, still incredibly better than anything law enforcement had 15 years ago.
The effect of any public advance into "responsible encryption" is only causing criminals to harden their tools.
> The existence of encryption means "converting the Internet and communications into a law-free zone" that criminals will happily take advantage of to do more crimes, Barr added, likening it to a neighborhood that local cops have abandoned.
This is such an asinine comparison. It's not like the cops have access to every locked door of every house in their jurisdiction. You don't need to build a special "cop door" into your house so they can come in whenever they want to, and you don't need to make an extra copy to give to the precinct. Why should my "internet house" be any different?
> The FBI ended up in possession of the shooter's iPhone during the investigation but was unable to unlock the device, as the attacker had been killed and therefore could not be compelled to share his PIN.
Is it not the case that he couldn't be legally compelled to share his PIN even if he were alive?
Make non-compliance a civil offense, subject to a daily penalty. Simply financially ruin people. Economic coercion is still coercion. And you don't have the same right to an attorney or a jury trial. And you can be found partially liable for consequences from non-compliance.
Criminal law is more binary, and a far stronger burden of proof on the claiming party, the government.
This is already the case in many states when it comes to vehicular traffic laws. Most speeding is a civil offense, no right to plea bargain, or jury trial. It only becomes criminal after a certain speed (careless or wreckless), and becomes criminal after a certain amount and time for the unpaid fine.
It's what happens when people in a democracy aren't paying attention to state and local law making.
No one can force your to share information. Let's say your PIN code is 1324, and you suffer from stress induced transposia (just made that up). Every time you are asked to say or share your PIN code, you get stresses & use 1234, even though you meant to use 1324. There's no way make you say the correct PIN code. In a more realistic example, what are you going to do to someone who simply forgot their PIN code?
You probably cannot be compelled to say what your password is,but you can be compelled to place your finger on the fingerprint scanner, provide retna scans, and provide other biometric data to unlock encryption. The difference being that you cannot be compelled to reveal information you have in your own mind. The rest of your body, however, is fair game.
Attorneys for the United States "can and must" build a criminal case starting from "reasonable suspicion", through "probable cause", all the way to "beyond reasonable doubt", without violating at any step the Constitutional protections put in place for the express purpose of limiting the government's power to reshape or destroy the society it purportedly serves.
We say not to roll your own encryption, but it's one thing to build a lock that is easily picked, and quite another to build one that instantly and silently opens to a key held by a stranger, outside your supervision. Even if there weren't open source crypto projects out there, some managed on servers outside the US, just because I don't roll my own, doesn't mean I can't.
I wrote a paper while in high school on the Clipper Chip, explaining why it was a bad idea. I have learned a lot since 1994, and these folks have apparently learned nothing. Backdoors are inherently flawed. DRM is inherently flawed. You can't create security by trusting an untrustworthy party, or by declaring the intended recipient to be the eavesdropper, or by redefining up to be down.
Is this the same Bill Barr who blacked out entire pages of the Muller Report? Why is transparency so important for the citizens but so prized by the politicians?
If you need to explain to non-technical friends and family exactly why this is a terrible idea, CGP Grey offers an easily digestible 5-minute primer: https://www.youtube.com/watch?v=VPBH1eW28mo
Computers either have unbreakable locks, or no locks whatsoever. There's no stable middle ground.
They do not need to hide the backdoor's existences though
Just add a 'USGOV BACKDOOR' tab and password/encrypt it or whatever the fuck. It doesn't need to be a real control panel or anything other than trivial options (a grayed-out unchangeable toggle box that says 'Enable backdoor', for instance). I'm not the guy to provide you UI tips.
Let users understand it's there without relying on security-through-obscurity. Disallow any temptation to use StO and ensure users are fully informed of the government's firm and fatherly hand.
Yes, I know that the GUI/man page information are not 'the backdoor'. It is also not actually any functionality for the software you're using. It is function through abstracted symbols.
Protest via abstracted symbolism - visible and immutable vulnerability.
"[...]The cost of encryption, he said, is measured in "victims" who might have been saved from crime if law enforcement had been able to lawfully intercept communications earlier.[...]"
Yes, one can measure the cost of encryption like this, just like you can measure the cost of lax gun controls or automobile traffic by their fatalities (= negative costs). But you also measure the cost of encryption and lax gun controls and automobile traffic by their benefits (positive costs). If the balance is sufficiently in the positive, the negative costs might get accepted. There is no logical point, to only reason from one side of the calculation.
I never heard Barr say:
"The cost of privacy is measured in 'victims' who might have been saved from crime if law enforcement had put an officer in every bedroom throughout the country."
"The cost of lax gun controls is measured in dead pupils who might have been saved from death if legislation had passed stricter gun control laws."
It'd be nice is encryption fell under the 2nd amendment. It's been classified as a munition before, and with the creation of USCYBERCOM, the govt has a vested interest in ensuring that it's citizens have experience using encryption software.
That's an interesting argument, but only the most extreme interpretation of the 2nd Amendment argues that it covers _all_ munitions. And I'm having trouble finding a contemporary example of someone making this claim.
On top of that, we now have years of SCOTUS jurisprudence that "arms" are absolutely subject to _some_ regulations, with the debate focusing on what "some" actually means.
Randall Munroe was onto this years ago: https://xkcd.com/504/ I think the case is strong that encryption is equally protected by the 1st, 2nd, and 4th amendments.
Even if the gov required backdoors. It's not going to stop the highly invested terrorists. If someone was to go through all the trouble of plotting another 911, I think they'd have more then enough money to create a secure messaging site of their own. All it takes is one little web site with a really strong encryption algorithm. It's pretty trivial to create your own messaging platform and not too hard to add a large enough encryption to it.
[+] [-] tehjoker|6 years ago|reply
If you want to be able to resist these assholes (e.g. having anti-war or anti-ICE meetings that aren't snooped on, enable whistleblowers, etc), you need secure communications. Don't give them a backdoor.
[+] [-] matthewmacleod|6 years ago|reply
But I'm interested in what the constructive path forward is on the rather more important issue of ubiquitous encryption. I think most people generally accept that well-regulated wiretapping (and other forms of interception of communication) are a useful and important law enforcement tool. Easy-to-use, commodity encryption makes this tool mostly useless. But obviously any "back door" would be immediately leaked/abused, and more to the point would just be circumvented by anybody who really did still want privacy. You can't shut the stable door etc.
We do sometimes choose to outlaw useful and entirely legitimate technology because of the potential for abuse. Is there a reasonable argument for doing so with mass-market encryption? In reality, it seems that the public benefit of having encryption in place is so overwhelming that this is a total non-starter. Or is it simply the case that we now have to accept that this tool is no longer one which can be used? I think we'd have to accept that ubiquitous encryption is here to stay and stop trying to fight it.
[+] [-] abecedarius|6 years ago|reply
[+] [-] Spooky23|6 years ago|reply
The legal reality today is that while legally we broadly interpret things like "companies are people", we very narrowly look at "papers, and effects" as literal, physical objects. Equating backdoors with wiretapping is only similar on the surface. A traditional wiretap allows the police to hear a series of conversations. A backdoor in an iPhone or your Dropbox/OneDrive/iCloud account potentially allows the police access to the entirety of your "papers, and effects".
I would be willing soften my opinion on the issue, but only if there was a meaningful change in how the law protects digital documents.
[+] [-] skybrian|6 years ago|reply
[+] [-] the_watcher|6 years ago|reply
An irreverent marketer could generate a ton of earned media out of this.
[+] [-] benj111|6 years ago|reply
Me too. I would have thought they would have been in favour of very strong encryption.... Made from girders.
https://m.youtube.com/watch?v=H4PxuFQCDis
[+] [-] helen___keller|6 years ago|reply
[+] [-] jakelazaroff|6 years ago|reply
The cost of backdoors is measured in hospitals paralyzed by ransomware, personal information stolen by hackers, government secrets obtained by hostile nations. Not to mention the threat of pervasive surveillance by our own government, which has long shown it's willing to intrude on our privacy as much as it possibly can, laws be damned.
[+] [-] coolspot|6 years ago|reply
[+] [-] Topgamer7|6 years ago|reply
[+] [-] all_blue_chucks|6 years ago|reply
That's not to mention the business damage this would cause. The US is trying to block Huawei products due to back doors. Other countries will block US products if the US government starts requiring back doors.
[+] [-] crispyporkbites|6 years ago|reply
We simply can't let people legislate on issues they fundamentally don't understand. And understanding a backdoor is actually not that difficult - a smart, motivated person could literally learn the theory of this with some practical examples in a few hours of study.
[+] [-] coolspot|6 years ago|reply
I believe this is how Australia is going to have their backdoors.
[+] [-] typenil|6 years ago|reply
I really don't know if people like Barr are arguing in good faith. But then I try to assume incompetence first and malice second.
[+] [-] cmurf|6 years ago|reply
This is an attempt to make sure ordinary people cannot have privacy, cannot be protected from either each other, other governments or this government, but wealthy people will continue to be able to buy that protection one way or another. It's a feudalist's proposition. You do not understand the motives without understanding the underlying ideologies. Not everyone can be a lord over their own data - it's a fact. Your data, peasant, is not your data! It belongs to your lord. And who the lord is case by case, it could be the government, it could be a particular corporation or a trade group of them.
[+] [-] cr0sh|6 years ago|reply
[+] [-] coldcode|6 years ago|reply
[+] [-] lonelappde|6 years ago|reply
[+] [-] helen___keller|6 years ago|reply
This reminds me of the classic article about trisectors https://web.mst.edu/~lmhall/WhatToDoWhenTrisectorComes.pdf
[+] [-] pwinnski|6 years ago|reply
Or drying water, or any number of other impossibilities.
[+] [-] motohagiography|6 years ago|reply
The government side of this is rhetorically, "we can imprison and kill engineers until one of you delivers us a flying pig, whether you still say that's impossible is your call." Some engineer is going to figure out how to meet their requirements. Pigs themselves can't fly per se, but when you get to discussing how far, in what direction, for how long, under what power, and the consequences of doing nothing, even flying pigs become a solvable problem. A technologist will say, "that's horrible, unethical, and wrong, you just ran those pigs off a cliff." Everyone else looks at the politicians and the guys with the guns and says, "See? Flying pigs." They also probably push that technologist over the cliff as well.
This is why they call politics the art of the possible.
What Barr is saying is that he plans to make tech companies accountable for his problems in justice. This kind of irrational posturing is probably a bargaining gambit on other regulatory areas or just an empty signal to rank and file law enforcement that the whitehouse and AG are on their sides.
[+] [-] octosphere|6 years ago|reply
[0] https://en.wikipedia.org/wiki/Triple_DES
[1] https://en.wikipedia.org/wiki/Advanced_Encryption_Standard
[+] [-] OrderlyTiamat|6 years ago|reply
[+] [-] iamnotacrook|6 years ago|reply
[+] [-] bediger4000|6 years ago|reply
I think this is just another step in a War on Change. Snooping was good when the bulk of the comms weren't encrypted, and cellphones could be confiscated and read because plaintext. Change to HTTPS, dominance of a cellphone company that isn't too compliant, and here's a change that needs to be reversed, to be stopped.
[+] [-] yibg|6 years ago|reply
[+] [-] AlphaGeekZulu|6 years ago|reply
It would be so much wiser to just keep quiet about that topic and rely on the neglicence of the common criminal - below the line there is a plethora of information in metadata and connection data, still incredibly better than anything law enforcement had 15 years ago.
The effect of any public advance into "responsible encryption" is only causing criminals to harden their tools.
[+] [-] tomphoolery|6 years ago|reply
This is such an asinine comparison. It's not like the cops have access to every locked door of every house in their jurisdiction. You don't need to build a special "cop door" into your house so they can come in whenever they want to, and you don't need to make an extra copy to give to the precinct. Why should my "internet house" be any different?
[+] [-] sokoloff|6 years ago|reply
Is it not the case that he couldn't be legally compelled to share his PIN even if he were alive?
[+] [-] cmurf|6 years ago|reply
Criminal law is more binary, and a far stronger burden of proof on the claiming party, the government.
This is already the case in many states when it comes to vehicular traffic laws. Most speeding is a civil offense, no right to plea bargain, or jury trial. It only becomes criminal after a certain speed (careless or wreckless), and becomes criminal after a certain amount and time for the unpaid fine.
It's what happens when people in a democracy aren't paying attention to state and local law making.
[+] [-] PopeDotNinja|6 years ago|reply
[+] [-] ProCicero|6 years ago|reply
[+] [-] vectorEQ|6 years ago|reply
[+] [-] logfromblammo|6 years ago|reply
We say not to roll your own encryption, but it's one thing to build a lock that is easily picked, and quite another to build one that instantly and silently opens to a key held by a stranger, outside your supervision. Even if there weren't open source crypto projects out there, some managed on servers outside the US, just because I don't roll my own, doesn't mean I can't.
I wrote a paper while in high school on the Clipper Chip, explaining why it was a bad idea. I have learned a lot since 1994, and these folks have apparently learned nothing. Backdoors are inherently flawed. DRM is inherently flawed. You can't create security by trusting an untrustworthy party, or by declaring the intended recipient to be the eavesdropper, or by redefining up to be down.
[+] [-] linsomniac|6 years ago|reply
[+] [-] jschwartzi|6 years ago|reply
[+] [-] lukifer|6 years ago|reply
Computers either have unbreakable locks, or no locks whatsoever. There's no stable middle ground.
[+] [-] stronglikedan|6 years ago|reply
Replace that with "locks that have yet to be broken", and I agree.
[+] [-] trentlott|6 years ago|reply
Just add a 'USGOV BACKDOOR' tab and password/encrypt it or whatever the fuck. It doesn't need to be a real control panel or anything other than trivial options (a grayed-out unchangeable toggle box that says 'Enable backdoor', for instance). I'm not the guy to provide you UI tips.
Let users understand it's there without relying on security-through-obscurity. Disallow any temptation to use StO and ensure users are fully informed of the government's firm and fatherly hand.
Yes, I know that the GUI/man page information are not 'the backdoor'. It is also not actually any functionality for the software you're using. It is function through abstracted symbols.
Protest via abstracted symbolism - visible and immutable vulnerability.
[+] [-] HeWhoLurksLate|6 years ago|reply
[+] [-] ptah|6 years ago|reply
[+] [-] ETHisso2017|6 years ago|reply
[+] [-] AlphaGeekZulu|6 years ago|reply
Yes, one can measure the cost of encryption like this, just like you can measure the cost of lax gun controls or automobile traffic by their fatalities (= negative costs). But you also measure the cost of encryption and lax gun controls and automobile traffic by their benefits (positive costs). If the balance is sufficiently in the positive, the negative costs might get accepted. There is no logical point, to only reason from one side of the calculation.
I never heard Barr say:
"The cost of privacy is measured in 'victims' who might have been saved from crime if law enforcement had put an officer in every bedroom throughout the country."
"The cost of lax gun controls is measured in dead pupils who might have been saved from death if legislation had passed stricter gun control laws."
[+] [-] dx87|6 years ago|reply
[+] [-] the_watcher|6 years ago|reply
On top of that, we now have years of SCOTUS jurisprudence that "arms" are absolutely subject to _some_ regulations, with the debate focusing on what "some" actually means.
[+] [-] lukifer|6 years ago|reply
[+] [-] DaniloDias|6 years ago|reply
[+] [-] thorwasdfasdf|6 years ago|reply
[+] [-] LinuxBender|6 years ago|reply
"Your rental car payment is due."
"OK, I will pay it on Wednesday at 10am. Stop nagging me."
I would imagine that if enough people knew everything had lawful intercept, then they would just obfuscate their communications.