"If you want to see change, you need to incentivize change. For example, if you want to see Microsoft have a heart attack, talk about the idea of defining legal liability for bad code in a commercial product. If you want to give Facebook nightmares, talk about the idea of making it legally liable for any and all leaks of our personal records that a jury can be persuaded were unnecessarily collected. Imagine how quickly Mark Zuckerberg would start smashing the delete key.
Where there is no liability, there is no accountability... and this brings us to the State. "
If this happens, it will be the end of open source and the indie web. Only large companies with large legal departments and serious liability insurance, and anonymous underground hackers, will be able to afford to make software public for commercial use or run a website.
I disagree with Snowden here. Liability is an extremely bad approach and would not solve the problem at all. It would also fortify companies that can pay for guarantees nobody could ever give.
That we have security flaws is always inevitable. Better languages might help but are no panacea.
I agree with Snowden on a lot, but this doesn't solve anything.
The result would be software certificates. By whom? Take a guess.
Nobody can guarantee absolute safety. This is a trap you don't want to fall into.
It would end open source and any independent development. Quite surprisingly short sighted by Snowden.
The problem with iMessage wouldn't be solved by liability. It is a security flaw that cannot be removed by law.
edit: To clarify: I agree with him in the Facebook example. They collected the data for their business and should be liable. "Bad" or "insecure" code is a different matter however.
> talk about the idea of making it legally liable for any and all leaks of our personal records that a jury can be persuaded were unnecessarily collected
> talk about the idea of making it legally liable for any and all leaks of our personal records that a jury can be persuaded were unnecessarily collected
That's kinda sorta one of the goals of the GDPR. And they go farther as they're liable for any leak of personal data, not just the unnecessary personal data.
I'm not going to comment on Snowden's view of what liberal western states do when it comes to surveillance. I have my own opinion, but he's been right about stuff I'd disagreed with him in the past before so I'm gun shy about confronting his ideas again.
On the topic of unsafe language though, he's absolutely right. We don't have to put up with this. We could pass a law and ban new code in unsafe languages from national security threatening devices like phones or cyberphysical devices like self-driving cars and we would be the better for it in under a decade. We could even have taxes to give breathing room during a transitionary period to encourage it before outright banning it, but we don't. We don't because so often it is less of a hassle for the government to trust the private sector that it is to take some real position on the regulatory issues that matter. This will probably always be the case until the government is able to evaluate talent and pay salaries in accordance with that talent as the private sector is.
As a professional software engineer I'm not sure the idea of "safe" or "unsafe" programming languages is a coherent idea, or if it is then all languages are unsafe in my eyes.
Yes C/C++ have more footguns than Java but there's no "hard line" in the safety differences and there are real and important things that need doing that it's not always clear can be reasonably done in another language.
If you haven't, I'd encourage you to read the paper "Some Were Meant For C"[0] on why C still doesn't have a real replacement (though it could in the future).
You have an awfully rosy view of how regulation of programming languages would play out. What will happen (and this is already de facto the case in many "safety-critical" industries) is that Rust is banned and everyone is forced to write straightjacketed C89 and C++03. There's already a congressman on the record complaining about a "native Nigerian" committing code to Rust.
I had never thought of passing legislation against unsafe languages. It sounds like the type of thorny legislative issue that leaves you with a half broken system, like all the cookie notices that currently abound. Nice in spirit, impossible in practice. Regulation could be more sensible but it’s all in the enforcement. There’s nothing preventing organizations from adopting safer coding languages and practices right now, a little incentive may help a bit but I think only so far.
Would you refuse to use Postgres, the Linux kernel or SQLite because they’re all written in C?
Certainly C and C++ have more footguns than many other languages, but highly insecure as well as highly secure software gets written in all languages. I think coming up with better/easier avenues for digital-security-breach related lawsuits and fines is a better idea than banning specific languages.
Where is your legislation supposed to draw the line? In Rust you need to have occasional unsafe code - you can't even have a double linked list or bidirectional graph without unsafe code. Would you also outlaw jni calls in java?
Instead of banning unsafe languages how about allowing for tort liability for software? There is a big difference in law between software and other engineering like building bridges. It used to be that software failures were not as severe as a collapsing bridge - but we are close to the point where it reverses.
Even if we could clearly differentiate between "safe" and "unsafe" languages, what use cases and devices go on the list?
Would phones really be that high up given that in environments with high security standards usually people are not allowed to carry them? What about home appliances? Could a state actor hack a bunch of stoves and burn the houses down?
I can easily see some huge bureaucracy being put in place without much benefits ("Federal Programing Language Commission"?).
Most high security environments have a physical component, why not learn from there? Phones could just have a physical switch to cut off microphones and antennae for example, much easier to do than trying to police millions of lines of codes to be secure.
C and C++ are really versatile, why are there no safe libraries for them (or are there?). If you want to make buffer overflows impossible then add a library that does so and use it. If you want safer memory handling use a library. Or extend C and C++ with standard libraries that are safer to use. Java is implemented in C so anything Java can do, C can do. Many languages are actually implemented in C.
What is this fascination with creating a new language every time we want to add some features? Is it the fame of creating and naming your own computer language? From a security perspective creating a new language reduces security because it will never have been as thoroughly vetted as the old languages.
What he is referencing is what I call “artificial complexity”, a way for making solving problems ludicrously wasteful in man-hours. This is primarily accomplished via limiting remixes of known technology - think for example of how easy it is to add types to Lisp and yet it took decades to be attempted publicly.
It is the job of human intelligence to cause ‘discontention’ - discontent and contention. I like to think of how it as how you can make powerful gears almost seize if you understand their weaknesses properly.
I don't think you'd have to ban unsafe languages but you could pass a law that slowly decreased the amount of money the government could spend on 'unsafe' software (either through direct licensing or renting through clouds). The government is such a huge client to these companies that it'd immediately create a large financial incentive to migrate.
I think this is probably one of the better, more practical ideas that I've seen. If the government has to consider whether the underlying technology has adequately addressed memory safety issues (doesn't have to be at a language level, but that's obviously the easiest way), that puts pressure on them to fund projects that use memory safe approaches.
That's billions of dollars that will get slowly steered in the right direction.
Are there any compelling reasons why it should be legal to sell exploits to anyone other than the company whose software is vulnerable? By neatly bundling these exploits up and selling the hacking tool to the highest bidder, this company is giving nation-state spying capabilities to cartels and dictators who would otherwise not have had that. I don't see why that shouldn't be regulated the same way as if they were selling nukes.
It's even more insane when you remember that strong encryption was regulated as munitions until the mid 90s [1,2] - and actual cyber-weapons aren't today.
Software that can cause real damage to not just data and business activities, but endanger lives is completely legal to sell to despotic lunatics. If you sold a bag of fertilizer to a Syrian you'd go straight to Guantanamo, but somehow this is okay.
I tend to bang on about software not as engineering but as literacy. It makes some sense even here - that bad code is as common as bad law - and often for the same reasons, politics, money, and hard questions
"Engineering" is a wide subject - the big stuff is carefully built and highly regulated - bridges and buildings. But as we go down the scale we see engineering give way to the problems of politics and money - tower blocks collapse for example, and then we see human level engineering - factory tools that try to meet inflicting goals, dangerous toys and so much more.
The software world should not beat itself up for not being like all those engineers - when lives are not on the line engineers get tied up just the same as the rest. And when lives are on the line, software and hardware engineering have learnt a few things - reduce the scope to the barest possible essentials - have a lot of redundancy and stick to well known designs.
Also, traditional engineers design things to withstand conditions that they would reasonably face in ordinary use, with some additional safety factor. They don't design them to withstand deliberate attacks by nation-level actors like we're seeing here.
If a car explodes because it got hit by an artillery shell, would anyone hold the automotive engineers responsible? If a building collapses because a bomb was dropped on it, would anyone hold the civil engineers responsible?
I doubt "real" engineering is better. It's just that attacking its artifacts doesn't scale, so it looks more secure. In fact, the average bridge or skyscraper is probably absolutely riddled with serious design and manufacturing flaws
It feels like we've gone full circle back to 1980 when the US Defense comissioned a 'safe' embedded language.
Jean Ichbiah's team won this contract with the language 'Green' in 1979. They subsequently went on to further develop this and standardize it in what was then the Ada 83 language standard.
The more I think about the more it feels that Ada just came about to solve the right problem but at the wrong time.
I find Snowden's take on software engineering to be poorly reasoned and internally inconsistent.
The most important fact related to his argument is one he doesn't even bother mentioning, namely that Android is mostly written in a memory safe language (Java). The thing he's asking for already exists and is deployed on most smartphones worldwide, yet all he has to say on the topic is this:
"While iPhones are more private by default and, occasionally, better-engineered from a security perspective than Google’s Android"
You cannot claim to be a spokesman for freedom, then demand memory-safe languages be used everywhere, and then praise the one system controlled exclusively by a single American firm that's written almost entirely in (Objective) C, a memory unsafe language.
That sentence is the only mention of Android in the entire article, the word "Java" doesn't appear anywhere and he seems to think that Rust is the only memory safe language in existence. Why should I care about this guy's opinions? Java has been drastically more successful than Rust when it comes to making software memory safe. Nobody is gonna choose to write the next AirBNB in Rust other than for fashion reasons, because it'd simply be too unproductive. Developers already complain about Swift and its horrible compile times, Rust would be even worse.
If Snowden really cares about this topic, he should brush up his Java skills, download some OpenJDK early access builds and start experimenting with writing video codecs and 3D engines using the new vectorization, memory span and value types features. Java is getting the capabilities to do even higher performance work traditionally dominated by C++, but in ways that preserve memory safety. The engineering is very difficult and it's unclear if Google will ever adopt it into ART, but it's there for them if they want it.
Loved seeing language safety features called out. How can we prevent monoculture yet retain interoperability and profitability? Are these inherently at odds?
> The greatest danger to national security has become the companies that claim to protect it
No. The greatest danger is lack of software supply chain management followed by near-universal disrespect for formal complexity management methods.
The only way I have found to win at this "are we actually secure" game is to minimize the number of parties you have to trust. The smaller you get this figure, the easier it becomes to gain control over your circumstances again. How many of us can immediately state the exact number of unique parties that we have to trust as part of building solutions for other people?
What about complexity? Most of the time, something is insecure not because of malicious intent (covered by the trust angle above), but because its so goddamn complex that no one can say for sure if its correct or not. Why do we tolerate this?
> Fixing the hardware, which is to say surgically removing the two or three tiny microphones hidden inside, is only the first step of an arduous process, and yet even after days of these DIY security improvements, my smartphone will remain the most dangerous item I possess.
What's the purpose of these microphones? Do they pose more threat than the standard non-hidden microphone?
> it is still hard for many people to accept that something that feels good may not in fact be good
This strikes me as surprising. I have always been taught the opposite: if it feels good, it's probably bad for you, or illegal, or immoral, or all three.
I definitely agree with Snowden's call to action with making spyware illegal, I think the execution is plausible but unlikely for the same reasons he stated in the article. Every country is working to produce these things themselves for cyber security self defense.
I am unsure about his comments regarding unsafe code. Like many of the people have already stated here, that's a blurry line that has good intention but seems nearly impossible. I think more regulation and certification for both employees and companies is a much more likely to be successful.
I thought this was going to be about Instagram making money by making people, especially women, feel worse about themselves. Turns out there are multiple insecurity industries operating today...
The problem is bigger than hardware and software being "insecure" it's got ultimately to do with trust of these corporations. Do we trust that Apple, Google and so on will do the right thing? If not maybe we can construct new (maybe public) companies which are accountable for our privacy. Or hold existing companies accountable. Its going to need a huge swing of power towards ordinary people though.
Do people really think any of this is ever going to change? Governments don't give a shit, they time and again say they won't do something only to later do it in secret.
We should at this point accept that this isn't going to change and move ahead with the belief that your data is already hacked and is not private anymore. We should discuss more on the exact consequences of this and take actions accordingly.
I think we already know how to build secure tech, we just don't because of the cost, which is not 2x or 10x but way higher to the point that is deemed not worth/feasible both for private companies and governments, so it affects everyone and we are not getting out of it anytime soon (if anything sometimes we go in the opposite direction following the goal of money and calling it innovation). Heck, I think it would take a book just to even explain all the components of this argument. Go figure do something meaningful about it. Safe/unsafe languages is just one piece of this thing, and frankly a quite easy one to wrap one's head around: we already have so much stuff we use everywhere written when memory-safety was not really cared for and no-one wants to rewrite / catch up / surpass (there you go: because of the cost). So we get on with it, because meanwhile life goes on, bread needs to put on the table, people want to play with their shiny phones and whatnot, etc.
[+] [-] pjmlp|4 years ago|reply
Where there is no liability, there is no accountability... and this brings us to the State. "
Yep, this definitly needs to eventually happen.
[+] [-] kragen|4 years ago|reply
[+] [-] raxxorrax|4 years ago|reply
That we have security flaws is always inevitable. Better languages might help but are no panacea.
I agree with Snowden on a lot, but this doesn't solve anything.
The result would be software certificates. By whom? Take a guess.
Nobody can guarantee absolute safety. This is a trap you don't want to fall into.
It would end open source and any independent development. Quite surprisingly short sighted by Snowden.
The problem with iMessage wouldn't be solved by liability. It is a security flaw that cannot be removed by law.
edit: To clarify: I agree with him in the Facebook example. They collected the data for their business and should be liable. "Bad" or "insecure" code is a different matter however.
[+] [-] eru|4 years ago|reply
If you want software where the vendors are liable, you can get that today.
[+] [-] goodlinks|4 years ago|reply
IIRC it could be interpeted as coving a range of types of harm to people and the environment that is not restricted to control systems.
https://en.wikipedia.org/wiki/IEC_61508
[+] [-] pluc|4 years ago|reply
d-e-f-i-n-i-t-e-l-y.com
[+] [-] david_draco|4 years ago|reply
https://www.gdpreu.org/compliance/fines-and-penalties/
[+] [-] arkh|4 years ago|reply
That's kinda sorta one of the goals of the GDPR. And they go farther as they're liable for any leak of personal data, not just the unnecessary personal data.
[+] [-] silexia|4 years ago|reply
[+] [-] slickrick216|4 years ago|reply
[+] [-] 3pt14159|4 years ago|reply
On the topic of unsafe language though, he's absolutely right. We don't have to put up with this. We could pass a law and ban new code in unsafe languages from national security threatening devices like phones or cyberphysical devices like self-driving cars and we would be the better for it in under a decade. We could even have taxes to give breathing room during a transitionary period to encourage it before outright banning it, but we don't. We don't because so often it is less of a hassle for the government to trust the private sector that it is to take some real position on the regulatory issues that matter. This will probably always be the case until the government is able to evaluate talent and pay salaries in accordance with that talent as the private sector is.
[+] [-] Hermitian909|4 years ago|reply
Yes C/C++ have more footguns than Java but there's no "hard line" in the safety differences and there are real and important things that need doing that it's not always clear can be reasonably done in another language.
If you haven't, I'd encourage you to read the paper "Some Were Meant For C"[0] on why C still doesn't have a real replacement (though it could in the future).
[0] https://www.cl.cam.ac.uk/~srk31/research/papers/kell17some-p...
[+] [-] platinumrad|4 years ago|reply
[+] [-] flatline|4 years ago|reply
[+] [-] yashap|4 years ago|reply
Certainly C and C++ have more footguns than many other languages, but highly insecure as well as highly secure software gets written in all languages. I think coming up with better/easier avenues for digital-security-breach related lawsuits and fines is a better idea than banning specific languages.
[+] [-] MichaelMoser123|4 years ago|reply
[+] [-] zbyforgotpass|4 years ago|reply
[+] [-] Seirdy|4 years ago|reply
- Decoders and encoders for video, images, and audio - graphics libraries - Many parts of fast cryptographic libraries
This is typically for performance-related reasons.
[+] [-] SV_BubbleTime|4 years ago|reply
Yes on theme, but no on ”there aught to be a law” that bans C/C++ because of an evolving goal of memory safety.
When Rust++ comes out surely there will be people complaining that Rust isn’t safe, and so on.
Best case is you make the consequence punishable (as was described in the article).
[+] [-] matheusmoreira|4 years ago|reply
Code reflects the programmer's understanding of the world. If this understanding is flawed, the logic will also be flawed.
[+] [-] RandomLensman|4 years ago|reply
Would phones really be that high up given that in environments with high security standards usually people are not allowed to carry them? What about home appliances? Could a state actor hack a bunch of stoves and burn the houses down?
I can easily see some huge bureaucracy being put in place without much benefits ("Federal Programing Language Commission"?).
Most high security environments have a physical component, why not learn from there? Phones could just have a physical switch to cut off microphones and antennae for example, much easier to do than trying to police millions of lines of codes to be secure.
[+] [-] 7373737373|4 years ago|reply
[+] [-] rapjr9|4 years ago|reply
What is this fascination with creating a new language every time we want to add some features? Is it the fame of creating and naming your own computer language? From a security perspective creating a new language reduces security because it will never have been as thoroughly vetted as the old languages.
[+] [-] wydfre|4 years ago|reply
It is the job of human intelligence to cause ‘discontention’ - discontent and contention. I like to think of how it as how you can make powerful gears almost seize if you understand their weaknesses properly.
[+] [-] fouric|4 years ago|reply
I can tell you exactly how this will end up: like PCI DSS.
[+] [-] throwaway210222|4 years ago|reply
Countries already do this. But they also put exemption clauses in the policies.
Ban exemptions (in all policies) first if you want to make progress.
But you won't like it.
[+] [-] raxxorrax|4 years ago|reply
Liability should be created where when you expose third party data. That would disincentivise data collection massively.
[+] [-] siliconc0w|4 years ago|reply
[+] [-] staticassertion|4 years ago|reply
That's billions of dollars that will get slowly steered in the right direction.
[+] [-] m12k|4 years ago|reply
[+] [-] yabones|4 years ago|reply
Software that can cause real damage to not just data and business activities, but endanger lives is completely legal to sell to despotic lunatics. If you sold a bag of fertilizer to a Syrian you'd go straight to Guantanamo, but somehow this is okay.
[1] https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...
[2] https://web.archive.org/web/20051201184530/http://www.cyberl...
[+] [-] red_trumpet|4 years ago|reply
This begs the question: Is there a compelling reason why selling weapons to other states should be legal?
[+] [-] lifeisstillgood|4 years ago|reply
"Engineering" is a wide subject - the big stuff is carefully built and highly regulated - bridges and buildings. But as we go down the scale we see engineering give way to the problems of politics and money - tower blocks collapse for example, and then we see human level engineering - factory tools that try to meet inflicting goals, dangerous toys and so much more.
The software world should not beat itself up for not being like all those engineers - when lives are not on the line engineers get tied up just the same as the rest. And when lives are on the line, software and hardware engineering have learnt a few things - reduce the scope to the barest possible essentials - have a lot of redundancy and stick to well known designs.
[+] [-] jonas21|4 years ago|reply
If a car explodes because it got hit by an artillery shell, would anyone hold the automotive engineers responsible? If a building collapses because a bomb was dropped on it, would anyone hold the civil engineers responsible?
[+] [-] wly_cdgr|4 years ago|reply
[+] [-] DoingIsLearning|4 years ago|reply
Jean Ichbiah's team won this contract with the language 'Green' in 1979. They subsequently went on to further develop this and standardize it in what was then the Ada 83 language standard.
The more I think about the more it feels that Ada just came about to solve the right problem but at the wrong time.
[+] [-] pjmlp|4 years ago|reply
https://en.wikipedia.org/wiki/Burroughs_large_systems
ESPOL/NEWP were the very first system programming languages to have UNSAFE code blocks, 10 years before C was even an idea.
Before that there was JOVIAL as well, https://en.wikipedia.org/wiki/JOVIAL
[+] [-] native_samples|4 years ago|reply
The most important fact related to his argument is one he doesn't even bother mentioning, namely that Android is mostly written in a memory safe language (Java). The thing he's asking for already exists and is deployed on most smartphones worldwide, yet all he has to say on the topic is this:
"While iPhones are more private by default and, occasionally, better-engineered from a security perspective than Google’s Android"
You cannot claim to be a spokesman for freedom, then demand memory-safe languages be used everywhere, and then praise the one system controlled exclusively by a single American firm that's written almost entirely in (Objective) C, a memory unsafe language.
That sentence is the only mention of Android in the entire article, the word "Java" doesn't appear anywhere and he seems to think that Rust is the only memory safe language in existence. Why should I care about this guy's opinions? Java has been drastically more successful than Rust when it comes to making software memory safe. Nobody is gonna choose to write the next AirBNB in Rust other than for fashion reasons, because it'd simply be too unproductive. Developers already complain about Swift and its horrible compile times, Rust would be even worse.
If Snowden really cares about this topic, he should brush up his Java skills, download some OpenJDK early access builds and start experimenting with writing video codecs and 3D engines using the new vectorization, memory span and value types features. Java is getting the capabilities to do even higher performance work traditionally dominated by C++, but in ways that preserve memory safety. The engineering is very difficult and it's unclear if Google will ever adopt it into ART, but it's there for them if they want it.
[+] [-] ngneer|4 years ago|reply
[+] [-] bob1029|4 years ago|reply
No. The greatest danger is lack of software supply chain management followed by near-universal disrespect for formal complexity management methods.
The only way I have found to win at this "are we actually secure" game is to minimize the number of parties you have to trust. The smaller you get this figure, the easier it becomes to gain control over your circumstances again. How many of us can immediately state the exact number of unique parties that we have to trust as part of building solutions for other people?
What about complexity? Most of the time, something is insecure not because of malicious intent (covered by the trust angle above), but because its so goddamn complex that no one can say for sure if its correct or not. Why do we tolerate this?
[+] [-] mikewarot|4 years ago|reply
[+] [-] imdoor|4 years ago|reply
What's the purpose of these microphones? Do they pose more threat than the standard non-hidden microphone?
[+] [-] bambax|4 years ago|reply
This strikes me as surprising. I have always been taught the opposite: if it feels good, it's probably bad for you, or illegal, or immoral, or all three.
[+] [-] raman162|4 years ago|reply
I am unsure about his comments regarding unsafe code. Like many of the people have already stated here, that's a blurry line that has good intention but seems nearly impossible. I think more regulation and certification for both employees and companies is a much more likely to be successful.
[+] [-] spoonjim|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] Synaesthesia|4 years ago|reply
[+] [-] minusSeven|4 years ago|reply
We should at this point accept that this isn't going to change and move ahead with the belief that your data is already hacked and is not private anymore. We should discuss more on the exact consequences of this and take actions accordingly.
[+] [-] n0on3|4 years ago|reply
[+] [-] nxpnsv|4 years ago|reply