This and Stallman's "Right to Read" (posted yesterday at https://news.ycombinator.com/item?id=14332257 ) are particularly relevant given the whole WannaCry situation --- no doubt there will be plenty of authoritarian-minded thinking computers (and maybe even programming them) should be locked-down/regulated more, to stop such attacks.
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society. "Imagine a world without crime" is a somewhat common phrase used by some, and if you actually do, you will realise that it would pretty much be the world of Orwell's 1984: there is no crime because there is no more freedom of thought nor action; everything has become under the control of some central authority.
This goes beyond computers, although they will be a large part of it; it's really a general war on freedom.
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society.
I think this strongly conflates many issues. First of all, WannaCry spreads through a weakness in Microsoft's SMB implementation. Such a weakness can occur in both walled-garden and open systems.
Secondly, sandboxing, besides improving security, can increase our freedom: we get to decide which application can access what data. In the current model used on many computers, an application can access all user files and exfiltrate usage patterns, address books, etc. WannaCry would be pretty ineffective if it was sandboxed and could not encrypt user's files. Now such malware (as we've seen with the Handbrake malware) can upload all your passwords, browser history, etc.
Sandboxing is fine. It should be up to the user what we decide to sandbox and what not.
(I do agree that society is moving towards being totalitarian. Or at least, making itself extremely vulnerable to totalitarianism.)
Big data, machine learning and things like facial recognition in the public space is exceedingly leading us toward a society of total control.
We are not there yet, but I personally fear, that all it would take is the political realization, that the ability to predict behavior leads to more efficient governance.
Technology has the ability to "set us free" or "enslave us", but it's up to the people to decide which they want in a democracy. In the current conservative climate, we're certainly heading more toward dystopia, racing rapidly to make some of the darker cyberpunk come true.
We need to be careful with the scope here, I think. "War on general-purpose computing" is already abstract enough most people just roll their eyes and move on; expanding that to "war on freedom" guarantees that almost nobody will care.
I have mixed feelings about what to do. On the one hand, there would be merit in requiring a professional license for programmers like in other engineering fields - a lot of the mess in our industry could be removed if at least some jobs would require a license, which would give both a recognized right to refuse work based on ethical issues (backed by professional association) and the liability in case you fucked up badly and people died.
On the other hand, I fear the day when Turing-complete systems will get regulated and require a (probably expensive) license to use. Like many others here, I benefited a lot from being able to tinker with computers and programming languages in my teenage years, before I had access to formal education on the topic. I would like my children to have the same chance.
In a way, my feelings are reflected on a smaller scale in the way I feel about sandboxing. On the one hand, I appreciate the idea of isolation and don't like user-hostile software to be able to do whatever it wants on my system. On the other hand, I'd love to have the right to breach the sandbox myself and mess with software running on it. On Windows I still can alter GUI elements of running applications; try that on unrooted Android.
If it was just about security vs. freedom, then the problem would be relatively easy; some compromise could be reached (e.g. expanded definition of "life-critical" systems which would require licenses, and also hopefully licenses for jobs requiring use of personally identifiable information). As it is, there are other selfish/malicious actors in play - like music/movie industry pushing for DRM, corporations fighting for their walled gardens, etc. It'll be hard to navigate this problem space.
It is interesting to compare WannaCry to the Google Phishing Scam that was going around a couple weeks ago [1]
Obviously really different issues, but the resolution was that the phishing scam was shut down in short order because it relied on Google for OAuth and they turned the app off and revoked the credentials.
I always like to try to pick out the single most useful sentence or two from long essays like this. Here's my go:
"So when I get into a car—a computer that I put my body into—with my hearing aid—a computer I put inside my body—I want to know that these technologies are not designed to keep secrets from me, or to prevent me from terminating processes on them that work against my interests."
Did we even fight the war? Around here I regularly see people advocating for this sort of lockdown. It used to just be apple, but now it's in windows and Chromebooks.
My experiences with EFF in real life in Europe have been abysmal. I tend not to donate to organizations that are contrarian by statute.
Especially so when you go to conferences, you hear panels where a EFF representative is chatting with a Google lobbyist and ends up using sentences like - and I quote - "OMG, you are so right", "Wow, I can't agree more". The theme was the censorship of content and a critique - clearly instrumental to Google's bottom line - that any content shouldn't be regulated by law.
I don't mind a new security model for personal computing, one that sanely quarantines - or containerizes or sandboxes - an application.
Any browser or App process has nothing good to do in ~/Library other than its own application support and prefs plist files (and the equivalent in Linux and Windows)
If an App needs to access any artifact in my $HOME it needs to be explicitly authorized for that specific one (e.g. I want to embed images in a presentation) by a UI element that is part of a the OS itself.
Right now, I might be running as root on my machine, because as soon as I'm 0wned, my whole life can be siphoned to some dodgy server on the net.
The trouble with this sort of article is that it takes too long to get anything useful out of it. A better structure would be an explanation of the core idea at the beginning and the rest of the article to explore and defend the idea. Writing is not an ordered series of deductions as this author assumes. It's far more complicated.
This kind of text work is called essay, it doesn't need to follow the rules of writing an article because it's not one.
The author probably knows how complicated it's to write (he is a professional writer), and still manages to do it good enough to get some prizes for his work https://en.wikipedia.org/wiki/Cory_Doctorow#Awards
The problem is not a lack of access to a general-purpose computer. The problem is the lack of control over the computers you're forced to use in order to manipulate their attached I/O devices such as lights, refrigerators, screens, speakers, insulin pumps, steering and brakes, Internet connections, etc.
A corporate setting is completely different. They can choose how you access their own hardware (putting aside personal gripes for purposes of efficiency and convenience)
However the real problem is hardware level DRM implementation and non-open CPUs, which really mean that I cannot truly own my own personal computer
This 6 year old article is still relevant but now most of the public seems very satisfied with iPhones or Android phones as their primary digital device. Even though I can hack on Haskell and Python code on my iOS devices, I see them, and am happy with them basically being appliances.
When I program, I am in portable environments (Pharo Smalltalk, Emacs with Common Lisp or Haskell) that more or less can sit on top of any general purpose OS.
If our government (I am in the USA) locks down computing devices to an extreme extent, then that will just screw up our economy and more well run tax jurisdictions (countries) will benefit.
I'm really surprised there was no direct mention of the DMCA. That's turned out to be the thing that's given us un-repairable John Deere tractors and printers that won't use 3rd-party ink cartridges.
You will always be able to do general purpose computing, maybe just with a loss of speed. No one is going to back door every 10 dollar microcontroller.
[+] [-] userbinator|8 years ago|reply
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society. "Imagine a world without crime" is a somewhat common phrase used by some, and if you actually do, you will realise that it would pretty much be the world of Orwell's 1984: there is no crime because there is no more freedom of thought nor action; everything has become under the control of some central authority.
This goes beyond computers, although they will be a large part of it; it's really a general war on freedom.
[+] [-] microtonal|8 years ago|reply
I think this strongly conflates many issues. First of all, WannaCry spreads through a weakness in Microsoft's SMB implementation. Such a weakness can occur in both walled-garden and open systems.
Secondly, sandboxing, besides improving security, can increase our freedom: we get to decide which application can access what data. In the current model used on many computers, an application can access all user files and exfiltrate usage patterns, address books, etc. WannaCry would be pretty ineffective if it was sandboxed and could not encrypt user's files. Now such malware (as we've seen with the Handbrake malware) can upload all your passwords, browser history, etc.
Sandboxing is fine. It should be up to the user what we decide to sandbox and what not.
(I do agree that society is moving towards being totalitarian. Or at least, making itself extremely vulnerable to totalitarianism.)
[+] [-] eksemplar|8 years ago|reply
We are not there yet, but I personally fear, that all it would take is the political realization, that the ability to predict behavior leads to more efficient governance.
Technology has the ability to "set us free" or "enslave us", but it's up to the people to decide which they want in a democracy. In the current conservative climate, we're certainly heading more toward dystopia, racing rapidly to make some of the darker cyberpunk come true.
[+] [-] TeMPOraL|8 years ago|reply
I have mixed feelings about what to do. On the one hand, there would be merit in requiring a professional license for programmers like in other engineering fields - a lot of the mess in our industry could be removed if at least some jobs would require a license, which would give both a recognized right to refuse work based on ethical issues (backed by professional association) and the liability in case you fucked up badly and people died.
On the other hand, I fear the day when Turing-complete systems will get regulated and require a (probably expensive) license to use. Like many others here, I benefited a lot from being able to tinker with computers and programming languages in my teenage years, before I had access to formal education on the topic. I would like my children to have the same chance.
In a way, my feelings are reflected on a smaller scale in the way I feel about sandboxing. On the one hand, I appreciate the idea of isolation and don't like user-hostile software to be able to do whatever it wants on my system. On the other hand, I'd love to have the right to breach the sandbox myself and mess with software running on it. On Windows I still can alter GUI elements of running applications; try that on unrooted Android.
If it was just about security vs. freedom, then the problem would be relatively easy; some compromise could be reached (e.g. expanded definition of "life-critical" systems which would require licenses, and also hopefully licenses for jobs requiring use of personally identifiable information). As it is, there are other selfish/malicious actors in play - like music/movie industry pushing for DRM, corporations fighting for their walled gardens, etc. It'll be hard to navigate this problem space.
[+] [-] api|8 years ago|reply
Knowing history that's usually what they get. WannaCry used an OS vulnerability that would have existed regardless of how locked down user space is.
[+] [-] michaelbuckbee|8 years ago|reply
Obviously really different issues, but the resolution was that the phishing scam was shut down in short order because it relied on Google for OAuth and they turned the app off and revoked the credentials.
1 - http://gizmodo.com/a-huge-and-dangerously-convincing-google-...
[+] [-] 0xdeadbeefbabe|8 years ago|reply
[+] [-] daoubt|8 years ago|reply
"So when I get into a car—a computer that I put my body into—with my hearing aid—a computer I put inside my body—I want to know that these technologies are not designed to keep secrets from me, or to prevent me from terminating processes on them that work against my interests."
Indeed.
[+] [-] eponeponepon|8 years ago|reply
Five years down the line, though, and I'm no longer sure that "We haven't lost yet," as the closing paragraph puts it.
[+] [-] flukus|8 years ago|reply
[+] [-] dm319|8 years ago|reply
[+] [-] confounded|8 years ago|reply
Donation link for those interested: https://eff.org/donate
[+] [-] camillomiller|8 years ago|reply
[+] [-] _wmd|8 years ago|reply
[+] [-] eecc|8 years ago|reply
Any browser or App process has nothing good to do in ~/Library other than its own application support and prefs plist files (and the equivalent in Linux and Windows)
If an App needs to access any artifact in my $HOME it needs to be explicitly authorized for that specific one (e.g. I want to embed images in a presentation) by a UI element that is part of a the OS itself.
Right now, I might be running as root on my machine, because as soon as I'm 0wned, my whole life can be siphoned to some dodgy server on the net.
[+] [-] nicky0|8 years ago|reply
[+] [-] oldmancoyote|8 years ago|reply
[+] [-] bruo|8 years ago|reply
The author probably knows how complicated it's to write (he is a professional writer), and still manages to do it good enough to get some prizes for his work https://en.wikipedia.org/wiki/Cory_Doctorow#Awards
[+] [-] camillomiller|8 years ago|reply
[+] [-] bobajeff|8 years ago|reply
[+] [-] gue5t|8 years ago|reply
The problem is not a lack of access to a general-purpose computer. The problem is the lack of control over the computers you're forced to use in order to manipulate their attached I/O devices such as lights, refrigerators, screens, speakers, insulin pumps, steering and brakes, Internet connections, etc.
[+] [-] dano|8 years ago|reply
[+] [-] hyperhopper|8 years ago|reply
However the real problem is hardware level DRM implementation and non-open CPUs, which really mean that I cannot truly own my own personal computer
[+] [-] mark_l_watson|8 years ago|reply
When I program, I am in portable environments (Pharo Smalltalk, Emacs with Common Lisp or Haskell) that more or less can sit on top of any general purpose OS.
If our government (I am in the USA) locks down computing devices to an extreme extent, then that will just screw up our economy and more well run tax jurisdictions (countries) will benefit.
[+] [-] pmiller2|8 years ago|reply
[+] [-] zardo|8 years ago|reply
https://www.techdirt.com/articles/20151027/10131232649/libra...
[+] [-] naveen99|8 years ago|reply
[+] [-] ominous|8 years ago|reply