I hate this practice, no idea how it became commonplace. Of course lots of times, installation procedures can be long and tedious, but it takes one popular project's script server to be compromised, and tons of people are suddenly running malicious commands.
I would go through manually installing dependencies and setting up my system, adding repos, etc. over running some script any day. But then again some projects wouldn't be that popular if they were hard to install.
Some of npm's installation instructions ask you to pipe curl into bash, to run a lovely script [0] which makes things easier for you, but not by much. Is it really necessary? Would developers give up trying to get npm and node just because installing not as easy as "curl https://some.script.com/that-script.sh | sudo -E bash -; sudo apt-get install npm"?
Other than building/installing programs, adding GPG/SSH keys like in the blog post can be as dangerous, and while not simple, there could be some method built to make things easier without having to run commands you don't even check.
Look at it this way: whenever you are running a program you didn't write yourself, you're running a bunch of commands you never checked. This is no different to, say, downloading a precompiled executable and running it, with all the same problems and tradeoffs.
People are lazy, developers are people, most developers are not interested in process of system maintenance, so if something could be done fast and easy, they could care less about it. My colleague uses Windows 8 preinstalled on his laptop 5 years ago, it's slow as hell and ridden with all kinds of toolbars, but he doesn't want to make an effort of full reinstall. He would happily pipe curl to bash, actually I'm not even sure if he understands what curl or bash is, he knows some Java and JavaScript and that's enough to be paid.
I like tinkering with system, I reinstalled my home server may be 20 times, I'm always reinstalling macos from scratch when new version released, but honestly it's not very productive time spent, so I understand people who just want to get things done.
I think it's failure of current operating systems: installing software is still too hard and tedious. Opening terminal and copy-pasting strings is not a trivial activity. If one could install npm by pushing button in their website, they would do that instead.
I like very much this formalism to installation procedure description. Being curious and informed about the risks, I will not do a copy paste of curl ... | sudo .... I will do a curl to a temporary file, edit the file and if there is nothing fishy, I will copy the commands (from the file, not a browser) to a terminal.
If the content of the file is a very messy (or too long) script, I consider that the software will also be messy and does not deserve my time.
This formalism flatter my ego, proves transparency and makes me spare time. Full benefit. It is dangerous for noob, but is also a good opportunity to educate them.
> But then again some projects wouldn't be that popular if they were hard to install.
Slackware was my first distro back in 2007 and I still wish it had at the very least a decent installer and package manager, it is so well built but upgrading, and installing is just not as trivial when compared to Debian. I guess openSUSE (afaik) is the only remnant of Slackware that's usable - really loved openSUSE but just like every other distro in the world I have to dance to get my Wi-Fi working properly.
I've wasted several days trying to get programs I write turned into debs and rpms, I gave up. It's a single executable you can download and put wherever you like, or download the source and './configure.py; make'.
Also, I release new versions regularly, so now being in the official repositories is no good as they will get out of date, I have to run my own repositories, for several versions of ubuntu and redhat. No chance.
Your example fetches the key from the keyserver without https. Fetching the key from the project's own site over https using curl is better.
Edited to add: Fetching from a keyserver is OKish if a) you use the long form of key id and b) your gpg is new enough that it checks that it got the key for the id it requested. Still, the Web page you copy the key id from is as vulnerable to an attack on the server as the server serving the key directly.
> Consider the case where download.docker.com starts serving an evil key file
At that point I can't trust the key ID in the docker documentation either. Since Docker doesn't use web of trust (who does honestly?) there is no way that I can verify the key ID in any way in the provided key file. So I don't know how it does any good inspecting the key file before adding it to the apt keyring.
On piping anything from the Internet directly to your system for execution: Don't be lazy. Don't be an ass.
When I am working in a persona that responsible for managing a server or a service, I insist on knowing everything I need to know about how to keep that service and the environment in which it operates safe, alive, and providing usable performance.
I require good, clean and coherent instructions for deploying something at production level, where all required components and their preferred method of interaction are clearly explained and documented by the developer, and can be repeated in a predictable manner by me.
If all I have to work with is "pipe this to the shell, alternatively read the code" I'm going to go with "nah, I'll find something professional".
Time spent installing a system should be only a minuscule fraction of time spent actually operating the system. Spending a few extra hours doing it right shouldn't make a difference.
... and then you have to check the fingerprints manually, and delete the ones you don't want manually.
No, what would really solve this specific issue is allowing apt-key to add only a single key, and give it the expected fingerprint (as zimbatm explains)
Apt-key should just have a built-in way of importing keys from HTTP(s) URLs, preferably in interactive mode so you can confirm the keys are legitimate before adding them.
Of my first steps into the world of Linux this year, this sort of procedure has been one of the most glaringly disturbing. Another similar was packages being downloaded over HTTP.
Debian packages are signed, they are safe to transmit over http. See https://wiki.debian.org/SecureApt (which appears to have been written around the time of the transition, so it's out of date, e.g. SHA1 signatures are no longer trusted etc)
teolandon|8 years ago
I would go through manually installing dependencies and setting up my system, adding repos, etc. over running some script any day. But then again some projects wouldn't be that popular if they were hard to install.
Some of npm's installation instructions ask you to pipe curl into bash, to run a lovely script [0] which makes things easier for you, but not by much. Is it really necessary? Would developers give up trying to get npm and node just because installing not as easy as "curl https://some.script.com/that-script.sh | sudo -E bash -; sudo apt-get install npm"?
Other than building/installing programs, adding GPG/SSH keys like in the blog post can be as dangerous, and while not simple, there could be some method built to make things easier without having to run commands you don't even check.
Anyways, hope projects grow out of this habit.
[0] https://deb.nodesource.com/setup_6.x
yakult|8 years ago
vbezhenar|8 years ago
I like tinkering with system, I reinstalled my home server may be 20 times, I'm always reinstalling macos from scratch when new version released, but honestly it's not very productive time spent, so I understand people who just want to get things done.
I think it's failure of current operating systems: installing software is still too hard and tedious. Opening terminal and copy-pasting strings is not a trivial activity. If one could install npm by pushing button in their website, they would do that instead.
reacweb|8 years ago
If the content of the file is a very messy (or too long) script, I consider that the software will also be messy and does not deserve my time.
This formalism flatter my ego, proves transparency and makes me spare time. Full benefit. It is dangerous for noob, but is also a good opportunity to educate them.
giancarlostoro|8 years ago
Slackware was my first distro back in 2007 and I still wish it had at the very least a decent installer and package manager, it is so well built but upgrading, and installing is just not as trivial when compared to Debian. I guess openSUSE (afaik) is the only remnant of Slackware that's usable - really loved openSUSE but just like every other distro in the world I have to dance to get my Wi-Fi working properly.
CJefferson|8 years ago
Also, I release new versions regularly, so now being in the official repositories is no good as they will get out of date, I have to run my own repositories, for several versions of ubuntu and redhat. No chance.
lorenzhs|8 years ago
hsivonen|8 years ago
Edited to add: Fetching from a keyserver is OKish if a) you use the long form of key id and b) your gpg is new enough that it checks that it got the key for the id it requested. Still, the Web page you copy the key id from is as vulnerable to an attack on the server as the server serving the key directly.
jwilk|8 years ago
Is this bug fixed now?
orf|8 years ago
It also apparently works by generating a bash script in /tmp/ and executing it...
leni536|8 years ago
At that point I can't trust the key ID in the docker documentation either. Since Docker doesn't use web of trust (who does honestly?) there is no way that I can verify the key ID in any way in the provided key file. So I don't know how it does any good inspecting the key file before adding it to the apt keyring.
jwilk|8 years ago
http://thejh.net/misc/website-terminal-copy-paste
mdekkers|8 years ago
When I am working in a persona that responsible for managing a server or a service, I insist on knowing everything I need to know about how to keep that service and the environment in which it operates safe, alive, and providing usable performance.
I require good, clean and coherent instructions for deploying something at production level, where all required components and their preferred method of interaction are clearly explained and documented by the developer, and can be repeated in a predictable manner by me.
If all I have to work with is "pipe this to the shell, alternatively read the code" I'm going to go with "nah, I'll find something professional".
Time spent installing a system should be only a minuscule fraction of time spent actually operating the system. Spending a few extra hours doing it right shouldn't make a difference.
[edit: added "...and another thing" argument]
faho|8 years ago
pilif|8 years ago
That's the downside of everything-is-just-strings in Unix
zimbatm|8 years ago
rakoo|8 years ago
No, what would really solve this specific issue is allowing apt-key to add only a single key, and give it the expected fingerprint (as zimbatm explains)
mdekkers|8 years ago
What would really help: Publishers providing the key in a clear text copy-paste format, and providing instructions on adding the key to apt-key.
Rjevski|8 years ago
tlrobinson|8 years ago
adventureadmin|8 years ago
MatthewWilkes|8 years ago
proactivesvcs|8 years ago
lorenzhs|8 years ago