I used to think this way as well, but out of necessity, had to work specifically with a commercial IDE for some time. Turns out, if you are confident learning the keybindings of a robust IDE is worthwhile (e.g. you know that you must use it for some particular project for a decent amount of time) the investment pays off just as well as learning shell commands. A good IDE can do everything a cobbled together shell pipeline can, often in two keystrokes instead of N trial and error command inputs because you forgot, yet again that, e.g. while `grep` takes a pattern argument before a path, `find` does the opposite--in spite of POSIX there's an incredible lack of UX standardization when it comes to unix tooling.
People like to pretend that learning shell commands is somehow better because it's more portable, but unless your work entails working across multiple machines and environments daily, it doesn't matter. Most of us do most of our work on one machine, in one environment, so at the end of the day, it really comes down to preference. Personally, I do prefer working in the shell, but if you prefer working in a modern IDE, don't let anyone fool you that you're somehow wasting time or being less proficient than you could be in a shell--if you took the time to learn the IDE keybindings and the IDE is on par with anything Jetbrains puts out, you're not losing in efficiency at all and possibly making gains when it comes to certain tasks.
> People like to pretend that learning shell commands is somehow better because it's more portable,
No, it's better because if you aren't doing exactly the workflow an IDE or GUI tool designer has envisioned, it is almost invariably much easier to do it in shell (and then make it a script and then bind it to a command in your IDE or GUI tool, if they support that) than to beat the non-shell tool into, first, doing what you want, and then making it easily repeatable.
It's also more portable, which, contrary to your description, is very useful for most people, because even if there are some dev-only tasks that “I can only do it in the IDE” is fine for, for many things you may also want to do in a CI pipeline, on a deployment box, or other places where your IDE isn't running. Shell scripts generally work there.
"People like to pretend that learning shell commands is somehow better because it's more portable, but unless your work entails working across multiple machines and environments daily, it doesn't matter."
It is more portable. No pretending is necessary. It's true. I run multiple computers with different resource constraints and operating systems. I neither have the patience nor the time (not to mention the system requirements) to install an IDE on all of them. However each one has an OS that comes with a POSIX-like shell, e.g., NetBSD's sh, FreeBSD's sh, OpenBSD's sh and Linux's sh, which is derived from NetBSD's sh.
The author of this blog post begins his demonstration of shell wizardry with the "history" command. This command does not exist in POSIX sh. On NetBSD I use "fc -l 0". Go figure, it is more portable (nevermind fewer keystrokes). The scripts I write in NetBSD sh run on FreeBSD, OpenBSD, Linux, and a number of other OS without any modification. I do not have learn multiple shells to do work. On BSD, I also use the POSIX-like scripting shell (sh) as the interactive shell.
The best part is I do not have to install anything, it has already been included, no worries about system requirements. All these OS require a POSIX-like sh. None of them require an IDE.
"Personally, I do prefer working in the shell, but if you prefer working in a modern IDE, don't let anyone fool you that you're somehow wasting time or being less proficient than you could be in a shell"
I totally agree with that sentiment.
"[...] unless your work entails working across multiple machines and environments daily"
That situation might be a lot more likely than you'd think. I started out doing WordPress, and that's still what mostly what makes me money.
But now that I am using 30-40 different machines that host 3-400 sites, each with their own installs, and knowing how to do stuff efficiently with the shell is super useful.
To me, that situation feels kinda unlikely, but at the same time dang it's nice to just ssh into a machine and know what I'm doing.
I have coworkers and clients who just can't do certain kinds of tasks or troubleshoot stuff with certain kinds of tools because they just don't have the CLI knowledge.
So while I generally agree with your post, I will offer an alternate idea to balance it: if folks are thinking that learning how to use a shell is a waste of time because they can do everything in a GUI, if they took the time to learn the shell at a a level of most of the things they use every day in an IDE, they are "not losing in efficiency at all and possibly making gains when it comes to certain tasks."
Yes, the IDE is supposed to be faster for the things it was designed to do. That's the whole reason for its existence. But the shell allows you to do much more than your IDE was supposed to do. If you don't learn it, you're confining yourself to the boundaries set by the IDE.
If you work with mechanical or electrical engineers, then you're going to run into Windows machines pretty quick, and then you need to convince them to install minGW to run your shell scripts. Sure, shell scripts are portable between Mac and *nix, but you're leaving out a large elephant.
The thing is sometimes it's faster to goes to the shell use a grep -r/ag than to wait that 'jump to definition' does its job..
Depends of course the language and the project, but it's true for me at my work big C++ project..
CLion is awfully slow to start, VSCode starts fast but often I can find things faster in the shell than in the IDE :-( so I use it mostly as a dumb text editor.
Does your IDE run inside a 500MB linux container ? After cloud adoption, I would have difficulty finding software developers whose work does not entail running across multiple machines and environments. (Ok, the community of scientific researchers is probably exempt from this but they are a minority).
From my experience, the capabilities IDEs and unix tools offer are mostly orthogonal. Also, your arguments against consistent UX has the solution of just writing a simple wrapper around the offending tool (or using newer ones, e.g., ripgrep and fd). `tldr` also tells you the correct interface quickly.
I can just never get past the arg/flag inconsistency/complexity across commands. Perhaps if the docs started with a simple example of what has been seen over time to be the most common incantation for each command it might be tolerable. But as it is, I spend more time dealing with idiosyncracies of a command than I do expressively piping stuff.
Perhaps if Rust had five mutually incompatible borrow-checkers which get called based on who wrote the code for a particular language construction-- that might get across the frustration I feel when using the shell. (Well, honestly I just StackOverflow for the shell incantation I need and that seems to work well enough.)
You're right. I often think the man page should begin with a few examples, then launch into the neverending list of options.
That is no way to learn. In foreign language 101, they start you off with a small group of examples. "Como estas?" "Muy bien. Y tu?" Afterward, they explain the rules of the language (this is a noun, this is a verb, this is how you conjugate for first-person singular, etc.). In fact, this is how we learn our first language as babies. Anyone remember Mom and Dad pulling out a flip chart? Or did they just talk to you a lot?
The same goes with apprenticeships, I would think. The blacksmith starts the apprentice with simple tasks around the shop. I suppose he would intersperse it with the occasional pontification about principles and theory, but he would not sit down the pupil for weeks explaining everything before just letting him get his hands dirty.
The Linux man pages are upside down. Examples don't come till the very end, if at all.
Thankfully, like you said, there is now Stack Overflow.
> Perhaps if the docs started with a simple example of what has been seen over time to be the most common incantation for each command it might be tolerable.
You might enjoy https://cheat.sh which is usable via curl: `curl cheat.sh/grep`
Using Fish as my shell somewhat makes this morn tolerable as it checks the manpages and automatically suggests flags in a tab-complete fashion. It also remembers previously run commands and suggests them, which helps with not having to remember exactly which flag does what, because the example comes from the last time you ran the command.
Perhaps if Rust had five mutually incompatible borrow-checkers which get called based on who wrote the code for a particular language construction-- that might get across the frustration I feel when using the shell.
You mean like the five mutually incompatible async solutions in rust?
The inconsistency sometimes bothers me too, but it makes me feel a little better to remember that so many of the CLI commands we take for granted are part of an old historical heritage. The inconsistency is part of that heritage--today's CLI wasn't designed all at once by one group, but rather evolved over 50+ years from many contributors, back when nobody expected that people would still be using `sed` in 2020.
For me, reflecting on that history helps dull the annoyance of having to type `grep --extended-regexp` but `sed --regexp-extended`.
At least when you go to Stack Overflow or read the man page, if you find an answer, then it is likely to work. In contrast, if you go to Stack Overflow to find out how to do something with a GUI program, there's a good chance that the answer you find is for a different version than you are working with, and does not help at all.
Almost nothing in computerdom was designed from a developer-centric perspective and it blows my mind.
Maybe I'm 'opposite brained' but it's the first thing that I think about when making something, sometimes at the expensive of the algorithm.
There's a bird in the back of my mind literally every time I use the shell chirping at me to translate them all into something consistent, and then make actually useful manpages for them.
The world is complicated, we have too much to learn, communications and ramp-up are essential and part of the product even if there's genius under the hood. (I'm also looking at you Rust, Git).
> Perhaps if the docs started with a simple example of what has been seen over time to be the most common incantation for each command it might be tolerable.
Rsync’s man page opens with examples, and even though I never actually those specific commands, they are usually enough for me to remember how to do whatever I wanted to do.
You would probably love the tldr[1] command, then. It's a user maintained library of common example incantations of any given command. Nowadays I check it before `man`.
Learning the shell was one of the best things I did as a programmer.
It helped a lot that 15+ years ago I committed to running desktop Linux as a daily driver. Especially back then when things were much rougher on the desktop, using Linux as a daily driver means occasionally doing something in the shell.
Ultimately, just like learning a programming language, one must have a practical reason, a project, to make it worth the while to learn as you go. For someone interested in learning to use the power of the shell, don't just go read a book or do some exercises or tape a cheat sheet to the wall. Instead, commit to using it for daily tasks for a month, or for maintaining a project entirely in the shell, not even using a GUI file manager. It's tough at first but so worth it!
At least it still gives you the choice. I learned a lot by stripping down my system and removing/replacing a lot of components to the extent that it didn't resemble Fedora (or Debian, or whatever) at all anymore. The amount of stuff you learn after completely trashing your only system and needing it to be restored and usable in a few hours…
Hear hear! I remember a senior dev at my first job being stymied by the command line; he was very adept in the IDE, but had a hard time navigating directories. It was definitely a disadvantage when it came to getting stuff done.
One thing the author didn't cover is how you can share reified knowledge when you write shell scripts. (It's the same with other programming languages, but they aren't as easy to write or modify.) That is really really powerful, because you can not only share it with others but also with your future self.
I used to joke at my previous position that I was a "programmer-lite", because I was in a support role and they wouldn't let me write code, but for some reason I was allowed to write and commit bash scripts, so all of the programmers started coming to me to write the shell script interfaces to their programs. I took pride in making the scripts "safe", running them through shellcheck and actually checking the exit status on things.
To this day, I've still never quite gotten the hang of the whole "trap" mechanic, though...
Yeah the way I think of this is that some people write a text or Markdown file with shell commands for documentation.
I INVERT this, and write a shell script with comments :) That way someone else can reuse your knowledge more easily (and your future self as well).
Examples:
Constructing a big curl command to use the Zulip API, and also using jq for the first time. I used this to easily make a blog post out of a long Zulip thread [1]
Though one issue is that shell scripts don't really specify their environment, but there is a large number of tools growing around containers that can solve this problem. (Basically Docker is being refactored into something more sane; thank you to OCI and others.)
So I hope to integrate the Oil shell and containers more so shell scripts are more reproducible. I mean most of the container tools are already command line tools so in some sense it's already done, but you can imagine closer integration (e.g. not just passing code as text from outside the container to inside the container).
And one thing I've wondered is if Oil should literally run shell out of markdown, so you can create executable docs. I can see it being useful, but it might be something you should do with a separate tool that converts markdown code blocks to a shell script...
yes, you're absolutely spot on - I'm part of a remote team and we made it mission to try and just use shell-commands to communicating business-centric admin stuff, such as: add this row in the table, modify this stored parameter, invoke that api call via curl, etc. They're all shorter than 10 lines usually but help illustrate exactly what needs to happen with very little room for interpretation.
This week I got my drill bit bound up in a double stud I wanted to run some wire through. The drill went into thermal cut out and wouldn't budge. I was at a loss of how to get the bit out without destroying the stud until I remembered I had a hand drill in the shop. It came from my great-grandfather. It's solid and well maintained and I was able to hand crank that drill bit through the rest of the wood. That's not the first time a simple hand tool has saved the day when a power tool let me down.
I agree that IDEs and the tools that accompany them are powerful and can make you more productive. Learning them is an investment that can pay dividends. On the other hand, most of the IDEs I took the time to learn in school are now obsolete. I still use the shell today because those skills are still relevant and have gotten me out of a jam plenty of times. I've chosen to prioritize learning tools that are reliable and lasting even if it costs me some productivity.
I don't think that a shell should be a complete programming language. If you need a programming language, then better use one. There is xonsh if you are looking for something like this.
I think there should be a better bash with an very clean and consistent interface. Some of the most commonly used utils like awk '{ print $2}', sed, grep, sort, ... should be included out of the box to improve performance and to provide consistency across distributions.
Absolutely agreed, shell is in many respects terrible.
>I don't think that a shell should be a complete programming language.
On the contrary, I think shell should be a more complete programming language! Drop the stringly typing and add actual types (hence eliminating 80% of bothersome awksedgrep magic; yes, no need to tell me it's a real tall order), add proper error handling...
I truly value simplicity where possible, but I think that the primary interface that I use to communicate with my computer should be as powerful as possible. Numerous times I've built up a shell pipeline only to realize near the end that I need to do something that shell is horrible at, and had to redo the whole thing in Python from scratch. I cannot really see a reason to limit it.
>Some of the most commonly used utils like awk '{ print $2}', sed, grep, sort, ... should be included out of the box to improve performance and to provide consistency across distributions.
UNIX "philosophy" got a lot of things wrong, but not this one. These common utilities you mention are good especially when they're separate, because it's not the shell's job to improve performance, provide consistency, etc... of a dozen and a half different utilities, a number sure only to grow in time as people discover what commonly used thing they want in their shell. Though I'll give you that `awk '{ print $2 }'` should definitely be its own utility or a built-in.
That pipeline sure does look convenient. Let's see how it will handle a path with spaces.
$ git status -s | grep '^ D' | awk '{ print $2 }' | xargs git checkout --
xargs: unmatched double quote; by default quotes are special to xargs unless you use the -0 option
$ git status -s
D "g h i"
I'd be lying if I said I was surprised, to be honest.
I spend many hours of each day ~programming~ wrangling text files and I use macos + zsh + textmate2 for my daily drivers. I see shell as an important proficiency because it helps maintain a lower-level understanding of how the "magical" GUI "works," which often helps in debugging obtuse errors, and I'm sad to encounter more engineers who are completely unfamiliar with it.
When it comes to examples like that posted by the OP, I like the combination of piping/pasting to mate and multi-caret editing for most scenarios where others would reach for awk/xargs.
Here's me following the same example scenario but with multi-caret editing (slowed down slightly):
1. git status -s | mate
2. select " D" with arrow keys + shift
3. command-E macos default for "use selection to find"
4. option-command-F to find all (multi-caret editing starts)
5. type "git checkout" to replace " D"
6. command-left to move cursor to start of line, then shift-command-right to select to end of line.
7. copy
8. select all + delete (clear document)
9. paste,
10. press return to insert newlines
11. select all + copy
12. switch back to terminal
13. paste
To me, this is many small steps, but each step is more mechanical and flows naturally, and the general flexibility of multi-caret editing means it is applicable more often in my daily work.
> I can work with this. I add grep '^ D' to filter out any entries which were not deleted, and pipe it through awk '{ print $2 }' to extract just the filenames.
Best hope your file names don’t have spaces in them.
One of the downsides of the shell on Unix is that everything is usually text meant for humans so you have to spend an annoying amount of time dealing with delimiters that make it easy on the eyes but a pain in the ass.
Piping and shell commands are powerful, but generally a bad experience. You have to plan your command, try to run it and understand how all the piping steps work. If you get it wrong, it’s an annoying experience.
As an alternative, try Sublime with multiple selection (or other editors). With multiple selection skills you can transform your lines in a WYSIWYG interactive format which is much easier to work with for me.
(To use it well you need to know the following shortcuts: split selection to lines, select next, and the alt+arrows jumps to next/previous word)
This won’t mean you don’t need to learn how to use shells, but is still pretty fun to use.
Someone should write "become IDE literate" as a response to opening of using vim with no extensions and using grep a lot.
I've been using editors that are language aware since at least the late 90s. Depending on the language they'll show me all references, take me to the definition or declaration, stack those jumps so as I follow the links I can pop back a level to where I was. All of this is instant. 1000x faster than opening a shell and trying to grep all the project files for word phrases which, not being language aware, can't tell if one 'foo' is relevant or irrelevant from another 'foo'. They'll also let me refactor various things from renaming a method/class/variable/function and correctly fixing all the related files to doing things like changing class members to getter/setters and other things.
The difference is like using a hammer vs using a nail gun. There are times where they hammer is useful but given the job of constructing a building a nail gun will get the job done much faster.
If you dislike POSIX sh, consider looking at Plan 9's rc before you look anywhere else (including bash): http://man.9front.org/1/rc
Also, I would advise you to learn about these four tools in depth: sh, sed, awk, and make. Learn when to use each, and don't use one when another would be better.
Learning some kind of common shell language is certainly a knowledge that will serve you well in the widest amount of circumstances. From managing headless servers, to local use on a workstation, to scripting within your U-Boot bootloader, to reverse engineering Linux based IoT/devices that oftentimes include busybox. Shell is included in a lot of places.
I agree that a programmer should know how to do things with shell, but when I read:
> In my workflow, I use Vim as my editor, and Unix as my “IDE”.
I immediately remember how great are Jetbrains IDEs. I just can't imagine how someone could refactor code with such accuracy and without hurdle just using Linux vi or any cli tool...
Nice post, but 2020 and still using an unauthenticated connection (plain netcat) to transfer data including executables (if I understood correctly) is not something you can justify when there are plenty of safe alternatives (e.g. wormhole, syncthing).
Maybe the recipient and the sender just compare hashes via a secure channel, but there's no mention of this.
It still strikes me that after all this time, the next step in the direction of the author is to use an editor like ACME (https://en.wikipedia.org/wiki/Acme_(text_editor)): use the shell to its maximum power, where the output of a command can be edited and used to control the existing interface.
Take for instance this video: https://www.youtube.com/watch?v=4djoOiLste0
The author shows how the output of "git status" is just a text, you can modify the text and add "git add" just in front of the files you care about and add the file to the index. There is no linear loop of command input that becomes command output, it's all one big buffer that can feed itself. Consider how there could be another window that perpetually shows the status thanks to a combination of inotify and git status, and you have a git view. Fiddle with the command line argument and you can choose wether to show whitespaces or not, whether to show a summary view or the full diff, or restrict the list to some files. Have another window where you can write some message, and a GitCommit command in the "Tag" of the window will use the whole buffer as a git commit message; boom, you have 60% of what I use git-cola for.
A bit more information can be seen by one of its creator here: https://www.youtube.com/watch?v=dP1xVpMPn8M. The possibilities are truly endless, and I haven't seen anything that resembles it. There is just no editor that embraces your platform the way Acme does.
No, Emacs is not the same, because Emacs doesn't integrate with your OS; Emacs is an OS unto itself. You can't really say it's integrated to the OS when everything is implemented in the language that only Emacs uses.
Our codebase uses tons of shellscripts and Python. Shell has been great for running installations tasks. Recently we added presubmits and added shellcheck utility to check for style and syntax. Any suggestions for adding unit tests for she'll scripts will be great. I'm planning to unify the syntax across the board
Sign me up for someone who hates the shell. I've always tried to write scripts in Python or even C++ is so much easier than bash, awk, etc. Bash has no repl, no unit tests not debugger its anachronistic. Don't get me started on emacs.
[+] [-] voidhorse|5 years ago|reply
People like to pretend that learning shell commands is somehow better because it's more portable, but unless your work entails working across multiple machines and environments daily, it doesn't matter. Most of us do most of our work on one machine, in one environment, so at the end of the day, it really comes down to preference. Personally, I do prefer working in the shell, but if you prefer working in a modern IDE, don't let anyone fool you that you're somehow wasting time or being less proficient than you could be in a shell--if you took the time to learn the IDE keybindings and the IDE is on par with anything Jetbrains puts out, you're not losing in efficiency at all and possibly making gains when it comes to certain tasks.
[+] [-] dragonwriter|5 years ago|reply
No, it's better because if you aren't doing exactly the workflow an IDE or GUI tool designer has envisioned, it is almost invariably much easier to do it in shell (and then make it a script and then bind it to a command in your IDE or GUI tool, if they support that) than to beat the non-shell tool into, first, doing what you want, and then making it easily repeatable.
It's also more portable, which, contrary to your description, is very useful for most people, because even if there are some dev-only tasks that “I can only do it in the IDE” is fine for, for many things you may also want to do in a CI pipeline, on a deployment box, or other places where your IDE isn't running. Shell scripts generally work there.
[+] [-] 1vuio0pswjnm7|5 years ago|reply
It is more portable. No pretending is necessary. It's true. I run multiple computers with different resource constraints and operating systems. I neither have the patience nor the time (not to mention the system requirements) to install an IDE on all of them. However each one has an OS that comes with a POSIX-like shell, e.g., NetBSD's sh, FreeBSD's sh, OpenBSD's sh and Linux's sh, which is derived from NetBSD's sh.
The author of this blog post begins his demonstration of shell wizardry with the "history" command. This command does not exist in POSIX sh. On NetBSD I use "fc -l 0". Go figure, it is more portable (nevermind fewer keystrokes). The scripts I write in NetBSD sh run on FreeBSD, OpenBSD, Linux, and a number of other OS without any modification. I do not have learn multiple shells to do work. On BSD, I also use the POSIX-like scripting shell (sh) as the interactive shell.
The best part is I do not have to install anything, it has already been included, no worries about system requirements. All these OS require a POSIX-like sh. None of them require an IDE.
[+] [-] scarecrowbob|5 years ago|reply
I totally agree with that sentiment.
"[...] unless your work entails working across multiple machines and environments daily"
That situation might be a lot more likely than you'd think. I started out doing WordPress, and that's still what mostly what makes me money.
But now that I am using 30-40 different machines that host 3-400 sites, each with their own installs, and knowing how to do stuff efficiently with the shell is super useful.
To me, that situation feels kinda unlikely, but at the same time dang it's nice to just ssh into a machine and know what I'm doing.
I have coworkers and clients who just can't do certain kinds of tasks or troubleshoot stuff with certain kinds of tools because they just don't have the CLI knowledge.
So while I generally agree with your post, I will offer an alternate idea to balance it: if folks are thinking that learning how to use a shell is a waste of time because they can do everything in a GUI, if they took the time to learn the shell at a a level of most of the things they use every day in an IDE, they are "not losing in efficiency at all and possibly making gains when it comes to certain tasks."
[+] [-] coliveira|5 years ago|reply
[+] [-] cjohnson318|5 years ago|reply
[+] [-] antb123|5 years ago|reply
1) shell commands
2) sql
3) vi
4) some programming languages c, pascal(?), basic(?)
Everything else has changed multiple times. Maybe that means its worth investing on learning them.
[+] [-] renox|5 years ago|reply
[+] [-] lenkite|5 years ago|reply
[+] [-] Siira|5 years ago|reply
[+] [-] jancsika|5 years ago|reply
Perhaps if Rust had five mutually incompatible borrow-checkers which get called based on who wrote the code for a particular language construction-- that might get across the frustration I feel when using the shell. (Well, honestly I just StackOverflow for the shell incantation I need and that seems to work well enough.)
[+] [-] combatentropy|5 years ago|reply
That is no way to learn. In foreign language 101, they start you off with a small group of examples. "Como estas?" "Muy bien. Y tu?" Afterward, they explain the rules of the language (this is a noun, this is a verb, this is how you conjugate for first-person singular, etc.). In fact, this is how we learn our first language as babies. Anyone remember Mom and Dad pulling out a flip chart? Or did they just talk to you a lot?
The same goes with apprenticeships, I would think. The blacksmith starts the apprentice with simple tasks around the shop. I suppose he would intersperse it with the occasional pontification about principles and theory, but he would not sit down the pupil for weeks explaining everything before just letting him get his hands dirty.
The Linux man pages are upside down. Examples don't come till the very end, if at all.
Thankfully, like you said, there is now Stack Overflow.
[+] [-] rootlocus|5 years ago|reply
You might enjoy https://cheat.sh which is usable via curl: `curl cheat.sh/grep`
[+] [-] dafoex|5 years ago|reply
[+] [-] Scarbutt|5 years ago|reply
You mean like the five mutually incompatible async solutions in rust?
[+] [-] acabal|5 years ago|reply
For me, reflecting on that history helps dull the annoyance of having to type `grep --extended-regexp` but `sed --regexp-extended`.
[+] [-] mannykannot|5 years ago|reply
[+] [-] jariel|5 years ago|reply
Almost nothing in computerdom was designed from a developer-centric perspective and it blows my mind.
Maybe I'm 'opposite brained' but it's the first thing that I think about when making something, sometimes at the expensive of the algorithm.
There's a bird in the back of my mind literally every time I use the shell chirping at me to translate them all into something consistent, and then make actually useful manpages for them.
The world is complicated, we have too much to learn, communications and ramp-up are essential and part of the product even if there's genius under the hood. (I'm also looking at you Rust, Git).
[+] [-] vicek22|5 years ago|reply
[+] [-] teflodollar|5 years ago|reply
Rsync’s man page opens with examples, and even though I never actually those specific commands, they are usually enough for me to remember how to do whatever I wanted to do.
[+] [-] commandlinefan|5 years ago|reply
[+] [-] rement|5 years ago|reply
[0] https://tldr.sh/
[1] https://www.npmjs.com/package/tldr
[2] https://github.com/tldr-pages/tldr-python-client
[+] [-] dehrmann|5 years ago|reply
[+] [-] Aeolun|5 years ago|reply
[+] [-] IndexPointer|5 years ago|reply
[+] [-] ohthehugemanate|5 years ago|reply
[1] https://tldr.sh/
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] acabal|5 years ago|reply
It helped a lot that 15+ years ago I committed to running desktop Linux as a daily driver. Especially back then when things were much rougher on the desktop, using Linux as a daily driver means occasionally doing something in the shell.
Ultimately, just like learning a programming language, one must have a practical reason, a project, to make it worth the while to learn as you go. For someone interested in learning to use the power of the shell, don't just go read a book or do some exercises or tape a cheat sheet to the wall. Instead, commit to using it for daily tasks for a month, or for maintaining a project entirely in the shell, not even using a GUI file manager. It's tough at first but so worth it!
[+] [-] scaladev|5 years ago|reply
[+] [-] mooreds|5 years ago|reply
One thing the author didn't cover is how you can share reified knowledge when you write shell scripts. (It's the same with other programming languages, but they aren't as easy to write or modify.) That is really really powerful, because you can not only share it with others but also with your future self.
I wrote about this more here: https://letterstoanewdeveloper.com/2019/02/04/learn-the-comm...
[+] [-] AdmiralAsshat|5 years ago|reply
To this day, I've still never quite gotten the hang of the whole "trap" mechanic, though...
[+] [-] chubot|5 years ago|reply
I INVERT this, and write a shell script with comments :) That way someone else can reuse your knowledge more easily (and your future self as well).
Examples:
Constructing a big curl command to use the Zulip API, and also using jq for the first time. I used this to easily make a blog post out of a long Zulip thread [1]
https://github.com/oilshell/oil/blob/master/services/zulip.s... (oops some tabs snuck in here)
Figuring out how to use uftrace (and successfully optimizing Oil with it [2]):
https://github.com/oilshell/oil/blob/master/benchmarks/uftra...
Though one issue is that shell scripts don't really specify their environment, but there is a large number of tools growing around containers that can solve this problem. (Basically Docker is being refactored into something more sane; thank you to OCI and others.)
So I hope to integrate the Oil shell and containers more so shell scripts are more reproducible. I mean most of the container tools are already command line tools so in some sense it's already done, but you can imagine closer integration (e.g. not just passing code as text from outside the container to inside the container).
----
I wrote some notes about the documentation issue here: http://www.oilshell.org/blog/2020/02/good-parts-sketch.html#...
And one thing I've wondered is if Oil should literally run shell out of markdown, so you can create executable docs. I can see it being useful, but it might be something you should do with a separate tool that converts markdown code blocks to a shell script...
[1] http://www.oilshell.org/blog/2020/11/more-syntax.html
[2] http://www.oilshell.org/blog/2020/01/parser-benchmarks.html
[+] [-] BillyTheKing|5 years ago|reply
[+] [-] intrepidhero|5 years ago|reply
I agree that IDEs and the tools that accompany them are powerful and can make you more productive. Learning them is an investment that can pay dividends. On the other hand, most of the IDEs I took the time to learn in school are now obsolete. I still use the shell today because those skills are still relevant and have gotten me out of a jam plenty of times. I've chosen to prioritize learning tools that are reliable and lasting even if it costs me some productivity.
[+] [-] tutfbhuf|5 years ago|reply
I don't think that a shell should be a complete programming language. If you need a programming language, then better use one. There is xonsh if you are looking for something like this.
I think there should be a better bash with an very clean and consistent interface. Some of the most commonly used utils like awk '{ print $2}', sed, grep, sort, ... should be included out of the box to improve performance and to provide consistency across distributions.
[+] [-] theon144|5 years ago|reply
Absolutely agreed, shell is in many respects terrible.
>I don't think that a shell should be a complete programming language.
On the contrary, I think shell should be a more complete programming language! Drop the stringly typing and add actual types (hence eliminating 80% of bothersome awksedgrep magic; yes, no need to tell me it's a real tall order), add proper error handling...
I truly value simplicity where possible, but I think that the primary interface that I use to communicate with my computer should be as powerful as possible. Numerous times I've built up a shell pipeline only to realize near the end that I need to do something that shell is horrible at, and had to redo the whole thing in Python from scratch. I cannot really see a reason to limit it.
>Some of the most commonly used utils like awk '{ print $2}', sed, grep, sort, ... should be included out of the box to improve performance and to provide consistency across distributions.
UNIX "philosophy" got a lot of things wrong, but not this one. These common utilities you mention are good especially when they're separate, because it's not the shell's job to improve performance, provide consistency, etc... of a dozen and a half different utilities, a number sure only to grow in time as people discover what commonly used thing they want in their shell. Though I'll give you that `awk '{ print $2 }'` should definitely be its own utility or a built-in.
[+] [-] kaszanka|5 years ago|reply
[+] [-] larkinrichards|5 years ago|reply
When it comes to examples like that posted by the OP, I like the combination of piping/pasting to mate and multi-caret editing for most scenarios where others would reach for awk/xargs.
Here's me following the same example scenario but with multi-caret editing (slowed down slightly):
https://user-images.githubusercontent.com/226503/101993951-6...
Step by step:
To me, this is many small steps, but each step is more mechanical and flows naturally, and the general flexibility of multi-caret editing means it is applicable more often in my daily work.[+] [-] simonw|5 years ago|reply
Other tools and languages come and go, but the shell and SQL just keep on providing value.
[+] [-] kortilla|5 years ago|reply
Best hope your file names don’t have spaces in them.
One of the downsides of the shell on Unix is that everything is usually text meant for humans so you have to spend an annoying amount of time dealing with delimiters that make it easy on the eyes but a pain in the ass.
[+] [-] strulovich|5 years ago|reply
As an alternative, try Sublime with multiple selection (or other editors). With multiple selection skills you can transform your lines in a WYSIWYG interactive format which is much easier to work with for me.
(To use it well you need to know the following shortcuts: split selection to lines, select next, and the alt+arrows jumps to next/previous word)
This won’t mean you don’t need to learn how to use shells, but is still pretty fun to use.
[+] [-] gfxgirl|5 years ago|reply
I've been using editors that are language aware since at least the late 90s. Depending on the language they'll show me all references, take me to the definition or declaration, stack those jumps so as I follow the links I can pop back a level to where I was. All of this is instant. 1000x faster than opening a shell and trying to grep all the project files for word phrases which, not being language aware, can't tell if one 'foo' is relevant or irrelevant from another 'foo'. They'll also let me refactor various things from renaming a method/class/variable/function and correctly fixing all the related files to doing things like changing class members to getter/setters and other things.
The difference is like using a hammer vs using a nail gun. There are times where they hammer is useful but given the job of constructing a building a nail gun will get the job done much faster.
[+] [-] ddevault|5 years ago|reply
If you dislike POSIX sh, consider looking at Plan 9's rc before you look anywhere else (including bash): http://man.9front.org/1/rc
Also, I would advise you to learn about these four tools in depth: sh, sed, awk, and make. Learn when to use each, and don't use one when another would be better.
[+] [-] megous|5 years ago|reply
[+] [-] gitowiec|5 years ago|reply
I immediately remember how great are Jetbrains IDEs. I just can't imagine how someone could refactor code with such accuracy and without hurdle just using Linux vi or any cli tool...
[+] [-] bredren|5 years ago|reply
With multiple contexts, and sometimes complex behavior, leveraging shell seems key to debug and gluing things together.
[+] [-] hiq|5 years ago|reply
Maybe the recipient and the sender just compare hashes via a secure channel, but there's no mention of this.
[+] [-] rakoo|5 years ago|reply
Take for instance this video: https://www.youtube.com/watch?v=4djoOiLste0 The author shows how the output of "git status" is just a text, you can modify the text and add "git add" just in front of the files you care about and add the file to the index. There is no linear loop of command input that becomes command output, it's all one big buffer that can feed itself. Consider how there could be another window that perpetually shows the status thanks to a combination of inotify and git status, and you have a git view. Fiddle with the command line argument and you can choose wether to show whitespaces or not, whether to show a summary view or the full diff, or restrict the list to some files. Have another window where you can write some message, and a GitCommit command in the "Tag" of the window will use the whole buffer as a git commit message; boom, you have 60% of what I use git-cola for.
A bit more information can be seen by one of its creator here: https://www.youtube.com/watch?v=dP1xVpMPn8M. The possibilities are truly endless, and I haven't seen anything that resembles it. There is just no editor that embraces your platform the way Acme does.
No, Emacs is not the same, because Emacs doesn't integrate with your OS; Emacs is an OS unto itself. You can't really say it's integrated to the OS when everything is implemented in the language that only Emacs uses.
[+] [-] spicyramen|5 years ago|reply
[+] [-] xx789|5 years ago|reply