top | item 33354286

Shell script best practices, from a decade of scripting things

958 points| sharat87 | 3 years ago |sharats.me | reply

490 comments

order
[+] xelxebar|3 years ago|reply
Hands down, shell scripting is one of my all time favorite languages. It gets tons of hate, e.g. "If you have to write more than 10 lines, then use a real language," but I feel like those assertions are more socially-founded opinions than technically-backed arguments.

My basic thesis is that Shell as a programming language---with it's dynamic scope, focus on line-oriented text, and pipelines---is simply a different programming paradigm than languages like Perl, Python, whatever.

Obviously, if your mental model is BASIC and you try to write Python, then you encounter lots of friction and it's easy for the latter to feel hacky, bad and ugly. To enjoy and program Python well, it's probably best to shift your mental model. The same goes for Shell.

What is the Shell paradigm? I would argue that it's line-oriented pipelines. There is a ton to unpack in that, but a huge example where I see friction is overuse of variables in scripts. Trying to stuff data inside variables, with shell's paucity of data types is a recipe for irritation. However, if you instead organize all your data in a format that's sympathetic to line-oriented processing on stdin-stdout, then shell will work with you instead of against.

/2cents

[+] snidane|3 years ago|reply
Shell and SQL make you 10x productive over any alternative. Nothing even comes close. I've seen people scrambling for 1 hours to write some data munging, then spend another 1 hour to run it through a thread pool to utilize those cores , while somebody comfortable is shell writes a parallelized one liner, rips through GBs of data, and delivers the answer in 15 minutes.

What Python is to Java, Shell is to Python. It speeds you up several times. I started using inline 'python -c' more often than the python repl now as it stores the command in shell history and it is then one fzf search away.

While neither Shell or SQL are perfect, there have been many ideas to improve them and for sure people can't wait for something new like oil shell to get production ready, getting the shell quoting hell right, or somebody fixing up SQL, bringing old ideas from Datalog and QUEL into it, fixing the goddamn NULL joins, etc.

But honestly, nothing else even comes close to this 10x productivity increase over the next best alternative. No, Thank you, I will not rewrite my 10 lines of sh into python to explode it into 50 lines of shuffling clunky objects around. I'll instead go and reread that man page how to write an if expression in bash again.

[+] ilyt|3 years ago|reply
> Hands down, shell scripting is one of my all time favorite languages. It gets tons of hate, e.g. "If you have to write more than 10 lines, then use a real language," but I feel like those assertions are more socially-founded opinions than technically-backed arguments.

It is "opinion" based on debugging scripts made by people (which might be "you but few years ago") that don't know the full extent of death-traps that are put in the language. Or really writing anything more complex.

About only strong side of shell as a language is a pipe character. Everything else is less convenient at best, actively dangerous at worst.

Sure, "how to write something in a limited language" might be fun mental excercise but as someone sitting in ops space for good part of 15 years, it's just a burden.

Hell, I'd rather debug Perl script than Bash one...

Yeah, if it is few pipes and some minor post processing I'd use it too (pipe is the easiest way to do it out of all languages I've seen) but that's about it.

It is nice to write one-liners in cmdline but characteristic that make it nice there make it worse programming language. A bit like Perl in that matter

[+] ducktective|3 years ago|reply
>However, if you instead organize all your data in a format that's sympathetic to line-oriented processing on stdin-stdout, then shell will work with you instead of against.

Not even that is necessary. Just use structured data formats like json. If you are consuming some API that is not json but still structured, use `rq` to convert it to json. Then use `jq` to slice and dice through the data.

dmenu + fzf + jq + curl is my bread and butter in shell scripts.

However, I still haven't managed to find a way to do a bunch of tasks concurrently. No, xargs and parallel don't cut it. Just give me an opinionated way to do this that is easily inspectable, loggable and debuggable. Currenly I hack together functions in a `((job_i++ < max_jobs)) || wait -n` spaghetti.

[+] marklgr|3 years ago|reply
> "If you have to write more than 10 lines, then use a real language"

I swear, there should be a HN rule against those. It pollutes every single Shell discussions, bringing nothing to them and making it hard for others do discuss the real topic.

[+] usrbinbash|3 years ago|reply
> What is the Shell paradigm? I would argue that it's line-oriented pipelines.

Which python can do realitively well, by using the `subprocess` module.

Here is an example including a https://porkmail.org/era/unix/award (useless use of cat) finding all title lines in README.md and uppercasing them with `tr`

    import subprocess as sp
    cat = sp.Popen(
        ["cat", "README.md"],
        stdout=sp.PIPE,
    )
    grep = sp.Popen(
        ["grep", "#"],
        stdin=cat.stdout,
        stdout=sp.PIPE,
    )
    tr = sp.Popen(
        ["tr", "[:lower:]", "[:upper:]"],
        stdin=grep.stdout,
        stderr=sp.PIPE,
        stdout=sp.PIPE,
    )
    out, err = tr.communicate()
    print(out.decode("utf-8"), err.decode("utf-8"))
Is this more complicated than doing it in bash? Certainly. But on the other side of that coin its alot easier in python to do a complex regular expression (maybe depending on a command line argument) on one of those, using the result in an HTTP request via the `requests` module, packing the results into a digram rendered in PNG and sending it via email.

Yes, that is a convoluted example, but it illustrates the point I am trying to make. Everything outlined could probably done in a bash script, but I am pretty certain it would be much harder, and much more difficult to maintain, than doing this in python.

Bash is absolutely fine up to a point. And with enough effort, bash can do extremely complex things. But as soon as things get more complex than standard unix tools, I rather give up on the comfort of having specialiced syntax for pipes and filehandles, and write a few more lines handling those, if that means that I can do the more complex stuff easily using the rich module ecosystem of Python.

[+] throwawaaarrgh|3 years ago|reply
I would agree, with the caveat that Bourne Shell isn't really a programming language, and has to be seen as such to be loved.

Bourne Shell Scripting is literally a bunch of weird backwards compatible hacks around the first command line prompt from 1970. The intent was to preserve the experience of a human at a command prompt, and add extra functionality for automation.

It's basically a high-powered user interface. It emphasizes what the operator wants for productivity, instead of the designer in her CS ivory tower of perfection. You can be insanely productive on a single line, or paste that line into a file for repeatability. So many programmers fail to grasp that programming adds considerations that the power user doesn't care about. The Shell abstracts away all that unnecessary stuff and just lets you get simple things done quickly.

[+] AtlasBarfed|3 years ago|reply
Hard Disagree. Bash programming:

- no standard unit testing

- how do you debug except with printlns? Fail.

- each line usually takes a minimum of 10 minutes to debug unless you've done bash scripting for... ten years

- basic constructs like the arg array are broken once you have special chars and spaces and want to pass those args to other commands. and UNICODE? Ha.

- standard library is nil, you're dependent on a hodgepodge of possibly installed programs

- there is no dependency resolution or auto-install of those programs or libraries or shell scripts. since it is so dependent on binary programs, that's a good thing, but also sucks for bash programmers

- horrid rules on type conversions, horrid syntax, space-significant rules

- as TFA shows, basic error checking and other conventions is horrid, yeah I want a crap 20 line header for everything

- effective bash is a bag of tricks. Bag of tricks programming is shit. You need to do ANYTHING in it for parsing, etc? Copy paste in functions is basically the solution.

- I'm not going to say interpreter errors are worse than C++ errors, but it's certainly not anything good.

Honestly since even effing JAVA added a hashbang ability, I no longer need bash.

Go ahead, write some bash autocompletion scripts in bash. Lord is that awful. Try writing something with a complex options / argument interface and detect/parse errors in the command line. Awful.

Bash is basically software engineering from the 1970s, oh yeah, except take away the word "engineering". Because the language is actively opposed to anything that "engineering" would entail.

[+] bitofhope|3 years ago|reply
There are workloads where shell scripts are the so-called right tool for a job. All too often I see people writing scripts in "proper" languages and calling os.system() on every other line. Shell scripts are good for gluing programs together. It's fine to use them for that.
[+] jayd16|3 years ago|reply
Eh, this is true but I dont think its because of the programming model of bash. I feel like this is conflating the *nix ecosystem with bash. If every programming language was configured by default and had access to standard unix tools with idiomatic bindings, Shell's advantages would be greatly reduced. You still get a scripting language with some neat tricks but I don't think I would reach for it nearly as often if other things were an option.

And sure sure you can call any process from a language but the assumptions are different. No one wants to call a Java jar that has a dependency on the jq CLI app being available.

[+] bheadmaster|3 years ago|reply
I also like Bash - it's a powerful language, especially when combined with a rich ecosystem of external commands that can make your life easier, e.g. GNU Parallel.

Handling binary data can also work in Bash, provided that you just use it as a glue for pipelines between other programs (e.g. feeding video data into ffmpeg).

One time, while working on some computer vision project, I had a need to hack up a video-capture-and-upload program for gathering training data during a certain time of day. It took me about 20 minutes and 50 lines of Bash to setup the whole thing, test it, and be sure it works.

[+] mrlemke|3 years ago|reply
To add to this, it's designed to work in conjunction with small programs. You don't write everything using bash (or whatever shell) built-ins. It will feel like a crappier Perl. If there is some part of your script where you're struggling to use an existing tool (f.g. built-ins, system utils), write your own small program to handle that part of the stream and add it in to your pipe. Since shell is a REPL, you get instant feedback and you'll know if it's working properly.

It's also important to learn your system's environment too. This is your "standard library", and it's why POSIX compatibility is important. You will feel shell is limited if you don't learn how to use the system utilities with shell (or if your target system has common utilities missing).

As an example of flexibility, you can use shell and system utilities in combination with CGI and a basic web server to send and receive text messages on an Android phone with termux. Similar to a KDE Connect or Apple's iMessage.

[+] wutbrodo|3 years ago|reply
> I feel like those assertions are more socially-founded opinions than technically-backed arguments

You think the complaints about rickety, unintuitive syntax are "socially founded"? I can't think of another language that has so many pointless syntax issues every time I revisit it. I haven't seen a line of Scheme in over a decade, and I'm still fairly sure I could write a simple if condition with less likelihood of getting it wrong than Bash.

I came at it from the other end, writing complex shell scripts for years because of the intuition that python would be overkill. But there was a moment when I realized how irrational this was: shell languages are enough of a garbage fire that Python was trivially the better choice for my scripts the minute flow control enters the picture.

[+] kazinator|3 years ago|reply
> with it's dynamic scope

Bash has dynamic scope with its local variables.

The standard POSIX language has only global variables: one pervasive scope.

[+] dimitar|3 years ago|reply
Line-oriented pipelines are great and have their place but I'm still sticking to a high-level general purpose programming language (lets abbreviate this as HGPPL) for scripts longer than 10 lines, because the following reasons:

* I like to the HGPPL data structures and convenient library for manipulating them (in my case this is Clojure which has a great core library). Bash has indexed and associative arrays.

* Libraries for common data formats are also used in a consistent way in the HGPPL. I don't have to remember a DSL for every data format - i.e. how to use jq when dealing with JSON. Similarly for YAML, XML, CSVs, I can also do templating for configuration files for nginx and so on. I've seen way too many naive attempts to piece together valid YAML from strings in bash to know its just not worth doing.

* I don't want to switch programming language from the main application and I find helps "break down silos" when everyone can read and contribute to some code. If a team is just sysadmins - sure, make bash the official language and stick to it.

* I can write scripts without repeating myself using namespaces and higher-order functions, which my choice of paradigm for abstractions, others write cleanly with classes. You can follow best practices, avoid the use of ENV vars, but that requires extra discipline and it is hard to enforce on other for the type of places where bash is used.

[+] Spivak|3 years ago|reply
Also the fact that $() invokes a supparser which lets use double quotes in an already double quoted expression is something I miss when using Python-f strings.
[+] throw10920|3 years ago|reply
> My basic thesis is that Shell as a programming language---with it's dynamic scope, focus on line-oriented text, and pipelines---is simply a different programming paradigm than languages like Perl, Python, whatever.

This argument is essentially the same as "dynamic typing is just a different programming paradigm than static typing, and not intrinsically better or worse" - but to an even greater extent, because bash isn't really typed at all.

To those who think that static (and optional/gradual) typing brings strong benefits with little downsides over dynamic typing and becomes increasingly important as the size of a program increases, bash is simply unacceptable for any non-trivial program.

Other people (like yourself) that think that static typing isn't that important and "it's just a matter of preference" will be fine with an untyped language like bash.

Unfortunately, it's really hard to find concrete, clear evidence that one typing paradigm is better than the other, so we can't really make a good argument for one or the other using science.

However, I can say that you're conflating different traits of shell languages here. You say "dynamic scope, focus on line-oriented text, and pipelines" - but each of those are very different, and you're missing the most contested one (typing). Shell's untypedness is probably the biggest complaint about it, and the line-oriented text paradigm is really contentious, but most people don't care very much about the scoping, and lots of people like the pipelines feature.

A shell language that was statically-typed, with clear scoping rules, non-cryptic syntax, structured data, and pipelines would likely be popular and relatively non-controversial.

[+] strunz|3 years ago|reply
Eh, as soon as you have to deal with arrays and hash tables/dicts or something like JSON, bash becomes very painful and hard to read.
[+] psychstudio|3 years ago|reply
Kindred spirit. I particularly love variable variables and exploit them often. Some would call it abuse I guess.
[+] arendtio|3 years ago|reply
The biggest Issue is that error handling is completely broken in POSIX shell scripting (including Bash). Even errexit doesn't work as any normal language would implement it (One could say it is broken by design).

So if you don't care about error cases everything is fine, but if you do, it gets ugly really fast. And that is the reason why other languages are probably be better suited if you want to write something bigger that 10 lines.

However, I have to admit, I don't follow that advice myself...

[+] krylon|3 years ago|reply
I sometimes regret I never learned to "really" write shell scripts. I stumbled across Perl early on, and for anything more complex than canned command invocation(s) or a simple loop, I usually go for Perl.

There is something to be said in favor of the shell being always available, but Perl is almost always available. FreeBSD does not have it base of the base system, but OpenBSD does, and most Linux distros do, too.

But it is fun to connect a couple of simple commands via pipes and create something surprisingly complex. I don't do it all the time, but it happens.

[+] gtowey|3 years ago|reply
As someone who has used a lot of shell over my career, I do love it as a utility and a programming paradigm.

However the biggest issues I've had is that the code is really hard to test, error handling in shell isn't robust, and reusability with library type methods is not easy to organize or debug.

Those are deal breakers for me when it comes to building any kind of non trivial system.

[+] floitsch|3 years ago|reply
Shell scripting also inspired some choices (especially syntax) of the Toit language (toitlang.org).

Clearly, it's for a different purpose, and there are some things that wouldn't work in a general-purpose language that isn't as focused on line-based string processing, but we are really happy with the things we took from bash.

[+] TristanBall|3 years ago|reply
Aye.. I've been saying for years that shell scripting is how I meditate, and I'm only mostly joking

Shell quoting though, Aieeee...

I find I have to shift gears quite substantially moving from shell or powershell to anything else...

"I'll just pipe the output of this function into.. oh, right"

[+] cryptonector|3 years ago|reply
I've written a lot of shell scripts. I have my own best practices that work for me. I don't like it one bit. I mean, it's enjoyable to write shell scripts, it's just not enjoyable to deal with them long-term.
[+] Beltalowda|3 years ago|reply
> Use bash. Using zsh or fish or any other, will make it hard for others to understand / collaborate. Among all shells, bash strikes a good balance between portability and DX.

I think fish is quite a bit different in terms of syntax and semantics (I'm not very familiar with it), but zsh is essentially the same as bash except without most of the needless footguns and awkwardness. zsh also has many more advanced features, which you don't need to use (and many people are unaware of them anyway), but will very quickly become useful; in bash all sorts of things require obscure incantations and/or shell pipelines that almost make APL seem obvious in comparison.

In my experience few people understand bash (or POSIX sh) in the first place, partly because everything is so difficult and full of caveats. Half my professional shell scripting experience on the job is fixing other people's scripts. So might as well use something that doesn't accidentally introduce bugs every other line.

Most – though obviously far from all – scripts tend to be run in environments you control; portability is often overrated and not all that important (except when it is of course). Once upon a time I insisted on POSIX sh, and then I realised that actually, >90% of the scripts I wrote were run just by me or run only in an environment otherwise under my control, and that it made no sense. I still use POSIX sh for some public things I write, when it makes sense, but that's fairly rare.

I think bash is really standing in the way of progress, whether that progress is in the form of fish, zsh, oil shell, or something else, because so many people conflate "shell" with "bash", similar to how people conflate "Google" with "search" or "git" with "GitHub" (to some degree).

[+] hiepph|3 years ago|reply
I can't really stand Bash's arcane syntax, it drains my brain power (and time of consulting manual) every time I have to work with it. Switching to Fish has been a breath of fresh air for me. I think some people who want to use only Bash need to open their conservative mind. All of my personal shell scripts now are converted to Fish. If I want to run some POSIX-compatible script then I just use `bash scripts.sh`

Of course Bash is ubiquitous so I use them whenever I can in the company. A golden rule for me is: if it has more than 50 lines then I should probably write in a decent programming language (e.g. Ruby). It makes maintenance so much easier.

[+] benreesman|3 years ago|reply
A little personal color: I’m kind of a terminal tweak-fanatic but I’ve stuck with bash.

Ten years or so ago the cool kids were using zsh: which is in general a pretty reasonable move, it’s got way more command-line amenities than bash (at least built in).

Today fish is the fucking business: fish is so much more fun as a CLI freak.

But I guess I’ve got enough PTSD around when k8s or it’s proprietary equivalents get stuck that I always wanted to be not only functional but fast in outage-type scenarios that I kept bash as a daily driver.

Writing shell scripts of any kind is godawful, the equivalent python is the code you want to own, but it’s universality is a real selling point, like why I keep half an eye on Perl5 even though I loathe it: it may suck but it’s always there when the klaxon is going off.

The best possible software is useless if it’s not installed.

[+] pizza234|3 years ago|reply
I don't know fish, but I don't consider zsh a step in the right direction, as it tries to be just a cleaned up Bash, which is not enough.

There is a general problem in the fact that a radical evolution of glue languages wouldn't be popular because devs rather use Python, and small evolutions wouldn't be popular (ie. zsh), because they end up being confusing (since they're still close to Bash) and not bringing significant advantages.

I'm curious why there haven't been attempts to write a modern glue language (mind that languages like Python don't fit this class). I guess that Powershell (which I don't know, though) has been the only attempt.

[+] aasasd|3 years ago|reply
I think the article means using Bash for scripting, while the reader could use anything they want interactively. That's what I do—I use zsh, but I don't script in zsh.
[+] ilyt|3 years ago|reply
> Most – though obviously far from all – scripts tend to be run in environments you control; portability is often overrated and not all that important (except when it is of course)

If you're at that spot, don't use shell in the first place but whatever other scripting language your team uses. Well, unless it's "pipe this to that to that", sh has no parallel here

[+] selectnull|3 years ago|reply
> Use the .sh (or .bash) extension for your file. It may be fancy to not have an extension for your script, but unless your case explicitly depends on it, you’re probably just trying to do clever stuff. Clever stuff are hard to understand.

I don't agree with this one. When I name my script without extension (btw, .sh is fine, .bash is ugly) I want my script to look just like any other command: as a user I do not care what the language program is written in, I care about its output and what it does.

When I develop a script, I get the correct syntax highlight becuase of the shebang so the extension doesn't matter.

The rest of the post is great.

[+] throwawaaarrgh|3 years ago|reply
"Use set -o errexit"

Only if it doesn't matter that the script fails non-gracefully. Some scripts are better to either have explicit error handling code, or simply never fail. In particular, scripts you source into your shell should not use set options to change the shell's default behavior.

"Prefer to use set -o nounset."

ALWAYS use this option. You can test for a variable that might not be set with "${FOO:-}". There is no real downside.

"Use set -o pipefail."

Waste of time. You will spend so much time debugging your app from random pipe failures that actually didn't matter. Dont use this option; just check the output of the pipe for sane values.

"Use [[ ]] for conditions"

No!!! Only use that for bashisms where there's no POSIX alternative and try to avoid them wherever possible. YAGNI!

"Use cd "$(dirname "$0")""

Use either "$(dirname "${BASH_SOURCE[0]}")" or grab a POSIX readfile-f implementation.

"Use shellcheck."

This should have been Best Practice #1. You will learn more about scripting from shellcheck than 10 years worth of blog posts. Always use shellcheck. Always.

Also, don't use set -o nounset when set -u will do. Always avoid doing something "fancy" with a Bashism if there's a simpler POSIX way. The whole point of scripts is for them to be dead simple.

[+] suprjami|3 years ago|reply
Use a linter.

Pass all scripts through https://www.shellcheck.net/ or use `shellcheck` on the commandline.

Learn the things it tells you and implement them in future scripts.

[+] Xophmeister|3 years ago|reply
There's a bug in his template.

He suggests to `set -eu`, which is a good idea, but then immediately does this:

    if [[ "$1" =~ ^-*h(elp)?$ ]]; ...
If the script is given no arguments, this will exit with an unbound variable error. Instead, you want something like this:

    if [[ "${1-}" =~ ^-*h(elp)?$ ]]; then
[+] xelxebar|3 years ago|reply
> set -o errexit

Unfortunately, `errexit` is fairly subtle. For example

    [ "${some_var-}" ] && do_something
is a standard way to `do_something` only when `some_var` is empty. With `errexit`, naively, this should fail, since `false && anything` is always false. However, `errexit` in later versions of Bash (and dash?) ignore this case, since the idiom is nice.

However! If that's the last line of a function, then the function's return code will inherit the exit code of that line, meaning that

    f(){ [ "${some_var-}" ] && do_something;}; f
will actually trigger `errexit` when `some_var` is empty, despite the code being functionally equivalent to the above, non-wrapped call.

Anyway, there are a few subtleties like this that are worth being aware of. This is a good, but dated, reference: https://mywiki.wooledge.org/BashFAQ/105

[+] ndsipa_pomu|3 years ago|reply
I'm not convinced about having shell scripts end with ".sh" as you may be writing a simple command style script and shouldn't have to know or worry about what language it's using.

I'm a fan of using BASH3 boilerplate: https://bash3boilerplate.sh/

It's standalone, so you just start a script using it as a template and delete bits that you don't want. To my mind, the best feature is having consistent logging functions, so you're encouraged to put in lots of debug commands to output variable contents and when you change LOG_LEVEL, all the extraneous info doesn't get shown so there's no need to remove debug statements at all.

The other advantage is the option parsing, although I don't like the way that options have to have a short option (e.g. -a) - I'd prefer to just use long options.

[+] rgrau|3 years ago|reply
> If appropriate, change to the script’s directory close to the start of the script.

> And it’s usually always appropriate.

I wouldn't think so. You don't know where your script will be called from, and many times the parameters to the script are file paths, which are relative to the caller's path. So you usually don't want to do it.

I collected many tips&tricks from my experience with shell scripts that you may also find useful: https://raimonster.com/scripting-field-guide/

[+] bluetomcat|3 years ago|reply
Use the shell only if your script is mostly about calling other programs and filtering and redirecting their output. That's what the syntax of these languages is optimised for. As soon as you need any data manipulation (i.e. arrays, computation, etc.) it becomes a pain and Python is the much better fit.
[+] jph|3 years ago|reply
I favor POSIX and dash over bash, because POSIX is more portable.

If a shell script needs any kind of functionality beyond POSIX, then that's a good time to upgrade to a higher-structure programming language.

Here's my related list of shell script tactics:

http://github.com/sixarm/unix-shell-script-tactics

[+] lhoursquentin|3 years ago|reply
> [[ ]] is a bash builtin, and is more powerful than [ ] or test.

Agreed on the powerful bit, however [[ ]] is not a "builtin" (whereas [ and test are builtins in bash), it's a reserved word which is more similar to if and while.

That why [[ ]] can break some rules that builtins cannot, such as `[[ 1 = 1 && 2 = 2 ]]` (vs `[ 1 = 1 ] && [ 2 = 2]` or `[ 1 = 1 -a 2 = 2 ]`, -a being deprecated).

Builtins should be considered as common commands (like ls or xargs) since they cannot bypass some fundamental shell parsing rules (assignment builtins being an exception), the main advantages of being a builtin being speed (no fork needed) and access to the current shell process env (e.g. read being able to assign a variable in the current process).

[+] lockedinspace|3 years ago|reply
This is not a best practices guide, please look forward to: https://mywiki.wooledge.org/BashGuide

For example, using cd "$(dirname "$0")" to get the scripts location is not reliable, you could use a more sophisticated option such as: $(dirname $BASH_SOURCE)

[+] bheadmaster|3 years ago|reply
> For copy-paste: if [[ -n "${TRACE-}" ]]; then set -o xtrace; fi

> People can now enable debug mode, by running your script as TRACE=1 ./script.sh instead of ./script.sh.

The above "if" condition will set xtrace even when user explicitly disables trace by setting TRACE=0.

A correct way of doing this would be:

    if [[ "${TRACE-0}" == "1" ]]; then set -o xtrace; fi
[+] grumbel|3 years ago|reply
What would be the justification for 'cd "$(dirname "$0")"'? Going to the scripts directory does not seem very helpful. If I don't care about the current directory, I might just go to '/' or a temporary directory, when I do care about it I better stay in it or interpreting relative command line arguments is going to get difficult. When symbolic links are involved, dirname will also give the wrong directory.
[+] ndsipa_pomu|3 years ago|reply
I think BASH scripting is the opposite of riding a bike - you end up re-learning it almost every time you need to do it
[+] ghostoftiber|3 years ago|reply
Instead of implementing a -h or --help, consider using some code like "if nothing else matches, display the help". The asterisk is for this purpose.

  while getopts :hvr:e: opt
  do
      case $opt in
          v)
              verbose=true
              ;;
          e)
              option_e="$OPTARG"
              ;;
          r)
              option_r="$option_r $OPTARG"
              ;;
          h)
              usage
              exit 1
              ;;
          \*)
              echo "Invalid option: -$OPTARG" >&2
              usage # call some echos to display docs or something...
              exit 2
              ;;
      esac
  done
[+] erlkonig|3 years ago|reply
Relying on errexit to save one from disaster is also often fatal, except for surpassingly simple scripts. While inside of many different kinds of control structures, the errexit is disabled, and usually just provides a false sense of security.

For someone who knows errexit can't be trusted, and codes defensively anyway, it's fine.