Don't output ANSI colour codes directly - your output could redirect to a file, or perhaps the user simply prefers no colour. Use tput instead, and add a little snippet like this to the top of your script:
This checks that the tput command exists (using the bash 'command' builtin rather than which(1) - surprisingly, which can't always be relied upon to be installed even on modern GNU/Linux systems), that stdout is a tty, and that the NO_COLOR env var is not set. If any of these conditions are false, a no-op tput function is defined.
This little snippet of setup lets you sprinkle tput invocations through your script knowing that it's going to do the right thing in any situation.
This reads like what I've named as "consultantware" which is a type of software developed by security consultants who are eager to write helpful utilities but have no idea about the standards for how command line software behaves on Linux.
It ticks so many boxes:
* Printing non-output information to stdout (usage information is not normal program output, use stderr instead)
* Using copious amounts of colours everywhere to draw attention to error messages.
* ... Because you've flooded my screen with even larger amount of irrelevant noise which I don't care about (what is being ran).
* Coming up with a completely custom and never before seen way of describing the necessary options and arguments for a program.
* Trying to auto-detect the operating system instead of just documenting the non-standard dependencies and providing a way to override them (inevitably extremely fragile and makes the end-user experience worse). If you are going to implement automatic fallbacks, at least provide a warning to the end user.
* ... All because you've tried to implement a "helpful" (but unnecessary) feature of a timeout which the person using your script could have handled themselves instead.
* pipefail when nothing is being piped (pipefail is not a "fix" it is an option, whether it is appropriate is dependant on the pipeline, it's not something you should be blanket applying to your codebase)
* Spamming output in the current directory without me specifying where you should put it or expecting it to even happen.
* Using set -e without understanding how it works (and where it doesn't work).
* -z instead of actually checking how many arguments you got passed and trusting the end user if they do something weird like pass an empty string to your program
* echo instead of printf
* `print_and_execute sdk install java $DEFAULT_JAVA_VERSION` who asked you to install things?
* `grep -h "^sdk use" "./prepare_$fork.sh" | cut -d' ' -f4 | while read -r version; do` You're seriously grepping shell scripts to determine what things you should install?
* Unquoted variables all over the place.
* Not using mktemp to hold all the temporary files and an exit trap to make sure they're cleaned up in most cases.
BOFH much? It’s not as if this script is going to be used by people that have no idea what is going to happen. It’s a script, not a command.
Your tone is very dismissive. Instead of criticism all of these could be phrased as suggestions instead. It’s like criticising your junior for being enthusiastic about everything they learned today.
> pipefail when nothing is being piped (pipefail is not a "fix" it is an option
I think it’s pretty good hygiene to set pipefail in the beginning of every script, even if you end up not using any pipes. And at that point is it that important to go back and remove it only to then have to remember that you removed it once you add a pipe?
On the scale of care, “the script can blow up in surprising ways” severely outweighs “error messages are in red.” Also, as someone else pointed out, what if I’m redirecting to a file?
It is impossible to write a safe shell script that does automatic error checking while using the features the language claims are available to you.
Here’s a script that uses real language things like a function and error checking, but which also prints “oh no”:
set -e
f() {
false
echo oh
}
if f
then
echo no
fi
set -e is off when your function is called as a predicate. That’s such a letdown from expected- to actual-behavior that I threw it in the bin as a programming language. The only remedy is for each function to be its own script. Great!
In terms of sh enlightenment, one of the steps before getting to the above is realizing that every time you use “;” you are using a technique to jam a multi-line expression onto a single line. It starts to feel incongruous to mix single line and multi line syntax:
# weird
if foo; then
bar
fi
# ahah
if foo
then
bar
fi
Writing long scripts without semicolons felt refreshing, like I was using the syntax in the way that nature intended.
Shell scripting has its place. Command invocation with sh along with C functions is the de-facto API in Linux. Shell scripts need to fail fast and hard though and leave it up to the caller (either a different language, or another shell script) to figure out how to handle errors.
I have since copied this pattern for many scripts: logging functions, grouping all global vars and constants at the top and creating subcommands using shift.
The following check for gtimeout means that other OSs that don't have the expected behaviour in either command won't break the script, you'll just get a warning message that isn't terribly relevant to them (but more helpful than simply failing or silently running without timeout/gtimeout. Perhaps improving that message would be the better option.
Though for that snippet I would argue for testing for the command rather than the OS (unless Macs or some other common arrangement has something incompatible in the standard path with the same command name?).
Power users already know to always do `-h / --help` first, but this way even people that are less familiar with command line can use your tool.
if that's a script that's run very rarely or once, entering the fields sequentially could also save time, compared to common `try to remember flags -> error -> check help -> success` flow.
> shell scripts are the wrong solution for anything over ~50 lines of code.
I don't think LOC is the correct criterion.
I do solve many problems with bash and I enjoy the simplicity of shell coding. I even have long bash scripts. But I do agree that shell scripting is the right solution only if
= you can solve the problem quickly
= you don't need data structures
= you don't need math
= you don't need concurrency
If you want a great script user experience, I highly recommend avoiding the use of pipefail. It causes your script to die unexpectedly with no output. You can add traps and error handlers and try to dig out of PIPESTATUS the offending failed intermediate pipe just to tell the user why the program is exiting unexpectedly, but you can't resume code execution from where the exception happened. You're also now writing a complicated ass program that should probably be in a more complete language.
Instead, just check $? and whether a pipe's output has returned anything at all ([ -z "$FOO" ]) or if it looks similar to what you expect. This is good enough for 99% of scripts and allows you to fail gracefully or even just keep going despite the error (which is good enough for 99.99% of cases). You can also still check intermediate pipe return status from PIPESTATUS and handle those errors gracefully too.
Every time I see a “good” bash script it reminds me of how incredibly primitive every shell is other than PowerShell.
Validating parameters - a built in declarative feature! E.g.: ValidateNotNullOrEmpty.
Showing progress — also built in, and doesn’t pollute the output stream so you can process returned text AND see progress at the same time. (Write-Progress)
Error handling — Try { } Catch { } Finally { } works just like with proper programming languages.
Platform specific — PowerShell doesn’t rely on a huge collection of non-standard CLI tools for essential functionality. It has built-in portable commands for sorting, filtering, format conversions, and many more. Works the same on Linux and Windows.
Etc…
PS: Another super power that bash users aren’t even aware they’re missing out on is that PowerShell can be embedded into a process as a library (not an external process!!) and used to build an entire GUI that just wraps the CLI commands. This works because the inputs and outputs are strongly typed objects so you can bind UI controls to them trivially. It can also define custom virtual file systems with arbitrary capabilities so you can bind tree navigation controls to your services or whatever. You can “cd” into IIS, Exchange, and SQL and navigate them like they’re a drive. Try that with bash!
I also hate bash scripting, and as far as Unix shell go, bash is among the best. So many footguns... Dealing with filenames with spaces is a pain, and files that start with a '-', "rm -rf" in a script is a disaster waiting to happen unless you triple check everything (empty strings, are you in the correct directory, etc...), globs that don't match anything, etc...
But interactively, I much prefer Unix shells over PowerShell. When you don't have edge cases and user input validation to deal with, these quirks become much more manageable. Maybe I am lacking experience, but I find PowerShell uncomfortable to use, and I don't know if it has all these fancy interactive features many Unix shell have nowadays.
What you are saying essentially is that PowerShell is a better programming language than bash, quite a low bar actually. But then you have to compare it to real programming languages, like Perl or Python.
Perl has many shell-like features, the best regex support of any language, which is useful when everything is text, many powerful features, and an extensive ecosystem.
Python is less shell-like but is one of the most popular languages today, with a huge ecosystem, clean code, and pretty good two-way integration, which mean you can not only run Python from your executable, but Python can call it back.
If what you are for is portability and built-in commands, then the competition is Busybox, a ~1MB self-contained executable providing the most common Unix commands and a shell, very popular for embedded systems.
Bash also has a built-in to validate parameters; it’s called test, and is usually called with [], or [[]] for some bash-specifics.
Re: non-standard tools, if you’re referring to timeout, that’s part of GNU coreutils. It’s pretty standard for Linux. BSDs also have it from what I can tell, so it’s probably a Mac-ism. In any case, you could just pipe through sleep to achieve the same thing.
> …inputs and outputs are strongly typed objects
And herein is the difference. *nix-land has everything as a file. It’s the universal communication standard, and it’s extremely unlikely to change. I have zero desire to navigate a DB as though it were a mount point, and I’m unsure why you would ever want to. Surely SQL Server has a CLI tool like MySQL and Postgres.
And for anyone who might be open to trying powershell, the cross platform version is pwsh.
Pythonistas who are used to __dir__ and help() would find themselves comfortable with `gm` (get-member) and get-help to introspect commands.
You will also find Python-style dynamic typing, except with PHP syntax. $a=1; $b=2; $a + $b works in a sane manner (try that with bash). There are still funny business with type coercion. $a=1; $b="2"; $a+$b (3); $b+$a ("21");
I also found "get-command" very helpful with locating related commands. For instance "get-command -noun file" returns all the "verb-noun" commands that has the noun "file". (It gives "out-file" and "unblock-file")
Another nice thing about powershell is you can retain all your printf debugging when you are done. Using "Write-Verbose" and "Write-Debug" etc allows you to write at different log levels.
Once you are used to basic powershell, there are bunch of standard patterns like how to do Dry-Runs, and Confirmation levels. Powershell also supports closures, so people create `make` style build systems and unit test suites with them.
The big problem with trying to move on from BASH is that it's everywhere and is excellent at tying together other unix tools and navigating the filesystem - it's at just the right abstraction level to be the duct tape of languages. Moving to other languages provides a lot more safety and power, but then you can't rely on the correct version being necessarily installed on some machine you haven't touched in 10 years.
I'm not a fan of powershell myself as the only time I've tried it (I don't do much with Windows), I hit a problem with it (or the object I was using) not being able to handle more than 256 characters for a directory and file. That meant that I just installed cygwin and used a BASH script instead.
My issue with powershell is that it’s niche language with a niche “stdlib” which cannot be used as general purpose. The same issue I have with AHK. These two are languages that you use for a few hours and then forget completely in three weeks.
Both of them should be simply python and typescript compatible dlls.
You can “cd” into IIS, Exchange, and SQL and navigate them like they’re a drive. Try that with bash!
I ask LLMs to modify the shell script to strictly follow Google’s Bash scripting guidelines[^1]. It adds niceties like `set -euo pipefail`, uses `[[…]]` instead of `[…]` in conditionals, and fences all but numeric variables with curly braces. Works great.
> This matches the output format of Bash's builtin set -x tracing, but gives the script author more granular control of what is printed.
I get and love the idea but I'd consider this implementation an anti-pattern. If the output mimics set -x but isn't doing what that is doing, it can mislead users of the script.
I can highly recommend using bash3boilerplate (https://github.com/kvz/bash3boilerplate) if you're writing BASH scripts and don't care about them running on systems that don't use BASH.
It provides logging facilities with colour usage for the terminal (not for redirecting out to a file) and also decent command line parsing. It uses a great idea to specify the calling parameters in the help/usage information, so it's quick and easy to use and ensures that you have meaningful information about what parameters the script accepts.
Also, please don't write shell scripts without running them through ShellCheck. The shell has so many footguns that can be avoided by correctly following its recommendations.
Even 4 can be generalized to "be deliberate about what you do with a failed function call (etc) - does it exit the command? Log/print an error and continue? Get silently ignored? Handled?"
Few months ago, I wrote a bash script for an open-source project.
I created a small awk util that I used throughout the script to style the output. I found it very convenient. I wonder if something similar already exists.
exit_with_help_message() {
local exit_code=$1
cat <<EOF | theme
CQL Playground
Sub-commands:
help
Show this help message
hello
Onboarding checklist — Get ready to use the playground
build-cql-compiler
Rebuild the CQL compiler
Also, you'd want to put in a double dash to signify the end of arguments as otherwise someone could set VAR="--no-preserve-root " and truly trash the system. Also, ${VAR} needs to be in double quotes for something as dangerous as a "rm" command:
[+] [-] haileys|1 year ago|reply
This little snippet of setup lets you sprinkle tput invocations through your script knowing that it's going to do the right thing in any situation.
[+] [-] Arch-TK|1 year ago|reply
It ticks so many boxes:
* Printing non-output information to stdout (usage information is not normal program output, use stderr instead)
* Using copious amounts of colours everywhere to draw attention to error messages.
* ... Because you've flooded my screen with even larger amount of irrelevant noise which I don't care about (what is being ran).
* Coming up with a completely custom and never before seen way of describing the necessary options and arguments for a program.
* Trying to auto-detect the operating system instead of just documenting the non-standard dependencies and providing a way to override them (inevitably extremely fragile and makes the end-user experience worse). If you are going to implement automatic fallbacks, at least provide a warning to the end user.
* ... All because you've tried to implement a "helpful" (but unnecessary) feature of a timeout which the person using your script could have handled themselves instead.
* pipefail when nothing is being piped (pipefail is not a "fix" it is an option, whether it is appropriate is dependant on the pipeline, it's not something you should be blanket applying to your codebase)
* Spamming output in the current directory without me specifying where you should put it or expecting it to even happen.
* Using set -e without understanding how it works (and where it doesn't work).
[+] [-] Arch-TK|1 year ago|reply
* #!/bin/bash instead of #!/usr/bin/env bash
* [ instead of [[
* -z instead of actually checking how many arguments you got passed and trusting the end user if they do something weird like pass an empty string to your program
* echo instead of printf
* `print_and_execute sdk install java $DEFAULT_JAVA_VERSION` who asked you to install things?
* `grep -h "^sdk use" "./prepare_$fork.sh" | cut -d' ' -f4 | while read -r version; do` You're seriously grepping shell scripts to determine what things you should install?
* Unquoted variables all over the place.
* Not using mktemp to hold all the temporary files and an exit trap to make sure they're cleaned up in most cases.
[+] [-] Aeolun|1 year ago|reply
Your tone is very dismissive. Instead of criticism all of these could be phrased as suggestions instead. It’s like criticising your junior for being enthusiastic about everything they learned today.
[+] [-] wodenokoto|1 year ago|reply
I think it’s pretty good hygiene to set pipefail in the beginning of every script, even if you end up not using any pipes. And at that point is it that important to go back and remove it only to then have to remember that you removed it once you add a pipe?
[+] [-] klysm|1 year ago|reply
[+] [-] irundebian|1 year ago|reply
[+] [-] hidelooktropic|1 year ago|reply
[+] [-] f1shy|1 year ago|reply
[+] [-] sgarland|1 year ago|reply
On the scale of care, “the script can blow up in surprising ways” severely outweighs “error messages are in red.” Also, as someone else pointed out, what if I’m redirecting to a file?
[+] [-] gorgoiler|1 year ago|reply
Here’s a script that uses real language things like a function and error checking, but which also prints “oh no”:
set -e is off when your function is called as a predicate. That’s such a letdown from expected- to actual-behavior that I threw it in the bin as a programming language. The only remedy is for each function to be its own script. Great!In terms of sh enlightenment, one of the steps before getting to the above is realizing that every time you use “;” you are using a technique to jam a multi-line expression onto a single line. It starts to feel incongruous to mix single line and multi line syntax:
Writing long scripts without semicolons felt refreshing, like I was using the syntax in the way that nature intended.Shell scripting has its place. Command invocation with sh along with C functions is the de-facto API in Linux. Shell scripts need to fail fast and hard though and leave it up to the caller (either a different language, or another shell script) to figure out how to handle errors.
[+] [-] Yasuraka|1 year ago|reply
https://github.com/containerd/nerdctl/blob/main/extras/rootl...
I have since copied this pattern for many scripts: logging functions, grouping all global vars and constants at the top and creating subcommands using shift.
[+] [-] oneshtein|1 year ago|reply
[+] [-] koolba|1 year ago|reply
[+] [-] klysm|1 year ago|reply
[+] [-] dspillett|1 year ago|reply
Though for that snippet I would argue for testing for the command rather than the OS (unless Macs or some other common arrangement has something incompatible in the standard path with the same command name?).
[+] [-] teo_zero|1 year ago|reply
[+] [-] sgarland|1 year ago|reply
That said, I’ve never used any of the BSDs, so I may be way off here.
[+] [-] archargelod|1 year ago|reply
For rarely run scripts, consider checking if required flags are missing and query for user input, for example:
Power users already know to always do `-h / --help` first, but this way even people that are less familiar with command line can use your tool.if that's a script that's run very rarely or once, entering the fields sequentially could also save time, compared to common `try to remember flags -> error -> check help -> success` flow.
[+] [-] xyzzy4747|1 year ago|reply
Use a better programming language. Go, Typescript, Rust, Python, and even Perl come to mind.
[+] [-] heresie-dabord|1 year ago|reply
I don't think LOC is the correct criterion.
I do solve many problems with bash and I enjoy the simplicity of shell coding. I even have long bash scripts. But I do agree that shell scripting is the right solution only if
[+] [-] pmarreck|1 year ago|reply
Meanwhile, 10 year old Bash scripts I've written still run unmodified.
Winner by a mile (from a software-longevity and low-maintenance perspective at least): Bash
[+] [-] sgarland|1 year ago|reply
[+] [-] shepherdjerred|1 year ago|reply
Bun has similar features: https://bun.sh/docs/runtime/shell
[+] [-] 0xbadcafebee|1 year ago|reply
Instead, just check $? and whether a pipe's output has returned anything at all ([ -z "$FOO" ]) or if it looks similar to what you expect. This is good enough for 99% of scripts and allows you to fail gracefully or even just keep going despite the error (which is good enough for 99.99% of cases). You can also still check intermediate pipe return status from PIPESTATUS and handle those errors gracefully too.
[+] [-] TeeMassive|1 year ago|reply
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
This is based on this SA's answer: https://stackoverflow.com/questions/59895/how-do-i-get-the-d...
I never got why Bash doesn't have a reliable "this file's path" feature and why people always take the current working directory for granted!
[+] [-] jiggawatts|1 year ago|reply
Validating parameters - a built in declarative feature! E.g.: ValidateNotNullOrEmpty.
Showing progress — also built in, and doesn’t pollute the output stream so you can process returned text AND see progress at the same time. (Write-Progress)
Error handling — Try { } Catch { } Finally { } works just like with proper programming languages.
Platform specific — PowerShell doesn’t rely on a huge collection of non-standard CLI tools for essential functionality. It has built-in portable commands for sorting, filtering, format conversions, and many more. Works the same on Linux and Windows.
Etc…
PS: Another super power that bash users aren’t even aware they’re missing out on is that PowerShell can be embedded into a process as a library (not an external process!!) and used to build an entire GUI that just wraps the CLI commands. This works because the inputs and outputs are strongly typed objects so you can bind UI controls to them trivially. It can also define custom virtual file systems with arbitrary capabilities so you can bind tree navigation controls to your services or whatever. You can “cd” into IIS, Exchange, and SQL and navigate them like they’re a drive. Try that with bash!
[+] [-] GuB-42|1 year ago|reply
But interactively, I much prefer Unix shells over PowerShell. When you don't have edge cases and user input validation to deal with, these quirks become much more manageable. Maybe I am lacking experience, but I find PowerShell uncomfortable to use, and I don't know if it has all these fancy interactive features many Unix shell have nowadays.
What you are saying essentially is that PowerShell is a better programming language than bash, quite a low bar actually. But then you have to compare it to real programming languages, like Perl or Python.
Perl has many shell-like features, the best regex support of any language, which is useful when everything is text, many powerful features, and an extensive ecosystem.
Python is less shell-like but is one of the most popular languages today, with a huge ecosystem, clean code, and pretty good two-way integration, which mean you can not only run Python from your executable, but Python can call it back.
If what you are for is portability and built-in commands, then the competition is Busybox, a ~1MB self-contained executable providing the most common Unix commands and a shell, very popular for embedded systems.
[+] [-] sgarland|1 year ago|reply
Re: non-standard tools, if you’re referring to timeout, that’s part of GNU coreutils. It’s pretty standard for Linux. BSDs also have it from what I can tell, so it’s probably a Mac-ism. In any case, you could just pipe through sleep to achieve the same thing.
> …inputs and outputs are strongly typed objects
And herein is the difference. *nix-land has everything as a file. It’s the universal communication standard, and it’s extremely unlikely to change. I have zero desire to navigate a DB as though it were a mount point, and I’m unsure why you would ever want to. Surely SQL Server has a CLI tool like MySQL and Postgres.
[+] [-] teyc|1 year ago|reply
Pythonistas who are used to __dir__ and help() would find themselves comfortable with `gm` (get-member) and get-help to introspect commands.
You will also find Python-style dynamic typing, except with PHP syntax. $a=1; $b=2; $a + $b works in a sane manner (try that with bash). There are still funny business with type coercion. $a=1; $b="2"; $a+$b (3); $b+$a ("21");
I also found "get-command" very helpful with locating related commands. For instance "get-command -noun file" returns all the "verb-noun" commands that has the noun "file". (It gives "out-file" and "unblock-file")
Another nice thing about powershell is you can retain all your printf debugging when you are done. Using "Write-Verbose" and "Write-Debug" etc allows you to write at different log levels.
Once you are used to basic powershell, there are bunch of standard patterns like how to do Dry-Runs, and Confirmation levels. Powershell also supports closures, so people create `make` style build systems and unit test suites with them.
[+] [-] ndsipa_pomu|1 year ago|reply
I'm not a fan of powershell myself as the only time I've tried it (I don't do much with Windows), I hit a problem with it (or the object I was using) not being able to handle more than 256 characters for a directory and file. That meant that I just installed cygwin and used a BASH script instead.
[+] [-] richbell|1 year ago|reply
PowerShell blows bash out of the water. I love it.
[+] [-] wruza|1 year ago|reply
Both of them should be simply python and typescript compatible dlls.
You can “cd” into IIS, Exchange, and SQL and navigate them like they’re a drive. Try that with bash!
This exists.
[+] [-] tomcam|1 year ago|reply
[+] [-] mixmastamyk|1 year ago|reply
fish, Python, and oilshell (ysh) are ultimately on better footing.
[+] [-] rednafi|1 year ago|reply
[^1]: https://google.github.io/styleguide/shellguide.html
[+] [-] _def|1 year ago|reply
I get and love the idea but I'd consider this implementation an anti-pattern. If the output mimics set -x but isn't doing what that is doing, it can mislead users of the script.
[+] [-] delusional|1 year ago|reply
The author could also consider trapping debug to maybe be selective while also making it a little more automatic.
[+] [-] ndsipa_pomu|1 year ago|reply
It provides logging facilities with colour usage for the terminal (not for redirecting out to a file) and also decent command line parsing. It uses a great idea to specify the calling parameters in the help/usage information, so it's quick and easy to use and ensures that you have meaningful information about what parameters the script accepts.
Also, please don't write shell scripts without running them through ShellCheck. The shell has so many footguns that can be avoided by correctly following its recommendations.
[+] [-] emmelaich|1 year ago|reply
[+] [-] bfung|1 year ago|reply
[+] [-] kemitche|1 year ago|reply
[+] [-] denvaar|1 year ago|reply
[+] [-] zdw|1 year ago|reply
[+] [-] anthk|1 year ago|reply
You can use it as a shebang too:
[+] [-] olejorgenb|1 year ago|reply
[+] [-] jojo_|1 year ago|reply
I created a small awk util that I used throughout the script to style the output. I found it very convenient. I wonder if something similar already exists.
Some screenshots in the PR: https://github.com/ricomariani/CG-SQL-author/pull/18
Let me know guys if you like it. Any comments appreciated.
Go to source: https://github.com/ricomariani/CG-SQL-author/blob/main/playg...Example usage:
Go to source: https://github.com/ricomariani/CG-SQL-author/blob/main/playg... Go to source: https://github.com/ricomariani/CG-SQL-author/blob/main/playg...[+] [-] fragmede|1 year ago|reply
[+] [-] teroshan|1 year ago|reply
https://github.com/ValveSoftware/steam-for-linux/issues/3671
[+] [-] ndsipa_pomu|1 year ago|reply