(no title)
salmo | 1 year ago
If I want logic beyond that, then I skip the shell and write “real” software.
I personally find those both to be more readable and easier to fit in my head than long complex jq expressions. But that’s completely subjective and others may find the jq expression language easier to read than shell or (choose your programming language).
digdugdirk|1 year ago
I'm very interested, but not a Linux person, do you know of any good resources for learning the Linux shell as a programming language?
[0] https://jqlang.github.io/jq/
salmo|1 year ago
It’s a very old book and the audience was using dumb terminals. But it made me understand why and how. I think I’ve read every Kernighan book at this point and most he was involved in because he is just so amazing and not just conveying facts, but teaching how to think idiomatically in the topic.
I also used awk for 2 decades, kind of like how I use jq now. But when I read his memoir I suddenly “got it.” What I make with it now is intentional and not just me banging on the keyboard until it works. A great middle ground for something a little sophisticated, but not worth writing a full program for.
Something else that helped me was to install a minimal distro… actually a base FreeBSD install would be great… and read the man pages for all the commands. I don’t remember the details, but I learned that things existed. I have many man pages that I look at the same options on every few months because I’m not positive I remember right. Heck, I ‘man test’ all the time still. (‘test’ and ‘[‘ are the same thing)
I also had an advantage of 2 great coworkers. They’d been working on Unix since the 80s and their feedback helped me be more efficient, clean, and avoid “useless use of cat” problems.
I also highly recommend using shellcheck. I sometimes disagree with it when I’m intentionally abusing shell behavior, but it’s a great way to train good habits and prevent bugs that only crop up with bad input, scale, etc. I get new devs to use it and it’s helped them “ramp up” quickly, with me explaining the “why” from time to time.
But yeah. The biggest problem I see is that people think there is more syntax than there really is (like my test and [ comment). And remember it’s all text, processes, and files. Except when we pretend it’s not ;).
NortySpock|1 year ago
Here's the thing. These tools have been used since the '70s to slice, dice and filter log files, CSVs, or other semi-structured data. They can be chained together with the pipe command. Sys admins were going through 100MB logs with these tools before CPUs hit the gigahertz
These tools are blisteringly fast, and they are basically installed on every Linux machine.
https://github.com/onceupon/Bash-Oneliner
And for a different play-by-play example:
https://adamdrake.com/command-line-tools-can-be-235x-faster-...
coldtea|1 year ago
Or one of the most old standing widespread clubs of computing open standard language :)
"jq is like sed for JSON data - you can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep and friends let you play with text."
Translation:
JQ is like a (UNIX/POSIX staple command line text-manipulation tool) but specialized for text structured in JSON format. You can use it to extract parts of a JSON document (slice), keep nodes based on some criteria (filter), transform each element in a list of structured data to get a new list with the transformed versions (map), and do that as easily as you can with the sed (basic command line text manipulation program), awk (command line text manipulation program with a full featured text-processing oriented language), grep (command line program to search for strings), and other assorted unix userland programs.