These may be objectively superior (I haven't tested), but I have come to realize (like so many others) that if you ever change your OS installation, set up VMs, or SSH anywhere, preferring these is just an uphill battle that never ends. I don't want to have to set these up in every new environment I operate in, or even use a mix of these on my personal computer and the traditional ones elsewhere.
Learn the classic tools, learn them well, and your life will be much easier.
Some people spend the vast majority of their time on their own machine. The gains of convenience can be worth it. And they know enough of the classic tools that it's sufficient in the rare cases when working on another server.
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
Some are so vastly better that it's worth whatever small inconvenience comes with getting them installed. I know the classic tools very well, but I'll prefer fd and ripgrep every time.
One of the reasons I really like Nix, my setup works basically everywhere (as long the host OS is either Linux or macOS, but those are the only 2 environments that I care). I don't even need root access to install Nix since there are multiple ways to install Nix rootless.
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
That goes against the UNIX philosophy IMO. Tools doing "one thing and doing it well" also means that tools can and should be replaced when a superior alternative emerges. That's pretty much the whole point of simple utilities. I agree that you should learn the classic tools first as it's a huge investment for a whole career, but you absolutely should learn newer alternatives too. I don't care much for bat or eza, but some alternatives like fd (find alt) or sd (sed alt) are absolute time savers.
> Learn the classic tools, learn them well, and your life will be much easier.
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
I indeed would not want to feel stranded with a bespoke toolkit. But I also don't think shying away from good tools is the answer. Generally I think using better tools is the way to go.
Often there are plenty of of paths open to getting a decent environment as you go:
Mostly, I rely on ansible scripts to install and configure the tools I use.
One fallback I haven't seen mentioned, that can get a lot of mileage from it: use sshfs to mount the target system locally. This allows you to use local tool & setup effectively against another machine!
Along those lines, Dvorak layouts are more efficent, but I use qwerty because it works pretty much everywhere (are small changes like AZERTY still a thing? Certainly our French office is an "international" layout, and generally the main pain internationally are "@" being in the wrong place, and \ not working -- for the latter you can use user@domain when logging into a windows machine, rather than domain\user)
I know well enough my way around vi, because although XEmacs was my editor during the 1990's when working on UNIX systems, when visiting customers there was a very high probability that they only had ed and vi installed on their server systems.
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
When I got my first Unix account [1] I was in a Gnu emacs culture and used emacs from 1989 to 2005 or so. I made the decision to switch to vi for three reasons: (1) less clash with a culture where I mostly use GUI editors that use ^S for something very different than what emacs does, (2) vim doesn't put in continuation characters that break cut-and-paste, (3) often I would help somebody out with a busted machine where emacs wasn't installed, the package database was corrupted, etc and being able to count on an editor that is already installed to resolve any emergency is helpful.
[1] Not like the time one of my friends "wardialed" every number in my local calling area and posted the list to a BBS and I found that some of them could be logged into with "uucp/uucp" and the like. I think Bell security knew he rang everybody's phone in the area but decided to let billing handle the problem because his parents had measured service.
I started a new job and spent maybe a day setting up the tools and dotfiles on my development machine in the cloud. I'm going to keep it throughout my employment so it's worth the investment. And I install most of the tools via nix package manager so I don't have to compile things or figure out how to install them on a particular Linux distribution.
L
I have a chef cookbook that sets up all the tools I like to have on my VMs. When I bootstrap a VM it includes all the stuff I want like fish shell and other things that aren’t standard. The chef cookbook also manages my SSH keys and settings.
I have some of these tools, they are not "objectively superior". A lot of them make things prettier with colors, bargraphs, etc... It is nice on a well-configured terminal, not so much in a pipeline. Some of them are full TUIs, essentially graphical tools that run in a terminal rather than traditional command line tools.
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
For scripting, no doubt about that! But if you want to use some custom tool you can use sshfs to mount whatever is on the other side onto your system and work from there. That has its own set of limitations but it makes some stuff much easier.
Never will I ever set up tools and home environment directly on the distro. Only in a rootfs that I can proot/toolbx/bwrap into. Not only I don't want to set up again on different computer, distro upgrade has nuked "fancy" tools enough times to be not worth it.
Not a comment on these particular tools, but I keep non-standard utilities that I use in my ~/bin/ directory, and they go with me when I move to a different system. The tools mentioned here could be handled the same way, making the uphill a little less steep.
Agreed, but some are nice enough that I'll make sure I get them installed where I can. 'ag' is my go to fast grep, and I get it installed on anything I use a lot.
As someone who logs into hundreds of servers in various networks, from various customers/clients, there is so little value in using custom tooling, as they will not be available on 90% of the systems.
I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.
95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.
Another reason emacs as an OS (not fully, but you know) is such a great way to get used to things you have on systems. Hence the quote: "GNU is my operating system, linux is just the current kernel".
As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.
Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.
"servers" is the key word here. Some of the tools listed on that page are just slightly "improved" versions of common sysadmin utilities, and indeed, those are probably not worth it. But some are really development tools, things that you'd install on the small number of machines where you do programming. Those might be.
The ones that leap out at me are ripgrep (a genuinely excellent recursive grepper), jq (a JSON processor - there is no alternative to this in the standard unix toolkit), and hyperfine (benchmarking).
Is there any tool or ssh extension that would bring these apps into the remote session?
Is something like that possible? Seems like you could conceivablely dump these small file size tools into a temp folder and and use them and that could be automated.
Is there a security issue with that? Do any of these tools need more permission than the remote session would have?
Maybe the main issue is portability of these apps?
This is certainly a common sentiment (I've felt it myself) so is it at all possible?
What's the relevance of these "as someone who ..." posts? Nobody cares that these tools don't happen to fit into your carefully curated list of tools that you install on remote computers. You can install these on your local computer to reap some benefits.
You're again confusing this website with your personal email inbox. This is a public message board, all messages you see haven't been written for you specifically - including this blog post.
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
Many of the entries do include this detail — e.g. "with syntax highlighting", "ncurses interface", and "more intuitive". I agree that "written in rust", "modern", and "better" aren't very useful!
I always enjoy these lists. I think most folks out there could probably successfully adopt at least one or two of these tools. For me, that’s ripgrep and jq. The former is a great drop-in replacement for grep and the latter solves a problem I needed solving. I’ll try out a few of the others on this list, too. lsd and dust both appeal to me.
I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.
Thank you to those who have put in so much effort. You’re making the community objectively better.
I think many of us linux admins have such a list. Mine in particular is carefully crafted around GPL-izing my stack as much as possible. I really like the format of this ikrima.dev one though! The other stuff is great too, worth a peruse.
Except that ripgrep isn't actually a drop-in replacement for grep as it behaves differently. It is a nice program don't get me wrong, but it is not interchangeable with grep.
I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
> I basically live in the terminal. However, every single one of these tools offers a solution to a problem that I don't have; aren't installed on my system; and mysteriously have many tens of thousands of github stars.
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
You never use fzf? What a tough life in terminal then huh. It's not as useful to run it directly, but pretty much any shell has plugin for fzf support that let's you Ctrl+R to fuzzy search over bash_history (or fish_history or whatever) and Ctrl+T let's you fuzzy search files in current directory.
The core Unix toolset is so good, that you can easily get by with it. Many of these tools are better, but still not necessary, and they certainly aren't widely available by default.
Out of curiosity, how would you recursively grep files ignoring (hidden files [e.g., `.git`]), only matching a certain file extension? (E.g., `rg -g '*.foo' bar`.)
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
The list includes jq. I’d frankly love to never have a problem that jq solves, but, well, here we are.
ripgrep is something I have installed, but use only via text editor integrations. fzf is nice for building ad-hoc TUIs. fd may make sense (I’m told it’s faster than find), but I already know enough find.
The “next gen ls” family of tools in the article is baffling.
I'd like to read this list, but the color scheme is among the least accessible that I've ever come across. Dark, greyish-blue text with dark, bluish-grey highlighting over a dark grey background. Wow.
If any fledgling designers are here, then take note and add this to your list of examples to avoid.
I find the opposite to be true. Most of these are really just reinventing the wheel of foundational GNU tools that are really powerful provided one has spent some time on them.
On the contrary, that's exactly what “modern” sounds like. I wonder when all those tools will go unmaintained. Coreutils, with all their problems, are maintained since before authors of many listed tools were born.
I’m on a Mac, and some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions. Rather than replace or fight against the system tools, I supplement them with a few extras. Honestly, most are marginal upgrades over what macOS ships with, except for fzf, which is a huge productivity boost. Fuzzy-finding through my shell history or using interactive autocompletion makes a noticeable difference day to day.
>some of the default tooling feels dated: GNU coreutils and friends are often stuck around mid-2000s versions
That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)
Every time such a list is posted, it tends to generate a lot of debate, but I do think there is at least 2 tools that are really a good addition to any terminal :
`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.
The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.
Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.
EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.
> But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}
> Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
grep will try to search inside .git. If your project is Javascript, it might be searching inside node_modules, or .venv if Python. ripgrep ignores hidden files, .gitignore and .ignore. You could try using `git grep` instead. ripgrep will still be faster, but the difference won't be as dramatic.
`jq` is pretty much the only one solving a real problem that no existing tool solves IMHO. Most of the others are rewrites for some opinionated 'better' (faster, syntax highlighting, written in Rust or whatever...).
would be good to have an indicator if it’s available with your distro by default or what package you’ll need to install it since all tools are only as useful as available they are…
I used htop for many years because it seemed more accessible to top, until I found that it didn't display kernel threads by default and when troubleshooting a load issue top found exactly what I needed when htop was generally unhelpful. Since then I've returned to top because it has everything I need and really is time tested; the htop/btop/whatever UI in my opinion is just theatrics.
While folks were building improved ls, cat, and so on, and jq for manipulating JSON data, Nushell has been happily doing all this in a consistent way and making it easier, to boot. I'm surprised to see Nushell missing from this list.
duf is pretty good for drive space, has some nice colours and graphs. But its also not as useful for feeding into other tools.
btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.
zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.
It looks quite fancy but I actually like it more for it's functionality, particularly it's tree view for navigating the processes list. I'm not a big fan of full multicolor in these kinds of tools and so appreciate how easy it is to flip to grey scale mode from the built in colour schemes (even from the TUI settings menu).
qq should be on this list. It's like jq but works with multiple file formats, including JSON, YAML, XML, &c. and has a really cool interactive TUI mode.
THAT looks like something worthwhile; I'll be looking into it.
I was going to top-post that the Unix/Linux command line tools were designed back in the day when data was pretty much line-oriented, e.g. one database record per line. Since then XML, and more recently JSON, have been invented, and tools like grep and sed just don't work for those formats. But you ninja'd me, sort of.
Modern doesn't always mean better. A better replacement for mplayer was mpv, and in some cases mplayer was faster than mpv (think about legacy machines).
- bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
- alias ls='ls -Fh' , problem solved. Now you have * for executables, / for directories and so on.
- ncdu it's fine, perfect for what it does
- iomenu it's much faster than fzf and it almost works the same
- jq it's fine, it's a good example on a new Unix tool
- micro it's far slower than even vim
- instead of nnn, sff https://github.com/sylphenix/sff with soap(1) (xdg-open replacement) from https://2f30.org create a mega fast environment. Add MuPDF and sxiv, and nnn and friends will look really slow compared to these.
Yes, you need to set config.h under both sff and soap, but they will run much, much faster than any Rust tool on legacy machines.
> bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
> bat it's a useless cat. Cat concatenates files. ANSI colour breaks that.
From the README:
>Whenever bat detects a non-interactive terminal (i.e. when you pipe into another process or into a file), bat will act as a drop-in replacement for cat and fall back to printing the plain file contents
bat works as normal cat for normal uses of cat and a better cat for all those "useless cat" situations we find ourselves in.
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
0x37|4 months ago
Learn the classic tools, learn them well, and your life will be much easier.
bonoboTP|4 months ago
Not everybody is a sysadmin manually logging into lots of independent, heterogeneous servers throughout the day.
tmountain|4 months ago
kokada|4 months ago
But yes, in the eventual case that I don't have Nix I can very much use the classic tools. It is not a binary choice, you can have both.
lucasoshiro|4 months ago
apt-get/pacman/dnf/brew install <everything that you need>
You'll need install those and other tools (your favorite browser, you favorite text editor, etc) anyway if you're changing your OS.
> or SSH anywhere
When you connect through SSH you don't have GUI and that's not a reason for avoiding using GUI tools, for example.
> even use a mix of these on my personal computer and the traditional ones elsewhere
I can't see the problem, really. I use some of those tools and they are convenient, but it doesn't matter that I can't work without that. For example, bat: it doesn't replace cat, it only outputs data with syntax highlight, makes my life easier but if I don't have it, ok.
thiht|4 months ago
oneeyedpigeon|4 months ago
Agreed, but that doesn't stop you from using/learning alternatives. Just use your preferred option, based on what's available. I realise this could be too much to apply to something like a programming language (despite this, many of us know more than one) or a graphics application, but for something like a pager, it should be trivial to switch back and forth.
skydhash|4 months ago
chimprich|4 months ago
E.g. I have ls set up aliased to eza as part of my custom set of configuration scripts. eza pretty much works as ls in most scenarios.
If I'm in an environment which I control and is all configured as I like it, then I get a shinier ls with some nice defaults.
If I'm in another environment then ls still works without any extra thought, and the muscle memory is the same, and I haven't lost anything.
If there's a tool which works very differently to the standard suite, then it really has to be pulling its weight before I consider using it.
andai|4 months ago
jauntywundrkind|4 months ago
Often there are plenty of of paths open to getting a decent environment as you go:
Mostly, I rely on ansible scripts to install and configure the tools I use.
One fallback I haven't seen mentioned, that can get a lot of mileage from it: use sshfs to mount the target system locally. This allows you to use local tool & setup effectively against another machine!
ta1243|4 months ago
pjmlp|4 months ago
Many folks nowadays don't get how lucky they are, not having to do UNIX development on a time-sharing system, although cloud systems kind of replicate the experience.
PaulHoule|4 months ago
[1] Not like the time one of my friends "wardialed" every number in my local calling area and posted the list to a BBS and I found that some of them could be logged into with "uucp/uucp" and the like. I think Bell security knew he rang everybody's phone in the area but decided to let billing handle the problem because his parents had measured service.
kombine|4 months ago
dankobgd|4 months ago
landgenoot|4 months ago
Only to feel totally handicapped when logging in into a busybox environment.
I'm glad I learned how to use vi, grep, sed..
My only change to an environment is the keyboard layout. I learned Colemak when I was young. Still enjoying it every day.
dangus|4 months ago
I have a chef cookbook that sets up all the tools I like to have on my VMs. When I bootstrap a VM it includes all the stuff I want like fish shell and other things that aren’t standard. The chef cookbook also manages my SSH keys and settings.
GuB-42|4 months ago
Some of them are smart but sometimes I want dumb, for example, ripgrep respects gitignore, and often, I don't want that. Though in this case, there is an option to turn it off (-uuu). That's a common theme with these tools too, they are trying to be smart by default and you need option to make them dumb.
So no, these tools are not "objectively superior", they are generally more advanced, but it is not always what you need. They complement classic tools, but in no way replace them.
lousken|4 months ago
112233|4 months ago
mcswell|4 months ago
trebligdivad|4 months ago
musicale|4 months ago
not really contradicted by:
> exa: modern replacement for ls/tree, not maintained
UltraSane|4 months ago
samcat116|4 months ago
shmerl|4 months ago
pbduring|4 months ago
sigio|4 months ago
I have a very limited set of additional tools I tend to install on systems, and they are in my default ansible-config, so will end up on systems quickly, but I try to keep this list short and sweet.
95% of the systems I manage are debian or ubuntu, so they will use mostly the same baseline, and I then add stuff like ack, etckeeper, vim, pv, dstat.
arminiusreturns|4 months ago
As a greybeard linux admin, I agree with you though. This is why when someone tells me they are learning linux the first thing I tell them is to just type "info" into the terminal and read the whole thing, and that will put them ahead of 90% of admins. What I don't say is why: Because knowing what tooling is available as a built-in you can modularly script around that already has good docs is basically the linux philosophy in practice.
Of course, we remember the days where systems only had vi and not even nano was a default, but since these days we do idempotent ci/cd configs, adding a tui-editor of choice should be trivial.
twic|4 months ago
The ones that leap out at me are ripgrep (a genuinely excellent recursive grepper), jq (a JSON processor - there is no alternative to this in the standard unix toolkit), and hyperfine (benchmarking).
jayd16|4 months ago
Is something like that possible? Seems like you could conceivablely dump these small file size tools into a temp folder and and use them and that could be automated.
Is there a security issue with that? Do any of these tools need more permission than the remote session would have?
Maybe the main issue is portability of these apps?
This is certainly a common sentiment (I've felt it myself) so is it at all possible?
skylurk|4 months ago
63stack|4 months ago
carlosjobim|4 months ago
bieganski|4 months ago
drob518|4 months ago
Actual LOL. Indeed. I was working for a large corporation at one point and a development team was explaining their product. I asked what its differentiators were versus our competitors. The team replied that ours was written in Go. #faceplam
oneeyedpigeon|4 months ago
pasc1878|4 months ago
dragonelite|4 months ago
TheCoelacanth|4 months ago
oslem|4 months ago
I just enjoy seeing others incrementally improve on our collective tool chest. Even if the new tool isn’t of use to me, I appreciate the work that went into it. They’re wonderful tools in their own right. Often adding a few modern touches to make a great tool just a little bit better.
Thank you to those who have put in so much effort. You’re making the community objectively better.
arminiusreturns|4 months ago
tasuki|4 months ago
atiedebee|4 months ago
fergie|4 months ago
I genuinely don't know what is going on here.
maeln|4 months ago
> I genuinely don't know what is going on here.
I basically live in my music library. However, every single pop artist offers songs that I don't like, are not in my library, and mysteriously have many millions of albums sold.
I genuinely don't know what is going on here.
Joking aside, have you ever tried to use some of these tools ? I use to not understand why people where using vim until I really tried.
qustrolabe|4 months ago
oneeyedpigeon|4 months ago
robenkleene|4 months ago
I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.
(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)
esafak|4 months ago
twic|4 months ago
account42|4 months ago
akho|4 months ago
ripgrep is something I have installed, but use only via text editor integrations. fzf is nice for building ad-hoc TUIs. fd may make sense (I’m told it’s faster than find), but I already know enough find.
The “next gen ls” family of tools in the article is baffling.
unknown|4 months ago
[deleted]
roger_|4 months ago
gxonatano|4 months ago
seplox|4 months ago
If any fledgling designers are here, then take note and add this to your list of examples to avoid.
Otek|4 months ago
oneeyedpigeon|4 months ago
antegamisou|4 months ago
_ZeD_|4 months ago
exa modern replacement for ls/tree, not maintained
"not maintained" doesn't smell "modern" to me...
throw_a_grenade|4 months ago
JohnKemeny|4 months ago
Hendrikto|4 months ago
arccy|4 months ago
eza: https://github.com/eza-community/eza
snide|4 months ago
twic|4 months ago
https://difftastic.wilfred.me.uk/
It's a huge improvement over purely character-based diffs.
ed_blackburn|4 months ago
MontyCarloHall|4 months ago
That’s because they’re not GNU coreutils, they’re BSD coreutils, which are spartan by design. (FWIW, this is one of my theories for why Linux/GNU dominated BSD: the default user experience of the former is just so much richer, even though the system architecture of the latter is arguably superior.)
demetris|4 months ago
I know I have hyperfine, fd, and eza on my Windows 11, and maybe some more I cannot remember right now.
They are super easy to install too, using winget.
maeln|4 months ago
`fd`: first I find that the argument semantic is way better than `find`, but that is more a bonus than a real killer feature. Now, it being much, much faster than `find` on most setup, I would consider a valuable feature. But the killer feature for me is the `-x` argument. It allows calling another command on the individual search result, which `find` can also do with `xargs` and co. But `fd` provide a very nice placeholder syntax[0], which remove the need to mess with `basename` and co. to parse the filename and make a new one, and it executes in parallel. For example, it makes converting a batch of image a fast and readable one line : `fd -e jpg -x cjxl {} {.}.jxl`
`rg` a.k.a `ripgrep` : Honestly it is just about the speed. It is so much faster than `grep` when searching through a directory, it opens up a lot of possibilities. Like, searching for `isLoading` on my frontend (~3444 files) is instant with rg (less than 0.10s) but takes a few minutes with grep.
But there is one other thing that I really like with `ripgrep` and that I think should be a feature of any "modern" CLI tool : It can format its output in JSON. Not that I am a big fan of JSON, but at least it is a well-defined exchange format. "Classic" CLI tool just output in a "human-readable" format which might just happen to be "machine-readable" if you mess with `awk` and `sed` enough. But it makes piping and scripting just that much more annoying and error & bug prone. Being able to output json, `jq` it and feed it to the next tool is so much better and feel like the missing chain of the terminal.
The big advantage of the CLI is that it is composable and scriptable by default. But it is missing a common exchange format to pass data, and this is what you have to wrangle with a lot of time when scripting. Having json, never mind all the gripes I have with this format, really join everything together.
Also, honorable mention for `zellij` which I find to be a much saner UX-wise alternative to `tmux`, and the `helix` text editor, which for me is neo-vim but with, again, a better UX (especially for beginner) and a lot more battery included feature while remaining faster (IMEX) than nvim with matching plugin for feature-parity.
EDIT: I would also add difftastic ( https://github.com/Wilfred/difftastic ) which is a syntax aware diff tool. I don't use it much, but it does makes some diff so so much easier to read.
[0] https://github.com/sharkdp/fd?tab=readme-ov-file#placeholder...
rkomorn|4 months ago
Then I tried them and it was such a night and day performance difference that they're now immediate installs on any new system I use.
Izkata|4 months ago
That was inherited from find, it has "-exec". Even uses the same placeholder, {}, though I'm not sure about {.}
maleldil|4 months ago
grep will try to search inside .git. If your project is Javascript, it might be searching inside node_modules, or .venv if Python. ripgrep ignores hidden files, .gitignore and .ignore. You could try using `git grep` instead. ripgrep will still be faster, but the difference won't be as dramatic.
unknown|4 months ago
[deleted]
esafak|4 months ago
doktorn|4 months ago
Scotrix|4 months ago
commandersaki|4 months ago
gxonatano|4 months ago
foofoo12|4 months ago
Got featured here on HN few weeks ago.
h4ch1|4 months ago
[0] https://github.com/aristocratos/btop
PaulKeeble|4 months ago
btop has been pretty good for watching a machine to get an overview of everything going on, the latest version has cleaned up how the lazy CPU process listing works.
zoxide is good for cding around the system to the same places. It remembers directories so you avoid typing full paths.
tomxor|4 months ago
It looks quite fancy but I actually like it more for it's functionality, particularly it's tree view for navigating the processes list. I'm not a big fan of full multicolor in these kinds of tools and so appreciate how easy it is to flip to grey scale mode from the built in colour schemes (even from the TUI settings menu).
oniony|4 months ago
https://github.com/JFryy/qq
mcswell|4 months ago
I was going to top-post that the Unix/Linux command line tools were designed back in the day when data was pretty much line-oriented, e.g. one database record per line. Since then XML, and more recently JSON, have been invented, and tools like grep and sed just don't work for those formats. But you ninja'd me, sort of.
esafak|4 months ago
stormed|4 months ago
tracker1|4 months ago
bandrami|4 months ago
Symmetry|4 months ago
anthk|4 months ago
oneeyedpigeon|4 months ago
It's useless as a cat replacement, I agree. The article really shouldn't call it that, although the program's GitHub page does self-describe it as "a cat clone". It's more of a syntax highlighter combined with a git diff viewer (I do have an issue with that; it should be two separate programs, not one).
stryan|4 months ago
From the README:
>Whenever bat detects a non-interactive terminal (i.e. when you pipe into another process or into a file), bat will act as a drop-in replacement for cat and fall back to printing the plain file contents
bat works as normal cat for normal uses of cat and a better cat for all those "useless cat" situations we find ourselves in.
lucasoshiro|4 months ago
I can't see bat as a "useless cat" or a replacement for cat except for reading source code in the terminal. It's more a like a less with syntax highlight or a read-only vim.
hulitu|4 months ago
YMMV.