The complexity of the `find` command is the least of Unix's problems. How about defending these?
1. Unnecessary and confusing directory structure. `/etc`? Why not `/config`? `/usr` instead of `/system`, `/var` instead of ... well who knows. The maximum directory name length is no longer 3 characters.
2. Programs are mushed together and scattered through the filesystem rather than stored in separate locations. This basically means applications are install-only. Yeah, package managers try to keep track of everything, but that is just hacking around the problem, and most developers don't want to spend hours creating 5 different distro packages.
3. Not strictly Unix, but the mess of glibc with respect to ABI compatibility, static linking, etc. is ridiculous. Musl fixes most of this fortunately.
4. Emphasis on text-based configuration files. This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.
5. Emphasis on shell scripts. Fortunately this is starting to change, but doing everything with shell scripts is terribly bug-prone and fragile.
6. X11. 'nuff said about that. When is Wayland going to be ready again?
7. General bugginess. I know stuff works 90% of the time, but that 10% is infuriating. Windows is a lot more reliable than Linux at having things "just work" these days.
Regarding #5, I quite enjoy text-based configuration files, and can't stand systems that force me to use a GUI to change settings. If I have a text-based config file, I know that it will play nicely with git. If there are many related settings, users can change them all quickly with their preferred text editor.
@7 - is that so ? I just gave up installing F# developer tools in Windows last night after two hours, 3 general install methods (check the F# foundation site) and some 6+ installer packages (some of which wanted to eat 8GB of diskspace).
- And yes, my copy of windows is reasonably modern (8.1) and legal.
Contrast that with Ubuntu, where installing F# took all of 5 minutes, with one command, 200MB and I had a full IDE and F# support.
The one concession I'll make is the one you yourself seem ignorant of - ease of use pertains to your expertise with the system. If you grew up on Windows, you may get its idiosyncrasies.
8. Lack of proper, well integrated, easy to use, expressive permissions system, ideally with a notion of complete isolation by default. Right now most users rely on the benevolence of software writers to not mess with their personal files, but sometimes things goes awry (that Steam homefolder deletion disaster comes to mind).
Imagine mobile OSs with just the Unix permissions system, the malware spread on those would be so humongous, it'd almost be funny again (arguably this was a long-time problem anyway privacy-wise, with software requiring privileges that couldn't be faked (e.g. giving the application a fake address book instead of your own), but at least apps couldn't easily nuke/hijack all your personal files.)
> Windows is a lot more reliable than Linux at having things "just work" these days.
As long as you only do the things you're allowed to do. I just replaced my 6-year old Windows gaming machine, the only win machine I have. I can't even change the windowing theme - it has to be MS's preselected graphics. I wanted to turn off all the 'phone home' stuff except updates and windows defender; these are spread through half-a-dozen locations. I hadn't even got to install my first bit of software yet (firefox) and already I was limited over what can normally be done with a desktop.
"Just works" isn't really an argument when it's paired with "but you can only do these things".
Not to mention that back in the server world that lightweight virtual servers are an impossibility with Windows arena. ^nix servers are small in volume size, can run on fumes, and are largely disposable. Windows servers are (relatively) huge, slow to launch, require much more system resources, and require licensing. That isn't "just works" for me.
> [text config] This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.
Only if your GUI tools are written from the viewpoint that nothing else should touch the config file. After all, if I can write a bash script that upserts a config value in a text ini file, why can't a GUI programmer?
Great. It is spread everywhere this kind of complication for everyday tasks. Want to install something, need to type:
apt-get install something
Since we only use apt-get for installing stuff, why not?
apt-get something
I will give a response I didn't see in any comment there, in the original post, neither in this post or here:
Because it could fail miserably trying to install like this, any package named like a subcommand.
apt-get remove # is installing package remove?
# or is failing the remove subcmd without args
You can workaround that, by apt-get install remove in that case, but the error at first try is counter-intuitive.
After opening the link I expected to see an article from 90s, unfortunately it's from 2016... I can't believe that people still going into such debates.
Ask your parents or non-geek friends: Given the task of finding the files with the name ending with .txt which of the following two commands would you choose?
And if we're going to be using the commandline, I'd much rather the unixy
find PATH -name FOO
than the powershelly
Get-ChildItem -Path PATH -Filter FOO -Recurse
I mean "dir FOO /s" is simple and all, but powershell was created because cmd was deficient in many areas.
The blogpost referenced in the article is also stacking the deck a bit, as some of the 'complex' commands are normal commands, but with the verbosity turned up - the rsync command has three flags for increasing verbosity...
I cannot see how this article is defending UNIX only by talking about a
single utility. It could be a better "defense" if it mentioned, for
example, the power of being able to compositionally combine various
commands through pipes, each of which doing one thing well. Inputs and
outputs, remember?
The post the author is responding to is uninformed, and seems rather
like a rant. DOS vs UNIX comparison does not work. I couldn't know how
to take the `apt-get` example seriously. Because you could easily fix
something like `alias ai='sudo apt-get install'` to achieve `ai
something` magic with something as simple as aliases, which DOS does not
even provide.
> find, in particular, very much has a fine UI, and I dare you to process, and not just list, the files with cmd.
I'm not really sure why he is comparing find with cmd in the first place. Nobody thinks cmd is good; most anybody actually doing anything in a shell on Windows would be doing it in Powershell.
In that case, I would do something like:
Get-ChildItem pathname -Filter test.txt -Recurse
Any processing I want to do is easy, because I'm getting back objects and not just text. Say that I want to get a hash of each file named test.txt.
Let's make a little test. You pick ten Linux users on the next FOSDEM and ask them how to find a file on a subfolder. This will be called the "brito test".
If only 2 people don't know the answer, you are correct and simplifying the command line is not really needed.
If 5 of them don't know how do it, they will be branded as "horribly ignorant" Linux users.
If 8 of those people fail, we keep closing our eyes and repeating that everything is OK.
If 10 out of 10 people that you ask are failing this question. Well, time to ask another 10 until you get a positive ratio of non-horribly-ignorant answers.. ;-)
And btw, the blog is about design of future (upcoming) command line tools and not about changing "ls" or any of the other example given. For example, thinking about the most used function of a tool and making that as simple as possible to reach.
As an aside, I remember reading an article about how "the original Unix guys" found the syntax somewhat odd when find came about. But the command was useful enough that they kept it.
(The article was about how inconsistent the Unix commands are regarding syntax, and I think its conclusion was that it is much more important to design syntax to fit the problem than to maintain a superficial consistency with other commands).
In my opinion ls is one of the more broken bits in unix. But besides that, arguably more unixy way (even if no proper UNIX supports it out of the box) to solve is recursive wildcard, i.e.
@7 I see this point repeated often without further clarification, what exactly is more buggy?
I can tell you from personal experience that with a reasonably modern Linux distro, my laptop works out of the box without any problems and I've not experienced any significant system-level bugs in a long time.
Meanwhile in Windows, my sound doesn't work at all when I wake up the laptop from sleep, Windows update tries to override the GPU driver with an older version from Windows update and every time I have > 10 Chrome tabs open, the whole system locks up regularly.
Not to mention that the Windows registry is still a complete mess and trying to COMPLETELY remove a piece of software is an impossible task.
I am not saying Linux is perfect and yes, Windows works better in the games department, but I am not sure I'll call Windows less buggy.
Haha! Yes! +1 for this. I do exactly the same thing. Or even just grep txt. The find command just is sometthing i cant get into my muscle memory (despite using linux for more the 15 years)
1) could be a running gag. If you know it you know it, if you don't why not learn it?
2) mushed together /bin, /sbin/, /usr/bin, /usr/sbin/ usr local Well yes that may be hard to get if you exectubables are scatterd around as in windows in every other directoy. what a "big" difference
3) Yes DLl-Hell never ever has happened to Windows users - never
4) Oh yes it's better to have on registry and nobody knows which is for what. And if the registry is broken the whole system does not even run any more - yes that sounds as if that would be much better.. And no there is no graphical frontend for whatever in webmin.
5) Shell script are programs, and you can use them for scripting. what problem do you have with that?
6) So AFAIKT it runs here without troubles and update/upgrades are just an apt-get upgrade away.
7) Teh IT backbones are servers and most servers run under Linux. That should give you a hint.
You arguments are none. They are just your opinion which is not backed by any knowledge. So welcome in the land of good-doers.
Reliability is a word that Windows users have just learned the last few Windows incarnations. Long running servers are usual with Unices that's hardly the case for any Windows.
And for Windows nearly all malware works. But hey who needs reliability if it's all that nice and colourful
Is Bash really better at wildcard expansion than cmd? I mean, sometimes you don't want to do wildcard expansion in the shell. copy .txt .bak would be much harder to write in Bash I suppose.
I like consistent and predictable rules for how wildcard arguments are expanded. There's no way that thousands of programs would all get it right if it were up to the developers themselves!
That said, it would perhaps have been nice if glob expansion would have, in some other timeline, been performed by a separate command, so that «echo 🞳» would print out a literal 🞳 character, and «glob echo 🞳🞳» would print out the result of the expansion.
glob could then accept flags to modify the rules for expansion, such as enabling the common shortcut for recursive expansion, rather than the user having to modify the behaviour of wildcard expansions by setting global variables.
Do you think the programs themselves should do the rest too? Tilde expansion? Quoting and escaping arguments? Word splitting? Variable substitution? Etc?
As jstimpfle mentioned, that doesn't work, which is why the argument is left off the alias. In general, you can do that in a function instead of an alias:
lsr() {
find . -name "$1"
}
Wrapping variable expansion with double quotes is a good habit, so spaces are handled properly.
Also, if you are using a modern-ish bash, you almost always want to use "$@" when there could be multiple arguments. The double-quoted @ special variable is guaranteed to always expand as multiple args, but with spaces handled correctly:
foo() {
bar --quux=42 "$@"
}
foo "a b c" "Spaces in my filename.txt"
That doesn't work the way you think. Aliases don't receive arguments. The $1 will be expanded to the first positional argument of the surrounding environment when you call the alias -- not the first argument of the alias invocation.
[+] [-] IshKebab|10 years ago|reply
1. Unnecessary and confusing directory structure. `/etc`? Why not `/config`? `/usr` instead of `/system`, `/var` instead of ... well who knows. The maximum directory name length is no longer 3 characters.
2. Programs are mushed together and scattered through the filesystem rather than stored in separate locations. This basically means applications are install-only. Yeah, package managers try to keep track of everything, but that is just hacking around the problem, and most developers don't want to spend hours creating 5 different distro packages.
3. Not strictly Unix, but the mess of glibc with respect to ABI compatibility, static linking, etc. is ridiculous. Musl fixes most of this fortunately.
4. Emphasis on text-based configuration files. This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.
5. Emphasis on shell scripts. Fortunately this is starting to change, but doing everything with shell scripts is terribly bug-prone and fragile.
6. X11. 'nuff said about that. When is Wayland going to be ready again?
7. General bugginess. I know stuff works 90% of the time, but that 10% is infuriating. Windows is a lot more reliable than Linux at having things "just work" these days.
[+] [-] MereInterest|10 years ago|reply
[+] [-] pseu|10 years ago|reply
Contrast that with Ubuntu, where installing F# took all of 5 minutes, with one command, 200MB and I had a full IDE and F# support.
The one concession I'll make is the one you yourself seem ignorant of - ease of use pertains to your expertise with the system. If you grew up on Windows, you may get its idiosyncrasies.
[+] [-] RGamma|10 years ago|reply
Imagine mobile OSs with just the Unix permissions system, the malware spread on those would be so humongous, it'd almost be funny again (arguably this was a long-time problem anyway privacy-wise, with software requiring privileges that couldn't be faked (e.g. giving the application a fake address book instead of your own), but at least apps couldn't easily nuke/hijack all your personal files.)
[+] [-] vacri|10 years ago|reply
As long as you only do the things you're allowed to do. I just replaced my 6-year old Windows gaming machine, the only win machine I have. I can't even change the windowing theme - it has to be MS's preselected graphics. I wanted to turn off all the 'phone home' stuff except updates and windows defender; these are spread through half-a-dozen locations. I hadn't even got to install my first bit of software yet (firefox) and already I was limited over what can normally be done with a desktop.
"Just works" isn't really an argument when it's paired with "but you can only do these things".
Not to mention that back in the server world that lightweight virtual servers are an impossibility with Windows arena. ^nix servers are small in volume size, can run on fumes, and are largely disposable. Windows servers are (relatively) huge, slow to launch, require much more system resources, and require licensing. That isn't "just works" for me.
> [text config] This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.
Only if your GUI tools are written from the viewpoint that nothing else should touch the config file. After all, if I can write a bash script that upserts a config value in a text ini file, why can't a GUI programmer?
[+] [-] ZoF|10 years ago|reply
Ha, I'm server tech at a hosting company and this is spit-take worthy.
I really can't see how anyone could possibly think this.
Unless you're talking end-users local PC's this is wrong, and even then it generally isn't 'Linux bugginess' it's 'user ineptness'
[+] [-] txutxu|10 years ago|reply
Because it could fail miserably trying to install like this, any package named like a subcommand.
You can workaround that, by apt-get install remove in that case, but the error at first try is counter-intuitive.Edit: fix my last example
[+] [-] zenlot|10 years ago|reply
[+] [-] knz42|10 years ago|reply
[+] [-] nothrabannosir|10 years ago|reply
[+] [-] microcolonel|10 years ago|reply
[+] [-] escherize|10 years ago|reply
[+] [-] Vieira|10 years ago|reply
[+] [-] DanBC|10 years ago|reply
[+] [-] vacri|10 years ago|reply
The blogpost referenced in the article is also stacking the deck a bit, as some of the 'complex' commands are normal commands, but with the verbosity turned up - the rsync command has three flags for increasing verbosity...
[+] [-] hasenj|10 years ago|reply
[+] [-] tonyedgecombe|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] raldu|10 years ago|reply
The post the author is responding to is uninformed, and seems rather like a rant. DOS vs UNIX comparison does not work. I couldn't know how to take the `apt-get` example seriously. Because you could easily fix something like `alias ai='sudo apt-get install'` to achieve `ai something` magic with something as simple as aliases, which DOS does not even provide.
[+] [-] bpye|10 years ago|reply
[+] [-] exprx|10 years ago|reply
find, in particular, very much has a fine UI, and I dare you to process, and not just list, the files with cmd.
[+] [-] Amezarak|10 years ago|reply
I'm not really sure why he is comparing find with cmd in the first place. Nobody thinks cmd is good; most anybody actually doing anything in a shell on Windows would be doing it in Powershell.
In that case, I would do something like:
Get-ChildItem pathname -Filter test.txt -Recurse
Any processing I want to do is easy, because I'm getting back objects and not just text. Say that I want to get a hash of each file named test.txt.
Get-ChildItem C:\Users\Amezarak -Filter test.txt -Recurse | Get-FileHash
[+] [-] nunobrito|10 years ago|reply
Let's make a little test. You pick ten Linux users on the next FOSDEM and ask them how to find a file on a subfolder. This will be called the "brito test".
If only 2 people don't know the answer, you are correct and simplifying the command line is not really needed.
If 5 of them don't know how do it, they will be branded as "horribly ignorant" Linux users.
If 8 of those people fail, we keep closing our eyes and repeating that everything is OK.
If 10 out of 10 people that you ask are failing this question. Well, time to ask another 10 until you get a positive ratio of non-horribly-ignorant answers.. ;-)
And btw, the blog is about design of future (upcoming) command line tools and not about changing "ls" or any of the other example given. For example, thinking about the most used function of a tool and making that as simple as possible to reach.
[+] [-] jstimpfle|10 years ago|reply
As an aside, I remember reading an article about how "the original Unix guys" found the syntax somewhat odd when find came about. But the command was useful enough that they kept it.
(The article was about how inconsistent the Unix commands are regarding syntax, and I think its conclusion was that it is much more important to design syntax to fit the problem than to maintain a superficial consistency with other commands).
[+] [-] ygra|10 years ago|reply
[+] [-] zokier|10 years ago|reply
[+] [-] twic|10 years ago|reply
No comment on whether it's actually better or worse, mind!
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] julian-klode|10 years ago|reply
[+] [-] AsyncAwait|10 years ago|reply
I can tell you from personal experience that with a reasonably modern Linux distro, my laptop works out of the box without any problems and I've not experienced any significant system-level bugs in a long time.
Meanwhile in Windows, my sound doesn't work at all when I wake up the laptop from sleep, Windows update tries to override the GPU driver with an older version from Windows update and every time I have > 10 Chrome tabs open, the whole system locks up regularly. Not to mention that the Windows registry is still a complete mess and trying to COMPLETELY remove a piece of software is an impossible task.
I am not saying Linux is perfect and yes, Windows works better in the games department, but I am not sure I'll call Windows less buggy.
[+] [-] joosters|10 years ago|reply
[+] [-] BozeWolf|10 years ago|reply
[+] [-] rat87|10 years ago|reply
`locate` or `locate --regex`
Its faster since it uses a path database(If you have root you can run sudo updatedb to update it beforehand)
[+] [-] Gratsby|10 years ago|reply
[+] [-] yrro|10 years ago|reply
https://www.gnu.org/software/coreutils/manual/html_node/dir-...
[+] [-] OJFord|10 years ago|reply
[+] [-] jkot|10 years ago|reply
[+] [-] FDominicus|10 years ago|reply
You arguments are none. They are just your opinion which is not backed by any knowledge. So welcome in the land of good-doers.
Reliability is a word that Windows users have just learned the last few Windows incarnations. Long running servers are usual with Unices that's hardly the case for any Windows.
And for Windows nearly all malware works. But hey who needs reliability if it's all that nice and colourful
[+] [-] someoneretarded|10 years ago|reply
apt-get something instead of apt-get install something"
wat? How dumb do people get?
[+] [-] skocznymroczny|10 years ago|reply
[+] [-] gtf21|10 years ago|reply
[+] [-] catnaroek|10 years ago|reply
[+] [-] yrro|10 years ago|reply
That said, it would perhaps have been nice if glob expansion would have, in some other timeline, been performed by a separate command, so that «echo 🞳» would print out a literal 🞳 character, and «glob echo 🞳🞳» would print out the result of the expansion.
glob could then accept flags to modify the rules for expansion, such as enabling the common shortcut for recursive expansion, rather than the user having to modify the behaviour of wildcard expansions by setting global variables.
[+] [-] jjnoakes|10 years ago|reply
I would go mad working in such an environment.
[+] [-] DiabloD3|10 years ago|reply
[+] [-] bitwize|10 years ago|reply
Modern Windows has PowerShell, which is leagues ahead of bash in terms of functionality and power.
[+] [-] iLemming|10 years ago|reply
[+] [-] pdkl95|10 years ago|reply
Also, if you are using a modern-ish bash, you almost always want to use "$@" when there could be multiple arguments. The double-quoted @ special variable is guaranteed to always expand as multiple args, but with spaces handled correctly:
[+] [-] jstimpfle|10 years ago|reply