top | item 11229025

In defense of Unix

110 points| ingve | 10 years ago |leancrew.com | reply

185 comments

order
[+] IshKebab|10 years ago|reply
The complexity of the `find` command is the least of Unix's problems. How about defending these?

1. Unnecessary and confusing directory structure. `/etc`? Why not `/config`? `/usr` instead of `/system`, `/var` instead of ... well who knows. The maximum directory name length is no longer 3 characters.

2. Programs are mushed together and scattered through the filesystem rather than stored in separate locations. This basically means applications are install-only. Yeah, package managers try to keep track of everything, but that is just hacking around the problem, and most developers don't want to spend hours creating 5 different distro packages.

3. Not strictly Unix, but the mess of glibc with respect to ABI compatibility, static linking, etc. is ridiculous. Musl fixes most of this fortunately.

4. Emphasis on text-based configuration files. This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.

5. Emphasis on shell scripts. Fortunately this is starting to change, but doing everything with shell scripts is terribly bug-prone and fragile.

6. X11. 'nuff said about that. When is Wayland going to be ready again?

7. General bugginess. I know stuff works 90% of the time, but that 10% is infuriating. Windows is a lot more reliable than Linux at having things "just work" these days.

[+] MereInterest|10 years ago|reply
Regarding #5, I quite enjoy text-based configuration files, and can't stand systems that force me to use a GUI to change settings. If I have a text-based config file, I know that it will play nicely with git. If there are many related settings, users can change them all quickly with their preferred text editor.
[+] pseu|10 years ago|reply
@7 - is that so ? I just gave up installing F# developer tools in Windows last night after two hours, 3 general install methods (check the F# foundation site) and some 6+ installer packages (some of which wanted to eat 8GB of diskspace). - And yes, my copy of windows is reasonably modern (8.1) and legal.

Contrast that with Ubuntu, where installing F# took all of 5 minutes, with one command, 200MB and I had a full IDE and F# support.

The one concession I'll make is the one you yourself seem ignorant of - ease of use pertains to your expertise with the system. If you grew up on Windows, you may get its idiosyncrasies.

[+] RGamma|10 years ago|reply
8. Lack of proper, well integrated, easy to use, expressive permissions system, ideally with a notion of complete isolation by default. Right now most users rely on the benevolence of software writers to not mess with their personal files, but sometimes things goes awry (that Steam homefolder deletion disaster comes to mind).

Imagine mobile OSs with just the Unix permissions system, the malware spread on those would be so humongous, it'd almost be funny again (arguably this was a long-time problem anyway privacy-wise, with software requiring privileges that couldn't be faked (e.g. giving the application a fake address book instead of your own), but at least apps couldn't easily nuke/hijack all your personal files.)

[+] vacri|10 years ago|reply
> Windows is a lot more reliable than Linux at having things "just work" these days.

As long as you only do the things you're allowed to do. I just replaced my 6-year old Windows gaming machine, the only win machine I have. I can't even change the windowing theme - it has to be MS's preselected graphics. I wanted to turn off all the 'phone home' stuff except updates and windows defender; these are spread through half-a-dozen locations. I hadn't even got to install my first bit of software yet (firefox) and already I was limited over what can normally be done with a desktop.

"Just works" isn't really an argument when it's paired with "but you can only do these things".

Not to mention that back in the server world that lightweight virtual servers are an impossibility with Windows arena. ^nix servers are small in volume size, can run on fumes, and are largely disposable. Windows servers are (relatively) huge, slow to launch, require much more system resources, and require licensing. That isn't "just works" for me.

> [text config] This is often ok, but it does make it hard to integrate with GUI tools, hence the lack of them.

Only if your GUI tools are written from the viewpoint that nothing else should touch the config file. After all, if I can write a bash script that upserts a config value in a text ini file, why can't a GUI programmer?

[+] ZoF|10 years ago|reply
>Windows is a lot more reliable than Linux at having things "just work" these days.

Ha, I'm server tech at a hosting company and this is spit-take worthy.

I really can't see how anyone could possibly think this.

Unless you're talking end-users local PC's this is wrong, and even then it generally isn't 'Linux bugginess' it's 'user ineptness'

[+] txutxu|10 years ago|reply
The post linked from this article says:

    Great. It is spread everywhere this kind of complication for everyday tasks. Want to install something, need to type:
    apt-get install something
    
    Since we only use apt-get for installing stuff, why not?
    apt-get something
I will give a response I didn't see in any comment there, in the original post, neither in this post or here:

Because it could fail miserably trying to install like this, any package named like a subcommand.

    apt-get remove     # is installing package remove?
                       # or is failing the remove subcmd without args
You can workaround that, by apt-get install remove in that case, but the error at first try is counter-intuitive.

Edit: fix my last example

[+] zenlot|10 years ago|reply
After opening the link I expected to see an article from 90s, unfortunately it's from 2016... I can't believe that people still going into such debates.
[+] knz42|10 years ago|reply
The article could have been enhanced by highlighting that some shells (e.g. zsh) provide expansion patterns that recurse into subdirectories. e.g.

   ls **/*.txt
[+] nothrabannosir|10 years ago|reply
or in bash, with the globstar option set:

    shopt -s globstar # e.g. in your .bashrc

    ls **/*.txt
[+] Vieira|10 years ago|reply
Ask your parents or non-geek friends: Given the task of finding the files with the name ending with .txt which of the following two commands would you choose?

  find -name "*.txt"

  dir *.txt /s
[+] DanBC|10 years ago|reply
How about if you just give them the man page for "find" and the /? page for "dir" and see who can get the command correct first?
[+] vacri|10 years ago|reply
And if we're going to be using the commandline, I'd much rather the unixy

    find PATH -name FOO
than the powershelly

    Get-ChildItem -Path PATH -Filter FOO -Recurse
I mean "dir FOO /s" is simple and all, but powershell was created because cmd was deficient in many areas.

The blogpost referenced in the article is also stacking the deck a bit, as some of the 'complex' commands are normal commands, but with the verbosity turned up - the rsync command has three flags for increasing verbosity...

[+] hasenj|10 years ago|reply
I actually do this instead:

    find . | grep "\.txt$"
[+] tonyedgecombe|10 years ago|reply
Neither, they would search in Finder or Explorer.
[+] raldu|10 years ago|reply
I cannot see how this article is defending UNIX only by talking about a single utility. It could be a better "defense" if it mentioned, for example, the power of being able to compositionally combine various commands through pipes, each of which doing one thing well. Inputs and outputs, remember?

The post the author is responding to is uninformed, and seems rather like a rant. DOS vs UNIX comparison does not work. I couldn't know how to take the `apt-get` example seriously. Because you could easily fix something like `alias ai='sudo apt-get install'` to achieve `ai something` magic with something as simple as aliases, which DOS does not even provide.

[+] bpye|10 years ago|reply
Powershell also has pipes, and works with objects rather than with text. It can be very powerful and much easier than having to use awk, sed, etc.
[+] exprx|10 years ago|reply
The mentioned blog post is horribly ignorant, and lacks almost any valid points.

find, in particular, very much has a fine UI, and I dare you to process, and not just list, the files with cmd.

[+] Amezarak|10 years ago|reply
> find, in particular, very much has a fine UI, and I dare you to process, and not just list, the files with cmd.

I'm not really sure why he is comparing find with cmd in the first place. Nobody thinks cmd is good; most anybody actually doing anything in a shell on Windows would be doing it in Powershell.

In that case, I would do something like:

Get-ChildItem pathname -Filter test.txt -Recurse

Any processing I want to do is easy, because I'm getting back objects and not just text. Say that I want to get a hash of each file named test.txt.

Get-ChildItem C:\Users\Amezarak -Filter test.txt -Recurse | Get-FileHash

[+] nunobrito|10 years ago|reply
Nuno Brito, blog post author here.

Let's make a little test. You pick ten Linux users on the next FOSDEM and ask them how to find a file on a subfolder. This will be called the "brito test".

If only 2 people don't know the answer, you are correct and simplifying the command line is not really needed.

If 5 of them don't know how do it, they will be branded as "horribly ignorant" Linux users.

If 8 of those people fail, we keep closing our eyes and repeating that everything is OK.

If 10 out of 10 people that you ask are failing this question. Well, time to ask another 10 until you get a positive ratio of non-horribly-ignorant answers.. ;-)

And btw, the blog is about design of future (upcoming) command line tools and not about changing "ls" or any of the other example given. For example, thinking about the most used function of a tool and making that as simple as possible to reach.

[+] jstimpfle|10 years ago|reply
I do like find's UI as well.

As an aside, I remember reading an article about how "the original Unix guys" found the syntax somewhat odd when find came about. But the command was useful enough that they kept it.

(The article was about how inconsistent the Unix commands are regarding syntax, and I think its conclusion was that it is much more important to design syntax to fit the problem than to maintain a superficial consistency with other commands).

[+] ygra|10 years ago|reply

    for /r "C:\some path" %F in (*.exe) do process "%F"
forfiles also exists, which works similar to find regarding passing the list of files to another command.
[+] zokier|10 years ago|reply
In my opinion ls is one of the more broken bits in unix. But besides that, arguably more unixy way (even if no proper UNIX supports it out of the box) to solve is recursive wildcard, i.e.

    ls **/*.txt
[+] twic|10 years ago|reply
I'd say this is less unixy, because it relies on the shell to walk the file tree, rather than delegating to a utility whose job is to do just that.

No comment on whether it's actually better or worse, mind!

[+] julian-klode|10 years ago|reply
bash supports that, just do

  shopt -s globstar
to enable it.
[+] AsyncAwait|10 years ago|reply
@7 I see this point repeated often without further clarification, what exactly is more buggy?

I can tell you from personal experience that with a reasonably modern Linux distro, my laptop works out of the box without any problems and I've not experienced any significant system-level bugs in a long time.

Meanwhile in Windows, my sound doesn't work at all when I wake up the laptop from sleep, Windows update tries to override the GPU driver with an older version from Windows update and every time I have > 10 Chrome tabs open, the whole system locks up regularly. Not to mention that the Windows registry is still a complete mess and trying to COMPLETELY remove a piece of software is an impossible task.

I am not saying Linux is perfect and yes, Windows works better in the games department, but I am not sure I'll call Windows less buggy.

[+] joosters|10 years ago|reply
I can never remember find's strange command line arguments, so I end up writing the easy replacement:

  find . |grep \\.txt
The backslashes are not very intuitive, but if you miss them out entirely, you'll still likely get good enough results.
[+] BozeWolf|10 years ago|reply
Haha! Yes! +1 for this. I do exactly the same thing. Or even just grep txt. The find command just is sometthing i cant get into my muscle memory (despite using linux for more the 15 years)
[+] rat87|10 years ago|reply
for files on disk (not external disk) older then a day you can run

`locate` or `locate --regex`

Its faster since it uses a path database(If you have root you can run sudo updatedb to update it beforehand)

[+] OJFord|10 years ago|reply

    > This is not available in the version of bash that comes
    > with OS X
Almost nobody that doesn't comment "what's Terminal?" on an article/etc. should be using "the version of bash that comes with OS X".
[+] jkot|10 years ago|reply
I dont think its fair to mention DOS in 2016. There is PowerShell, VBS scripting etc...
[+] FDominicus|10 years ago|reply
1) could be a running gag. If you know it you know it, if you don't why not learn it? 2) mushed together /bin, /sbin/, /usr/bin, /usr/sbin/ usr local Well yes that may be hard to get if you exectubables are scatterd around as in windows in every other directoy. what a "big" difference 3) Yes DLl-Hell never ever has happened to Windows users - never 4) Oh yes it's better to have on registry and nobody knows which is for what. And if the registry is broken the whole system does not even run any more - yes that sounds as if that would be much better.. And no there is no graphical frontend for whatever in webmin. 5) Shell script are programs, and you can use them for scripting. what problem do you have with that? 6) So AFAIKT it runs here without troubles and update/upgrades are just an apt-get upgrade away. 7) Teh IT backbones are servers and most servers run under Linux. That should give you a hint.

You arguments are none. They are just your opinion which is not backed by any knowledge. So welcome in the land of good-doers.

Reliability is a word that Windows users have just learned the last few Windows incarnations. Long running servers are usual with Unices that's hardly the case for any Windows.

And for Windows nearly all malware works. But hey who needs reliability if it's all that nice and colourful

[+] someoneretarded|10 years ago|reply
"Since we only use apt-get for installing stuff, why not?

apt-get something instead of apt-get install something"

wat? How dumb do people get?

[+] skocznymroczny|10 years ago|reply
Is Bash really better at wildcard expansion than cmd? I mean, sometimes you don't want to do wildcard expansion in the shell. copy .txt .bak would be much harder to write in Bash I suppose.
[+] gtf21|10 years ago|reply
People are still having windoze vs. unix fights? In this day and age?
[+] catnaroek|10 years ago|reply
There are good things about the Unix shell, but the fact that the shell, not the program being called, expands wildcards, ain't one of them.
[+] yrro|10 years ago|reply
I like consistent and predictable rules for how wildcard arguments are expanded. There's no way that thousands of programs would all get it right if it were up to the developers themselves!

That said, it would perhaps have been nice if glob expansion would have, in some other timeline, been performed by a separate command, so that «echo 🞳» would print out a literal 🞳 character, and «glob echo 🞳🞳» would print out the result of the expansion.

glob could then accept flags to modify the rules for expansion, such as enabling the common shortcut for recursive expansion, rather than the user having to modify the behaviour of wildcard expansions by setting global variables.

[+] jjnoakes|10 years ago|reply
Do you think the programs themselves should do the rest too? Tilde expansion? Quoting and escaping arguments? Word splitting? Variable substitution? Etc?

I would go mad working in such an environment.

[+] DiabloD3|10 years ago|reply
A lot of what this article talks about is why systems like msys2 exists: to bring a *nixy command line to Windows.
[+] bitwize|10 years ago|reply
Only useful for compiling software that depends on Unix toolchains.

Modern Windows has PowerShell, which is leagues ahead of bash in terms of functionality and power.

[+] iLemming|10 years ago|reply
and in the last example he given, i think you don't have to wrap things in quotation marks if alias done like this:

   alias lsr='find . -name $1'
[+] pdkl95|10 years ago|reply
As jstimpfle mentioned, that doesn't work, which is why the argument is left off the alias. In general, you can do that in a function instead of an alias:

    lsr() {
        find . -name "$1"
    }
Wrapping variable expansion with double quotes is a good habit, so spaces are handled properly.

Also, if you are using a modern-ish bash, you almost always want to use "$@" when there could be multiple arguments. The double-quoted @ special variable is guaranteed to always expand as multiple args, but with spaces handled correctly:

    foo() {
        bar --quux=42 "$@"
    }

    foo "a b c" "Spaces in my filename.txt"
[+] jstimpfle|10 years ago|reply
That doesn't work the way you think. Aliases don't receive arguments. The $1 will be expanded to the first positional argument of the surrounding environment when you call the alias -- not the first argument of the alias invocation.