top | item 24372830

A Requiem for a Dying Operating System (1994)

246 points| aphrax | 5 years ago |user.eng.umd.edu | reply

276 comments

order
[+] reagent_finder|5 years ago|reply
Pretty much each point raised in this post(?) are correct, current and relevant even 26 years later.

POSIX is a monolith and really deserves to be improved. It's been around forever, yes. It will probably keep on being around forever, yes.

    Take the tar command (please!), which is already a nightmare where lower-case `a' means "check first" and upper-case `A' means "delete all my disk files without asking" (or something like that --- I may not have the details exactly right). In some versions of tar these meanings are reversed. This is a virtue?
Raise your hand if you've never broken Grep because the flags you gave it didn't work. Anyone? Congratulations, you've worked on a single version of grep your entire life. Have a cookie.

Pretty much the only consistent grep flag I know is -i. There's never been a standard for naming and abbreviating flags, which means that for EACH program you will have to learn new flags.

This becomes truly terrible when you get around to, say, git and iptables. Have you ever tried to read git documentation? It is the most useless godawful piece of nonsense this side of the Moon.

There's Google now, which means that the fundamental design issues of POSIX will probably never get issued. "Just google it and paste in from stackoverflow" is already standard, and people are already doing that for 5-10-year-old code/shell commands. What about 10 years from now, will googling best DHCP practices still find that stupid post from 2008 that never got actually resolved? How about 20 years?

I have honestly no idea how to even start fixing the problem. A proper documentation system would be a start.

[+] lioeters|5 years ago|reply
> Have you ever tried to read git documentation? It is the most useless godawful piece of nonsense

I ran "man git" for the first time ever.

https://www.man7.org/linux/man-pages/man1/git.1.html

Heey, that's actually pretty good! I don't think it's "godawful". In the second sentence it recommends starting with gittutorial and giteveryday, for a "useful minimum set of commands".

https://www.man7.org/linux/man-pages/man7/gittutorial.7.html

https://www.man7.org/linux/man-pages/man7/giteveryday.7.html

I must admit, I still occasionally (regularly?) search for "magic incantations", particular combinations of flags for sed, git, rsync, etc. But the man pages are my first go-to, and they usually do the job as a proper documentation system. It's better than most software I've worked with outside (or on top) of the OS, with their ad-hoc, incomplete or outdated docs.

[+] arexxbifs|5 years ago|reply
Completely agree. One of the problems is of course the freedom of choice a Unix system gives you. Instead of a single shell with a single set of commands, people can pick and mix. For beginners it's a nightmare but for power users it's, in general, very empowering.

Getting help on Unix commands, particularly in Linux, has always been a mess. On most Linux distros, typing "help" will give you help about the shell built-ins. Then, discovering "man", you soon find that the bundled GNU utils of course would rather you use their "info" system, which in turn may refer to a web page(!) for info.

I remember coming from the Amiga to Linux: I would not have gotten far without word-of-mouth help (and helpful computer magazine articles explaining a lot of the particularities of Unix). The Amiga, on the other hand, was a cheap home computer with an exceptionally thick manual detailing every single command clearly and succinctly.

The Open Group has the POSIX util spec published online[0] and also allow free downloads of it for personal use. Since I discovered it, I find myself using it much more often than man pages. I've made a little alias in bash that launches Dillo with the appropriate command page.

[0] https://pubs.opengroup.org/onlinepubs/9699919799/utilities/c...

[+] folbec|5 years ago|reply
The 4 things I love most in Powershell, from the point of view of a maintainer of scripts, is the relative verbosity of commands (at least I don't have to go hunting for obscure acronyms, or recursive puns such as yacc, when I read code), auto-completion an auto-documentation of scripts (when I have to change something), and object pipe (as a maintainer I hate awk and regular expressions in general).

Only defect is they did not go with Yoda speak (Get-<TAB> is a much worse filter that NetA<TAB> to search for Get-NetAdapter for instance).

It's still a bit green on Linux environments, but it already beats many of the alternatives.

[+] 3np|5 years ago|reply
One of my pet-peeves on a similar note is the inconsistencies between different ssh commands on if it's -p or -P that signifies the target port.

Apart from the obvious compatibility and legacy factor, I think a major reason is that by the time someone has both enough knowledge and experience to formulate a proper solution and have felt the pain-points, they're already deep enough that they've internalized that this is The Way It Is and are somewhat comfortable with it, those annoying flags aside.

We tend to settle on the lowest common denominator, because consistency and time-to-ready trumps any actual improvements. For example, I'd so much prefer vim bindings in tmux but stopped using customizations like that completely since it turns out it's less of a hassle to just get used to the crappy standard ones instead of customizing it on every new host I start up.

If you can't get your friends off Facebook, good luck getting engineers off POSIX.

[+] ahartmetz|5 years ago|reply
Two points about bad documentation:

The documentation (and syntax, or lack thereof) of "tc" are significantly worse than git's. Unfortunately, the network management tool you are supposed to use these days, ip, is made by the same people, though somewhat less bad.

Second, take git documentation with humor: https://git-man-page-generator.lokaltog.net/

[+] ACS_Solver|5 years ago|reply
I don't find `man git` to be bad at all. Git is complex, and its man pages do the right thing by referring to sources for basics info like giteveryday, and referring to in-depth guides as well. Individual man pages are also pretty good, see `man git-rebase`. It starts with an overall explanation of what rebase does, with examples, and then covers configuration options and flags. It's a lot of stuff, but it's pretty good as far as documentation goes.

GNU packages often have documentation that's bad in the typical "Linux docs are bad" way. Try `man less`. First, it commits a grave sin in having a totally useless one-line summary, which reads "less - opposite of more". Funny, but totally useless and doesn't even remotely suggest what the command does (if you know what more is on UNIX, you surely know what less does). Or `man grep`. It's a reference page, very good for knowing what all the options do, but with no useful everyday examples, and with gems like these:

> Finally, certain named classes of characters are predefined within bracket expressions, as follows. Their names are self explanatory, and they are [:alnum:], [:alpha:], [:cntrl:], [:digit:], [:graph:], [:lower:], [:print:], [:punct:], [:space:], [:upper:], and [:xdigit:].

Self explanatory? Yes, if you used grep in the 90s. Is alnum alphanumeric or all numbers? Is alpha short for alphabet, as in what most people would intuitively call letters? What's xdigit? Extra digits? Except digits? Oh, it's hex digits. Pretty obvious that periods and commas are in punct... but also + * - {} are punct, among other stuff.

`man tar` is extremely comprehensive, an impressive reference, but very hard to figure out if you've never used tar.

I've been recently looking at FreeBSD documentation for common commands, and the source code as well. Both are so much better than the usual GNU versions you find on Linux.

[+] sirsuki|5 years ago|reply
> There's never been a standard for naming and abbreviating flags, which means that for EACH program you will have to learn new flags.

How is this different than web pages or GUI apps? Everyone is different and a button that does one thing in one app/page does something different in another.

Have you tried to read GUI help files? They are written for 5 year olds and provide nothing you need as a dedicated user. Have you had to inspect the DOM of a website to try and intuit what something does it does not do?

Least with command line apps usually you have a --help or man page.

[+] ptero|5 years ago|reply
Those articles from mid-1990s are great to read; they are both funny and informative. Even as many of the points they make are still valid today, even more valuable is the ability to look at succeesses or failures with 20+ years of hindsight.

I feel the pain of the user in this particular case. But I also understand the frustration of the people who wanted to write their own smaller programs with less restrictions than the well architected but highly constrained VMS architecture. And ignoring such users can topple a better technology. That is why (a technically horrible) DOS spread like wildfire on personal computers, super unreliable Windows (Win 95 had to be rebooted daily) killed a much more robust OS-2, etc.

We can call such users who want capabilities quickly, even if they are not fully reliable "ignorant lemmings" or whatever, but ignoring them when a competitor does not is very risky. My 2c.

[+] rcarmo|5 years ago|reply
We had to use VAX/VMS when I was a freshman, decades ago. The “Computer Center” had dozens of VT220s hooked up to it via serial “hubs” in the main building basement (the very stereotype of a nerd dungeon).

It wasn’t half bad. As multi-user systems went, it was actually quite good and we ran a number of projects on it before moving off to PCs, Sun Workstations and Linux in general.

I remember all the staples of the era: using Kermit to upload our assignments to the thing, dialing in from home at 2400bps, hacking our way “out” to the Internet, running out of our 50MB quota due to mailing-lists and uuencoded files fetched via mail gateways.

It was a lot of fun, and AFAIK there are some working VMS emulators around that I could install on a Raspberry Pi (and likely get a faster multi-user system than what we had then for hundreds of students).

I say it was an experience worth having, but largely (functionally) indistinguishable from a UNIX machine when accessing it via teletype (glass or otherwise).

[+] codesections|5 years ago|reply
> [the name] Grep suggests to me that the author of this one had been reading too much Robert Heinlein (you grok?), or possibly --- and this is in fact quite likely --- was under the influence of psychotropic substances at the time.

As funny as this is, the actual origin for "grep" is even more interesting – and, at least to me, quite mnemonic. "grep" comes from ed, and stands for the command "g/re/p", that is

  global/regular expression/print
https://en.wikipedia.org/wiki/Grep
[+] golergka|5 years ago|reply
> was under the influence of psychotropic substances at the time

He's saying it as if it's a bad thing?...

[+] a1369209993|5 years ago|reply
Note that it's not "global" or "print" - the commands are the literal single letters "g" and "p".
[+] kstrauser|5 years ago|reply
A once asked my then-mentor if it was an abbreviation of "GNU rep". He was highly amused.
[+] dhosek|5 years ago|reply
I really miss VMS. Adding a command wasn't just simply a matter of throwing an executable on the path, it had to be declared, with two options: One was to declare a command that had its own argument parsing or the better option, there was a way of doing external configuration of the arguments using (iirc) a .cld file which provided a way of specifying all the arguments and options for the command in a straightforward way. I did this for all the TeX programs around 1990 (I think this might have been the first TeX version which didn't use separate binaries for iniTeX vs TeX). It was nice because the OS automatically managed things like abbreviations so you only needed to type as much of a verbose command or option as was necessary for it to be unique. So while the command might have been `DIRECTORY /FULL` one could just type `DIR /FU` to get the same result. The VMS help system was also fantastic and allowed for easy browsing of available commands and their instructions in a way that man+apropos only approximates (not to mention that the tradition of using relatively verbose but understandable commands and options made it supremely usable.
[+] laksdjfkasljdf|5 years ago|reply
never understood why so many people complain about the short commands, but nobody contributes a central list of aliases with verbose names to them.

alias copy_files_from_one_place_to_another=cp

this tells me short names aren't a problem to begin with. And everyone would have the same discoverability problems with longer ones.

[+] somesortofsystm|5 years ago|reply
I remain amazed that the #1 shipper of Unix systems today is .. Apple.

I grew up on Unix in the 80's, cut my teeth on MIPS RISC/os and then Irix and SunOS and all the joys of the very first days of Linux, oh my .. and I was fully prepared to be an SGI fanboy for the rest of my life - and then, they abandoned Irix and shipped NT. sadface

So when the tiBook came out, and it was promised to have a Unix on it, I jumped off my Indy and O2 workstation onto Apple - a company I'd never imagined, in my wildest 80's and 90's fever dreams, would become the one company still standing in the Unix wars. The tiBook was just soo good, and despite all of its warts in the early days, MacOS X' underpinnings with Darwin were just good enough to swing the decision to use it as a Unix-developers machine. And, it has been solid for 20 years as a platform in that regard, although the writing is definitely on the wall for us Unix geeks who nevertheless carry a Macbook.

If only SGI had made a laptop, and not been wooed away from sanity by the NT freaks. Can you imagine if SGI (Nvidia) had made that laptop before Apple did .. ? I sure can.

[+] kristopolous|5 years ago|reply
They made a rather strange o2 laptop that never made it to production.

A number of legendary companies with great potential misstepped, GRiD, Blackberry, Be, MasPar, Thinking Machines, GO corp, Intergraph, heck go back to the Evans & Sutherland LDS-1 in 1969 or when BBN decided not to get into hardware after making the first internet hardware ever, the IMP. Or how about how SRI fumbled the ball after Englebarts work or SDS, who made a bunch of the NASA Apollo hardware.

The world is littered with great technology companies that didn't stick around because we determine success and failure by handshakes on the golf course.

[+] ubermonkey|5 years ago|reply
It's been said pretty often -- and I think it's true -- that the introduction and success of OS X drastically slowed the adoption of Linux as a mainstream desktop option.

I take no position on whether this was good or bad overall; it's just a statement of fact. Lots of us who would've otherwise needed to shift to Linux for LAMP development or whatever after the dot-com crash migrated to the Mac instead, because it meant a unix laptop that Just Works that also allowed us to run native MS Office, etc. That was a powerful value proposition (and remains one).

" the writing is definitely on the wall for us Unix geeks who nevertheless carry a Macbook."

I do not yet see this writing.

[+] JdeBP|5 years ago|reply
Were you amazed back in the days when one of the most popular Unix flavours was produced by Microsoft? Yes, Xenix.
[+] lalalandland|5 years ago|reply
"Anyway, have you ever tried to use man? It's fine as long as you know what you are looking for. How would you ever find out the name of command given just the function you wanted to execute? You can't. "

One of my gripes with UNIX systems is how opaque they are

[+] webreac|5 years ago|reply
I have learned unix with sunOs in 1990. Man pages were excellent. The first command I learned was "man man". There was a paper version of the man pages available in the computer room (among many other sun documentations). I became pissed by man pages only when I switched to linux. At that time, I have discovered the crypt(3) command by looking at the index of man pages. One week later, I had cracked 10% of passwords of the school (creating my own dictionary).
[+] C1sc0cat|5 years ago|reply
DEC documentation was always very good but then it was a product you paid for.
[+] jiggawatts|5 years ago|reply
PowerShell takes the UNIX philosophy, cranks it up to 11, and makes commands trivially discoverable.
[+] Lammy|5 years ago|reply
Anecdote this jogged from my memory: my first Unix experience was NetBSD on a SEGA Dreamcast game console, of all things. This must have been some time in 2000 or 2001, but I remember sitting there trying every possible command I could think of trying to find one that did anything at all so i could keep going and learn more. The DOS commands I knew obviously didn’t do anything. Neither did anything like ‘help’ or ‘?’. After five minutes or so of trying commands off the top of my head I tried ‘exit’, and that was the end of my Unix experience until I mail ordered a Mandrake Linux CD-ROM set in 2002 :)
[+] shakna|5 years ago|reply
Whilst you have to know it to be able to use it, man -k THING will let you search for a manual, today. (--apropos if long options are supported, which isn't exactly clearer.)
[+] icedchai|5 years ago|reply
"man -k" (or "apropos") will let you search for commands. It is pretty primitive though, definitely not a Google quality search.
[+] mozey|5 years ago|reply
Read a blog, book or cheatsheet? Once you know the command it's much easier to remember (and explain) than click here, then there, then double click that, etc
[+] brazzy|5 years ago|reply
Are we going to completely ignore the fact that this purported Requiem for VMS tells us in great detail why Unix suxx - but literally (and I do mean literally) not a single thing about why VMS should be mourned.

If this is how VMS advocacy looks like, I'm not surprised it disappeared.

[+] thenoblesunfish|5 years ago|reply
The complaints here don't seem to be much about how Unix works in a deep way, but rather the particular language/syntax which is used to interact with it ("rm", etc.). While that's certainly annoying in many ways, so are almost all languages in common use. You could write a lot about how horrible English is (or German, as Mark Twain famously did). But English is a useful standard - politics, inertia, and the value of a common language are forces far stronger than the fact that the language is more annoying than it could be.

Am I being trolled? Probably :(

[+] alxlaz|5 years ago|reply
The back story to this is that the DIGITAL Command Language (more or less the equivalent of an Unix shell), with its excellent filesystem-level features and the environment around it (e.g. the well-written and extremely thorough documentation) was very much light-years ahead of anything you could get on most Unix environments at the time. Going back to the Unix shell felt a bit like a step back.

FWIW, there are plenty of complaints about things other than syntax after the fifth paragraph or so, too ;).

[+] lokedhs|5 years ago|reply
You are right. Some other complaints have have no aged well either (especially things like case insignificance, which we all know is extremely difficult to get right once you leave the confines of English).

Unix has a lot of issues on a lower level, but very few people discuss that, since it's a complicated topic.

[+] 0-_-0|5 years ago|reply
Quote from the article:

Douglas Adams may well have had Unix in mind when he described the products of the Sirius Cybernetics Corporation thus: "It is very easy to be blinded to the essential uselessness of them by the sense of achievement you get from getting them to work at all. In other words --- and this is the rock solid principle on which the whole of [its] Galaxy-wide success is founded --- their fundamental design flaws are completely hidden by their superficial design flaws."

[+] fredsmith42|5 years ago|reply
I just finished a contract at a client that still has a live OpenVMS system running a mission critical application. It was interesting to compare my 25 year old fond memories of the OS with the practical experience of using it on a daily basis. No command line history, crazy long file paths, and case insensitive passwords were a shock. On the other hand, the built in DCL programming language was a saving grace. My wife, amazingly, found my decades-old copy of the "Writing Real Programs in DCL" book. Because of that, I looked like a VMS superhero.
[+] gcc_programmer|5 years ago|reply
I think the author is arrogant and closed minded. He underestimates the levels of complexity of software systems and strives for some utopian "elegance of design" which I think doesn't exist. The only thing which matters is to deliver results and move forward. Requirements change: back in 1990 the internet existed in a much much smaller form, and WWW was just about to be invented. Linux had yet to be invented by Linus. Unix and Linux survived and adapted, VMS (or whatever it was named) didn't. Also the argument about "speaking English" is again so arrogant and closed-minded: not all of us are native English speakers. To me rm or delete or del or banana doesn't really matter...
[+] normanmatrix|5 years ago|reply
I have a feeling that Windows Nano Server w/ Powershell + .NET Core might make this text relevant again soon. Having a consistent OS that scales from containers to servers and desktop is a big benefit in corporate. Now if it matched performance of an Alpine, played well with WSL, and Microsoft would manage to push the major open source stuff for compatibility. I'd give Linux 5 years. But only time will tell..
[+] tromp|5 years ago|reply
> There is no effective way to find out how to use Unix, other than sitting next to an adept and asking for help

That's a rather unfair claim. Hundreds of books and tutorials have been written to introduce people to Unix...

[+] TheOtherHobbes|5 years ago|reply
Now.

This was written decades ago, when computers mostly lived in universities and the only way to learn Unix was by etc.

[+] fouc|5 years ago|reply
You're kind of making the article's point. What's the need for hundreds of books and tutorials if the system doesn't provide access to great documentation with just a "help" command?
[+] ubermonkey|5 years ago|reply
My first corporate job in computing, in 1994, was at TeleCheck. They used an all VMS environment, though by then the machines themselves were about 30% Alphas and not actual Vaxes.

The denial about the platform's future was SUPER strong. TeleCheck IT and software development, at least in those days, was staffed by people who would either move on quickly or stay forever. They mostly hired right out of college, too, so the lifers were people who had never worked anywhere else. (This kind of employment monoculture is pretty destructive, IMO -- get some new ideas in there!)

This created a super weird environment. Everywhere else I worked back in those days was rife with industry publications, curiosity about how other systems worked, excitement about developments in software or networking even if they were on stacks or platforms other than whatever the site used, etc.

Not so there. I think this may have been mostly because to read, say, InfoWorld in 1994 would have made it much harder to avoid how narrow a niche they were occupying. The nature of the systems and software there meant people who stayed were gaining skills not useful anywhere else; pretty much everything (even the database system) was built in-house.

People were out the door constantly, going to other big tech employers in the area to either (a) pick up more marketable skills or (b) pick up a huge bump in pay. That's what I did after 2 years. (Turnover in the dev group was something like 35% a year, which is HELL on institutional knowledge.)

All that said, you could see SOME of the appeal of staying on VMS from their POV. Clustering was a big deal, because downtime cost dollars. File versioning in the OS -- which I still haven't seen implemented the same way anywhere else -- was fucking genius and made rolling back a bad release almost trivial.

But when you decide to ignore where the market is going, and stay on a doomed platform, there are real costs to pay.

[+] soco|5 years ago|reply
I remember I was upset when we moved off OpenVMS (because market pressure) but it was so long time ago, that I don't even remember why I was upset. But I definitely resented the incredible gamble they took with Itanium, even before the gamble crashed.
[+] ninefathom|5 years ago|reply
Meh... I think the author of this article was cherry-picking a bit. VMS is no cake-walk either, and has its fair share of idiosyncrasies.

So you're a system administrator, and you want to change a user's password?

$ set default sys$system

$ mcr authorize

UAF> modify jimbob /pass="whatever"

UAF> exit

...while on just about any nix, one might simply:

# passwd jimbob

<< enter the new pw twice when prompted >>

For every example of the original author complaining about *nix, I could probably find a counter-example of VMS being awful. Really, I think it reduces down to this: people prefer to stick with what they're used to, and what they were trained on. Anything else is "awful" and "inferior."

[+] gnufx|5 years ago|reply
Some context might be useful.

Starlink was one of two somewhat similar subject-specific networks set up in a period of enlightenment by the UK Science Research Council (as I think it was then) operating in the 1980s, particularly for largely interactive analysis. The other was an un-named and ill-publicized one for nuclear structure as opposed to astronomy. I don't know about Starlink, not being an astronomer, but I guess it was also rather ahead of its time, like the nuclear structure one.

Obviously it had changed in astronomy by then, but I didn't see the attitude that physicists (and later, structural biologists) shouldn't write software or build the necessary hardware before or around that time. We had people and job titles like "physicist-programmer", and did what was necessary, and we did fit the facilities to the problem (when not working at foreign labs). The software systems were designed to be user-extensible anyhow, in our case.

Personally I was glad to have the lightning-fast interactive graphics system on the nuclear structure GEC systems, and not the stodgy performance -- even when they weren't running something like Macsyma -- of all the VAXen I used. VMS was somewhat inscrutable to a physics hacker anyhow, so I don't understand why Unix made it more difficult, though I hold no particular affection for Unix.