top | item 8942175

LINUX is obsolete (1992)

78 points| edwincheese | 11 years ago |groups.google.com | reply

74 comments

order
[+] TaylorAlexander|11 years ago|reply
This is my favorite part:

"Linus Benedict Torvalds In article <[email protected]> I wrote: >Well, with a subject like this, I'm afraid I'll have to reply.

And reply I did, with complete abandon, and no thought for good taste and netiquette. Apologies to ast, and thanks to John Nall for a friendy "that's not how it's done"-letter. I over-reacted, and am now composing a (much less acerbic) personal letter to ast. Hope nobody was turned away from linux due to it being (a) possibly obsolete (I still think that's not the case, although some of the criticisms are valid) and (b) written by a hothead :-)

Linus "my first, and hopefully last flamefest" Torvalds"

This HN comment posted from a Linux machine in 2015. :)

[+] amirmc|11 years ago|reply
I find it interesting that this post can be taken two ways.

1. It's clear that Linux 'won' in the marketplace so we can all laugh at how wrong this guy was and the curiosity of these 'microkernel' things and that portability stuff. Lols all around.

2. We've reached a point where ideas are gaining ground about immutable infrastructure, people are talking more about things with similarities to microkernels called unikernels [1] (and where they might be used [2]). Linux isn't going anywhere but these new approaches have value and were being discussed as long ago as 1992. Of course, the author got things wrong but that's par for course. It's more interesting to see ideas that are resurfacing. Incidentally, ARM was a RISC chip and now dominates Intel on mobile devices.

I prefer the second approach. So to anyone poking fun at the author, please consider that maybe this is one aspect of living in the future [3], albeit much further than than most (market timing is always underrated and academics tend to think further ahead than most people). We can also remind ourselves that 'Better' is a tricky and subjective thing to define (cf VHS tapes vs Betamax).

[1] http://queue.acm.org/detail.cfm?id=2566628

[2] http://nymote.org/blog/2013/introducing-nymote/

(disclaimer: I'm involved with both the above projects)

[3] http://paulgraham.com/startupideas.html

[+] asuffield|11 years ago|reply
So we now live in a world where both monolithic and "microkernel" (hybrid) systems are commonplace - there's a bit shy of a billion devices running linux (android) and no shortage of windows and osx hosts around. What can we learn from this?

Well, it doesn't really seem to matter whether you use a monolithic or microkernel design. They both work well enough. Driver development is hard for reasons that have got nothing to do with this, and doesn't appear to change much between the platforms. Success/popularity of operating systems appears to be determined by other factors.

I'm not sure why people feel this is still worth debating. All the evidence points to it being completely unimportant.

[+] qznc|11 years ago|reply
> It's clear that Linux 'won' in the marketplace

Arguable. There might be more instances of QNX and L4 running than of Linux. This is pretty much impossible to measure. Remember that Android smartphones have a second OS underneath or beside the Linux kernel.

Apple and Microsoft use a hybrid approach, so they just do not participate in the debate.

> OKL4 shipments exceeded 1.5 billion in early 2012, mostly on Qualcomm wireless modem chips. Other deployments include automotive infotainment systems.

http://en.wikipedia.org/wiki/L4_microkernel_family#Commercia...

[+] hristov|11 years ago|reply
I would very much like to laugh at Andy Tanenbaum and we all should. He deserves to be ridiculed because he was being an a-hole. He was putting down another programmer based on a theoretical mumbo jumbo when he could have proven his theories by actual programming.

I think it is obvious that any scientist that can prove his theories in a practical way must do that and should not flame and criticize another. In areas where theories are harder to prove like astrophysics, social science, economics, etc. I guess people can debate and flame. We are all used to economist flame wars by now.

But in areas where theories are practically provable, such criticisms are useless and idiotic. If Tanenbaum thought microkernels were so great he should have made a usable microkernel OS.

This also shows a very negative practice in the software industry which should be discouraged. And this is the criticism of the thing that is working now in favor of some new shiny project that is still in development but it will be all so great when it comes out. It is a very easy pitfall to get into, because the thing working now usually has some problems and disadvantages while the shiny new thing that does not exist yet is usually perfect in every way. But in practical matters, one usually does better going with the thing working now. But worst of all, this practice of worshiping the nonexistent solution is usually used to strike down newcomers and disruptors.

By the way, unikernels have absolutely nothing to do with microkernels. So, no the day of the microkernel has not arrived and there is no indication that it will come any time soon. Maybe there will be a practical working microkernel based os some day. I have no idea what will happen in a thousand years or ten thousand years. But if that happens it will be more of coincidence than evidence that Andy Tanenbaum is some kind of genius living in the future. If you have long enough time frame a lot of predictions about the future will eventually happen in one way or another.

Regardless, Linus never disputed the fact that microkernels are theoretically better. He just knew how impractical they were because he actually buckled down and did the work to make a useful OS instead of bloviating on theory.

[+] ginko|11 years ago|reply
>We've reached a point where ideas are gaining ground about immutable infrastructure, people are talking more about things with similarities to microkernels called unikernels [1] (and where they might be used [2]).

Not only this. Amoeba, Tanenbaum's research OS is a distributed operating system, meaning that many devices could be used as a single system. This would be incredibly useful in our current age where everyone has a laptop, a smartphone, a tablet, and lots of other devices that need to be synched.

[+] jheriko|11 years ago|reply
> It's clear that Linux 'won' in the marketplace

it depends a lot on what you mean by won and which marketplace you are talking about.

as a desktop os linux is essentially non existent with an ever shrinking 1.5% of the market share or so as more and more regular joe consumers get machines with windows or os x preinstalled at a much faster rate than expert-level geeks are installing linux.

as a software developer targetting the mainstream it is rare that targetting linux is a profitable venture, because nearly no customers use it and those that do don't spend money on software in the same way as some one living in the Microsoft or Apple ecosystems. OS X also has a tiny market share (around 4% iirc) but people who are willing to pay over the odds to have a pretty workstation seem also not afraid of spending money on software.

as a server platform Linux dominates despite the best efforts of MS and Apple to try and market slightly uglier versions their consumer level desktop OS as a server product. its easier to use remotely, and can easily be configured to not rely on heavy UI features - there are even flavours that cater better to business philosophies on 'stability' (i use that term loosely) like RHEL and CentOS - also stability and open source go hand in hand, because you can fix bugs, or pay people to fix bugs allowing you to throw money at problems to fix them in a way which is impossible with the mainstream OSs

as a software developer it makes sense to target linux first for your big expensive server product. if you don't then you are cutting out your customers...

i don't think either of these situations has much to do with the underlying technology. its much more to do with the OS and what it provides than how it is built to achieve those aims.

[+] krick|11 years ago|reply
> Linux 'won' in the marketplace so we can all laugh at how wrong this guy was

Oh, it has been a long time since 1992 and these posts by Tannenbaum are pretty famous (actually, I'm surprised it still appears on HN). So we did laugh already. Later we suddenly stopped laughing and it was more like your "second approach". But that also was quite a while ago. Next wave of philosophical though in techno society was reflecting on all these "worse in better" stuff, "MIT vs Berkley", "Lisp is older that C", "ML is 40 years old". Because, really, it's all about the same idea. And posts about "Cathedral and bazaar" and how dot com bubble destroyed the notion of sound architecture is pretty much about the same thing.

So for me it isn't anymore about any new "lessons" to learn from that. Just a painful reminder on "sooner or later we'll have to throw all these tremendous work away and start anew… shall we?".

[+] caster_cp|11 years ago|reply
Tanenbaum said that Torvalds would not get a good grade on his course, epic. "I still maintain the point that designing a monolithic kernel in 1991 is a fundamental error. Be thankful you are not my student. You would not get a high grade for such a design :-)"

Perfect display of how universities are good at judging people about how well they know how to "play the game" (and usually that involves conforming to whatever frame of mind your professor thinks is right).

[+] carlosrg|11 years ago|reply
Well, a microkernel is harder to implement than a monolithic one and from a theoretical standpoint is a better design, IMO it deserves a higher grade in an operating system design course. Let's not mythify the 1992 Linux kernel, which probably wasn't _that_ good.
[+] ezequiel-garzon|11 years ago|reply
There seems to be a response by "the" (please rise) Ken Thompson:

"viewpoint may be largely unrelated to its usefulness. Many if not most of the software we use is probably obsolete according to the latest design criteria. Most users could probably care less if the internals of the operating system they use is obsolete. They are rightly more interested in its performance and capabilities at the user level.

"I would generally agree that microkernels are probably the wave of the future. However, it is in my opinion easier to implement a monolithic kernel. It is also easier for it to turn into a mess in a hurry as it is modified."

[+] DanBC|11 years ago|reply
[+] sambeau|11 years ago|reply
Before we get into - "Oh not this again!" and "Duh! but Linux Won!" and "Microkernels are still a better design"...

Remember, today's lucky 10,000...

http://xkcd.com/1053/

Actually, I always enjoy reading this whenever it comes up. I was just about to start a university computing science degree course at this time and was reading Tanenbaum in preparation. This was a debate that was alive for years throughout the computing community I was part of.

In the end reality came down to what always wins in computing (and in life too).

  Running code beats design. Design is always 'better'.
[+] jacquesm|11 years ago|reply
There's a variation to that. Unless your company is a very successful multinational chances are that a stranger picked at random has not heard about your product. Makes you think a little longer about terms such as 'market saturation', most products don't even go near there.
[+] jacquesm|11 years ago|reply
The problem with this whole debate is that Linus was 'more wrong' than Tanenbaum and Tanenbaum made plenty of mistakes with Minix which also make it more of a macrokernel than a true microkernel.

A true microkernel does one thing and one thing only: pass messages.

Now that's an ideal and in the real world you don't get to have your ideals realized so rather than to be able to realize this spherical cow you're going to have add in a few more system calls to make it work but you'll end up with something a lot closer to plan9 or QnX and compared to those Linux is very very old hat indeed, it's basically a re-run of the 70's state of the art with a a whole pile of modern day hardware drivers and other goodies thrown in.

Tanenbaums biggest mistake was to try to monetize Minix through Prentice-Hall, if he'd just tossed it out there it would have picked up steam a lot quicker, but likely he too had bills to pay and his expenses at the time were probably a lot larger than Linuses, and so history was made.

So, Linus was obsolete, but so was Minix and the future as we could have had it is still waiting to happen. And when it does you'll finally appreciate just how obsolete Linux was back in 92, and how much more obsolete it is today.

Until then it's like democracy: not perfect but the best we've got (without shelling out lots of license fees for something better).

[+] wobbleblob|11 years ago|reply
> Tanenbaums biggest mistake was to try to monetize Minix through Prentice-Hall,

You make it sound like his capitalist motives worked against him.

Writing and publishing educational material was part of his job. Prentice hall had been publishing his books since the 1970's. I'm sure it seemed like the obvious way to publish his educational material - especially since minix wasn't a standalone thing, but it came with a book. Perhaps he even had an exclusive contract with PH. It probably didn't even occur to him that there was another way to publish the software.

Also, in 1991, making software available for download wasn't as straight forward as it is today. There was no www, and very few Europeans had access to the internet. I know for me a book with a disk would have been the only reasonable way to get it.

[+] icebraining|11 years ago|reply
If cost was the only problem, why didn't Hurd take off? Did Minix not suffer from the same problems?
[+] ghshephard|11 years ago|reply
Something to meditate on any time there is an "Appeal to Expert" - particularly on an issue that is forward looking.

Also, I love how he was so confident with regards to

While I could go into a long story here about the relative merits of the two designs, suffice it to say that among the people who actually design operating systems, the debate is essentially over.

[+] qznc|11 years ago|reply
The funny thing is that I would mostly agree with him. The important detail is "Among the people who actually design operating systems." Linus would agree that Linux is not designed. It evolved. Very few OSs are designed these days.

The QNX microkernel is quite successful. L4 also seems to do well. The BSD Mach microkernel is still around. Minix is still around, although its reliability approach (similar to Erlang, let processes/drivers fail and restart) does not gain mindshare. Windows is considered a hybrid since NT. Likewise XNU/Darwin is Apple's hybrid.

Maybe the current state is more like: Neither micro nor monolithic won, but we know the tradeoffs now and an OS designer can decide per feature. For most applications the difference does not matter, though. Most applications build on a higher level platform (JVM, iOS, Browser, etc).

[+] Nux|11 years ago|reply
Also this (later in the thread), he was pretty wrong..

"Making software free, but only for folks with enough money to buy first class hardware (x86) is an interesting concept. Of course 5 years from now that will be different, but 5 years from now everyone will be running free GNU on their 200 MIPS, 64M SPARCstation-5."

[+] RedNifre|11 years ago|reply
How feasible is it to radically refactor an operating system? Would it be possible to gradually incorporate all the insights of operating system research into Linux/BSD/$otherProductionReadyOS or is it always necessary to start again from scratch?
[+] justincormack|11 years ago|reply
Minix3 is to a fairly large extent a refactor of NetBSD to have a microkernel base; osx kind of did a similar thing with freebsd. Also there are ways of running old OSs on top eg Linux on Genode.

NetBSD is a good starting point as the drivers are portable (via rump kernel; eg Genode uses this for their filesystem drivers) so you can reuse them in another OS as a starting point, plus it is reasonably simple as OSs go, plus the BSD license is friendly.

There were a bunch of related talks at https://operatingsystems.io/

[+] Danieru|11 years ago|reply
Through refactoring Linux has been transformed to be a real time kernel using the linux-rt patch set. The patch set is aggressive and big and only slowly making way upstream.

Still it proves that yes, refactoring can work. Real-time is one of those super-duper invasive features. So if it works for real-time we should be able to refactor in other features.

[+] sgt|11 years ago|reply
The following statement puts things in perspective, doesn't it.

"Speeds of 200 MIPS and more are likely in the coming years."

Now, a modern Mac Pro can do 488250 million instructions per second. Needless to say, we've had progress in these last couple of decades.

[+] jacquesm|11 years ago|reply
Forget about the Mac Pro (and isn't it the intel CPU that does those instructions to begin with?), take any modern graphics card.
[+] jestinjoy1|11 years ago|reply
On the portability aspect why Minix is better than Linux? I didn't get that point.
[+] icebraining|11 years ago|reply
IIRC, Linux at the time had plenty of i368 specific code, while Minix could be compiled to multiple architectures.
[+] Narishma|11 years ago|reply
I believe at that time, Linux only ran on x86.
[+] avinassh|11 years ago|reply
It's amazing to see how a 20 year old university student was debating with well known OS researcher and he was so confident.

Btw Linus called DOS as DOG:

> You mention OS/360 and MS-DOG as examples of bad designs as they were hardware-dependent, and I agree.

[+] sergiolp|11 years ago|reply
I did some small contributions to GNU Hurd years ago, but still, this made me giggle:

Of course 5 years from now that will be different, but 5 years from now everyone will be running free GNU on their 200 MIPS, 64M SPARCstation-5.

[+] josteink|11 years ago|reply
Looking back it's amazing how much traction the gnu userland tools gained and how little (zero?) their kernel gained.

I'm not going to call it a wasted effort, because I appreciate there being other "Unix"-systems around in case Linux ever goes bad, but at this point it seems to be a whole lot of work invested for nothing.

Maybe things will change 5 years down the road when people abandon systemd and DMD had matured, and the best way to run it is on Hurd, but I'm not putting any money on that bet ;)

[+] krasnov|11 years ago|reply
Is it still possible to asscess comp.os.minix old archives through NNTP somewhere? Google Groups have the old emails but I can't find a way to read them in a local email client.