top | item 19992818

Systems Software Research is Irrelevant (2000)

71 points| Philipp__ | 6 years ago |doc.cat-v.org

78 comments

order

geofft|6 years ago

"What happened" was two-fold:

1. Starting especially a few years before 2000 but continuing today, the software industry is quite profitable, pays well, and has lots of openings, while the a academic job market in systems research continues to pay poorly and has much more limited openings. So if you want to do systems software research while also having an enjoyable quality of life, you might as well go to a company and get paid well instead of spending your days writing a thesis and grant proposals.

2. Computer science is a field where the cost of basic research equipment is low (a computer), and more interesting research environments generally are beyond the scale of academia (tens of thousands of hardware nodes, hundreds or thousands or more QPS of production load, etc.). That makes it quite different from e.g. biology or high-energy physics on one end where you usually need to be in academia to get access to the equipment or e.g. mathematics (including theoretical CS) and literature on the other where it doesn't matter where you are; in systems research you only get access to the equipment from being in industry.

That doesn't mean that systems software research, done in industry, is (or was or will be) irrelevant; it means that the narrower definition of "research" as "that which is done in academia" is inaccurate (including industry with the trappings of academia, i.e., people at Google or Bell Labs writing papers in academic journals and hiring people with Ph.D.s). Systems software research happens in industry and is quite relevant to itself.

yingw787|6 years ago

Commercial research is decently different from academic research, which is what I think Rob may be referring to.

Commercial research needs to keep in mind the existing legacy systems used by the sponsor. Innovations are more evolutionary instead of revolutionary as the field matures. They may be more tailored to observable pain points of the research sponsor. They may not be widely shared if they yield results providing a competitive advantage. While it may not demand immediate returns, commercial research does have an axe to grind. All of this hampers advancement in the field of computer science in general.

I also don't know if there's any kind of commercial research on the scale of XEROX PARC or Bell Labs. I can't think of any off the top of my head. Microsoft and Google do some pretty neat research, but I don't think they've shipped anything quite on a similar scale.

There's really no organization hiring the best talent to work on the kind of black swan events commercial research may miss. For example, I think it'd be cool to have a microcode-based OS; I've heard it would help with keeping operating systems secure. But who would fund it, and who would work on it? Right now it doesn't look like anybody would, and that might be what Rob is concerned about.

pjc50|6 years ago

> more interesting research environments generally are beyond the scale of academia

I think the real impairment to OS research is deployment. If your idea isn't compatible with one of the existing OSs, in such a way that it can run a web browser, then nobody's going to use it. Heck, even Windows Phone couldn't get adoption. OS ideas that require people to completely rewrite applications and interaction paradigms are non-starters no matter what benefits they offer - unless they can fulfil a need that can't be fulfilled any other way. So quite a lot of work goes into bypassing the OS entirely for hardware-specific single-program networking applications, and everyone else has to keep with their existing paradigms.

wsetchell|6 years ago

Re 2) Biology and Physics are both really expensive (think particle accelerators and human genome). We could dramatically increasing academic computer science funding to let tackle those large interesting problems.

cperciva|6 years ago

Virtualization. Capabilities. Kernel-bypass networking. Static code analysis. Verified-pointer microarchitectures. Coverage-guided fuzzing.

Systems software research has come a long way since 2000.

jascii|6 years ago

Virtualization: See VMS (1977) Kernel-bypass networking: See microkernels (1967) Need I go on?

eeZah7Ux|6 years ago

Virtualization and static code analysis existed before 2000.

Granted - the tooling improved.

tlb|6 years ago

The last 19 years of systems software research have not refuted Rob's thesis. Industry has made incremental progress, academia has written papers but not built much that people want to use. Despite massive increases in graphics processing power, desktop UIs are still about the same as in 2000, just with more shininess.

And the number one thing that could have gotten better in the last 19 years but didn't: security.

polskibus|6 years ago

There's a lot of churn in tech - people jump to new stacks for job prospects, instead of solving hard problems in existing frameworks. This is part of the reason why tech keeps on reinventing the wheel instead of providing improved productivity perceived from the business (customers' business needs) perspective.

GuiA|6 years ago

> desktop UIs are still about the same as in 2000, just with more shininess.

That is because the majority of people fundamentally do the same things with computers than they did 20 years ago. Browse the web, edit pictures, videos, put together presentations, document layout, spreadsheets, etc.

Of course now your home videos are in 4K instead of 320p, and webpages are 10MB of JS instead of 10k of text... but these are changes in scale, not in kind.

However, shiny features is what gets people attracted to your platform, so we get shininess (never mind if functionality actually gets lost in the process).

The perfect illustration of this for me is George RR Martin, a professional writer of indisputable success, doing all of his writing work on a 1980s workstation with WordStar 4.

i80and|6 years ago

> And the number one thing that could have gotten better in the last 19 years but didn't: security.

In 2000, people mostly still used Windows 9x. A single-user system with no sandboxing and no built-in firewall.

wyldfire|6 years ago

> And the number one thing that could have gotten better in the last 19 years but didn't: security.

This is an astonishing claim: what makes you think it hasn't gotten better? It's gotten a LOT better since 2000.

microtonal|6 years ago

New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture -- and some do -- the first thing to build is the Unix emulation layer.

How can operating systems research be relevant when the resulting operating systems are all indistinguishable?

[...]

Linux is the hot new thing... but it's just another Unix.

Although they are rooted in FP notions of purity and immutability, I would say that NixOS and Guix try to fundamentally change operating systems.

joker3|6 years ago

What do UIs have to do with systems research? And given how much easier it is to use an interface that you're already familiar with, isn't it a good thing that they've mostly stayed the same?

virgilp|6 years ago

> academia has written papers but not built much that people want to use.

Does "started in academia" not count? Because that'd give you easy counter-examples, e.g. Scala, Spark.

eeZah7Ux|6 years ago

The disconnect between research and development* has been constantly increasing.

* "development" as in making a technology usable, not software development

RcouF1uZ4gsC|6 years ago

I would argue that systems research has been incredibly relevant. First consider programming languages. Even though languages such as Java, C++, C# are all widely used, they are much different languages than they were in the early 2000’s. You can see the influences of academic research especially from the functional languages (monads) on these languages. Also, Rust is an exciting new language that is enabled by the systems software research of the past.

If you look at networking, recently, there has been the move towards new protocols (quic) that was the result of systems research looking at the deficiencies of tcp. Another area is consensus algorithms. We now have large scale real life deployments of consensus algorithms, for example Spanner and etcd.

The late 90’s and early 2000’s were a weird time where the hardware was improving so fast and taking software along for a free ride that a lot of software was good enough. Now, as we bump more into the end of Moore’s Law, we will be seeing more research and real life usage of multicore and heterogenous computing and libraries and languages and operating systems that try to make that easier.

lallysingh|6 years ago

While he mentions languages & networking up top, he's not really complaining about it. Those have and continue to advance nicely. OSs have definitely gone into small-increment-improvement mode

dgellow|6 years ago

> The late 90’s and early 2000’s were a weird time where the hardware was improving so fast and taking software along for a free ride that a lot of software was good enough.

Would you say that wasn’t the case during the past 20 years (2000-2019)? Or do you consider all that period to be “early 2000’s”?

w8rbt|6 years ago

"Linux may fall into the Macintosh trap: smug isolation leading to (near) obsolescence." Well, that did not happen.

apta|6 years ago

This is how you end up with a language like golang.

zzzcpan|6 years ago

Well, to be fair, there is almost no actual scientific programming language research to begin with, so anything goes.

JdeBP|6 years ago

Would research into side-channel attacks count under M. Pike's criteria?

Granted, whilst it is system-level it is not system software. And it has not yielded demos that people have regarded as cool, rather ones that have been received by some as horrifyingly worrying.

But it has definitely influenced industry.

tlb|6 years ago

Great work has been done discovering side-channel attacks, but on the other hand most side channels have been created by sloppy microarchitectural design since 2000. So I dunno if that's progress. If we see some CPUs in the next few years that are both fast and not vulnerable, I'll count that as progress.

wayoutthere|6 years ago

This article predates it, but OS X (particularly after it mutated into iOS) represents probably the biggest source of systems innovation in the two decades after this article. Apple is very secretive, so their systems research often isn’t known outside the company until it’s actually going into a product.

OS X was modern for its time, but where they’ve really pushed the envelope is with iOS. They can simply move faster at scale than anyone else because they almost entirely own the IP for both the software and all major hardware components and can pivot on a dime compared to market-based coordination.

naasking|6 years ago

> This article predates it, but OS X (particularly after it mutated into iOS) represents probably the biggest source of systems innovation in the two decades after this article

There was almost nothing innovative about OS X, even when it came out. It was just packaged and marketed very well. Objectice-C and NeXTSTEP was a user land improvement over typical C user lands, but that's not saying much.

> OS X was modern for its time

It really wasn't. The Mach "microkernel" was from outdated 80s research. It's bloated, slow and inflexible compared to the state of the art at the time.

kllrnohj|6 years ago

At launch Android was way more innovative at a systems level with per-application UID sand-boxing & a permission system and system-integration capabilities (broadcasts, services, intents, etc...)

iOS was innovative at a UI/UX level, definitely. But I can't really think of anything they did at a systems level that was at all innovative?

jascii|6 years ago

I always understood that the "Systems" component of OS X was called Darwin and is an open source project and anything but secretive.. Did something change?

icedchai|6 years ago

OS X is NeXTStep at the core. It dates back to the 80's.

elchief|6 years ago

Hadoop? Spark? Are those not systems software?

syn0byte|6 years ago

Beowulf? MPI? Were these not already things 30 years ago?

https://github.com/intel/spark-mpi-adapter

Oh look, a paper; "For example, a recent case study found C with MPI is 4.6–10.2×faster than Spark on large matrix fac-torizations on an HPC cluster with 100 compute nodes"

Does it sound like large data analytics would have horribly stagnated?

marcinzm|6 years ago

Hadoop was not an academic project although Spark was.

ChrisRus|6 years ago

Soon it will take less time and be more cost-effective to commission the integration of an SoC for your application than to risk your business to software basket weavers.

phtrivier|6 years ago

I genuinely don't know what the author is refering to.

I'm amazed at how many comments resolve around "But wait, of course systems research has evolved, see XX and YYY", followed by responses along the lines of "Nah, he was not talking about XX and YYY, rather ZZZ, etc..."

I hate being the "please define xxx" guy, but is there a consensual definition of what "Systems software" is ?

yingw787|6 years ago

Rob defined systems software in the first part of his post as "Operating systems, networking, languages; the things that connect programs together.".

perfmode|6 years ago

RAMCloud isn’t popping, but it did give us RAFT.