top | item 10858299

Academics, we need to talk

141 points| ssn | 10 years ago |matt-welsh.blogspot.com | reply

165 comments

order
[+] forgottenpass|10 years ago|reply
Now, I don't think all academic research has to be relevant to industry.

I'm getting kind of sick of this tactic, where this fake concession is made before going on about how academia isn't designed well enough to deliver to industry.

If authors of articles like this didn't care about the efficiency of academia to deliver them research, what's the big deal if there are a bunch of people somewhere running around in circles? Hey, at least the CS academics occasionally produce something useful to industry, which is more than I can say about other groups running in circles. Sure it could be about burning through money, or a real desire to improve the state of CS academia. But if that was the case I'd expect these articles to have categorically different discussion and calls to action. Or - at the very least - address goals of acedemia other than performing research for industry.

I've never been an academic, and never want to be, but even I gave up reading after the "(I know, crazy, right?)" line. At that point, I knew for 100% sure, that the target audience of this article is not academics. Nobody is that much of a condescending prick to the person they're trying to persuade. This is not a "we need to talk" conversation this is a "I need to talk at you, so I can show off to other people."

[+] hackbinary|10 years ago|reply
I'm getting kind of sick of this tactic, where this fake concession is made before going on about how academia isn't designed well enough to deliver to industry.

This +1, er +1,000,000.

I would firmly state that academia by definition is not supposed to 'deliver to industry', and not even to society. The whole point of academia is to give the very smartest people (whatever that means) the freedom to explore ideas, so that a some point humanity may benefit from their ideas, experiments, and research.

My point is that it could be any number of generations of research from the current generation of (what seems like quack) research (to the establishment) will turn into being instrumental to our understanding of the universe. Galileo Galilei (and Copernicus) for instance. Their ideas (and research) did not sit well with the establishment of the Church (which was pretty much the central authority of everything at the time), but their ideas are now central to our understanding of our Solar system.

edit: formatting.

[+] jerf|10 years ago|reply
"Mostly I'm involved in mobile, wireless, and systems, and in all three of these subfields I see plenty of academic work that tries to solve problems relevant to industry, but often gets it badly wrong."

If you want to work on problems irrelevant to industry, go nuts.

If you want to work on problems relevant to the industry, it helps to double-check that your problems are relevant to the industry.

The desire to work on problems relevant to industry is coming from academia, not the author.

[+] mdwelsh|10 years ago|reply
(I'm the author of the original blog post.) Ad hominem attacks aside, I do think it's a big deal if a bunch of academics are spending time working on the wrong things, if for no better reason than maximizing efficiency: Don't forget that most academic research is funded by the government. Since I also happen to help decide which research proposals Google funds, I also care that academics are well-aligned with the problems we care about. Clearly we also need to invest in long term bets. But there is a big difference between doing long-term, potentially-groundbreaking research and bad industry-focused research.
[+] adrusi|10 years ago|reply
I don't think it's a fake concession. There is academic value in far-out research that isn't relevent to industry, just like there's value in math research and in metaphysics.

The point the article tries to make is that if you're trying to research something relevent to industry, it's probably not nearly as valuable academically as something more exotic would be, so you'd better make a good effort to make it practically valuable to compensate.

Nobody is that much of a condescending prick to the person they're trying to persuade.

In context the line didn't appear condescending at all as I read it. A little sarcastic, that's all. The author made it clear that he doesn't think academics need to produce something that's actually used by consumers. And the author was/is an academic. If you'll allow me to speculate, it seems that you don't have the highest opinion of academics and so it seems possible that you might further think that in order to be valuable they need to produce a product that is economically valuable. If you read the "I know, crazy, right?" line with that attitude, I can see how it might appear much more condescending than it actually is.

This is not a "we need to talk" conversation this is a "I need to talk at you, so I can show off to other people."

This is a non sequitur. "This article isn't trying to persuade academics, therefore it is trying to promote the author's social standing at academics' expense". The author identified a problem and offered a potential solution, does that mean he also has to persuade? Your statement presents a false dichotomy, that he has to either persuade academics or shame academics to promote himself. I say that he doesn't make an attempt to persuade beyond identifying problem and solution because he doesn't want to bore the reader with rhetoric, nothing more nefarious.

[+] scott_s|10 years ago|reply
The author is on the program committee for some academic conferences, so he is not just an observer, but a participant. It is also common for academic computer science papers to use industry applications as motivation for their research - and, similar to the author, I often find some of these motivations misguided.

(I work in an industry research lab, write code for a real product, and publish academic computer science papers.)

[+] Gatsky|10 years ago|reply
Look, this blog post contains good advice for academics. You seem to be overreacting quite severely. The truth is that the utopia where anyone can work on interesting problems is long gone or rather never really existed (http://amapress.gen.cam.ac.uk/?p=1537). Academia produces a massive oversupply of academics, and there isn't the capacity to fund them all. Funding bodies in general are talking more and more about 'impact' as a deliverable. The original post is just trying to say don't ruin your chances to get funding with faulty thinking about industry applications.

You can argue, quite rightly perhaps, that focusing on impact/industry applications is a bad idea, and that we will miss out on important epochal discoveries, but that is a problem with society, not the author of the blog post.

[+] mizzao|10 years ago|reply
Matt Welsh used to be a tenured systems professor at Harvard. He probably knows more about academia than 99.9% of people in industry.
[+] Fede_V|10 years ago|reply
I think he is largely correct in identifying the flaws of 'academic research', but he does not spend enough time discussing the whys.

Academics are in an insanely competitive environment, where what is rewarded is bringing in grants/high impact publications. There are a very select few academics that are so brilliant and have such sterling reputations they can afford to not play this game (like Matt Welsh's former advisor) but most young researchers don't have this luxury.

For example, Peter Higgs, the Nobel prize winner who postulated the existence of his namesake Boson, flat out said that: "I wouldn't be productive enough for today's academic system" (http://www.theguardian.com/science/2013/dec/06/peter-higgs-b...). He spent several years of quiet research without publishing anything to develop his theory - a young professor doing the same now is unthinkable. The most highly successful young scientists I know now are incredibly career driven and optimize ruthlessly for the kind of output that tenure committees are looking for.

Basically, if you want researchers to incorporate best practices (tests, version control, well commented code, etc) and to actually attempt ambitious longterm research programmes, make sure that's what you reward, and remember you cannot just reward success. By definition, something ambitious has a high possibility of failure - if failing means that your career is destroyed, then people won't do it.

[+] nickpsecurity|10 years ago|reply
This is very important. Writers like Matt might do well to factor in the "academic" constraints much like he encourages them to do for industry. I ran into this in a discussion with Anti on prior art for Rump Kernels. Learned a bit in that discussion but his troubles getting approval stood out:

https://news.ycombinator.com/item?id=10141736

I had heard about BS in academia where only paper output, grants, citations, and so on count. That Anti had to fight to get them to care about his work being implemented speaks wonders about how this works out in practice. Had he only cared about academic success, he could've dropped some light technical details and graphs in the paper then been done with it while the idea collected dust. Many benefited from him fighting the tide on it to produce good papers and an implementation.

I'm not in academic circles but I bet many face the same battle. With the pressure, it might be impossible for them to produce the desired output on their own within their constraints. Or so difficult many give up. Perhaps we should encourage them to have one good line of research they string out over years for quality and ensuring delivery while doing lots of nice papers in between to keep institutions happy. Think that might work?

[+] thearn4|10 years ago|reply
I agree with your points, in my case it was this hypercompetitiveness at the expense of meaningful contribution that drove me away from an academic career, post-PhD.
[+] samth|10 years ago|reply
Academic research that unknowingly (or sometimes even knowingly) duplicates secret industry work is much more valuable than this discussion indicates. Sure, it's not valuable _to Google_ for someone to publish things they already know. But everyone else benefits. If people at Google want their research to stop being duplicated, they should publish it.

Of course, if your goal is that Google adopt your new system in their data centers, then you need to know what they already do. But the problem with that model of research is the initial goal, not the way it's currently executed.

[+] mdwelsh|10 years ago|reply
I'm not so worried about duplication by academics -- that does not happen often -- but rather about academic research that's just wrong: makes bad assumptions, uses a flawed methodology, fails to address the general case.
[+] munin|10 years ago|reply
Today, in academia, it's considered risky to do research in computer vision, machine learning, or speech processing, for example, because it's likely that you will get "out-Googled". Google probably has an entire team of 20 working on what your one graduate student is doing. They'll have petabytes of real data to test against, hundreds of thousands of computers to run their jobs on, and decades of institutional experience. Your graduate student has a macbook air, six months of experience from an internship at Microsoft, and a BS in computer science. If you're lucky. They're going to lose. They should just go to work at Google.

Over time, fields of study become industrialized. There was a time when doing research in computer vision, machine learning, and speech processing was risky because the field was new, difficult to enter, and the prospects for commercialization were slim. That time has passed. Those 20 people working at Google are the people that helped that time pass. One could argue that the place for this work is now in industry - the motivations are all right and the resources and data are aligned to carry the work forward at a rapid pace.

This happens in other fields. For example, there's some word on the street that DARPA is going to stop funding so much basic research into applied robotics. Industry, they say, has got this covered. You can argue that they're right. The commercial sector is starting to get real thirsty for robots. Amazon talks about automated drone delivery. Everyone talks about self driving cars. The military wants to buy automated planes as a purchase, not as a research project. The time for basic research, it seems, is over.

As far as I can tell, this happened with systems about fifteen years ago, so the academic activity you see in systems is what is left over after all of the researchers that could do things moved into applying their research in industry. You no longer need to have weird hair and be buried in a basement to think about 20 computers talking to each other in parallel - you can go work at any technology company and think about two million computers talking to each other in parallel, and get paid two orders of magnitude more money. So the people doing systems research in academia are the people that cannot take their systems research into industry. If they could get internships, they would, and then they would get jobs. They haven't.

[+] anonymousDan|10 years ago|reply
Such nonsense. I know plenty of people doing great systems research that just doesn't align with the goals of current technology companies. Just look at the proceedings of the top systems conferences and there are plenty of good papers and ideas out there.
[+] tensor|10 years ago|reply
> Industry is far more collaborative and credit is widely shared.

This couldn't be farther from the truth. Your idea's are generally credited to the company, which in turn is credited to the CEO or some other high up. On collaboration, it's only more collaborative within a given company, and not always even then. Between companies it's outright hostile to collaboration by definition.

[+] accountatwork|10 years ago|reply
Something I've seen at every large tech company I've worked at (which includes the author's company), is that some people do the work for something cool. The next step for them will be to prepare a slide deck so that some big name can give a talk at a conference. That doesn't always happen, but it's common.

Depending on the managers and team leads involved, that kind of thing can also happen when promotions come around. At every place I've worked, a common complaint is that the TL for the project got promoted despite not doing much because they were TL. Of course that's not supposed to happen, but it happens all the time.

The post seems to compare the worst case in academia vs. the best case in industry. You could just as easily flip things around and make industry sound bad.

[+] _delirium|10 years ago|reply
It's often not even collaborative within the company, but only within a specific fiefdom within the company.
[+] skywhopper|10 years ago|reply
> I know this sounds like a waste of time because you probably won't get a paper out of it

It's unfortunate that a "lessons learned" paper summarizing a sabbatical in industry doing customer-facing work would not be publishable. Surely it's far more useful to other academics than most papers. It'd definitely be more broadly relevant.

My wife is a professor in a practical field and I'm always sad to hear what counts as a "good" paper or a publishable article. The big journals in these fields drive the notion of what is and isn't legitimate research. That's the point where what constitutes career-advancing "academic output" has to be changed. But I'm not enough of an insider to have any idea of how to go about doing that.

[+] mdwelsh|10 years ago|reply
The issue is that when doing a sabbatical/internship at a company, it's often not possible to write a paper - either because there's not time, or the company may not want to publish the work (which could be confidential). I wouldn't go to a company expecting to be able to publish about the project.
[+] KKKKkkkk1|10 years ago|reply
People often misunderstand what it means to get a paper published. Nobody publishes a paper to "disseminate their work." You can do that perfectly well by making a blog post or posting on arXiv. A published paper is a badge recognizing your contribution to the scientific community. If the lessons I learned from my summer exploits at FooCorp do not constitute a significant scientific discovery, then they're not publishable, and justly so.
[+] boxy310|10 years ago|reply
I'm sad that journal reputation often trumps practicality of not exactly "breakthrough" articles that are nonetheless extremely interesting. One could argue that blog-reading should form an important part of being well-read in the literature, but again there's the concern over quality of research & insights that leads us to journals in the first place.
[+] irremediable|10 years ago|reply
I agree with some of this, but I wonder...

> It drives me insane to see papers that claim that some problem is "unsolved" when most of the industry players have already solved it, but they didn't happen to write an NSDI or SIGCOMM paper about it.

I've seen many examples of industry "solutions" that aren't documented, aren't published, and aren't even validated. There's a place for papers like these. I'm not quite your typical CS researcher (I do applied math and software for medical imaging), so YMMV, but I think this criticism is too harsh.

[+] mdwelsh|10 years ago|reply
That's a fair point. The issue is that many of these papers don't seem to acknowledge that industry has (unpublished) solutions, and are somewhat naive as a result.
[+] wfo|10 years ago|reply
If you're doing industry-relevant research and you're in academia, leave. Your work can be supported by corporate profits because it is in essence for corporate profit. Get a job in industry, make more money, and make room for academics who want to do honest-to-god academics and work on theory or fundamental research. Or who want to do research relevant to improving society, not improving profit margins.

There aren't that many professor jobs out there. It's unbelievably greedy to be taking one up to do industry's dirty work.

You can always take an afternoon off a semester here and there to be an adjunct and teach a SE class or give a guest lecture.

[+] dgacmu|10 years ago|reply
Oh, my toosh, no. I don't agree with everything Matt said, but on this, you're totally off. I try pretty hard to do research that is (a) potentially industry-relevant (key word: potentially); and that (b) industry won't do for various reasons. Thus far, it's seemed to work pretty well. Using my favorite example of the week, take some of our work on cuckoo hashing -- it's contributed quite a bit to the literature of how to implement fast cuckoo hashes, contributed a new and fairly intriguing data structure for approximate set membership, and produced a design that I know to be in use in two Very Large Companies(tm).

The companies wouldn't have done this work, at least outside of their research labs, because the solution's theory is too far from the need of any one problem. But the result -- a more memory-efficient hash table design -- turns out to be broadly useful.

And yes, I do consider this to be "industry-relevant" research. I'm not going to solve their problems for them -- but there can be great synergies between industry and academia for having broad impact through adoption.

(full disclosure: I'm an academic on sabbatical at Google for the year. It's likely I'm a little biased in my belief that both have value. But this isn't a bias unique to me; systems as a general area is close to industry, and most of my colleagues rotate in and out of industry periodically via sabbaticals or startups.)

[+] mdwelsh|10 years ago|reply
I don't agree with this at all. The partnership between industry and academia is long-standing and has proven to be extremely valuable -- much of the Internet came about because of it.
[+] sail|10 years ago|reply
What stood out for me:

My PhD advisor never seemed to care particularly about publishing papers; rather, he wanted to move the needle for the field, and he did (multiple times).

Racking up publications is fine, but if you want to have impact on the real world, there's a lot more you can do.

[+] yarrel|10 years ago|reply
Clickbaity title aside, this is sound advice for academics who wish to be relevant to industry from someone who has experience in both camps.

Other academics, for example those doing "stuff going way beyond where industry is focused today" as the author explicitly states, can safely ignore it.

[+] KKKKkkkk1|10 years ago|reply
Re collaboration. I work in a government research lab which prides itself on being a collaborative environment. The result is that we publish 10-author papers in which one author is doing all of the work and the other 9 are cheering from the sidelines. I don't think this is particular to my lab -- the typical scenario is that 90% of the work on any given project is done by 10% of the people. So when people praise their work environment for being collaborative, I'm sceptical. I'd much rather be in a situation where everyone gets the credit they deserve for the work they have actually done.
[+] Fomite|10 years ago|reply
"Second: don't get hung up on who invents what. Coming from academia, I was trained to fiercely defend my intellectual territory, pissing all over anything that seemed remotely close to my area of interest. Industry is far more collaborative and credit is widely shared."

In my experience, this is only true until there is money to be made. Or more specifically, that industry was more than willing to share credit, but ownership was theirs.

[+] jff|10 years ago|reply
> Coming from academia, I was trained to fiercely defend my intellectual territory, pissing all over anything that seemed remotely close to my area of interest.

Anyone who has ever been through a conference/journal submission process knows this pain. You can usually tell from the comments which of the reviewers is working in your field and wants to shut you out.

[+] nitinics|10 years ago|reply
I think Industry Research is mostly driven by some constraints that applies to their architecture, their business use-cases and how much the company is willing to spend $$$ on research that adds value to their products or services. Academics on the other hand thinks beyond the box and researches and gives clues to upcoming industries on where the problem might be and how it could be solved, therefore eventually helping Industry grow with validation from the researches and allowing them to put them into "products" and "services".

Therefore, I don't think Academics should stop doing what they do (i.e. wander around) and have a laser focus on Industry's product-based researches.

[+] woah|10 years ago|reply
Matt, I was intrigued by your throwaway "not another multi hop routing protocol" comment. As far as I can tell, the field is very slow-moving. The state of the art, Babel, is at least 5 years old and is an incremental improvement on protocols that are at least 20 years old. Some very promising research was done into DHT-based routing with Scalable Source Routing, but this work is now from more than 10 years ago, and interest seems to have dropped off completely.

Are there a bunch of protocols that I don't know about?

Are you maybe referring also to centralized path finding algorithms? This would explain the comment.

[+] mdwelsh|10 years ago|reply
Nobody needs multihop routing protocols. Show me one instance in which they have been useful, despite 20+ years of academic work in the area.
[+] AngrySkillzz|10 years ago|reply
> "Coming from academia, I was trained to fiercely defend my intellectual territory, pissing all over anything that seemed remotely close to my area of interest."

Apparently the author was unable to break that habit.

[+] pklausler|10 years ago|reply
Program committees and the conferences they serve may be part of the problem, and hence part of the solution as well. Instead of picking the best N papers so as to fill out a conference schedule, pick the good papers and shorten the conference schedule if the number of good papers is <N. And then raise the standards to meet your expectations of reproducibility, code reviews, unit tests, etc. If a big conference like ISCA were to be shortened by one day by omitting the least worthy papers, you'd see much better work arriving the next year.
[+] jonsterling|10 years ago|reply
Who gives a damn if academic research is relevant to industry? Almost anything that could possibly be relevant to industry is highly uninteresting.

Imagine being someone who thinks that Capital could decide what is a good problem to work on...

[+] mdwelsh|10 years ago|reply
This is so completely wrong. The most exciting work happening in systems, networking, programming languages, crypto, computer architecture, mobile, and many other subfields of computer science is highly relevant to industry and very interesting academically.