top | item 43494104

(no title)

snickmy | 11 months ago

This was inevitable and we'll see it playing out all over Europe.

You have a desire to be relevant in an important technological shift.

On one side, you have big tech companies laser-focused on attracting the best talent and putting them in a high-pressure cooker to deliver real business outcomes, under a leadership group that has consistently proven effective for the last XX years.

On the other side, you have universities, led by the remnants of that talent pool—those who were left behind in the acquisition race—full of principles and philosophical opinions but with little to no grounded experience in actual execution. Instead, you find a bunch of PhD students who either didn’t make the cut to be hired by the aforementioned tech companies or lack the DNA to thrive in them, actively avoiding that environment. Sprinkle on top several layers of governmental bureaucracy and diluted leadership, just to ensure everyone gets a fair slice of the extra funding.

I'm surprised anyone is surprised.

discuss

order

auggierose|11 months ago

I don't think universities should become industry. I mean, that is exactly what we have industry for. If you want to be put in a pressure cooker under leadership focused on business outcomes, great, do industry.

The problem really is that universities are treated as if they have the same mandate as industry. Government people shouldn't tell a professor what kind of research is interesting. They should let the best people do what they want to do.

I remember an acquaintance becoming a professor, promoted from senior reader, and he was going to be associated with the Alan Turing Institute. I congratulated him, and asked him what he was going to do now with his freedom. He answered that there were certain expectations of what he would be doing attached to his promotion, so that would be his focus.

This way you don't get professors, you turn good people into bureaucrats.

pjc50|11 months ago

Yes. The demand for increasing control, driven by the "taxpayer's money!" lot evident in this thread, strangles almost all state-funded research because it demands to know up front what the outcome will be. Which instantly forces everyone to pick only sure-bet research projects, while trying to sneak off to do actual blue-sky research in the background on "stolen" fractions of the funding. Like TBL inventing the WWW at CERN: that wasn't in his research brief, I'm sure it wasn't something that was funded in advance specifically for him to do.

Mind you, it was evident to me even twenty years ago when briefly considering a PhD that CS research not focused on applying itself to users would .. not be applied and languish uselessly in a paper that nobody reads.

I don't have a good answer to this.

(also, there is no way universities are going to come up with something which requires LLM like levels of capital investment: you need $100M of GPUs? You're going to spend a decade getting that funding. $10bn? Forget it. OpenAI cost only about half of what the UK is spending on its nuclear weapons programme!)

jimmaswell|11 months ago

That doesn't sound like a fair appraisal of university research at all. How much do we rely on day to day that came out of MIT alone? A lot of innovation does come from industry, but certain other innovation is impossible with a corporation breathing down your neck to increase next quarter's profits.

harvey9|11 months ago

The person you replied to is talking about the UK and Europe. I suspect that funding for research works differently at MIT and in the US generally.

snickmy|11 months ago

US universities (the usual suspects) have a substantial different approach to industry integration then European one.

Yet, European leaders have not got the memo, and expect the same level of output.

dingnuts|11 months ago

Your rhetorical begs the question -- I can't think of anything more recent than the MIT license.

What DO we rely on that has come out of MIT this century? I'm having a real hard time thinking of examples.

sfpotter|11 months ago

The problem is the "desire to be relevant in an important technological shift".

There's loads of worthwhile research to do that has nothing to do with LLMs. A lot of it will not or cannot be done in an industrial environment because the time horizon is too long and uncertain. Stands to reason that people who thrive in a "high-pressure cooker" environment are not going to thrive when given a long-term, open-ended goal to pursue in relative solitude that requies "principles and philosophical opinions" that aren't grounded in "actual execution". That's what makes real (i.e. basic) research hard and different as opposed to applied research. Lots of people in industry claiming to be researchers or scientists who are anything but.

daveguy|11 months ago

"actual execution" in the business world seems to be more and more synonymous with recklessly and incompetently fucking things up. See also: doge.

robertlagrant|11 months ago

Yes, this is so telling:

> For example, neither the key advance of transformers nor its application in LLMs were picked up by advisory mechanisms until ChatGPT was headline news. Even the most recent AI strategies of the Alan Turing Institute, University of Cambridge and UK government make little to no mention of AGI, LLMs or similar issues.

Almost any organisation struggles to stay on task unless there's a financial incentive or another driver, such as exceptional staff/management in place. Give them free money - the opposite of financial incentive - and the odds drop further.

gorgoiler|11 months ago

I’m sorry to read this — it just doesn’t feel grounded in my own lived experience.

Many of the best Engineering and Computer Science departments, around the world, operate a revolving door for people to go in and out of industry and academia and foster the strongest of relationships bridging both worlds.

Look at Roger Needham’s Wikipedia page and follow his academic family tree up and down and you’ll see what I mean.

https://en.m.wikipedia.org/wiki/Roger_Needham

aleph_minus_one|11 months ago

> remnants of that talent pool—those who were left behind in the acquisition race—full of principles and philosophical opinions but with little to no grounded experience in actual execution.

I do believe that these people at universities do have experience in the actual execution - of doing research. What they obviously have less experience in is building companies.

> Instead, you find a bunch of PhD students who either didn’t make the cut to be hired by the aforementioned tech companies

Or because they live in a country where big tech is not a thing. Or because these people simply love doing research (I am rather not willing to call what these AI companies are doing "research").

tene80i|11 months ago

Jesus… are you this judgmental about everyone in society? Some people just value the university environment. It doesn’t mean they’re incompetent and had no other options. Not everyone values money above all else, nor does choosing to opt out of the private sector mean people are “remnants”.

WhyOhWhyQ|11 months ago

From my perspective it's almost exactly opposite. Almost all of the people I consider exceptionally talented are vying for positions in academia (I'm in mathematics), and the people who don't make it begrudgingly accept jobs at the software houses / research labs.

I'm frequently and sadly reminded when I visit this website that lot of (smart) people can't seem imagine any form of success that doesn't include common social praise and monetary gain.

jimmaswell|11 months ago

Another point re: grounded experience, good professors/researchers make a point to take sabbaticals to work in industry for that purpose.

sfpotter|11 months ago

Have met lots of professors who are glorified managers who do not actual research taking sabbaticals for a fat paycheck. I doubt very much they do any real work during these sabbaticals either. If I had to guess, I would bet that these sabbatical positions are frequently sinecures.