The massive boom in computer science enrollment over the last 20 years has been driven mostly by people chasing tech salaries, not by any real interest in computing itself. These students often show up completely unprepared for how difficult CS actually is, and universities have responded by dumbing down their programs to keep everyone happy and enrolled.
If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing.
I think CS departments are at least partially responsible for this development. They know that most of the students applying care nothing about Computer Science, have no interest in Computer Science and will never learn Computer Science, yet they keep accepting (and graduating) them. If CS departments actually wanted to teach CS, then they would advocate for setting up a new series of departments/degrees with names like Software Development and Engineering or Application Design and UX, and send most of the students there. Then those who want to learn/teach Computer Science can learn/teach Computer Science without having to deal with a classroom full of people who really don't want to be there.
Even 20 years ago back when I was in college you had a sizable portion of kids who came in to study computer science thinking it would be fun and games. They were then made to study formal logic in their first semester and debug segfaults in gdb in the next, and by the end of the first year pretty much all of them had switched majors.
Anecdotally I've heard that very few CS programs even use C++ anymore, and schools now favor Python because students find it more accessible.
Having interviewed a number of graduates from code camps, they're definitely just chasing the salary.
Most of them have no actual passion for computing, their scope of knowledge is superficial, and they're asking for six-figure salaries out of the gate.
I had a relatively simple coding assignment (shouldn't take more than 15 minutes) that I would use to weed out those that were just copying and pasting sample code. It required a very large number of values and added an additional profiling step to it. The sample code wasn't performant with a very large number of values, and was painfully slow to use unless you made minor adjustments to a few things.
Thank you for saying this because it feels like these people entering the industry in such numbers over the 2010s completely killed what made this job fun in the first place. I call them "ticket completers". Sure they can mechanically perform the minimum requirements of the job, but there is zero interest at all in what is actually being done; just following PM directions to the letter with no further thought. The whole spirit of innovation and curiosity and discovery has been lost, replaced by lifestyle seekers who look at you like an insane person when trying to talk about software in the abstract (ha!).
The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
"If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing."
It's not going to work that way. I was genuinely interested and took many high level electives. I felt the program was very good 15 years ago at the school I attended. I also got an MSIS at a different school, but feel that one was not any more advanced than BS, just a faster pace and weirdly less coding. I did well for years at my job. Now it looks like I might lose my job and probably won't get another IT one. I will probably end up working at Walmart or something.
This is how I've been feeling through the whole situation. I got my current job at the bottom of the last slump the other year so it doesn't seem to be affecting me.
Still I've been careful to set my life up so I could go many years without employment if I had to. It's hard to trust the rest of the economy in general.
About 15 years ago when I started my degree there were both the “I want a good job” people and another crowd that I’d describe as having followed a thought process of “I like video games; I want to make video games; I should study comp sci”. At that time at least I think the video game crowd was even less equipped than the job crowd. Not to disparage video games, they are a majority of my free time, but those who were joining the field to _have fun_ are going to have an even harder time than those looking for work.
Lots of discussion about choice of programming language in the comments below.
- In principle, it should not matter at all, but there are practical reasons why one PL may be better than another in a particular school or context.
- But, all this "choice of PL" discussion is really a discussion about CS1. A CS degree has at least seven other courses -- assuming 1 CS course per semester -- and in practice many more than that. So, if you're going to ask questions about CS1, the question to ask is, "Does CS1 setup students to succeed in the advanced courses?" Classically, these were courses in compilers, operating systems, networking, and so on. These days, you can add distributed computing, machine learning, etc. (but don't subtract the classics).
There really is too much hand holding of university students nowadays. I don't think degrees are really equivalent to what they were thirty years ago. Back then the university was about weeding out the wasters and lazy folk who didn't study. College courses are meant to be hard for a reason. Don't get me started with that extra credit crap.
Thanks for sharing. Is this similar to what was attracting students to medicine/doctor (guaranteed position with high salary)? But the med-school-to-full-time-physician pipeline is long and can weed out. CS is a difficult subject, certain ways of thinking are difficult but certainly can be learned, like recursive thinking.
Did colleges expand their computer science departments or even just create them to meet the demand for the degree? The pipeline to possible employment with a CS degree is quite short, doesn't require residency and board-certification so it's a quicker route to employment, but then you are competing with peers with stronger backgrounds and educations and seasoned professionals for the same positions.
I started my degree in 1999, and already then this was already a factor. I hope more articles like this are published, and people who aren't really interested in computing stop choosing this path.
I'd argue it is actually more of a broad trend i.e. the boom in computer science enrollment over the last 20 years has been driven mostly by people chasing a better return on investment in the increasing cost of the average four-year degree, and software pays better than the average four-year degree. I do think that college being cheaper on average would help at least somewhat with CS being such a popular major.
If we're around in 2000 and 2009, you've seen this before. Our field has ups and downs and every time we hit the bottom people say it won't come back. It will.
We had some cleaning up to do. I was a hiring manager during COVID and the resumes I saw were unbelievable. People with "web" boot camps being considered for 6 figure salaries. People who had absolutely no business being in this field were being hired.
It was due to the easy money from low interest rates. This field always had solid salaries but some people were making a million to sit on meetings and integrate frameworks into me-too websites.
The hammer is coming down and is unfortunately hitting many good people too. But they will recover while the people who shouldn't be here will move on. Don't get your HVAC repair certification quite yet. Stop complaining about AI and go study it (the hard stuff not ChatGPT for dummies).
Well put. And this happened back in 2000, and 2009. I had people who I knew from direct experience to be non-technical slackers tell me about their IT Director jobs. I knew it wouldn't last then, and I'm not surprised now. Just get out of your comfort zone and start looking, and if you are in a bad situation don't be cowed. If you are truly technical this is always valued, despite the easy answers from ChatGPT, you must understand what it is telling you to really make use of it.
It's not the AI, it's outsourcing who really killing IT jobs. Even from relatively cheap East Europe projects are being moved to India, Vietnam and Philippines.
I don't read too much into the fact that unemployment for nutrition science is at 0.4% - that doesn't mean those people are all working as nutritionists or even in a job that requires a degree. You can see this clearly in the underemployment rate which is 45%+.
Likewise, the top unemployment rate (9.4%) of those with an anthropology major probably doesn't mean all those people are living under a bridge - a fair number of them will be pretty well off, living off their parents and knew going in that their field doesn't hire millions.
So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)? I feel some more on-the-ground reporting is needed.
The quotes from randos reacting in this article don't really help. "Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag," because debugging (like Zuck!) is computer science, apparently.
> So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)?
That's a very important observation. It's much better to be in a field with a 6% unemployment rate than a 60% underemployment rate (like criminal justice, performing arts, and, surprisingly, medical technicians).
> Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag
I feel like I've seen this quote many times over the years.
Also, how do they calculate employment rate? If you get a job at McDonald's while having a civil engineering degree or nutrition science, that counts as employed as well, no?
Would be good to see how many are actually employed in their field of study
> If you get a job at McDonald's while having a civil engineering degree
That would be under_employment (vs un_employment).
Un_employment refers to people actively seeking work but unable to find it, while under_employment encompasses individuals who are working but not fully utilizing their skills or working fewer hours than they would like
These numbers are all hard to measure. I only more or less worked in my engineering field of study (not CS) for 3 years but, other than going back to grad school for 2, was never underemployed by any serious definition of the term.
I left the software engineering field about 17 years ago to become a high school teacher. One of the things I taught was computer science (to high schoolers) and I recall sitting in many meetings of HS CS educators discussing the upcoming critical shortage of workers with CS degrees. I would tell them I left the field because there wasn't much work, and they would look at me like I was crazy. "Something wrong with that guy... He can't find work when there's a CRITICAL SHORTAGE of workers!!"
It’s a boom/bust profession that has been in a long boom.
Many of the big companies that have been on hiring orgies are advertising dependent. Ads are the thing that gets slashed heading into a bad economy, and we’re in an economic mess that is going to get alot worse.
20ys ago: you must study CS it's in high demand RN!
10ys ago: don't even apply w/o a master's degree!
2ys ago: sorry we're full!
1 week ago: you must study ML it's in high demand RN!
There's so many graduates that are not worth the paper their degree is printed on that it's laughable (if it weren't sad).
That's a good part of the reason why hiring processes are so long and you need to re-check everything people are supposed to know. Filtering out hundreds of candidates to get a mediocre one at best, thousands to get a really good one.
There are job openings, but just having a piece of paper is not enough to get to those.
AI tools have made recruiting a miserable experience for everyone involved, there's so much cheating in applicants and you waste so much time filtering those out and sadly, good candidates sometimes get lost in the noise.
Networking is what has the highest signal to noise ratio. A good recommendation from someone you trust helps a lot, but it penalizes people just starting their careers and have smaller networks.
"Despite computer science being ranked as number one by the Princeton Review for college majors, the tech industry may not be living up to graduates' expectations.
When it came to undergraduate majors with the highest unemployment rates, computer science came in at number seven, even amid its relative popularity.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
Computer engineering, which at many schools is the same as computer science, had a 7.5 percent unemployment rate, calling into question the job market many computer science graduates are entering.
On the other hand, majors like nutrition sciences, construction services and civil engineering had some of the lowest unemployment rates, hovering between 1 percent to as low as 0.4 percent.
This data was based on The New York Fed's report, which looked at Census data from 2023 and unemployment rates of recent college graduates."
> “Learn to code” ceases to be good advice if too many people do.
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
Anecdotally, but talking to a lot of people who really have their ears to the ground, the junior roles thing seems to be very real. It probably isn't just AI--with more senior folks probably more available, why hire juniors--but seems to be pretty pronounced (with probably the corollary that bootcamps are probably a bad idea these days). Which isn't a great trend if real.
'learn to code' is a great advice for anybody. If you're a biology major and need to check the world molecules database (forgot the name, sorry), being able to write your own query goes a long way despite the nocode solutions.
It's mainly #1. For 20 years now we've been hearing non-stop about how computer science is this magical major where anyone can sleepwalk out of college into a 150k job. Parents have been pushing their kids into it whether they are interested or not. Colleges have been taking advantage by pushing sub-par programs and boosting graduation rates. The end result is a large number of CS graduates who can't write a for loop in an interview (and will then loudly complain about how the interview process is unfair).
> 1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
Do you have a link to the post (or posts) from Michael O. Church? I have a vague recollection of the idea but I would like to reread it with what I know today!
I actually think some of big tech cough Apple cough is a decent short right now. I wanted to do it back in December but it's hard to bring yourself to short the largest companies like that.
“Learn to code” ceases to be good advice if too many people do.
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
This is nothing (it will get worse) compared to what will happen in 2030.
Just look at what is happening in just the last 5 to 6 months since this prediction was made [0]. The definition of "AGI" was hijacked to mean all sorts of things to the companies that operate the AI systems, even conflicting with each other on timeframes and goals.
But what really is the true definition of "AGI" is the blueprint inside the WEF's Future of Jobs Report 2025 [1] with the deadline of 2030 including mass layoffs which 40% of employers admittedly anticipate reducing their workforce where AI can automate tasks, as I said before [2]
So what AGI actually means is a 10% global unemployment increase by 2030 or 2035 and with all those savings going to the AI companies.
> with all those savings going to the AI companies
I'm not even sure those savings will "go" anywhere, they will just stay with the companies. Right now, if I use my $20/mo ChatGPT subscription to automate away my secretary's job ($3,000/mo or whatever), it's not like those $3,000/mo is going to OpenAI. And I don't think in the future they will be able to jack up prices, because foundational LLM models have become a race to the bottom.
cs_throwaway|9 months ago
The massive boom in computer science enrollment over the last 20 years has been driven mostly by people chasing tech salaries, not by any real interest in computing itself. These students often show up completely unprepared for how difficult CS actually is, and universities have responded by dumbing down their programs to keep everyone happy and enrolled.
If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing.
dagw|9 months ago
paxys|9 months ago
Anecdotally I've heard that very few CS programs even use C++ anymore, and schools now favor Python because students find it more accessible.
marklubi|9 months ago
Most of them have no actual passion for computing, their scope of knowledge is superficial, and they're asking for six-figure salaries out of the gate.
I had a relatively simple coding assignment (shouldn't take more than 15 minutes) that I would use to weed out those that were just copying and pasting sample code. It required a very large number of values and added an additional profiling step to it. The sample code wasn't performant with a very large number of values, and was painfully slow to use unless you made minor adjustments to a few things.
ramesh31|9 months ago
The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
giantg2|9 months ago
It's not going to work that way. I was genuinely interested and took many high level electives. I felt the program was very good 15 years ago at the school I attended. I also got an MSIS at a different school, but feel that one was not any more advanced than BS, just a faster pace and weirdly less coding. I did well for years at my job. Now it looks like I might lose my job and probably won't get another IT one. I will probably end up working at Walmart or something.
msgodel|9 months ago
Still I've been careful to set my life up so I could go many years without employment if I had to. It's hard to trust the rest of the economy in general.
roxolotl|9 months ago
enum|9 months ago
- In principle, it should not matter at all, but there are practical reasons why one PL may be better than another in a particular school or context.
- But, all this "choice of PL" discussion is really a discussion about CS1. A CS degree has at least seven other courses -- assuming 1 CS course per semester -- and in practice many more than that. So, if you're going to ask questions about CS1, the question to ask is, "Does CS1 setup students to succeed in the advanced courses?" Classically, these were courses in compilers, operating systems, networking, and so on. These days, you can add distributed computing, machine learning, etc. (but don't subtract the classics).
sys_64738|9 months ago
dpflan|9 months ago
Did colleges expand their computer science departments or even just create them to meet the demand for the degree? The pipeline to possible employment with a CS degree is quite short, doesn't require residency and board-certification so it's a quicker route to employment, but then you are competing with peers with stronger backgrounds and educations and seasoned professionals for the same positions.
Maro|9 months ago
km144|9 months ago
glimshe|9 months ago
We had some cleaning up to do. I was a hiring manager during COVID and the resumes I saw were unbelievable. People with "web" boot camps being considered for 6 figure salaries. People who had absolutely no business being in this field were being hired.
It was due to the easy money from low interest rates. This field always had solid salaries but some people were making a million to sit on meetings and integrate frameworks into me-too websites.
The hammer is coming down and is unfortunately hitting many good people too. But they will recover while the people who shouldn't be here will move on. Don't get your HVAC repair certification quite yet. Stop complaining about AI and go study it (the hard stuff not ChatGPT for dummies).
jaybrendansmith|9 months ago
ponector|9 months ago
bux93|9 months ago
I don't read too much into the fact that unemployment for nutrition science is at 0.4% - that doesn't mean those people are all working as nutritionists or even in a job that requires a degree. You can see this clearly in the underemployment rate which is 45%+.
Likewise, the top unemployment rate (9.4%) of those with an anthropology major probably doesn't mean all those people are living under a bridge - a fair number of them will be pretty well off, living off their parents and knew going in that their field doesn't hire millions.
So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)? I feel some more on-the-ground reporting is needed.
The quotes from randos reacting in this article don't really help. "Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag," because debugging (like Zuck!) is computer science, apparently.
leereeves|9 months ago
That's a very important observation. It's much better to be in a field with a 6% unemployment rate than a 60% underemployment rate (like criminal justice, performing arts, and, surprisingly, medical technicians).
felixarba|9 months ago
I feel like I've seen this quote many times over the years.
Also, how do they calculate employment rate? If you get a job at McDonald's while having a civil engineering degree or nutrition science, that counts as employed as well, no?
Would be good to see how many are actually employed in their field of study
Maro|9 months ago
That would be under_employment (vs un_employment).
Un_employment refers to people actively seeking work but unable to find it, while under_employment encompasses individuals who are working but not fully utilizing their skills or working fewer hours than they would like
ghaff|9 months ago
vertnerd|9 months ago
Spooky23|9 months ago
Many of the big companies that have been on hiring orgies are advertising dependent. Ads are the thing that gets slashed heading into a bad economy, and we’re in an economic mess that is going to get alot worse.
dschuetz|9 months ago
mrkramer|9 months ago
juancn|9 months ago
That's a good part of the reason why hiring processes are so long and you need to re-check everything people are supposed to know. Filtering out hundreds of candidates to get a mediocre one at best, thousands to get a really good one.
There are job openings, but just having a piece of paper is not enough to get to those.
AI tools have made recruiting a miserable experience for everyone involved, there's so much cheating in applicants and you waste so much time filtering those out and sadly, good candidates sometimes get lost in the noise.
Networking is what has the highest signal to noise ratio. A good recommendation from someone you trust helps a lot, but it penalizes people just starting their careers and have smaller networks.
It's a sad state of affairs.
dyl000|9 months ago
baron816|9 months ago
unknown|9 months ago
[deleted]
revskill|9 months ago
1vuio0pswjnm7|9 months ago
When it came to undergraduate majors with the highest unemployment rates, computer science came in at number seven, even amid its relative popularity.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
Computer engineering, which at many schools is the same as computer science, had a 7.5 percent unemployment rate, calling into question the job market many computer science graduates are entering.
On the other hand, majors like nutrition sciences, construction services and civil engineering had some of the lowest unemployment rates, hovering between 1 percent to as low as 0.4 percent.
This data was based on The New York Fed's report, which looked at Census data from 2023 and unemployment rates of recent college graduates."
Source:
https://www.newyorkfed.org/research/college-labor-market (requires Javascript)
Data: (no Javascript required)
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
Civil Engineering 1.0%
Aerospace Engineering 1.4%
Mechanical Engineering 1.5%
Chemical Engineering 2.0%
Electrical Engineering 2.2%
General Engineering 2.4%
Miscellaneous Engineering 3.4%
Computer Science 6.1%
Computer Engineering 7.5%
zettapwn|9 months ago
volemo|9 months ago
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
ghaff|9 months ago
orwin|9 months ago
paxys|9 months ago
rvz|9 months ago
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
JanneVee|9 months ago
msgodel|9 months ago
dagw|9 months ago
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
wooque|9 months ago
astura|9 months ago
JojoFatsani|9 months ago
[deleted]
rvz|9 months ago
Just look at what is happening in just the last 5 to 6 months since this prediction was made [0]. The definition of "AGI" was hijacked to mean all sorts of things to the companies that operate the AI systems, even conflicting with each other on timeframes and goals.
But what really is the true definition of "AGI" is the blueprint inside the WEF's Future of Jobs Report 2025 [1] with the deadline of 2030 including mass layoffs which 40% of employers admittedly anticipate reducing their workforce where AI can automate tasks, as I said before [2]
So what AGI actually means is a 10% global unemployment increase by 2030 or 2035 and with all those savings going to the AI companies.
[0] https://news.ycombinator.com/item?id=42490692
[1] https://www.weforum.org/publications/the-future-of-jobs-repo...
[2] https://news.ycombinator.com/item?id=42652402
Maro|9 months ago
I'm not even sure those savings will "go" anywhere, they will just stay with the companies. Right now, if I use my $20/mo ChatGPT subscription to automate away my secretary's job ($3,000/mo or whatever), it's not like those $3,000/mo is going to OpenAI. And I don't think in the future they will be able to jack up prices, because foundational LLM models have become a race to the bottom.