How does one explain the drop starting January 2023 (esp for things like Customer Service Rep, which is an NLP-heavy task) when most corporations didnt even start LLM/NLP pilots until mid/late 2023? I skimmed thru the 100+ page paper but didnt see an explanation for this strange leading effect.
SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year. The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient, but the chart doesnt make sense since the labor efforts seem to be leading the actual technology by over a year or more. From informal surveys, LLM-copilot usage didnt become widespread until late 2023 to mid 2024, certainly not widespread enough to cause macro labor effects in mid-2022.
The 2022 drop for SWE is easy for me to explain, and it's not on these analysts' list of factors (though I'm not an economic quant, I don't know how you could really control for it): In 2017, a tax bill was passed that cut a particular tax incentive in 2022 in an effort to be counted as "revenue neutral" despite being otherwise a massive tax cut overall. The incentive in question was a writeoff for "Research and development". This means that in 2022, it got effectively much more expensive to hire anyone who falls under that category, including developers not directly necessary for the day-to-day function of a business (hell, one might argue they would have counted anyway) and scientists of most kinds. That this hit big firms, which have a higher relative amount of R&D efforts going at a given time, first makes a lot of sense.
For customer service, my explanation is that companies literally do not care about customer service. Automated phone trees, outsourced call centers whose reps have no real power to help a customer, and poorly-made websites have been frustrating people for decades, but businesses never seem to try to compete on doing better at it. It's a cheap win with investors who want to hear about AI initiatives to lay off yet even more of this department, because it doesn't matter if the quality of service declines, there are no market or regulatory forces that are punishing this well enough to ever expect firms to stop breaking it, let alone fix it
I do consulting, I'm constantly scouting clients. Right around November 2022 something very stark happened. I went from fighting off prospects with a stick, to crickets, almost over night. I deal mostly with startups and mid-size companies, nobody with insider knowledge or cutting edge interests. I can tell you that GPT was not heavily on anyone I dealt with's radar as an opportunity to reduce costs.
Some sort of cultural zeitgeist occurred, but in terms of symptoms I saw with my own eyes, I think ZIRP ending (projects getting axed) and layoffs starting (projects getting filled within ~24 hours) were huge drivers. I have no proof.
I had the same thoughts, there are clearly indicators that the weakness in the labor market started happening before LLMs and AI took over popular discourse.
All the more reason to believe that while correlated, LLMs are certainly not the largest contributor, or even the cause of the job market weakness for young people. The more likely and simple explanation is that there are cracks forming in the economy not just in the US but globally; youth employment is struggling virtually everywhere. Can only speculate on the reasons, but delayed effects from questionable monetary and fiscal policy choices, increasing wealth gaps, tariffs, geopolitics, etc. have certainly not helped.
>The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient
Looking at the paper [0], they attempted to do it by regressing the number of jobs y_{c,q,t} at company c, time t, and "AI exposure quintile" q, with separate parameters jointly controlling for company/quintile (a), company/time (b) and quintile/time (g). This is in Equation 4.1, page 15, which I have simplified here:
log(y_{c,q,t}) ~ a_{c,q} + b_{c,t} + g_{q,t}
Any time-dependent effects (e.g. end of ZIRP/Section 174) that would equally affect all jobs at the company irrespective of how much AI exposure they have should be absorbed into b.
They normalized g with respect to October 2022 and quintile 1 (least AI exposure), and plotted the results for each age group and quintile (Figure 9, page 20). There is a pronounced decline that only starts in mid-2024 for quintiles 3, 4, and 5 in the youngest age group. The plots shown in the article are misleading, and are likely primarily a reflection of ZIRP, as you say. The real meat of the paper is Figure 9.
A potential flaw of this method is that ZIRP/Section 174 may have disproportionately affected junior positions with high AI exposure, e.g. software engineers. This would not be accounted for in b and would thus be reflected in g. It would be interesting to repeat this analysis excluding software engineers and other employees subject to Section 174.
Yeah my company started stepping up outsourcing in 2023. We also started some AI projects. The AI projects haven't made much progress but the outsourcing is at an extremely advanced stage.
I personally sat in meetings in 2022 where we adjusted staffing projections in anticipation of AI efficiency. Sure some of it was "overhiring," but the reality was that those staffing goals were pre-ai. Once they were updated, that's when the layoffs started because management didn't want anyone who didn't have an AI or big data background.
in addition to this detail I might add I can't remember the last time I had a customer service call that took place with someone stateside. It's easy to point to AI when offshoring for favorable interest rates is really the reason.
I think it's lipstick on a pig. We've seen tech companies collude before, and I'm guessing they're doing it again, trying to drive down the price of talent and make their employees less demanding.
I remember years of a no-backfill policy at Devon on a promises of automation. Since 2017 at least. The “desirable job market” for young people has been challenging well before LLM became popular. Want a dead end entry level job in food service earning $20/hour? No problem.
> SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year
It was pretty clear by late 2022 that AI assisted coding was going to transform how software development was done. I remember having conversations with colleagues at that time about how SWE might transform into an architecture and systems design role, with transformer models filling in implementations.
If it was clear to workers like us, it was pretty clear to the c-suite. Not that it was the only reason for mass layoffs, but it was a strong contributor to the rationale.
Many large companies were placing a bet that there were turbulent times ahead, and were lightening their load preemptively.
It is possible that multiple trends are coalescing
1. layoffs after web3 hiring spree
2. End of Zirp
However I think now, in 2025 is it impossible to reasonably claim AI isn't making an impact in hiring. Those who disagree on here seem to be insistent on some notion that AI has no benefits whatsoever, thus could never cause job loss.
Exactly this - I've said it before and will say it again - new technologies emerge in response to trends, often to accelerate existing trends and does not create them.
I see a few explanations for what you're saying, and those might be true, but I strongly believe part of it is investment (particularly VC, less so PE) has hit diminishing returns in tech and which means less subsidized "disruption", which means less money to hire people. AI becoming hugely popular right when this was happening is not a coincidence. And it's not just startups, less investment in startups also mean less clients for AWS and Azure. A16Z / Sand Hill switching to AI is not them just chasing the latest trend, it's a bid to reduce cost on people, which is the most expensive part of a tech company, as the only way to extend their unicorn-focused investment strategy.
This reminds me of the part of The Book of Why by Judea Pearl discussing how do calculus and the causal revolution came about with the simple insight that effects come before causes and so do calculus was invented to keep track of that in the math, rather than obscuring that with statistical relations that worked in either direction.
SWE, i know some worfloe digitalisation projects that had been in the planning for a very long time, budgetted as multiple person tear efforts, that due to pandemic nescessity were exected by a team of 3 over a long weekend in 2020. This did not go unnoticed, neither by customers nor swe providers.
Chat gpt 3 was 2020, even if the technology wasn't mature the hype was there informing investment and hiring decisions.
There was also other factors, there were covid booms, covid busts, overcorrections, Elon shoes you can cut by 90% and still keep a product running (kind of) and with X taking the flack other people followed suit without being as loud. There is a fairly major war in Europe ....
The drop started from mid-2022 and 2023 and there is a single cause: Russian assets freeze. This led to governments around the world moving their assets from West/AngloSaxon countries. This lack of liquidity put the West in a “hang in there” situation. Economically, it showed up as raising interest rates and politically where things move slower, the emergence of a new coalition.
It’s really as simple as that. But people would like to believe that West GDP is higher than global south GDP by xxx amounts and so all of this couldn’t be possible.
If you want an insight inside their heads, there is a Biden speech after the assets freeze where he declares that the Russian economy/country will collapse in a few weeks under the measures. None of this materialized and their bet have failed which is why Trump is trying to pull the US out of the mess.
Of course all of this is my personal opinion. So take it from the grain in my bag of salt.
I made a stupid simple model where hiring in all age brackets rose slowly until 2021 and then fell slowly. That produces very similar looking graphs, because the many engineers that were hired at the peak move up the demographic curve over time. Normalizing the graph to 2022 levels, as the paper seems to do, hides the fact that the actual hiring ratios didn't change at all.
Wow, that's hilarious. So essentially hiring could be identical across all age groups, but due to a glitch in the analysis (young people don't stay young, who knew?), it appears that younger people are losing jobs more than the rest.
I think not hiring juniors is a tragedy of the commons situation. It started before the AI boom, during COVID. It's not tax-related as people claim here, since this phenomenon is not US-only.
The ZIRP era made companies hire people as if there was no tomorrow, and companies started "poaching" engineers from others, including juniors. I saw some interns with 2 years of experience getting offers as seniors. I had friends being paid to attend boot camp.
Then everyone realized they were training junior engineers who would quickly get offers from other companies as “Senior" and leave. So companies stopped hiring them.
We need to reframe it. At this point, what we call "AI" is not a technology, but a subscription company.
A technology is a tool you can adopt in your toolchain to perform at task, even if in this case it's outsourcing cognitive load. For a subscription company, well, as long as the subscription is active, you get to outsource some of the cognitive load. When Anthropic's CEO says that white color jobs will disappear, he means that he is selling Enterprise subscriptions, and that companies will inevitably buy it.
Any economic data from between 2020 and 2025 should be tossed in the garbage. We will have no idea what affect AI has or hasn't had until AI has been available outside of the extremely confounded current circumstances. Tell me how employment looks after the next recession when the after effects of the pandemic, rapid inflation, interest rate unpredictability, and tariff whiplash are hopefully all behind us.
Lots of apparently unexplored alternative explanations. In times of uncertainty, you don't hire unless you need to. Another junior developer or customer service agent can be delayed, and youngsters in the labor force are most exposed to this. But if you need a home health aide, you probably don't have a lot of choice: somebody has to change grandma's diapers. Tariffs top my list of uncertainties for businesses, but interest rates are a close second.
I started university (in Australia) in 2004, not long after the dot com crash. CS enrolment rates were low, kids were getting scared off due to perceived lack of jobs. As a result, there was a shortage of grad talent (companies were already ramping up hiring again by 2004). I got a grad job just fine, in 2008, and I've never been short of work since.
So my advice to high school kids of 2025: right now is the perfect time to enrol in CS. 5 years from now, the AI hype will be over, and employers will be short on grads.
> 5 years from now, the AI hype will be over, and employers will be short on grads*
alterative view: the AI hype is real, AI takes over, and no one has any jobs anyway
also a thought: in 5 years the boomers will be retiring in droves as will the first series of GenX and the market, in most fields, should be opening up anyway
This time is different. A fact right now is that software engineers now can orchestrate LLMs and agents to write software. The role of software engineers who do this is quality control, compliance, software architecture and some out of the box thinking for when LLMs do not cut it. What makes you think advances in AI wont take care of these tasks that LLMs do not do well currently? My point is once these tasks are taken care off a CS graduate won't be doing tasks that they learnt to do in their degrees. What people need to learn is how to think of customers needs in abstract ways and communicate this to AI and judge the output in a similar way someone judges a painting.
"AI" dives in and disrupts and then it turns out that AI isn't too I. The disrupt phase where HR dumps staff based on dubious promises and directions from above takes a few months. The gradual re-hiring takes way longer than the dumping phase and will not trigger thresholds.
I've spent quite a while with "AI". LLMs do have a use but dumping staff is not one of the best ideas I've seen. I get that a management team are looking for trimmings but AI isn't the I they are looking for.
In my opinion (MD of a small IT focused company) LLMs are a better slide rule. I have several slide rules and calculators and obviously a shit load of computers. Mind you my slide rules can't access the internet, on the other hand my slide rules always work, without internets or power.
Software engineering is in correction mode since 2022, right after the Covid highs. AI is just the facade for job cuts. Zuck has been doing “the year of efficiency” for years now.
There are other factors that drive employment of young workforce, which don't have much to do with knowledge or skills. These are sort of blue-collar jobs within IT.
There is a lot of low-level drudgery work which is currently outsourced or assigned to non-employees (mostly young) in most companies. For example, the IT support and maintenance is mostly done outside of the West. This works requires a lot of back and forth. Don't think AI taking over this area.
Also, some work is assigned to young workers to spread the accountability and risk ownership. Not sure if you can hold AI as accountable as humans.
Also, young workers are generally preferred for their agility, flexibility and ability to work hard. There were easier to exploit and if they were unmarried, they won't mind giving all of their time and attention to work for less pay. I used to work in teams that stayed overnight at office to complete projects. Young people are also very cohesive and they team up well.
So, looking at the situation purely from a knowledge perspective may not give the full picture.
Several teams around me stopped hiring juniors over the past couple of years. It’s not that the newcomers aren’t good, it’s just that no one has time to train them. AI showed up at just the right moment to offer a convenient excuse, and companies are happy to save on the cost of mentoring.
But long term, this feels like borrowing from the future. Without someone to train, there’s no one ready to step up later.
Skimming this, I'm not sure why it couldn't be explained by the layoffs we had a couple years ago, which were primarily at tech companies (which are indeed more exposed to LLMs) and probably hit junior devs more.
I don't get how interest rates is given at best a cursory phrase when ZIRP regime ending is one of the biggest macro events of the past several decades. Seems like it would deserve more of a spotlight.
Looking at the "overall" graph they provide, it's clear that jobs as a whole have flatlined since 22, and even non-exposed jobs have had a growth rate hugely reduced. If we disregard the existence of AI, it would be easy to assume this is just the regular economic slowdown impacting younger people more than older ones. I don't see any correlation work done to isolate the source, especially as a lot of the "ai-exposed" work is also very much exposed to how much money is flowing in.
I doubt the whole narrative is true, it may just be hope that we do not need to hire because we will be able to do it with AI. But reality outside of software and tech is very different. I work in an organisation that heavily pushing AI and even with that push, a typical employee is still not utilising it fully, unprepared and expecting training from the employer. There is also a disconnect over connecting org data to these tools and between developers and cyber teams.
My experience: we're hiring for an AI engineer position and for a frontend developer position (and yes, we posted our positions here on Hackernews).
We have a stream of cookie-cutter candidates. As if they are clones of each other, it's uncanny. They typically have a BS degree in some foreign university, then a CS Masters' in the US, experience with robotics, then several years of experience in large companies.
And they completely fold during in-person coding tasks. Like, not being able to explain the difference between DFS and BFS (depth/breadth-first search). Or being able to write a simple custom metric and train a network in Pytorch.
And a similar story for the frontend developer position.
We now literally have to add more filters to not get inundated by underqualified candidates. These filters will make it harder for beginners to even _get_ to the resume review stage.
No conclusions from me, but something's been broken in the CS jobs market for a while.
The year is 2030: The junior engineers of the Adeptus Mechanicus stand in a circle around the holy monitor, chanting prayers to the Omnissiah. Their arcane books and lore guide the machine spirit, ensuring the correct holy words of blessing are entered into the holy prompt. Petitioning the machine spirit for its most truthful, secure holy answers.
Meh... just rehashing what he said before. The paper itself is fundamentally flawed, examining only a minuscule portion of the job market. If we step back and look at Europe's struggling economies over recent decades, we see that economic downturns disproportionately affect young people. Greece serves as the poster child, followed by Spain and Italy. In Germany alone, we've lost 50,000 jobs in manual labor heavy industries (mainly automotive) this past year. We're also seeing a 60% decline in apprenticeships for labor intensive roles at DAX companies that aren't even AI affected yet. AI has become a convenient scapegoat for a faltering economy driven by geopolitical tensions, protectionism and unqualified leadership in the world's largest economies. Roaring 20s indeed.
[+] [-] TuringNYC|6 months ago|reply
SWE figures dropped mid-2022 (almost magically in line with interest rate hikes) and LLM-copilots werent introduced for another year. The paper notes they did an adjustment for the end of ZIRP. I dont know enough econometrics to understand whether this adjustment was sufficient, but the chart doesnt make sense since the labor efforts seem to be leading the actual technology by over a year or more. From informal surveys, LLM-copilot usage didnt become widespread until late 2023 to mid 2024, certainly not widespread enough to cause macro labor effects in mid-2022.
[+] [-] advael|6 months ago|reply
For customer service, my explanation is that companies literally do not care about customer service. Automated phone trees, outsourced call centers whose reps have no real power to help a customer, and poorly-made websites have been frustrating people for decades, but businesses never seem to try to compete on doing better at it. It's a cheap win with investors who want to hear about AI initiatives to lay off yet even more of this department, because it doesn't matter if the quality of service declines, there are no market or regulatory forces that are punishing this well enough to ever expect firms to stop breaking it, let alone fix it
[+] [-] thr0w|6 months ago|reply
Some sort of cultural zeitgeist occurred, but in terms of symptoms I saw with my own eyes, I think ZIRP ending (projects getting axed) and layoffs starting (projects getting filled within ~24 hours) were huge drivers. I have no proof.
[+] [-] choilive|6 months ago|reply
All the more reason to believe that while correlated, LLMs are certainly not the largest contributor, or even the cause of the job market weakness for young people. The more likely and simple explanation is that there are cracks forming in the economy not just in the US but globally; youth employment is struggling virtually everywhere. Can only speculate on the reasons, but delayed effects from questionable monetary and fiscal policy choices, increasing wealth gaps, tariffs, geopolitics, etc. have certainly not helped.
[+] [-] MontyCarloHall|6 months ago|reply
Looking at the paper [0], they attempted to do it by regressing the number of jobs y_{c,q,t} at company c, time t, and "AI exposure quintile" q, with separate parameters jointly controlling for company/quintile (a), company/time (b) and quintile/time (g). This is in Equation 4.1, page 15, which I have simplified here:
log(y_{c,q,t}) ~ a_{c,q} + b_{c,t} + g_{q,t}
Any time-dependent effects (e.g. end of ZIRP/Section 174) that would equally affect all jobs at the company irrespective of how much AI exposure they have should be absorbed into b.
They normalized g with respect to October 2022 and quintile 1 (least AI exposure), and plotted the results for each age group and quintile (Figure 9, page 20). There is a pronounced decline that only starts in mid-2024 for quintiles 3, 4, and 5 in the youngest age group. The plots shown in the article are misleading, and are likely primarily a reflection of ZIRP, as you say. The real meat of the paper is Figure 9.
A potential flaw of this method is that ZIRP/Section 174 may have disproportionately affected junior positions with high AI exposure, e.g. software engineers. This would not be accounted for in b and would thus be reflected in g. It would be interesting to repeat this analysis excluding software engineers and other employees subject to Section 174.
[0] https://digitaleconomy.stanford.edu/wp-content/uploads/2025/...
[+] [-] jordanb|6 months ago|reply
[+] [-] deelowe|6 months ago|reply
[+] [-] kraig911|6 months ago|reply
[+] [-] cyanydeez|6 months ago|reply
[+] [-] Den_VR|6 months ago|reply
[+] [-] fuzzfactor|6 months ago|reply
[+] [-] danans|6 months ago|reply
It was pretty clear by late 2022 that AI assisted coding was going to transform how software development was done. I remember having conversations with colleagues at that time about how SWE might transform into an architecture and systems design role, with transformer models filling in implementations.
If it was clear to workers like us, it was pretty clear to the c-suite. Not that it was the only reason for mass layoffs, but it was a strong contributor to the rationale.
Many large companies were placing a bet that there were turbulent times ahead, and were lightening their load preemptively.
[+] [-] giantg2|6 months ago|reply
[+] [-] mertleee|6 months ago|reply
[+] [-] atleastoptimal|6 months ago|reply
1. layoffs after web3 hiring spree
2. End of Zirp
However I think now, in 2025 is it impossible to reasonably claim AI isn't making an impact in hiring. Those who disagree on here seem to be insistent on some notion that AI has no benefits whatsoever, thus could never cause job loss.
[+] [-] FloorEgg|6 months ago|reply
I sense some conflation of causation/correlation at hand.
[+] [-] unknown|6 months ago|reply
[deleted]
[+] [-] klik99|6 months ago|reply
I see a few explanations for what you're saying, and those might be true, but I strongly believe part of it is investment (particularly VC, less so PE) has hit diminishing returns in tech and which means less subsidized "disruption", which means less money to hire people. AI becoming hugely popular right when this was happening is not a coincidence. And it's not just startups, less investment in startups also mean less clients for AWS and Azure. A16Z / Sand Hill switching to AI is not them just chasing the latest trend, it's a bid to reduce cost on people, which is the most expensive part of a tech company, as the only way to extend their unicorn-focused investment strategy.
[+] [-] Natsu|6 months ago|reply
[+] [-] PeterStuer|6 months ago|reply
[+] [-] carabiner|6 months ago|reply
[+] [-] wisty|6 months ago|reply
There was also other factors, there were covid booms, covid busts, overcorrections, Elon shoes you can cut by 90% and still keep a product running (kind of) and with X taking the flack other people followed suit without being as loud. There is a fairly major war in Europe ....
[+] [-] thrawa8387336|6 months ago|reply
[+] [-] blindriver|6 months ago|reply
[+] [-] csomar|6 months ago|reply
It’s really as simple as that. But people would like to believe that West GDP is higher than global south GDP by xxx amounts and so all of this couldn’t be possible.
If you want an insight inside their heads, there is a Biden speech after the assets freeze where he declares that the Russian economy/country will collapse in a few weeks under the measures. None of this materialized and their bet have failed which is why Trump is trying to pull the US out of the mess.
Of course all of this is my personal opinion. So take it from the grain in my bag of salt.
[+] [-] jamii|6 months ago|reply
https://docs.google.com/spreadsheets/d/1z0l0rNebCTVWLk77_7HA...
[+] [-] tangotaylor|6 months ago|reply
[+] [-] jampa|6 months ago|reply
The ZIRP era made companies hire people as if there was no tomorrow, and companies started "poaching" engineers from others, including juniors. I saw some interns with 2 years of experience getting offers as seniors. I had friends being paid to attend boot camp.
Then everyone realized they were training junior engineers who would quickly get offers from other companies as “Senior" and leave. So companies stopped hiring them.
[+] [-] foxfired|6 months ago|reply
A technology is a tool you can adopt in your toolchain to perform at task, even if in this case it's outsourcing cognitive load. For a subscription company, well, as long as the subscription is active, you get to outsource some of the cognitive load. When Anthropic's CEO says that white color jobs will disappear, he means that he is selling Enterprise subscriptions, and that companies will inevitably buy it.
[+] [-] jjk166|6 months ago|reply
[+] [-] egl2020|6 months ago|reply
[+] [-] jaza|6 months ago|reply
So my advice to high school kids of 2025: right now is the perfect time to enrol in CS. 5 years from now, the AI hype will be over, and employers will be short on grads.
[+] [-] red-iron-pine|6 months ago|reply
alterative view: the AI hype is real, AI takes over, and no one has any jobs anyway
also a thought: in 5 years the boomers will be retiring in droves as will the first series of GenX and the market, in most fields, should be opening up anyway
[+] [-] AliveShine|6 months ago|reply
[+] [-] pizzly|6 months ago|reply
[+] [-] gerdesj|6 months ago|reply
"AI" dives in and disrupts and then it turns out that AI isn't too I. The disrupt phase where HR dumps staff based on dubious promises and directions from above takes a few months. The gradual re-hiring takes way longer than the dumping phase and will not trigger thresholds.
I've spent quite a while with "AI". LLMs do have a use but dumping staff is not one of the best ideas I've seen. I get that a management team are looking for trimmings but AI isn't the I they are looking for.
In my opinion (MD of a small IT focused company) LLMs are a better slide rule. I have several slide rules and calculators and obviously a shit load of computers. Mind you my slide rules can't access the internet, on the other hand my slide rules always work, without internets or power.
[+] [-] whatever1|6 months ago|reply
[+] [-] zkmon|6 months ago|reply
There is a lot of low-level drudgery work which is currently outsourced or assigned to non-employees (mostly young) in most companies. For example, the IT support and maintenance is mostly done outside of the West. This works requires a lot of back and forth. Don't think AI taking over this area.
Also, some work is assigned to young workers to spread the accountability and risk ownership. Not sure if you can hold AI as accountable as humans.
Also, young workers are generally preferred for their agility, flexibility and ability to work hard. There were easier to exploit and if they were unmarried, they won't mind giving all of their time and attention to work for less pay. I used to work in teams that stayed overnight at office to complete projects. Young people are also very cohesive and they team up well.
So, looking at the situation purely from a knowledge perspective may not give the full picture.
[+] [-] esafak|6 months ago|reply
[+] [-] zoeey|6 months ago|reply
[+] [-] phyzome|6 months ago|reply
[+] [-] yowlingcat|6 months ago|reply
[+] [-] Jean-Papoulos|6 months ago|reply
[+] [-] farai89|6 months ago|reply
[+] [-] cyberax|6 months ago|reply
We have a stream of cookie-cutter candidates. As if they are clones of each other, it's uncanny. They typically have a BS degree in some foreign university, then a CS Masters' in the US, experience with robotics, then several years of experience in large companies.
And they completely fold during in-person coding tasks. Like, not being able to explain the difference between DFS and BFS (depth/breadth-first search). Or being able to write a simple custom metric and train a network in Pytorch.
And a similar story for the frontend developer position.
We now literally have to add more filters to not get inundated by underqualified candidates. These filters will make it harder for beginners to even _get_ to the resume review stage.
No conclusions from me, but something's been broken in the CS jobs market for a while.
[+] [-] whateveracct|6 months ago|reply
[+] [-] papascrubs|6 months ago|reply
Oh wait, wrong distopian future.
[+] [-] jagged-chisel|6 months ago|reply
Education is an externality.
[+] [-] charcircuit|6 months ago|reply
[+] [-] BoredPositron|6 months ago|reply