top | item 47054789

(no title)

cmiles8 | 12 days ago

This is the elephant in the room nobody wants to talk about. AI is dead in the water for the supposed mass labor replacement that will happen unless this is fixed.

Summarize some text while I supervise the AI = fine and a useful productivity improvement, but doesn’t replace my job.

Replace me with an AI to make autonomous decisions outside in the wild and liability-ridden chaos ensues. No company in their right mind would do this.

The AI companies are now in a extinctential race to address that glaring issue before they run out of cash, with no clear way to solve the problem.

It’s increasingly looking like the current AI wave will disrupt traditional search and join the spell-checker as a very useful tool for day to day work… but the promised mass labor replacement won’t materialize. Most large companies are already starting to call BS on the AI replacing humans en-mass storyline.

discuss

order

pvab3|12 days ago

Part of the problem is the word "replacement" kills nuanced thought and starts to create a strawman. No one will be replaced for a long time, but what happens will depend on the shape of the supply and demand curves of labor markets.

If 8 or 9 developers can do the work of 10, do companies choose to build 10% more stuff? Do they make their existing stuff 10% better? Or are they content to continue building the same amount with 10% fewer people?

In years past, I think they would have chosen to build more, but today I think that question has a more complex answer.

1PlayerOne|12 days ago

AI says:

1. The default outcome: fewer people, same output (at first) When productivity jumps (e.g., 5–6 devs can now do what 10 used to), most companies do not immediately ship 10% more or make things 10% better. Instead, they usually:

Freeze or slow hiring Backfill less when people leave Quietly reduce team size over time

This happens because:

Output targets were already “good enough” Budgets are set annually, not dynamically Management rewards predictability more than ambition

So the first-order effect is cost savings, not reinvestment.

Productivity gains are initially absorbed as efficiency, not expansion.

2. The second-order effect: same headcount, more scope (but hidden) In teams that don’t shrink, the extra capacity usually goes into things that were previously underfunded:

Tech debt cleanup Reliability and on-call quality Better internal tooling Security, compliance, testing

From the outside, it looks like:

“They’re building the same amount.”

From the inside, it feels like:

“We’re finally doing things the right way.”

So yes, the product often becomes “better,” but in invisible ways.

3. Rare but real: more stuff, faster iteration Some companies do choose to build more—but only when growth pressure is high. This is common when:

The company is early-stage or mid-scale Market share matters more than margin Leadership is product- or founder-led There’s a clear backlog of revenue-linked features

In these cases, productivity gains translate into:

Faster shipping cadence More experiments Shorter time-to-market

But this requires strong alignment. Without it, extra capacity just diffuses.

4. Why “10% more” almost never happens cleanly The premise sounds linear, but software work isn’t. Reasons:

Coordination, reviews, and decision-making still bottleneck Roadmaps are constrained by product strategy, not dev hours Sales, design, legal, and operations don’t scale at the same rate

So instead of:

“We build 10% more”

You get:

“We missed fewer deadlines” “That migration finally happened” “The system breaks less often”

These matter—but they’re not headline-grabbing.

5. The long-run macro pattern Over time, across the industry:

Individual teams → shrink or hold steady Companies → maintain output with fewer engineers Industry as a whole → builds far more software than before

This is the classic productivity paradox:

Local gains → cost control Global gains → explosion of software everywhere

Think:

More apps, not bigger teams More features, not more people More companies, not fatter ones

6. The uncomfortable truth If productivity improves and:

Demand is flat Competition isn’t forcing differentiation Leadership incentives favor cost control

Then yes—companies are content to build the same amount with fewer people. Not because they’re lazy, but because:

Efficiency is easier to measure than ambition Savings are safer than bets Headcount reductions show up cleanly on financials

alex43578|12 days ago

There’s a middle road where AI replaces half the juniors or entry level roles, the interns and the bottom rung of the org chart.

In marketing, an AI can effortlessly perform basic duties, write email copy, research, etc. Same goes for programming, graphic design, translation, etc.

The results will be looked over by a senior member, but it’s already clear that a role with 3 YOE or less could easily be substituted with an AI. It’ll be more disruptive than spell check, clearly, even if it doesn’t wipe it 50% of the labor market: even 10% would be hugely disruptive.

johnnienaked|12 days ago

I think you're really overstating things here. Entry level positions are the tier at which replacement of senior positions happen. They don't do a lot, sure, but they are cheap and easily churnable. This is precisely NOT the place companies focus on for cutbacks or downsizing. AI being acceptable at replacing unskilled labor doesn't mean it WILL replace it. It has to make business sense to implement it.

cmiles8|12 days ago

Not really though:

1. Companies like savings but they’re not dumb enough to just wipe out junior roles and shoot themselves in the foot for future generations of company leaders. Business leaders have been vocal on this point and saying it’s terrible thinking.

2. In the US and Europe the work most ripe for automation and AI was long since “offshored” to places like India. If AI does have an impact it will wipe out the India tech and BPO sector before it starts to have a major impact on roles in the US and Europe.

neuronic|12 days ago

And why would it materialize? Anyone who has used even modern models like Opus 4.6 in very long and extensive chats about concrete topics KNOWS that this LLM form of Artificial Intelligence is anything but intelligent.

You can see the cracks happening quite fast actually and you can almost feel how trained patterns are regurgitated with some variance - without actually contextualizing and connecting things. More guardrailing like web sources or attachments just narrow down possible patterns but you never get the feeling that the bot understands. Your own prompting can also significantly affect opinions and outcomes no matter the factual reality.

gjk3|12 days ago

The great irony is this episode is exposing those who are truly intelligent and those who are not.

Folks feel free to screenshot this ;)

aidev19373913|12 days ago

It doesn’t have to replace us, just make us more productive.

Software is demand constrained, not supply constrained. Demand for novel software is down, we already have tons of useful software for anything you can think of. Most developers at google, Microsoft, meta, Amazon, etc barely do anything. Productivity is approaching zero. Hence why the corporations are already outsourcing.

The number of workers needed will go down.

gjk3|12 days ago

Well done sir, you seem to think with a clear mind.

Why do you think you are able to evade the noise, whilst others seem not to? IM genuinely curious. Im convinced its down to the fact that the people 'who get it' have a particular way of thinking that others dont.

sesm|12 days ago

The narrative about AI replacing humans is just a way to say 'we became 2x more productive' instead of saying 'we cut 50% jobs', which sounds better for investors. The real reason for job cut is COVID overhiring plus interest rate going up. If you remember, Twitter did the job cuts without any AI-related narrative.

zaphirplane|12 days ago

1 you are massively assuming less than linear improvement, even linear over 5 years puts LLM in different category

2 more efficient means need less people means redundancy means cycle of low demand

8n4vidtmkvmk|12 days ago

1 it has nothing to do with 'improvement'. You can improve it to be a little less susceptible to injection attacks but that's not the same as solving it. If only 0.1% of the time it wires all your money to a scammer, are you going to be satisfied with that level of "improvement"?

windexh8er|12 days ago

OK. Let's take what you've stated as a truth.

So where is the labor force replacement option on Anthropic's website? Dario isn't shy about these enormous claims of replacing humans. He's made the claim yet shows zero proof. But if Anthropic could replace anyone reliably, today why would they let you or I take that revenue? I mean they are the experts, right? The reality is these "improvements" metrics are built in sand. They mean nothing and are marketing. Show me any model replacing a receptionist today. Trivial, they say, yet they can't do it reliably. AND... It costs more at these subsidized prices.

otabdeveloper4|12 days ago

LLMs haven't been improving for years.

Despite all the productizing and the benchmark gaming, fundamentally all we got is some low-hanging performance improvements (MoE and such).

Applejinx|12 days ago

It sure did: I never thought I would abandon Google Search, but I have, and it's the AI elements that have fundamentally broken my trust in what I used to take very much for granted. All the marketing and skewing of results and Amazon-like lying for pay didn't do it, but the full-on dive into pure hallucination did.