top | item 43910545

(no title)

sirstoke | 9 months ago

I’ve been thinking about the SWE employment conundrum in a post-LLM world for a while now, and since my livelihood (and that of my loved ones’) depends on it, I’m obviously biased. Still, I would like to understand where my logic is flawed, if it is. (I.e I’m trying to argue in good faith here)

Isn’t software engineering a lot more than just writing code? And I mean like, A LOT more?

Informing product roadmaps, balancing tradeoffs, understanding relationships between teams, prioritizing between separate tasks, pushing back on tech debt, responding to incidents, it’s a feature and not a bug, …

I’m not saying LLMs will never be able to do this (who knows?), but I’m pretty sure SWEs won’t be the only role affected (or even the most affected) if it comes to this point.

Where am I wrong?

discuss

order

MR4D|9 months ago

I think an analogy that is helpful is that of a woodworker. Automation just allowed them to do more things at in less time.

Power saws really reduced time, lathes even more so. Power drills changed drilling immensely, and even nail guns are used on roofing project s because manual is way too slow.

All the jobs still exist, but their tools are way more capable.

babyshake|9 months ago

Automation allows one worker to do more things in less time, and allows an organization to have fewer workers doing those things. The result, it would seem, is more people out of work and those who do have work having reduced wages, while the owner class accrues all the benefits.

bentt|9 months ago

This is how I use LLM’s to code. I am still architecting, and the code it writes I could write given enough time and care, but the speed with which I can try ideas and make changes fundamentally alters what I will even attempt. It is very much a table saw.

tmoravec|9 months ago

How many wood workers were there as a proportion of the population in the 1800s and now?

CooCooCaCha|9 months ago

I think you’re making a mistake assuming AI is similar to past automation. Sure in the short term, it might be comparable but long term AI is the ultimate automation.

naasking|9 months ago

> Informing product roadmaps, balancing tradeoffs, understanding relationships between teams, prioritizing between separate tasks, pushing back on tech debt, responding to incidents, it’s a feature and not a bug, …

Ask yourself how many of these things still matter if you can tell an AI to tweak something and it can rewrite your entire codebase in a few minutes. Why would you have to prioritize, just tell the AI everything you have to change and it will do it all at once. Why would you have tech debt, that's something that accumulates because humans can only make changes on a limited scope at a mostly fixed rate. LLMs can already respond to feedback about bugs, features and incidents, and can even take advice on balancing tradeoffs.

Many of the things you describe are organizational principles designed to compensate for human limitations.

dgroshev|9 months ago

Software engineering (and most professions) also have something that LLMs can't have: an ability to genuinely feel bad. I think [1] it's hugely important and is an irreducible advantage that most engineering-adjacent people ignore for mostly cultural reasons.

[1]: https://dgroshev.com/blog/feel-bad/

concats|9 months ago

The way I see it:

* The world is increasingly ran on computers.

* Software/Computer Engineers are the only people who actually truly know how computers work.

Thus it seems to me highly unlikely that we won't have a job.

What that job entails I do not know. Programming like we do today might not be something that we spend a considerable amount of time doing in the future. Just like most people today don't spend much time handing punched-cards or replacing vacuum tubes. But there will still be other work to do, I don't doubt that.