I am constantly seeing this thing do most of my work (which is good actually, I don't enjoy typing code), but requiring my constant supervision and frequent intervention and always trying to sneak in subtle bugs or weird architectural decisions that, I feel with every bone in my body, would bite me in the ass later. I see JS developers with little experience and zero CS or SWE education rave about how LLMs are so much better than us in every way, when the hardest thing they've ever written was bubble sort. I'm not even freaking about my career, I'm freaking about how much today's "almost good" LLMs can empower incompetence and how much damage that could cause to systems that I either use or work on.
kilroy123|16 days ago
But _what if_ they work out all of that in the next 2 years and it stops needing constant supervision and intervention? Then what?
srcreigh|16 days ago
We can synthesize answers to questions more easily, yes. We can make better use of extensive test suites, yes. We cannot give 1000 different correct answers to the same prompt. We cannot read minds.
alainx277|15 days ago
Toutouxc|14 days ago
whattheheckheck|16 days ago
nprateem|16 days ago
I'm just not sure who will end up employed. The near state is obviously jira driven development where agents just pick up tasks from jira, etc. But will that mean the PMs go and we have a technical PM, or will we be the ones binned? Probably for most SMEs it'll just be maybe 1 PM and 2 or so technical PMs churning out tickets.
But whatever. It's the trajectory you should be looking at.
threethirtytwo|15 days ago
Right now you state the current problem is: "requiring my constant supervision and frequent intervention and always trying to sneak in subtle bugs or weird architectural decisions"
But in 2 years that could be gone too, given the objective and literal trendline. So I actually don't see how you can hold this opinion: "I'm not even freaking about my career, I'm freaking about how much today's "almost good" LLMs can empower incompetence and how much damage that could cause to systems that I either use or work on." when all logic points away from it.
We need to be worried, LLMs are only getting better.
Toutouxc|14 days ago