top | item 44087539

(no title)

timkam | 9 months ago

Is it really that LLM-based tools make developers so much more productive or rather that organizations have found out they can do with less -- and less privileged -- developers? What I don't really see, especially not big tech-internally, are stories of teams that have become amazingly more productive. For now it feels we get some minor productivity improvements that probably do not off-set the invest and are barely enough to keep the narrative alive.

discuss

order

locococo|9 months ago

A lot of it is perception. Writing software was long considered somewhat difficult and that it required smart people to do so. AI changes this perception and coding starts to be perceived as a low level task that anyone can do easily with augmentation from AI tools. I certainly agree that writing software is turning more into a factory job and is less intellectually rewarding now.

cmiles74|9 months ago

When I started working in the field (1996), I was told that I would receive detailed specs from an analyst that I would then "translate" into code. At that time this idea was already out of fashion, things worked this way for the core business team (COBOL on the AS/400) but in my group (internal tools, Delphi mostly) I would get only the most vague requirements.

Eventually everyone was expected to understand a good deal of the code they were working on. The analyst and the coder became the same person.

I'm deeply skeptical that the kind of people that enjoy software development are the same kind of people that enjoy steering and proofing LLM generated code. Unlike the analyst and the coder, this strike me as a very different skill set.

pjmlp|9 months ago

It has been a factory job for decades.

Not everyone gets to code the next ground breaking algorithm at some R&D department.

Most programming tasks are rather repetitive, and in many countries there is hardly anything to look up to software developers, it is another blue collar job.

And in many cultures if you don't go into management after about five years, usually it is seen as a failure to grow up on their career.

billy99k|9 months ago

It's been like this for awhile now. Aside from companies like Google and Facebook, most companies are using some CRUD web app where the development consists of gluing code together for multiple third-party services and libraries.

It's these sorts of jobs that will be replaced by AI and a vibe coder, which will cost much less because you don't need as much experience or expertise.

zkry|9 months ago

Even before AI I've always had the perception that writing software felt more intellectually on the level of plumbing. AI just feels like a having one of those fancy new tools that tradespersons may use.

catigula|9 months ago

What you're describing doesn't sound like something that requires a lot of foreign laborers.

datavirtue|9 months ago

It's been like this for decades.

Nasrudith|9 months ago

Organizations have long had a preference for 'deskilling' to something reliable through bureaucratic procedures, regardless of the side effects or even if it results in it costs more due to needing three people where one talented could do it before. Because it is more dependable, even if it is dependably mediocre. Even though this technique may lead to their long-term doom and irrelevance.

xkjyeah|9 months ago

The number of organizations that continue to use tedious languages like Java 8 and Golang...

Like, they hadn't realized they were turning humans into compilers for abstract concepts, yet now they are telling humans to get tf out of the way of AI

gerdesj|9 months ago

Please give some worked examples.

I'm not sure what: "'deskilling' to something reliable through bureaucratic procedures" ... means.

I'm the Managing Director of a small company and I'm pretty sure you are digging at the likes of me (int al) - so what am I doing wrong?

stock_toaster|9 months ago

I wonder about codebase maintainability over time.

I hypothesize that it takes some period of time for vibe-coding to slowly "bit rot" a complex codebase with abstractions and subtle bugs, slowly making it less robust and more difficult to maintain, and more difficult to add new features/functionality.

So while companies may be seeing what appears to be increases in output _now_, they may be missing the increased drag on features and bugfixes _later_.

doug_durham|9 months ago

Up until now large software systems required thousands of hours of work and efforts of bright engineers. We take established code as something to be preserved because it embeds so my knowledge and took so long to develop. If it rots then it takes too long to repair or never gets repaired.

Imagine a future where the prompts become the precious artifact. That we regularly `rm -rf *` the entire code base and regenerate it with the original prompts perhaps when a better model becomes available. We stop fretting about code structure or hygiene because it won't be maintained by developers. Code is written for readability and audibility. So instead of finding the right abstractions that allow the problem to be elegantly implemented the focus is on allowing people to read the code to audit that it does what it says it does. No DSLs just plain readable code.

greyadept|9 months ago

I’m concerned that it might not be easy to vibecode a security fix for a complex codebase, especially when the flaw was introduced by vibecoding.

mrheosuper|9 months ago

I wonder whether we have the same talk when the C compiler first came out.

People may worry that the "ASM" codebase will be bit-rot and no one can understand the compiler output or add new feature to the ASM codebase.

aprilthird2021|9 months ago

Yes, big-tech-internally I also see a lot of desire to get us to come up with some great AI achievements, but they are so far not achieving far far more than already existing automations and bots and code generators can do for us

inadequatespace|9 months ago

Right. What the article is unsurprisingly glossing over (per usual) is that just because AI is perceived (by higher-ups that don’t actually do the work) to speed up coding work doesn't mean it actually does.

and that probably to some extent all involved (depending on how delusional they are) know that it's simply an excuse to do layoffs (replaced by offshoring) by artificially so-called raising the bar to what is unrealistic for most people

add-sub-mul-div|9 months ago

For this narrative to make sense you would have to believe that Amazon management cares more about short-term profit than the long-term quality of their work.

timkam|9 months ago

The narrative reflects a broader cultural shift, from "we are all in this together" (pandemic) to "our organizations are bloated and people don't work hard enough" (already pre-LLM hype post-pandemic). The observation that less-skilled people can, with the help of LLMs, take the work of traditionally more-skilled people fits this narrative. In the end, it is about demoting some types of knowledge workers from the skilled class to the working class. Apparently, important people believe that this is a long-term sustainable narrative.

locococo|9 months ago

Management has different layers with different goals. A middle manager and a director certainly care a lot about accomplishing short term goals and are ok with tech debt to meet the goals.

Jeff_Brown|9 months ago

Caring is part of it. Having good measures is another. Older measures that worked might need updating to reflect the new, higher spaghetti risk. I expect Amazon to figure it out but I don't see why they necessarily already would have.

layer8|9 months ago

So it does make sense?

codr7|9 months ago

[flagged]

Pet_Ant|9 months ago

I see it more as replacing shitty code monkeys because it leaves the hard parts behind.

closewith|9 months ago

But you of course with your superior skills are above that risk?