(no title)
bananaflag | 1 day ago
All the other attempts failed because they were just mindless conversions of formal languages to formal languages. Basically glorified compilers. Either the formal language wasn't capable enough to express all situations, or it was capable and thus it was as complex as the one thing it was designed to replace.
AI is different. You tell it in natural language, which can be ambiguous and not cover all the bases. And people are familiar with natural language. And it can fill in the missing details and disambiguate the others.
This has been known to be possible for decades, as (simplifying a bit) the (non-technical) manager can order the engineer in natural, ambiguous language what to do and they will do it. Now the AI takes the place of the engineer.
Also, I personally never believed before AI that programming will disappear, so the argument that "this has been hyped before" doesn't touch my soul.
I have no idea why this is so hard to understand. I'd like people to reply to me in addition to downvoting.
danhau|1 day ago
How far AI will succeed in replacing programmers remains to be seen. Personally I think many jobs will disappear, especially in the largest domains (web). But I think this will only be a fraction and not a majority. For now, AI is simply most useful when paired with a programmer.
aleph_minus_one|1 day ago
This is not the case:
- Before the 90s, programming was rather a job for people who were insanely passionate about technology, and working as a programmer was not that well-regarded (so no "growing opportunities").
- After the burst of the first dotcom bubble, a lot of programmers were unemployed.
- Every older programmer can tell you how fast the skills that they have can become and became irrelevant.
Over the last decade, the stability and opportunities for programmers was more like a series of boom-bust cycles.
cafebabbe|1 day ago
Experienced through old-school (pre-LLM) practice.
I don't clearly see a good endgame for this.
Tanjreeve|1 day ago
t_mahmood|1 day ago
Does he know about SQL injection? XSS?
Maybe he knows slightly about security stuffs and asks the LLM to make a secure site with all the protection needed. But how the manager knows it works at all? If you figure out there's a issue with your critical part of the software, after your users data are stolen, how bad the fallback is going to be?
How good a tool is also depends on who's using it. Managers are not engineers obviously unless he was an engineer before becoming a manager, but you are saying engineers are not needed. So, where's the engineer manager is going to come from? I'm sure we're not growing them in some engineering trees
edgyquant|1 day ago
skydhash|1 day ago
In the real world, the materials are visible so people have a partial understanding on how it gets done. But most of the software world is invisible and has no material constraints other than the hardware (you can't use RAM that is not there). If the hardware is like a blank canvas, a standard web framework is like a draw by the numbers book (but one with lines drawn by a pencil so you can erase it easily). Asking the user to code with LLM is like asking a blind to draw the Mona Lisa with a brick.
ajshahH|1 day ago
Are you suggesting “And Claude, make no mistakes” works?
Because otherwise you need an expert operating the thing. Yes, it can answer questions, but you need to know what exactly to ask.
> This has been known to be possible for decades, as (simplifying a bit) the (non-technical) manager can order the engineer in natural, ambiguous language what to do and they will do it
I have yet to see vibe coding work like this. Even expert devs with LLMs get incorrect output. Anytime you have to correct your prompt, that’s why your argument fails.
mexicocitinluez|1 day ago
And while these tools can be invaluable in some cases, I still don't know how we get from "Hazy requirements where the user doesn't know what they even want" to "Production-ready apps built at the finger-tips of the PM".
Another really important detail people keep missing is that we have to make thousands of micro-decisions along the way to build up a cohesive experience to the user. LLM's haven't really shown they're great at not building assumptions into code. In fact, they're really bad at it.
Lastly, do people not realize how easy it to so convince an LLM of something that isn't true or vice versa? i love these tools but even I find myself trying to steer it into the direction that makes sense to me, not the direction that makes sense generally.
mexicocitinluez|1 day ago
This is just categorically false.
No-code tools didn't fail because they were "mindless conversions of formal languages to formal languages". They failed because the people who were supposed to benefit the most (non-developers) neither had the time nor desire to build stuff in the first place.
quotemstr|1 day ago
bananaflag|1 day ago
medi8r|1 day ago
Paradoxically this may mean there are more jobs for programmer and programmer-likes alike as new cottage industries are born. AI for dentists is coming.
rsynnott|16 hours ago
empath75|1 day ago