I don't understand why so many people subscribe to this "prediction". It seems unsubstantiated hyperbole to me.
There are a few reasons why I don't believe AI will replace programmers anytime soon:
1. The job of a developer/engineer entails so much more than writing code. Figuring out what the business wants, turning that into a good (system) design, etc. takes up more time than the actual coding itself.
Unless of course you take "programmer" literally, but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.
2. Support and maintenance is a huge part of the job that I don't see AI doing. Theoretically you could let humans focus on that part, but I believe support and maintenence will become much more costly if the people doing they job have no familiarity with the code because they didn't write it.
3. As evidenced by many comments in the thread elsewhere on HN about the announcement of Claude Sonnet 3.7 AI still routinely makes mistakes that are super easy to spot and verify. As long as that remains the case, it's going to be detrimental to the success of you company if you give AI too much autonomy.
I know people will argue that AI is evolving so fast that the above will be solved soon. But I think all three aspects I mentioned are such fundamental roadblocks that they won't be solved soon.
What I do believe in is engineers becoming so much more productive as AI evolves.
I don’t understand why so many people are convinced that these newfangled automobiles will replace horses. It sounds like unsubstantiated hype to me.
There are a few reasons why I don’t believe cars will replace horses anytime soon:
1. Riding and caring for a horse is about much more than just transportation. Horses have been an integral part of life for centuries—they provide companionship, work the land, and serve in countless roles beyond simple travel. Even if you consider only their use for getting from place to place, riding is a skill that people take pride in, and I don’t see that disappearing overnight.
2. The maintenance and upkeep of these machines seem like a nightmare. A horse may need food and care, but it doesn’t require expensive parts, specialized fuel, or constant repairs from trained mechanics. If a carriage breaks, any competent craftsman can fix it—but if one of these new engines fails, who will know how to repair it?
3. From what I’ve seen, these automobiles are still prone to frequent breakdowns and failures. They get stuck in mud, they require smooth roads (which hardly exist outside cities), and they are unreliable compared to a well-trained horse. If a machine fails, you’re stranded—whereas a horse will always find its way home.
I know people will argue that these machines are improving rapidly and that soon they’ll overcome these issues. But I think these challenges are fundamental and won’t be solved anytime soon.
What I do believe, however, is that for certain tasks, automobiles may assist in making travel more efficient. But replace the horse entirely? I just don’t see it happening.
I don't expect less demand for people mastering technology, considering AI will only increase the amount of it. What we should expect is that the balance between the number of people required to create a new product compared the the amount of people to maintain it will change dramatically. Sucessful startup are going to be composed of smaller teams. On the other end, legacy code is going to require army of people to deal with.
> but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.
So you yourself have already seen the demise of the programmer so why are you arguing against it? Software development isn’t going away. But just like we no longer have tweeners in animation, we’ll soon no longer have programmers in software development. Then soon there after we won’t have “front-enders” and “back-ended” the term “full stack” will lose meaning and at the end what we call a software developer will be more akin to what you today call a business analyst than a programmer.
If everyone is a programmer / coder since they have an AI software engineer on hand, I'm hoping that they would be comfortable with long term maintenance.
As entropy marches on with more AI generated lines of code in the codebase and software, APIs, tooling have breaking changes, will these new class of "vibe coder" / "creator coder" have the means and time to maintain their massive codebase?
I think AI is good for MVP's but if we're talking 10-30M lines of code then it might not be the best tool for this.
The hard part is the customer, not the technology. Unless you are working on something very unusual, it should be straightforward to implement anything given perfect requirements.
Much (most?) of my time as a software engineer has been spent poking absurd holes in customer stories such that they are compelled to provide the actual requirements. This edge case probing is what LLMs are infamously bad at. They are too eager to please. There's not an inner asshole with an aggressive aesthetic preference that was built up over months of interchange with the client.
The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere. How many layers of abstraction do we need to put in between the will of the customer and the product they paid for?
I think a viable solution could be to use the LLM as a direct bridge between your product and the customer. Tool calling with these new reasoning models is a hell of a drug. It's not that difficult to just write this code. 99% of it is string interpolation. You don't need copilot for this.
> The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere.
I don't understand your use of "inherently" here. Even if you define LLMs as not having agency, I don't see any inherent limitation against tacking agency on top of them. As you alluded to even just a basic loop of `if (!goalAchieved()) {promptWithToolCalling()}` is arguably agency, no?
You actually suggested connecting the LLM directly between the product and the customer, such that the customer specifies the goal. What's stopping tech from going in this direction?
This is all about suppressing wages, laying off American engineers, and rationalizing many tens of billions wasted on building AI infrastructure no one needed and no one will use.
This guy should focus more on fixing the ai generated plague that is currently waving his social media network, but instead seems “not to care too much” as long it keeps users busy.
The solution to every problem in programming is another rlayerbof abstraction.
For me programming was always about expressing my intend.
I don’t think about the instructions the compiler generates. I also rarely think about the expanded form of a template expression.
If ai just acts as an it remediate between me and the compiler by adding jet-another- abstraction between me and the generated instruction, why should I care?
I will still have to somehow explain the machine what it is that I want.
What should I focus on from now on? If I want to change career path, what will pay me as good as a Software Engineer given that I’m 34 years old? Let’s I can take a break of 4 years to take another degree, what would be the wisest choice?
I’m at lost honestly. If not 2025, it would 2030 or 2040. I fucking love software engineering.
Brother I am 39 and I am there with you. Increasingly the interview process in software has just become LeetCode standard to gate people. Did I personally build a tool that processed 120M in sales annually, sure. Can you balance a BST? No? Then fuck off.
Personally, I see robotics as something worth moving towards. It’s the intersection of software, mechanics, electronics, and math.
Just keep getting better, the risk is there's going to be many more software engineers, using AI tools, which is going to lower wages more so than AI will make all software engineers obsolete.
This nonsense is about recalibrating the SWE labor market and garnering hype for tech. The primary product the technology industry creates is company equities, and their primary customer is anxious CEOs/hedge/pension funds.
It feels like the only thing AI doesn’t have on us (yet) is the ability to drill into legacy code bases. Of course those code bases were written by humans during a time when coding was more expensive because we didn’t have AI to do it for us.
Because of that, I wonder if legacy code bases will be less common in the future.
The only prediction I’m confident in is that it’s a bleak future for devs whose skillset consists of languages rather than interests. I’m one of those devs.
Seeing the progress in LLMs... I do believe it. One software engineer will do in the future what would take an entire team in the past.
Now what to do? I have just finished my undergrad in software engineering and got admitted to Masters, but I feel that's a mistake. At the same time, I never knew what else to do in my life but programming.
"However, this transition presents a paradox: who will oversee and correct AI-generated code? Even the most advanced AI models are prone to errors, necessitating human oversight to ensure reliability and security."
I see a new role for programmers. The ex-coders will oversee quality control and step in as needed in the future.
Programmers will probably have a few more years -less than 10yrs- but long term their role will radically change.
DandyDev|1 year ago
There are a few reasons why I don't believe AI will replace programmers anytime soon:
1. The job of a developer/engineer entails so much more than writing code. Figuring out what the business wants, turning that into a good (system) design, etc. takes up more time than the actual coding itself. Unless of course you take "programmer" literally, but I have not seen many companies that still hire programmers in the most narrow sense, that only focus on writing code.
2. Support and maintenance is a huge part of the job that I don't see AI doing. Theoretically you could let humans focus on that part, but I believe support and maintenence will become much more costly if the people doing they job have no familiarity with the code because they didn't write it.
3. As evidenced by many comments in the thread elsewhere on HN about the announcement of Claude Sonnet 3.7 AI still routinely makes mistakes that are super easy to spot and verify. As long as that remains the case, it's going to be detrimental to the success of you company if you give AI too much autonomy.
I know people will argue that AI is evolving so fast that the above will be solved soon. But I think all three aspects I mentioned are such fundamental roadblocks that they won't be solved soon.
What I do believe in is engineers becoming so much more productive as AI evolves.
truculent|1 year ago
There are a few reasons why I don’t believe cars will replace horses anytime soon:
1. Riding and caring for a horse is about much more than just transportation. Horses have been an integral part of life for centuries—they provide companionship, work the land, and serve in countless roles beyond simple travel. Even if you consider only their use for getting from place to place, riding is a skill that people take pride in, and I don’t see that disappearing overnight.
2. The maintenance and upkeep of these machines seem like a nightmare. A horse may need food and care, but it doesn’t require expensive parts, specialized fuel, or constant repairs from trained mechanics. If a carriage breaks, any competent craftsman can fix it—but if one of these new engines fails, who will know how to repair it?
3. From what I’ve seen, these automobiles are still prone to frequent breakdowns and failures. They get stuck in mud, they require smooth roads (which hardly exist outside cities), and they are unreliable compared to a well-trained horse. If a machine fails, you’re stranded—whereas a horse will always find its way home.
I know people will argue that these machines are improving rapidly and that soon they’ll overcome these issues. But I think these challenges are fundamental and won’t be solved anytime soon.
What I do believe, however, is that for certain tasks, automobiles may assist in making travel more efficient. But replace the horse entirely? I just don’t see it happening.
heisgone|1 year ago
florbnit|1 year ago
So you yourself have already seen the demise of the programmer so why are you arguing against it? Software development isn’t going away. But just like we no longer have tweeners in animation, we’ll soon no longer have programmers in software development. Then soon there after we won’t have “front-enders” and “back-ended” the term “full stack” will lose meaning and at the end what we call a software developer will be more akin to what you today call a business analyst than a programmer.
colesantiago|1 year ago
As entropy marches on with more AI generated lines of code in the codebase and software, APIs, tooling have breaking changes, will these new class of "vibe coder" / "creator coder" have the means and time to maintain their massive codebase?
I think AI is good for MVP's but if we're talking 10-30M lines of code then it might not be the best tool for this.
bob1029|1 year ago
Much (most?) of my time as a software engineer has been spent poking absurd holes in customer stories such that they are compelled to provide the actual requirements. This edge case probing is what LLMs are infamously bad at. They are too eager to please. There's not an inner asshole with an aggressive aesthetic preference that was built up over months of interchange with the client.
The constant here is "agency". LLMs inherently lack it. So, it has to come from somewhere. How many layers of abstraction do we need to put in between the will of the customer and the product they paid for?
I think a viable solution could be to use the LLM as a direct bridge between your product and the customer. Tool calling with these new reasoning models is a hell of a drug. It's not that difficult to just write this code. 99% of it is string interpolation. You don't need copilot for this.
falcor84|1 year ago
I don't understand your use of "inherently" here. Even if you define LLMs as not having agency, I don't see any inherent limitation against tacking agency on top of them. As you alluded to even just a basic loop of `if (!goalAchieved()) {promptWithToolCalling()}` is arguably agency, no?
You actually suggested connecting the LLM directly between the product and the customer, such that the customer specifies the goal. What's stopping tech from going in this direction?
alsoforgotmypwd|1 year ago
This is all about suppressing wages, laying off American engineers, and rationalizing many tens of billions wasted on building AI infrastructure no one needed and no one will use.
ddmma|1 year ago
niemandhier|1 year ago
For me programming was always about expressing my intend.
I don’t think about the instructions the compiler generates. I also rarely think about the expanded form of a template expression.
If ai just acts as an it remediate between me and the compiler by adding jet-another- abstraction between me and the generated instruction, why should I care?
I will still have to somehow explain the machine what it is that I want.
archagon|1 year ago
rrgok|1 year ago
I’m at lost honestly. If not 2025, it would 2030 or 2040. I fucking love software engineering.
androiddrew|1 year ago
Personally, I see robotics as something worth moving towards. It’s the intersection of software, mechanics, electronics, and math.
Maybe it’s just time to move into management…
jdlshore|1 year ago
fullshark|1 year ago
This nonsense is about recalibrating the SWE labor market and garnering hype for tech. The primary product the technology industry creates is company equities, and their primary customer is anxious CEOs/hedge/pension funds.
pipeline_peak|1 year ago
Because of that, I wonder if legacy code bases will be less common in the future.
The only prediction I’m confident in is that it’s a bleak future for devs whose skillset consists of languages rather than interests. I’m one of those devs.
StefanBatory|1 year ago
Now what to do? I have just finished my undergrad in software engineering and got admitted to Masters, but I feel that's a mistake. At the same time, I never knew what else to do in my life but programming.
wturner|1 year ago
unknown|1 year ago
[deleted]
pb060|1 year ago
Stupid question: how do you become a high level programmer if entry and mid level roles disappear?
beardyw|1 year ago
WheelsAtLarge|1 year ago
"However, this transition presents a paradox: who will oversee and correct AI-generated code? Even the most advanced AI models are prone to errors, necessitating human oversight to ensure reliability and security."
I see a new role for programmers. The ex-coders will oversee quality control and step in as needed in the future.
Programmers will probably have a few more years -less than 10yrs- but long term their role will radically change.
klooney|1 year ago
gunian|1 year ago
pipeline_peak|1 year ago
[deleted]
zaphirplane|1 year ago
readyplayernull|1 year ago
Someone proposed this nice project AI will fail at:
https://news.ycombinator.com/item?id=43120035
tw1231728|1 year ago
But how would Zuckerberg know, he has never written anything special.
pipeline_peak|1 year ago
That’s the whole point, why even bother if AI does it faster?
Do you buy hand crafted furniture? Probably not because even if it’s better, it’s way more expensive.
mediumsmart|1 year ago
fnqi8ckfek|1 year ago
[deleted]