(no title)
chriswait | 1 year ago
I guess we'd probably agree that "writing code is an irrelevant skill" actually all comes down to whether LLMs will improve enough to match humans at programming, and thus comprehensively remove the need for fixing their work.
They currently don't, so at the time you claimed this it was incorrect. Maybe they will in the future, at which point it would be correct.
So, would it be responsible for me to bet my career on your advice today? Obviously not, which is why most people here disagree with your article.
You were prepared in advance to explain that criticism as people having a strong negative emotional reaction, so I'm not sure why you posted it here in the first place instead of LinkedIn where it might reach a more supportive audience.
jtlicardo|1 year ago
What I pointed out in my post is a trend I notice where an LLM can do more and more of a developer's work. Nowhere did I claim LLMs can replace human developers today, but when a technology consistently reduces the need for manual programming while improving its capabilities, the trajectory is clear. You can disagree with the timeline, but the transformation is already underway.
I posted on HN precisely because I wanted rigorous technical discussion, not validation.
chriswait|1 year ago
I do believe it's a binary thing: One day a model gets released which is sufficiently good at programming that I don't need to be able to debug or write code any more. That's the exact day my skills aren't relevant.
They aren't only 50% relevant 6 months before that date, because I need to entirely maintain my code during that 6 months, so that 50% is effectively 100%.
Seeing it as a spectrum carries a specific risk: you neglect your skills before that point is actually reached, at which point you're relying on code you can't understand properly or debug.
I think if you wanted rigorous technical discussion, this is the sort of specificity your article would've needed.