(no title)
gdubs | 27 days ago
There's such a wide divergence of experience with these tools. Often times people will say that anyone finding incredible value in them must not be very good. Or that they fall down when you get deep enough into a project.
I think the reality is that to really understand these tools, you need to open your mind to a different way of working than we've all become accustomed to. I say this as someone who's made a lot of software, for a long time now. (Quite successfully too!)
In someways, while the ladder may be getting pulled up on Junior developers, I think they're also poised to be able to really utilize these tools in a way that those of us with older, more rigid ways of thinking about software development might miss.
bsoles|27 days ago
Using AI/LLMs, you perhaps will create more commercial value for yourself or your employer, but it will not make you a better learner, developer, creator, or person. Going back to the electronic calculator analogy that people like to refer to these days when discussing AI, I also now think that, yes, electronic calculators actually made us worse with being able to use our brains for complex things, which is the thing that I value more than creating profits for some faceless corporation that happens to be my employer at the moment.
gdubs|27 days ago
Like Herbie Hancock once said, a computer is a tool, like an axe. It can be used for terrible things, or it can be used to build a house for your neighbor.
It's up to people how we choose to use these tools.
phicoh|27 days ago
When tools prove their worth, they get taken into to normal way software is produced. Older people start using them, because they see the benefit.
The key thing about software production is that it is a discussion among humans. The computer is there to help. During a review, nobody is going to look at what assembly a compiler produces (with some exceptions of course).
When new tools arrive, we have to be able to blindly trust them to be correct. They have to produce reproducible output. And when they do, the input to those tools can become part of the conversation among humans.
(I'm ignoring editors and IDEs here for the moment, because they don't have much effect on design, they just make coding a bit easier).
In the past, some tools have been introduced, got hyped, and faded into obscurity again. Not all tools are successful, time will tell.
bdangubic|27 days ago
whattheheckheck|27 days ago
AuthAuth|27 days ago
AndreasMoeller|27 days ago
3 years ago the idea of measuring productivity in lines of code would have been ridiculous. After AI, it is the norm.
mnky9800n|27 days ago
commandlinefan|27 days ago
pixl97|27 days ago
hn_throwaway_99|27 days ago
That said, I don't think this negates what TFA is trying to say. The difficulty with software has always been around focusing on the details while still keeping the overall system in mind, and that's just a hard thing to do. AI may certainly make some steps go faster but it doesn't change that much about what makes software hard in the first place. For example, even before AI, I would get really frustrated with product managers a lot. Some rare gems were absolutely awesome and worth their weight in gold, but many of them just never were willing to go to the details and minutiae that's really necessary to get the product right. With software engineers, if you don't focus on the details the software often just flat out doesn't work, so it forces you to go to that level (and I find that non-detail oriented programmers tend to leave the profession pretty quickly). But I've seen more that a few situations where product managers manage to skate by without getting to the depth necessary.
atmavatar|27 days ago
Unfortunately, since the tech industry still largely skews young, reticence to chase every new hype cycle also feeds into the perception of an inability to learn new things, even after many prove to be fads (e.g., blockchain).