(no title)
SupremumLimit | 8 months ago
What you get when it becomes easier to generate code/applications is a whole lot more code and a whole lot more noise to deal with. Sure, some of it is going to be well crafted – but a lot of it will not be.
It’s like the mobile app stores. Once these new platforms became available, everyone had a go at building an app. A small portion of them are great examples of craftsmanship – but there is an ocean of badly designed, badly implemented, trivial, and copycat apps out there as well. And once you have this type of abundance, it creates a whole new class of problems for the users but potentially also developers.
The other thing is, it really doesn’t align with the priorities of most companies. I’m extremely skeptical that any of them will suddenly go: “Right, enough of cutting corners and tech debt, we can really sort that out with AI.”
No, instead they will simply direct extra capacity towards new features, new products, and trying to get more market share. Complexity will spiral, all the cut corners and tech debt will still be there, and the end result will be that things will be even further down the hole.
agumonkey|8 months ago
scelerat|8 months ago
Same could be said of traditional desktop software development and the advent of web apps I suppose.
I guess I'm not that worried, other than being worried about personally finding myself in a technological or cultural eddy.
dgb23|8 months ago
I think the more pressing issues are costs: opportunity cost, sunk cost, signal to noise ratio.
roxolotl|8 months ago
namaria|8 months ago
Why on earth people expect to attach gpu farms to render characters into their codebase to not only not increase its entropy but to lower it?
guappa|8 months ago
lcnPylGDnU4H9OF|8 months ago
The article is making a normative argument. It is not saying what people "will" do but instead what they "should" do.
HPsquared|8 months ago