top | item 45528848

(no title)

ealexhudson | 4 months ago

I don't want to sound too dismissive, but all these arguments have been brought up time and again. The move from assembler to high level languages. The introduction of OOP. Component architecture / COM / CORBA / etc. The development of the web browser. The introduction of Java.

2018 isn't "the start of the decline", it's just another data point on a line that leads from, y'know, Elite 8-bit on a single tape in a few Kb through to MS Flight Simulator 2020 on a suite of several DVDs. If you plot the line it's probably still curving up and I'm not clear at which point (if ever) it would start bending the other way.

discuss

order

dkarl|4 months ago

We have always had, and always will have, the quality of software that people are willing to pay for.

leshow|4 months ago

That would be the case under market conditions where buyers are making rational decisions with perfect knowledge based on all available choices. Does that sound like the system we have? To me, reality seems more like a small set of oligopolies or effective monopolies, byzantine ownership structures and a pursuit of short term profits pushing future costs elsewhere as externalities.

jrm4|4 months ago

No. Not "willing," that implies that the options meaningfully exist. They don't.

"Willing AND ABLE" works here though.

fzeindl|4 months ago

> If you plot the line it's probably still curving up and I'm not clear at which point (if ever) it would start bending the other way.

I suspect when Moore‘s law ends and we cannot build substantially faster machines anymore.

rudedogg|4 months ago

One interesting thing that most non-systems programmers don’t know is that memory and cpu performance have improved at completely different rates. That’s a large part of why we have x times faster CPUs but software is still slow.

The systems people worry more about memory usage for this reason, and prefer manual memory management.

potatolicious|4 months ago

Moore's Law has been dead for a long time. The doubling rate of transistors is now drastically below Moore's prediction.

We're adding transistors at ~18%/year. That's waaaaay below the ~41% needed to sustain Moore's law.

Even the "soft" version of Moore's law (a description of silicon performance vs. literally counting transistors) hasn't held up. We are absolutely not doubling performance every 24 months at this point.

ealexhudson|4 months ago

Moore's law has kind of ended already though, and maybe has done for a few years, and even if you can make a chip which is faster there's a basic thermodynamics problem running it at full tilt for any meaningful period of time. I would have expected that to have impacted software development, and I don't think it particularly has, and there's also no obvious gain in e.g. compilers or other optimization which would have countered the effect.

mikepurvis|4 months ago

But the machines aren't really "faster" in clock speed— for a long time now the gains have been in better and more local caching + parallelism at both the core and instruction level.

gjsman-1000|4 months ago

I think another part of this, is that Tech is perhaps the only industry that hasn't quite gotten over itself yet.

Writing code is artistic the same way plumbing is artistic.

Writing code is artistic the same way home wiring is artistic.

Writing code is artistic the same way HVAC is artistic.

Which is to say, yes, there is satisfaction to be had, but companies don't care as long as it gets the job done without too many long-term problems, and never will care beyond that. What we call tech debt, an electrician calls aluminum wiring. What we call tech debt, a plumber calls lead solder joints. And I strongly suspect that one day, when the dust settles on how to do things correctly (just like it did for electricity, plumbing, flying, haircutting, and every other trade eventually), we will become a licensed field. Every industry has had that wild experimentation phase in the beginning, and has had that phase end.

veqz|4 months ago

Perhaps. But put another way:

Writing code is artistic the same way writing text is.

Whether that is a function call, an ad, a screen script, a newspaper article, or a chapter in a paperback the writer has to know what one wants to communicate, who the audience/users will be, the flow of the text, and how understandable it will be.

Most professionally engaged writers get paid for their output, but many more simply write because they want to, and it gives them pleasure. While I'm sure the jobs can be both monetarily and intellectually rewarding, I have yet to see people who do plumbing or electrical work for fun?

eikenberry|4 months ago

> Writing code is artistic the same way home wiring is artistic.

Instead of home wiring, consider network wiring. We've all seen the examples of datacenter network wiring, with 'the good' being neat, labeled and easy to work with and 'the bad' being total chaos of wires, tangled, no labels, impossible to work with.

IE. The people using the datacenter don't care as long as the packets flow. But the others working on the network cabling care about it A LOT. The artistry of it is for the other engineers, only indirectly for the customers.

bigfishrunning|4 months ago

> companies don't care as long as it gets the job done without too many long-term problems

Companies don't care as long as it gets the job done without too many VERY SHORT TERM problems. Long term problems are for next quarter, no reason to worry about them.

cmrdporcupine|4 months ago

I don't see working for most of my employers as "artistic."

I do see it as more of a craft than a typical trade. There are just too many ways to do things to compare it to e.g. an electrician. Our industry does not have (for better or for worse) a "code" like the building trades or even any mandated way to do things, and any attempts to impose (cough cough Ada, etc.) that have been met with outright defiance and contempt in fact.

When I'm working on my own projects -- it's a mix of both. It's a more creative endeavour.

imiric|4 months ago

If you haven't noticed a dramatic decline in average software quality, you're not paying attention or willfully ignoring it. The article is right.

This is partly related to the explosion of new developers entering the industry, coupled with the classic "move fast and break things" mentality, and further exacerbated by the current "AI" wave. Junior developers don't have a clear path at becoming senior developers anymore. Most of them will overly rely on "AI" tools due to market pressure to deliver, stunting their growth. They will never learn how to troubleshoot, fix, and avoid introducing issues in the first place. They will never gain insight, instincts, understanding, and experience, beyond what is acquired by running "AI" tools in a loop. Of course, some will use these tools for actually learning and becoming better developers, but I reckon that most won't.

So the downward trend in quality will only continue, until the public is so dissatisfied with the state of the industry that it causes another crash similar to the one in 1983. This might happen at the same time as the "AI" bubble pop, or they might be separate events.

0xWTF|4 months ago

Is this measureable? Like code readability scores on the GitHub corpus over time?

surgical_fire|4 months ago

Eh, after 20 years in the industry, I think that the overall quality of software is roughly the same. Matter of fact, my first job was by far the worst codebase I ever worked at. A masterclass in bad practices.

dlcarrier|4 months ago

I blame software updates. That's when software went from generally working on release to not at all.

Agile management methods set up a non-existent release method called "waterfall" as a straw man, where software isn't released until it works, practically eliminating technical debt. I'm hoping someone fleshes it out into a real management method. I'm not convinced this wasn't the plan in the first place, considering that the author of Cunningham's law, that "The best way to get the right answer on the Internet is not to ask a question; it's to post the wrong answer." was a co-signer of the Agile manifest.

It'll take a lot of work at first, especially considering how industry-wide the technical debt is (see also: https://xkcd.com/2030/), but once done, having release-it-and-forget-it quality software would be a game changer.

marcosdumay|4 months ago

> a non-existent release method called "waterfall" as a straw man

The person that invented the name never saw it, but waterfall development is extremely common and the dominant way large companies outsource software development even today.

The only thing that changed now is that now those companies track the implementation of the waterfall requirements in scrum ceremonies. And yes, a few more places actually adopted agile.

BrenBarn|4 months ago

> I blame software updates. That's when software went from generally working on release to not at all.

I agree. So much software these days treats users as testers and is essentially a giant test-in-production gaffe.

jrm4|4 months ago

Ha. I was tasked to teach (classic) Project Management without being super-familiar.

Then I had to get familiar with the new stuff; waterfall, agile whatever.

They literally are all nothing but hacks that violate the basic points of actual project management. (e.g. Projects have a clear end)

morshu9001|4 months ago

I don't think software has gotten worse, quite the opposite, but Java and OOP were mistakes.

bowsamic|4 months ago

Every time someone says this I ask them “what is your solution for maintainable software architecture?” And they say “what is software architecture? I just write code”

quotemstr|4 months ago

One salient difference is that typically abstraction layers trade performance (usually less than polemicists like the article author think) for improvements in developer efficiency, safety, generality, and iteration speed.

Current tools seem to get us worse results on bug counts, safety, and by some measures even developer efficiency.

Maybe we'll end up incorporating these tools the same way we did during previous cycles of tool adoption, but it's a difference worth noting.