top | item 34737084

Why is there so much useless and unreliable software?

83 points| learningstud | 3 years ago | reply

Linear logic has been known since 1987. The first release of Coq (dependent types for functional programming and writing proofs) was in 1989. The HoTTBook came out in 2013. Ada/SPARK 2014 came out the same year as Java 8 did. We also witnessed the Software Foundations series, the CompCert C compiler, the Sel4 microkernel, and the SPARKNaCl cryptographic library.

Instead of learning about those achievements and aiming to program for the same reliability, clarity, and sophistication, we see an abundance of software that cannot clearly describe their own behavior nor misbehavior.

Instead of incorporating the full functionality of XML/HTML/CSS/SVG/JS/WebGL into the development experience and providing ways to control them at the fundamental level, we reinvent crude approximations like the various web frameworks.

YAML and JSON often trumps XML/XSD until things get out of control, and even then, people still don't learn the lesson. Protobuf, flatbuffer, capnproto, and the like keep reinventing ASN.1.

Naive microservices partially reimplements Erlang's BEAM VM while ignoring all the hard parts that BEAM VM got right. Many people riding the microservice bandwagon have never even heard of Paxos, not to mention TLA+.

Many programmers keep learning new shining frameworks but are reluctant to learn about the crucial fundamentals, e.g., Introduction to Parallel Algorithms and Architectures, nor how to think clearly and unambiguously in the spirit of Coq/Agda/Lean.

No wonder ChatGPT exposes how shallow most of programming is and how lacking most programmers are in actual understanding. Linear logic and dependent types are there to help us design and think with clarity at a high level, but people would rather fumble around with OOP class hierarchies (participate in the pointless is-a/has-a arguments) and "architecture" design that only complicate things.

What is this madness? This doesn't sound like engineering.

144 comments

order
[+] pjc50|3 years ago|reply
> Why is there so much useless and unreliable software?

https://twitter.com/maxkreminski/status/887815522061926400

"a reminder: if inexperienced creators are using your tool to churn out loads of half-baked garbage, your tool is a phenomenal success"

Software is such a powerful concept - it basically imbues physical objects with magic - that even bad software is hugely empowering to its users and takes off very quickly. The demand is staggering.

I appreciate that when you see yourself as the most intelligent person in the world it becomes intolerable to be surrounded by unthinking muggles scratching in the dirt, but after a while you realise that life is more complicated than that, people usually have good reasons for doing the things they do, and that perfection is neither attainable nor necessary for most of that.

[+] zelphirkalt|3 years ago|reply
On one hand I want to agree with the quote you posted, but on the other hand implying or making it seem like the OP sees themselves as "the most intelligent person in the world" is putting the question into a wrong light and seems disingenuous. The OP's question is valid and one should be allowed to ask it.

Most tech choices are made on the basis of incomplete information and incomplete planning or even no planning ahead. Many are merely following hype, instead of truly looking at the options and making a wise choice. Hype creates awareness of products or software, which ultimately reaches the uninformed masses. People not in the business of making software themselves have usually vastly less information to base their choice on and often make questionable choices. This in turn generates more incentive to continue making software like the one they chose. This is what amounts to the quote you posted.

[+] learningstud|3 years ago|reply
It's never about perfection, but lifting up the acceptable lower bound. Around 150 years ago, Joseph Lister had to go to great lengths to persuade surgeons to apply antiseptics, which seems like a no-brainer nowadays. The methods I mentioned have been well-established for decades.
[+] Gigachad|3 years ago|reply
You could say the same thing about most things. A "bad" chair from ikea empowers people to not sit on the floor. A "bad" meal from the local cafe is still nutritious and tasty even if it's bad compared to something 5x more expensive.
[+] s1k3s|3 years ago|reply
As a software agency owner I can tell you that barely anyone wants that. We maybe get 1-2 requests for quality software every year. Most of our clients want their apps built as fast as possible, with little to no consideration about the tech quality. We have the choice of doing it that way or going out of business.

This also affects our people (usually our top engineers) - which is why I want to start developing our own products this year.

[+] shinycode|3 years ago|reply
I can relate. So often the people in charge doesn’t care at all about software quality. They want things as fast as possible in order to make money out of it. Software is just a mean to an end. So rarely an end by itself … Try to build a car or anything half as fast … control quality will suffer but with software it’s way easier to ship fast
[+] patrickk|3 years ago|reply
Ding, ding, ding!

Cover your ass, ticking off items as launched are other areas that lead to speed over quality every time.

The OP post reads as very naive, without experience in the real world corporate politics. No one really gives a shit about most things on that list (unfortunately).

[+] anonzzzies|3 years ago|reply
Even with own products it often is an issue; after years of working on something complex, you want or need to launch it. But it might still be quite bad compared to your vision of it; you simply run out of time, steam, money etc and launch it. Tests succeed and with ok coverage, you even have some tla+ and coq covered parts, but, because you have to be somewhat practical you stopped proving things (or even just dropping some less uphill practices) years ago.
[+] xupybd|3 years ago|reply
Yeah, then when the half baked thing you've rushed out falls over it's your fault. Despite the customer asking for exactly that level of quality. When bugs appear it because the engineer is an idiot not that they wanted to write tests but no one wants to pay for them. Agency work is tough. Being an in house dev is much better IMHO.
[+] StreamBright|3 years ago|reply
Same here. Some projects I work on:

- please put this broken crap of 15 services written in 6 different programming languages on a 50 node cluster so we can read and write data with 300 kbit/s.

2 years later:

- could you do something about this cluster, it costs us too much.

[+] smilespray|3 years ago|reply
I think it boils down to that many people are eager to start projects but can't be bothered to fi
[+] davidktr|3 years ago|reply
Is it just me or has HN's meme-fication exploded recently? Only seen stuff like this on Reddit before. I chuckled though.
[+] flemhans|3 years ago|reply
I think you are
[+] fer|3 years ago|reply
Engineering is about tradeoffs.

When you're shooting a multi-hundred million satellite into orbit it's worth the extra few million expense of formal verification because otherwise you might lose a gigantic investment and even kill people in the process.

When you're working on something without such extreme constraints, with vague SLAs, and limited to no business risk, then regular unit/integration/etc tests are good enough and exceedingly cheaper.

[+] pjc50|3 years ago|reply
> When you're shooting a multi-million satellite into orbit it's worth the extra few million expense of formal verification

And even then the cost effects of that are so prohibitive that SpaceX transformed the industry with their "we cannot guarantee the hoverslam landing will work first time" approach.

Producing something imperfect quickly and then iterating beats upfront planning by such a large margin so often that it's not funny.

[+] LAC-Tech|3 years ago|reply
Engineering is about tradeoffs.

People can give themselves all the fancy titles they want; I don't think 99% of software development is anything approaching Engineering.

(This isn't so much an attack on my fellow programmers as a recognition that this field is very young and still very immature.)

[+] stared|3 years ago|reply
For context: I come from a mathematical background. I didn't particularly appreciate that most programming languages use the word "function" for something that is not a mathematical function (e.g. has side effects). Well, then I got used to it that sometimes it is practical.

> This doesn't sound like engineering.

It is precisely engineering. As opposed to pure science and art. (I consider mathematics to be, above all, art.)

Based on the post itself, you come with a theoretical mindset. You may consider purity to be more important than practical applications. Yet, if people write software to be used, they focus on the latter. Sometimes it results in hacky code even within an already hacky language. There are no extra points for purity.

Purity itself is a double-edged sword. Sometimes it makes the code more reliable. Other times - it generates a lot of abstract nonsense, which makes it harder to reason about the piece of software or change it.

On the positive side, look at the Rust language (and community!). While it has lovely abstractions and safety guarantees, it is a practical language - performant for writing and execution.

[+] quickthrower2|3 years ago|reply
It is a function that maps global state to global state!!
[+] rep_movsd|3 years ago|reply
Rust is pretty hard to write, its not "practical" in any sense
[+] davidktr|3 years ago|reply
Why? Because of the barely competent programmer, like myself. A DIY army.

Is XML better than JSON? For longterm stability sure. A quick config file? Nope. You see, to understand XML you kind of have to know how to work with trees. Bread-and-butter for compsci grads, a nuisance for all others.

Recently I needed to query a SOAP-based API. It took me 3 days because I had no idea how a namespace in XML worked, because I was not sure which lib in Python to use (lxml was the solution), and because their API had some quirks. I read forum posts from the late 00's, the documentation is really scarce.

I would love to learn the fundamentals, I don't like half-assing things. But a) I'm not going back to uni for it, b) I'm not bright enough to be excellent in two fields. Learning the fundamentals takes a very, very long time. Where to start even? A complete math program? Programming fundamentals as such? Theoretical compsci?

At the end of the day, things need to work. If they fall apart next year I can tell my boss "ooohhh big problem give money". Using XML instead of JSON has no payoff today.

[+] Sankozi|3 years ago|reply
> Is XML better than JSON? For longterm stability sure. A quick config file? Nope.

I have an opposite opinion. JSON is great solution for a protocol or serialization format, better than XML (JSON is faster, smaller, safer, easier to understand) But it is one of the worst possible choices for config file format. Even XML is better in that case. Lack of comments is complete showstopper in any source code file syntax.

[+] StreamBright|3 years ago|reply
This is exactly what OP is talking about. You have XML instead of ASN.1 Imagine if we had only one way of doing data transfer between computer systems.
[+] jaredsohn|3 years ago|reply
> Is XML better than JSON? For longterm stability sure.

Don't get that.

>to understand XML you kind of have to know how to work with trees.

JSON is trees, too.

[+] solus_factor|3 years ago|reply
Nice rant but the answer is simple - incentives.

Seems like there's not that much demand for what you would like to see.

Crappy, barely working, software now is better then 1 year delayed perfect one.

[+] seren|3 years ago|reply
I agree that commercial incentives are not there, you must always ship earlier than competition rather than waiting for a few months or years to have a more stable product.

However it probably can hurt a brand reputation in the long run if you have a quality level below customer expectation.

[+] rTX5CMRXIfFG|3 years ago|reply
I’m not sure it’s as simple as that. I mean, isn’t capitalism supposed to drive innovation?

It seems to me that OP places a high level of usefulness or value of software at lower levels of abstraction, and I don’t disagree—it’s akin to how any manufacturing business is tremendously more valuable than the variety of consumer-facing products that can be made from it—but the cost of entry into such a business domain, and actually succeeding to make a profit, is often high.

[+] TrackerFF|3 years ago|reply
In construction, or most other fields of "traditional" engineering, projects will take years from start to finish. That's completely normal.

In the world of software - especially for startups - it is completely unacceptable to spend 5 years on building a MVP. There are some protected sectors where things move slower (defense, aerospace, for example), but if you're in the consumer field, you just can't spend too much time. You want to push out a MVP as soon as possible, and build on that.

And most software companies do not get penalized on shipping buggy software. Big game studios ship broken games, and spend a couple of years patching them up to their final form. People bitch and moan, but still throw money at 'em.

If you don't want bloated, broken, slow, and unreliable software - use your pocketbook.

[+] mikewarot|3 years ago|reply
The madness is insisting that applications, instead of operating systems, enforce security. In the desktop era, the OS was designed to blindly trust applications to do the right thing, and everyone went along with it.

That was the seed of the madness.

Instead of supplying dialog boxes(open, save, etc) for applications to use and then open files directly, the OS could have supplied handles (capabilities) from those dialog boxes for the applications to use. This would have allowed the user interface to be almost identical, and only require a few lines of code in the applications to change, in exchange for an environment which was almost immune to confused/rogue programs.

Because expectations were so corrupted in the desktop days, the situation now is effectively hopeless.

Applications should NEVER be trusted, especially not by the operating system.

[+] pjc50|3 years ago|reply
It's interesting to see how very different all this is from the original DARPA papers looking at security, because those all presumed that applications would be maintained by the system administrator and it was users who should be treated with suspicion.

> Applications should NEVER be trusted, especially not by the operating system.

This was basically impossible to do in the early personal computer era, before memory protection was a thing. So MacOS, Windows pre-NT, AmigaOS etc were all built around the assumption of applications reading each other's memory.

[+] Someone|3 years ago|reply
> Instead of supplying dialog boxes(open, save, etc) for applications to use and then open files directly, the OS could have supplied handles (capabilities) from those dialog boxes for the applications to use. […] the situation now is effectively hopeless.

Not hopeless. MacOS, for example, added entitlements that get you there. Applications cannot list random directories or open files at will.

If an application opens a file browser, the operating system shows you your file system. When you pick a file, the operating system gives the application the right to read that file.

See https://developer.apple.com/documentation/bundleresources/en...

[+] michaelteter|3 years ago|reply
Many possible reasons, including:

- no engineering standards in computer science, including especially in education of computer science

- some corporate finance views which see technology as a cost center rather than a business enabler

- some corporate strategies where marketing decides what is possible and when it must be ready (tomorrow. or yesterday.)

- computing and software development as a "fun", creative endeavor - as opposed to a rigorous, formal process

I'm sure there are dozens more reasons.

[+] mathverse|3 years ago|reply
It's simple. Hardly anyone wants to pay the premium price for quality software. Software is not physical goods and its real value comes from the things it can do it does not matter how it can do them.
[+] valenterry|3 years ago|reply
Pareto principle. The last 20% take most of the work. People are not willing to pay for it (or at least that's what sales/marketing believe), so the effort is not being spent. It's as simple as that.
[+] ChrisMarshallNY|3 years ago|reply
I feel your pain, but you really aren’t gonna be able to change it.

As long as people are willing to pay for dross, people will continue to produce it. Just look at Hollywood, for a classic example.

For myself, I found a company that was interested in producing Quality, and spent most of my career, there. It had its trade-offs.

It certainly didn’t get me much respect from today’s software development community. If anything, it gets me scorn and outright disrespect.

I had to retire and work on my own (for free), in order to finally get to write software the way that I want. I doubt it would be considered commercially viable.

In any case, “doing it right” is in the eye of the beholder. The fact that I'm not writing all my native Apple software in Haskell, is "doing it wrong," to some folks. Don't even get me started on OOP or using UIKit/AppKit.

Cant' please everyone.

I’m best off, putting my values into practice, in my own context, and not being an old man, yelling at the sky.

But I really want to. It just won’t do any good.

[+] silvestrov|3 years ago|reply
JSON is used because it is much simpler to generate and parse in browsers than xml.

It is also "good enough" for most use cases. So it is reasonable to start out with JSON and only upgrade to XML for complex documents/strutures.

[+] Zanfa|3 years ago|reply
JSON is also on average much more human-readable than the average XML document. Sure, you can do awful things with JSON and generate beautiful XML, but for some reason all of the XML I've encountered in production systems is extremely convoluted.
[+] khalidx|3 years ago|reply
Also a different take on the problem is so much software is unreliable from a data perspective. We trust these software providers with GBs of our data, yet they all have different backup strategies, are using different consistency and ACID guarantees, and are not at all experts at managing and backing up data.

If we could abstract away the storage so that it is user owned, or at least a "utility" like electrical power lines where everyone is playing by the same rules, we would be able to trust these platforms significantly more, and rest assured our data is safe even if the software isn't.

[+] Joel_Mckay|3 years ago|reply
"Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure."

( Melvin E. Conway, https://en.wikipedia.org/wiki/Conway%27s_law )

Thus, we may conclude the world has some people that choose to be "useless and unreliable". The software part is just an artifact of this ideology. =)

[+] pjc50|3 years ago|reply
> will produce a design whose structure is a copy of the organization's communication structure

This is the original Amazon memo and the rationale for microservices.

The bigger picture is that if you look at the totality of the software and hardware system, from app to browser to OS to hardware, you see that it replicates the company and market structure that produces it. Which is partly why the web is so chaotic. The web is a standards battleground over which companies fight.

[+] ss48|3 years ago|reply
Engineering is not always implementing the best solution, but implementing a solution that completely or partially solves the problem at hand. In engineering, the fact that the solution is suboptimal is immaterial compared to the outcome. It may have ramifications to future problems that are possible following that, but that is a separate engineering problem that can be considered separately and decided upon accordingly.

Transferring ideas and software discovery are difficult problems. Expecting everyone to know that tech x is superior to tech y and singularly pursue development of tech x is unreasonable.

Most people prefer work with a solution that you are familiar with or caters to a particular technology stack. JSON is a first-class citizen of Javascript, but can be secondary in other programming languages. Anything that provides the least cost or effort to resolving or minimizing the extent of the product at hand is all that is needed. There is almost never a case where point B must follow or supersede point A. Point A continues to solve the problems it was designed for or turned out to be useful for by chance: the technology can continue to be used where it is known and regarded.

[+] GianFabien|3 years ago|reply
You don't have to use software, frameworks, tools that don't meet your standards.

I find that the learning curve to get on top of most abstractions is greater than what is required to understand the foundations. Knowing and applying foundational components tends to yield more performant and less bloated solutions. Debugging is easier because you don't need to unravel layers of abstractions which don't quite align with the domain you are working with.

[+] ricardobayes|3 years ago|reply
Most companies don't do real cost/income analysis on any work. Most internal work is just approved by a department leader/tech lead. So most companies tend to overhire and do menial busywork, like constant refactoring of robust and well-working code, to catch up with the latest and greatest frameworks. The latest offender is vite, offering minimal gains (faster dev builds) at a non-negligible dev cost (and source of new bugs).
[+] porcoda|3 years ago|reply
There isn’t any external pressure to give people an incentive to do better. Until that arrives, we’re stuck in this crap software local minimum.

I’m hoping such pressure arrives in the form of legal or regulatory stuff, even if it chills the industry and slows/shrinks it. Until then we’re still in the computing equivalent of the auto industry when people died in mild fender benders that today are just mild annoyances.