top | item 45876270

(no title)

whism | 3 months ago

My guess is that this change has its roots in the move from physical media delivery of software to internet delivery.

My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.

I.e. where there is a barrier to entry, quality of results tends to improve.

I also see this in ease of publishing to social media, bias of “old music is better” (time has sorted wheat from chaff) and so on.

Perhaps there’s a well known description of this phenomenon somewhere already…

discuss

order

palmotea|3 months ago

> My guess is that this change has its roots in the move from physical media delivery of software to internet delivery.

> My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.

I think the principle is: the greater the impact of a mistake, the more effort you'll put in (up front) to avoiding one. The more friction, the greater the impact.

When software was distributed on physical media and users had no internet you basically had only one (1) chance to get it right. Buggy software would be buggy effectively forever. So (for instance) video game companies had insane QA testing of every release. After QA, it'd get burned onto an expensive cartridge and it'd be done. People would pay the 2025 equivalent of $100+ for it, and they'd be unhappy (to say the least) if it didn't even work.

Once users had internet and patching became possible, that slipped a little, then more. Eventually managers realized you could even get away with shipping an incomplete, not working product. You'd just fix it later in a patch.

Now, with software being delivered over the internet, often as SAAS, everything is in a constant state of flux. Sure they can fix bugs (if they chose too), but as they fix bugs they're constantly introducing new ones (and maybe even removing features you use).