top | item 44434301

(no title)

jdvh | 8 months ago

Rule of thumb: every 10% increase in complexity cuts your potential user base in half.

This is why people make backups by copy-pasting files. This is why Excel is so dominant. This is why systems like hypercard and git are not mainstream and never will be.

There is a large universe of tools people would love if only they would bother to learn how they worked. If only. Most people will just stick to whatever tools they know.

For most people the ability to go back and forward in time (linear history) is something they grasp immediately. Being able to go back in time and make a copy also requires no explanation. But having a version tree, forking and merging, having to deal multiple timelines and the graphs that represent them -- that's where you lose people.

discuss

order

wavemode|8 months ago

I wouldn't frame it as "complexity", I would frame it as "cognitive load". You can lower cognitive load despite having high complexity. For example, you could (and many companies have done so) build a user-friendly version management system and UI on top of git, which on its surface is just "version 1", "version 2", "version 2 (final) (actually)" but under the hood is using commits and branches. You can have submenus expose advanced features to advanced users while the happy path remains easy to use.

alphazard|8 months ago

> Rule of thumb: every 10% increase in complexity cuts your potential user base in half.

I agree this is an accurate rule of thumb. However if the complexity lets users achieve more, then the complexity can earn its keep. Using version control is so beneficial that software engineers deal with the complexity. The ability to maintain a more complicated model in one's head and use it to produce more value is not something that all users are able to do. More sophisticated users can afford to use more complicated tools.

However the sophisticated users are reigned in by network effects. If you want to work with people then everyone needs to be able to deal with the complexity. Programmers are more sophisticated than most office workers, which is why we ubiquitously version codebases, and not so much spreadsheets.

> This is why systems like hypercard and git are not mainstream and never will be.

We are moving towards a world where fewer humans are needed, and the humans that are needed are the most sophisticated operators in their respective domains. This means less network effects, less unsophisticated user drag holding back the tooling. The worst drop off and the population average increases.

I would not be surprised to see an understanding of version control and other sophisticated concepts become common place among the humans that still do knowledge work in the next few years.

esafak|8 months ago

Google has a version history for documents. Microsoft has it too now. I don't know introduced it first?

jancsika|8 months ago

> But having a version tree, forking and merging, having to deal multiple timelines and the graphs that represent them -- that's where you lose people.

There's no good reason why that should be the case. E.g., one could imagine the guts of the "copy-pasting files" UI being a VCS. That would keep the original 100% of the userbase plus allow whatever percentage to level up if/when the need arises (or go "back in time" in the event of a major screw-up).

It's just that software UX in 2025 is typically very bad. The real axiom: the longer you run an application, the more likely it will do the opposite of its intended purpose.

Oops, the word "stash" in git has an idiosyncratic meaning. That content has been removed from the history I was trying to keep. Fuck.

Oops, "Start" in Windows pauses interactivity and animation until ads are ready to be displayed in the upcoming dialog. Fuck!

Especially in the latter case, I don't think users are deterred by the cognitive load required to interact with the interface. It's probably more a case of them being deterred because the goddamned stupid thing isn't doing what it's supposed to.

jdvh|8 months ago

In theory you can have these "zero cost abstractions" but in practice I don't think so. The user manual gets thicker. Concepts like 'delete permanently' and backup/restore get more complicated. Users will get confronted by scary "advanced users only" warnings in the interface. Some enthusiast blogger or youtuber will create content highlighting those advanced features and then regular users will get themselves in trouble. Customer support gets way more complicated because you always have to consider the possibility that the user has (unknowingly) used these advanced features. If you put buttons in the interface users will press those buttons. That's just a fact of life. Advanced features always come at a cost. Sometimes that cost is worth it, but only sometimes.

scrubs|8 months ago

That's a generic answer to op's post which while compelling in spirt is also generic.

A first class model of a supply chain for assembly manufacturing and an (even bitemporal) accounting database are just wildly different domains.

To stop open ended rhetoric we need to make something fixed/constant.

kamaal|8 months ago

Part of the reasons why so many people are disillusioned by AI. We are attempting to tame complexity that shouldn't exist at the first place.

Im guessing lots of code that was getting written was kind of verbose boiler plate, automating all that doesn't move the productivity needle all that much. That shouldn't have existed at all to start with.

foundart|8 months ago

I think the author's ideas are likely too complex for a wide audience, but they could be a game changer for those who can handle that kind of complexity.