(no title)
jdvh | 8 months ago
This is why people make backups by copy-pasting files. This is why Excel is so dominant. This is why systems like hypercard and git are not mainstream and never will be.
There is a large universe of tools people would love if only they would bother to learn how they worked. If only. Most people will just stick to whatever tools they know.
For most people the ability to go back and forward in time (linear history) is something they grasp immediately. Being able to go back in time and make a copy also requires no explanation. But having a version tree, forking and merging, having to deal multiple timelines and the graphs that represent them -- that's where you lose people.
wavemode|8 months ago
alphazard|8 months ago
I agree this is an accurate rule of thumb. However if the complexity lets users achieve more, then the complexity can earn its keep. Using version control is so beneficial that software engineers deal with the complexity. The ability to maintain a more complicated model in one's head and use it to produce more value is not something that all users are able to do. More sophisticated users can afford to use more complicated tools.
However the sophisticated users are reigned in by network effects. If you want to work with people then everyone needs to be able to deal with the complexity. Programmers are more sophisticated than most office workers, which is why we ubiquitously version codebases, and not so much spreadsheets.
> This is why systems like hypercard and git are not mainstream and never will be.
We are moving towards a world where fewer humans are needed, and the humans that are needed are the most sophisticated operators in their respective domains. This means less network effects, less unsophisticated user drag holding back the tooling. The worst drop off and the population average increases.
I would not be surprised to see an understanding of version control and other sophisticated concepts become common place among the humans that still do knowledge work in the next few years.
esafak|8 months ago
jancsika|8 months ago
There's no good reason why that should be the case. E.g., one could imagine the guts of the "copy-pasting files" UI being a VCS. That would keep the original 100% of the userbase plus allow whatever percentage to level up if/when the need arises (or go "back in time" in the event of a major screw-up).
It's just that software UX in 2025 is typically very bad. The real axiom: the longer you run an application, the more likely it will do the opposite of its intended purpose.
Oops, the word "stash" in git has an idiosyncratic meaning. That content has been removed from the history I was trying to keep. Fuck.
Oops, "Start" in Windows pauses interactivity and animation until ads are ready to be displayed in the upcoming dialog. Fuck!
Especially in the latter case, I don't think users are deterred by the cognitive load required to interact with the interface. It's probably more a case of them being deterred because the goddamned stupid thing isn't doing what it's supposed to.
jdvh|8 months ago
scrubs|8 months ago
A first class model of a supply chain for assembly manufacturing and an (even bitemporal) accounting database are just wildly different domains.
To stop open ended rhetoric we need to make something fixed/constant.
kamaal|8 months ago
Im guessing lots of code that was getting written was kind of verbose boiler plate, automating all that doesn't move the productivity needle all that much. That shouldn't have existed at all to start with.
foundart|8 months ago