I find the vimmaxxing message common enough but I disagree with it.
Not because vim vs nano vs emacs etc. Just that plain text is not enough.
I want to be able to encode more of my memory and context to my notes with a minimum of fuss, and therefore want to embed images (gasp!), tables, hyperlinks, and even file or sound embeds in my documents.
Yes, open simple file formats are better than closed complex formats (and doing text vs binary formats has something to say for it as well).
No, I don't like that Onenote's .ONE file specification is super complex and therefore only semi-open.
But I'm not about to lock myself into a DEC dumb terminals 80x24 limitations where I have to learn some greybeard's key bindings for everything, all for the privilege of losing all my ability to embed multimedia combined with said gatekeepers tut-tutting me that I should just make a link in my terminal to the local image file, nevermind that you can't copy/paste an image into a bash terminal and have the file be saved to that directory (ok it probably exists but is some obscure something).
So I watch Xournal++ very closely, specifically [this issue](https://github.com/xournalpp/xournalpp/issues/937) because it will have the total package, multimedia embeds, ink support, text support, file embeds, all in one file that I can syncthing save to wherever else I like.
But until then? Onenote desktop, sorry but not that sorry.
> I want to be able to encode more of my memory and context to my notes with a minimum of fuss, and therefore want to embed images (gasp!), tables, hyperlinks, and even file or sound embeds in my documents.
I do this stuff with my notes in Emacs and it's really nice. Emacs is a graphical application and its notetaking facilities are really outstanding. It even supports syntax highlighted, executable, exportable code snippets, as well as all the basic text formatting, hyperlinks, tables of contents, tables, etc. It's a great tool for developers and sysadmins to use for notetaking.
If you know Emacs isn't for you, that's totally fine, but I wanted to throw this out there for bystanders that when someone suggests to build your notetaking workflow around a great text editor, the proposition is not at all to somehow confine oneself to an 1980s terminal.
Agreed 100%. My core workflow is "snap a picture, draw on it." I deal with hardware, and the bandwidth with which this lets me get physical information into the computer is so far beyond any Markdown-based workflow that I can't go back. I suspect this result would hold for many non-software professions.
For now, I've settled on GoodNotes because it's good enough at PDF Backup, OCR Search, and drawing. I have given solid tries to Evernote, ZoomNotes, Concepts, and Notion, and they each are amazingly better than GoodNotes in at least one major respect, sometimes multiple, but none of them are good enough on those zero-compromise requirements and GoodNotes is. FWIW.
EDIT: Oh, while I'm flamebaiting, I also use my ipad camera to take pictures of screens. Life is too short to schlep screenshots between computers, especially the ancient ones that run scientific equipment (lots of 68000, XP on some recent ones). Printer emulators and java applets can get bent. Runs everywhere, my ass. Moore's Law has given us gigahertz and megapixels, and if I can use them to eliminate painful asinine pointless busywork that's my god-given right and nobody is going to convince me otherwise. Sometimes I even use my ipad camera to take a screenshot of something on my PC and toss it into the mix. Sue me.
> I want to be able to encode more of my memory and context to my notes with a minimum of fuss, and therefore want to embed images (gasp!), tables, hyperlinks, and even file or sound embeds in my documents
I swear I'm not shilling for Google Keep, but I've found myself using it more and more over the last couple of years; it hits a sweet spot between "limited enough that I don't get distracted" and "advanced enough that I can still express myself".
The biggest praise is its ubiquity. It's just there when I need it. Interesting article + my in-the-moment thoughts? Google Keep-ed. Want to add a reflection at the end of the day with a picture? Google Keep-ed. Need to remember an address in the car? Toss it in the Keep and open it on my phone.
I totally agree with you. One Note is fantastic and I would like to use it but not being able to open the notes in linux makes me avoid it. Do you know if there is a way of opening One Note notes in Linux without using the webapp? (I don't even need to edit them!)
In the meantime, I'm using stylyslab write [0] which uses svgz and is a good replacement for OneNote and a compromise I'm willing to make to not be locked in but tbh the OneNote app is just so much better.
I wrote a library to parse plain text notes, sowhat [0] that delivers a lot of semantic and structural information. Notes for me solve lots of problems in my routine that wouldnt work with a markdown implementation: tasks, budgets, time management, reading lists, work log, simple calculations that operate on the global notes environment, etc.
Sowhat handles transactional data well, you could implement double-entry for example with relative ease. Other things include: links, events, quotes, tasks, formulas and a few organizational elements. I combine these elements in different ways depending on the problem.
The author makes a common statistical error in interpreting the Lindy effect. The Lindy effect proposes, simplified, that the longer something has been around, the longer it will probably stay around still. The author then makes a quick jump and posits that the opposite is also true, which it is not. Just because something has been around for a short time does not mean that its expected lifespan is somehow short. In other words, A implies B does not mean B implies A as well. All things that have been around for a long time had at one point only been around for a short time.
Disagree: even in your formulation, no 'quick jump' is needed.
Your statement A:"The longer something has been around, the longer it will be around" is not the opposite of B:"the shorter something has been around, the shorter it will be around".
Rather they mean the same thing. 'longer' and 'shorter' here are just English language ways of referring to the same time t that an object has been around.
If someone tells you "the longer a distance is, the more time it takes to walk it", that is exactly the same as "the shorter a distance is, the less time it takes to walk it"; there's no logical leap there.
I could conceive a rule that says "archeological artefacts are likely to be around for a long time" and it'd be a mistake to conclude that this means that non-archelogical artefacts will only be around for a short time.
But that doesn't seem to be how the Lindy effect is formulated, either on Wikipedia or on your post, so there doesn't seem to be an error in applying it to new things.
I believe you are incorrect. According to wikipedia:
> The Lindy effect is a theorized phenomenon by which the future life expectancy of some non-perishable things, like a technology or an idea, is proportional to their current age.
This implies that things that have been around for a short period of time do in fact have a short expected lifespan. You're correct that "A implies B does not mean B implies A as well", but that assumption is not needed.
These statements of "longer" and "shorter" actually refers to probabilities. So it makes sense to state when something is not "longer", then it is "shorter". So I disagree, OP's statements about the Lindy effect makes sense.
Also someone commented about COBOL. Of course no one thinks COBOL "is around", what also means statistics about usage. COBOL is declining and almost no one uses it anymore, so it is safe to say "it is almost not around".
I think the Lindy effect can work in reverse. Like if you had a collection 1000 things that are all just invented, then the likelihood would be only a small percentage of them will be around and in use in 50 years. Compared with a collection of 1000 things that have all be in use for 100 years, the percentage of them that will be around in 50 years will be much greater.
That's not right. The Lindy argument holds in that case as well, it's just a different version of the doomsday argument.
The basis for this kind of reasoning is essentially that, if you can assume that you are an 'average user' (and you don't have reason to believe you're especially late or early), the chance that your prediction about the longevity of the project is correct is most likely to be true if you 1/3 - 3x[1] the lifespan of the project currently.
That is because if say, you predicted VsCode existed a hundred times longer than it currently did, that prediction is only true if you are indeed among the first 1%. 99% of VsCode users making that prediction will be wrong.
This is correct, let me take a crack at explaining why.
The Lindy effect is named after a restaurant, Lindy's, which ironically closed recently. If we were to take it as invertible, we would have to conclude that a brand-new restaurant, which opened an hour ago, is most likely to shut down one hour from now.
This is an obvious absurdity: if we wish to speculate on the longevity of new things, we can't use the Lindy effect to do so. It's a good heuristic for betting that something will continue, it's a bad heuristic for betting something won't.
Very true but I think what's also implied in the article is that new tools (VS Code) that look very different from old tools (vi/emacs) will probably not last. So, if A is very old and B is very different from A then B is less likely to succeed.
I don't believe that, btw, just guessing at the mindset of the author.
I don't think there's anything in the wording of the Lindy effect that means it's only a one way inference. It says the life expectancy of a thing is proportional to its age. That makes sense to me. If a product is going to die, it's likely to die fast. If a product sticks around five years, it's more likely it'll be around for ten more. It's why startups go under all the time, but you don't hear of a lot of 50 year old companies going out of business.
I think the error is interpreting these things literally. If we did, then if it were 1990 we'd say Windows wouldn't be see the next century. Probabilities are attached.
There are 1000 things that have been around for 1 year.
There are 10 things that have been around for 10 years.
I think it’s safe to say that there is less chance of any individual thing that’s 1 year old being around in 1 year than the things that have already survived for 10.
> The Lindy effect proposes, simplified, that the longer something has been around, the longer it will probably stay around still.
I think there is a linguistic ambiguity here. What is meant by "the longer something has been around"? Longer with respect to what?
If you mean to say if A has been around longer than B, then A will stay around longer than B (in expectation), then this law is commutative with respect to A and B and what you specify as "opposite" holds.
If you mean to say if something has been around for t then it will stay around for some f(t) where f is an increasing function, then again what you specify as "opposite" holds.
The converse of "the longer something has been around, the longer it will probably stay around still" is "the longer something will probably stay around (in the future), the longer it will have already been around (from the past)." The change from longer to shorter is not a logical converse.
> In other words, A implies B does not mean B implies A as well.
I think you wanted to write A implies B does not mean "not A" implies "not B" as well. B implies A would be: the longer something will probably stay around, the longer it already has been around.
But a lot more things have only ever been around for a short time. Only a few made it through that “having been around a short time”, stay around longer than expected, and become exceptions to the rule.
I don't really understand this idea of never taking your hands off the keyboard. Maybe people program differently to me, but most of the time I'm not typing anything. Most of my time is spent thinking. When my thoughts are clear and the problem is solved, then I type. And when I do, it's usually no more than a dozen lines at a time.
I get the impression from these people that they are constantly typing things. In fact, they're typing so much that they can't possibly waste valuable seconds using a mouse. I must be misunderstanding what they mean because that just can't be right.
And what's with the "you can achieve the same thing faster, without breaking your concentration" in regard to using a spell-checker or a calculator or whatever. Are you being serious? I can achieve the same thing faster? I mean how long do you think it takes to check the spelling of a word? Even if I must look it up in a physical dictionary, how long are we talking here?
Guys, seriously, slow down. You're going to burn out. I don't want to judge because I don't know you. Maybe you're a rockstar, but I'd guess that if you're really going this fast, the quality of your code is suffering.
It's a running annoyance of Windows that they are always making small changes to the UI that don't really come across as an improvement or a deterioration but that force you to relearn things.
It really drove me crazy when I went from being a linux partisan for being responsible for quite a few different Windows machines and on a given day I could be working with anything from Win 98 to Win ME to Win NT to various editions of Win 2000 and XP and if you had to find something in the UI it would be slightly different in all of those which was a cognitive load. Contrast to to Linux where I did it all on the command line and it stayed the same in that time frame.
Don’t fear the Wayland. Wait patiently for the pain points to be worked out, and then reap the reward of a smoother and more robust desktop.
I’ve been running SwayWM for multiple years and it’s been great. There’s not anything I’m aware of that I can’t do in wlroots compositors; I even have Zoom screenshare working on my work computer under SwayWM. Most stuff, like WebRTC, doesn’t even require manual tweaks; just need the right packages installed. And you get the typical Wayland benefits, like great support for heterogeneous DPI, reduced jank, and potentially better robustness. (Depends on compositor for now; but there is a path towards compositor crash recovery, which should make things far better.)
A comparable Wayland compositor to Xmonad will likely rise as a good successor in the future.
My strategy is to keep the software for a desktop computer stable and of it's era forever. If it gets so old that I'm having trouble compiling things because my glib and gcc are so old then I'll build an entirely new desktop with up to date OS and software. Then set it up and use it till it can't do new software again. This happens every 5 to 10 years. I never lose ability. I only gain it.
There are many things my 2010 era Core2Duo running Ubuntu 10.04 can do that my fancy new Ryzen desktop with Debian 11 can't and won't ever be able to do. Things I cannot give up because they're too important for my daily life.
>And when Wayland finally happens? Well. I guess I’ll have no choice but to stop using computers forever ¯\_(ツ)_/
This is not an argument for unchanging tools. It's an argument for becoming a power user. Pick good, powerful, customizable tools. When they were made is immaterial. Learn them to expert level.
The "innovations" toolmakers try to push on you will almost never be worth the cost of losing all your knowledge, customization to your uses, and muscle memory.
Interesting that the author singles out using a search engine as a calculator as a bad idea. I do use proper calculators sometimes, but what makes google attactive as one is that it can handle both unit conversions and natural language. "30 milliliters * 50 in ounces" is an example. It's surprisingly flexible in what it recognizes.
The Lindy effect is a predictive tool. Whether something is good or not is much more complicated.
For one thing, switching GUI tools has almost no cost, if that tool doesn't have a significant amount of non-ephemeral user content.
It may be different for people with a strong muscle memory, but I can switch calculators or dictionary apps at any time. Basic GUI apps aren't skills you learn, they're things you get vaguely familiar with, the discoverable UIs guide you even if you don't know what you're doing. The learning time is minutes to days at most.
If I have to learn a new app in 3 years, that's fine. It won't take me much more effort than it probably would to maintain the config for enough Vim plugins to get it to act somewhat like I want it to.
I could probably even switch away from something as big as LibreOffice without trouble, if they used the same file formats and actually gave a reason I might want to switch.
Plus, Android itself is still new, and for most things, Mobile is what really matters to me. Note taking is worthless if I can't access or write down the notes when I think of them or want to check them.
Perhaps if I was doing more advanced programming, more of my notes would be taken at a keyboard?
These simple old tools seem really use case specific. Like, speed of text editing is less critical if most of what you do is interact with modern frameworks, where things might change too fast and the projects might be too big to memorize, and you're relying much more on IDE features to help you, and spending 2x as much time researching as actually coding.
This article proposes that the user set up (and remember) a bunch of alternative utilities that are no better, or at least no less laborious, than a Web search.
"I’ve combined XMonad and Chrome to get little floating web apps all over my desktop"
WTF, that is the last thing I'd want. The world (even the Mac world) has finally moved away from the asinine floating-window fad.
Long-lived, stable tools are great things. However, it's also not great to be stuck in your ways and being unwilling to adapt.
> The problem with most notetaking apps is editing text outside Vim breaks my brain.
I see this as an unwillingness to learn. I felt like the tone of the article was of the sort where I was just there to be told that I'm inferior for using a mouse.
Microsoft Word is actually an older program than vim (not older than vi), so obviously the author should switch to Word to take notes instead of vim.
The correlary to that is that unless you go out of your way to create a "cool" desktop for yourself, the desktop you use will change every few years, usually breaking your habits and becoming less and less usable. For example, I've been using Pop!_OS for a while. It has the terrible GNOME flaw that you can't see thumbnails in the filepicker, you can only see a preview of the image you currently select. Or you could, a few months ago. Ever since a relatively recent update, I can't even see the preview anymore. They made something that was bad for years even worse. Same thing with Windows. I can't find my ways in the new options or settings or whatever that is.
I'm trying to slowly move towards using software. I'm still relatively young, but I don't want to spend my whole life adjusting my habits to new random changes.
The KDE enthusiast would probably note that KDE was originally called the "Kool Desktop Environment". But even the most ardent KDE enthusiast wouldn't argue that it does not change.
Look, in the case of all other software, I believe strongly in "release early, release often". Hell, I damned near invented it. But I think history has proven that UI is different than software. The Firefox crew believe otherwise. Good for them, and we'll see.
Software performance improvements tend to come from hardware (Moore's Law, still-ish), and software algorithm (in the old-school sense of how information is actually processed) improvements. Leaning on the UI for massive performance enhancement is a bit like expecting order-of-magnitude income improvements by increasing your working hours. There's only so much time in a day, and there is only a limited rate at which humans can interact with digitised information --- generally text, images, video, audio, and data.
The Mother of All Demos was fifty years ago ... four years ago:
And yet, it incorporated very nearly all the basic human-computer interface principles still used today.
Apple's Macintosh has seen two principle variants of its desktop UI in the 38 years of its existence. And the second, OSX / Aqua, is now older than Mac Classic was when OSX was introduced by eight years. Apple are highly conservative in UI changes.
I'm not principally an Apple user, or fan. But for my desktop, I use an environment inspired by the Mac's predecessor, NeXT, namely Windowmaker. There's been very little development in years, but the product is stable, and still works even on retina-class displays. The fact that I don't have to go hunting down new interactions every few months or years is a tremendous advantage. And if you want, twm is still a serviceable window manager.
My own tools collection strongly resembles Cipriani's. Applications and tools learned decades ago still provide me regular use. I can do what I intended when I want without being buffetted by constant winds of change and shifting fashions. And quite frankly, it's awesome and a bit of a superpower.
[+] [-] Multicomp|3 years ago|reply
Not because vim vs nano vs emacs etc. Just that plain text is not enough.
I want to be able to encode more of my memory and context to my notes with a minimum of fuss, and therefore want to embed images (gasp!), tables, hyperlinks, and even file or sound embeds in my documents.
Yes, open simple file formats are better than closed complex formats (and doing text vs binary formats has something to say for it as well).
No, I don't like that Onenote's .ONE file specification is super complex and therefore only semi-open.
But I'm not about to lock myself into a DEC dumb terminals 80x24 limitations where I have to learn some greybeard's key bindings for everything, all for the privilege of losing all my ability to embed multimedia combined with said gatekeepers tut-tutting me that I should just make a link in my terminal to the local image file, nevermind that you can't copy/paste an image into a bash terminal and have the file be saved to that directory (ok it probably exists but is some obscure something).
So I watch Xournal++ very closely, specifically [this issue](https://github.com/xournalpp/xournalpp/issues/937) because it will have the total package, multimedia embeds, ink support, text support, file embeds, all in one file that I can syncthing save to wherever else I like.
But until then? Onenote desktop, sorry but not that sorry.
[+] [-] pxc|3 years ago|reply
I do this stuff with my notes in Emacs and it's really nice. Emacs is a graphical application and its notetaking facilities are really outstanding. It even supports syntax highlighted, executable, exportable code snippets, as well as all the basic text formatting, hyperlinks, tables of contents, tables, etc. It's a great tool for developers and sysadmins to use for notetaking.
If you know Emacs isn't for you, that's totally fine, but I wanted to throw this out there for bystanders that when someone suggests to build your notetaking workflow around a great text editor, the proposition is not at all to somehow confine oneself to an 1980s terminal.
[+] [-] jjoonathan|3 years ago|reply
For now, I've settled on GoodNotes because it's good enough at PDF Backup, OCR Search, and drawing. I have given solid tries to Evernote, ZoomNotes, Concepts, and Notion, and they each are amazingly better than GoodNotes in at least one major respect, sometimes multiple, but none of them are good enough on those zero-compromise requirements and GoodNotes is. FWIW.
EDIT: Oh, while I'm flamebaiting, I also use my ipad camera to take pictures of screens. Life is too short to schlep screenshots between computers, especially the ancient ones that run scientific equipment (lots of 68000, XP on some recent ones). Printer emulators and java applets can get bent. Runs everywhere, my ass. Moore's Law has given us gigahertz and megapixels, and if I can use them to eliminate painful asinine pointless busywork that's my god-given right and nobody is going to convince me otherwise. Sometimes I even use my ipad camera to take a screenshot of something on my PC and toss it into the mix. Sue me.
[+] [-] momojo|3 years ago|reply
I swear I'm not shilling for Google Keep, but I've found myself using it more and more over the last couple of years; it hits a sweet spot between "limited enough that I don't get distracted" and "advanced enough that I can still express myself".
The biggest praise is its ubiquity. It's just there when I need it. Interesting article + my in-the-moment thoughts? Google Keep-ed. Want to add a reflection at the end of the day with a picture? Google Keep-ed. Need to remember an address in the car? Toss it in the Keep and open it on my phone.
[+] [-] DysodiumRecut|3 years ago|reply
In the meantime, I'm using stylyslab write [0] which uses svgz and is a good replacement for OneNote and a compromise I'm willing to make to not be locked in but tbh the OneNote app is just so much better.
[0] http://styluslabs.com/
[+] [-] usrbinbash|3 years ago|reply
And exactly which one of them can't I do in asciidoc?
[+] [-] joe8756438|3 years ago|reply
Sowhat handles transactional data well, you could implement double-entry for example with relative ease. Other things include: links, events, quotes, tasks, formulas and a few organizational elements. I combine these elements in different ways depending on the problem.
[0] https://github.com/tatatap-com/sowhat
[+] [-] dirkc|3 years ago|reply
[+] [-] suzzer99|3 years ago|reply
[+] [-] metadat|3 years ago|reply
https://wiki.c2.com/?CommonUserAccess
[+] [-] neurostimulant|3 years ago|reply
[+] [-] mixmastamyk|3 years ago|reply
[+] [-] Etheryte|3 years ago|reply
[+] [-] feral|3 years ago|reply
Your statement A:"The longer something has been around, the longer it will be around" is not the opposite of B:"the shorter something has been around, the shorter it will be around".
Rather they mean the same thing. 'longer' and 'shorter' here are just English language ways of referring to the same time t that an object has been around.
If someone tells you "the longer a distance is, the more time it takes to walk it", that is exactly the same as "the shorter a distance is, the less time it takes to walk it"; there's no logical leap there.
I could conceive a rule that says "archeological artefacts are likely to be around for a long time" and it'd be a mistake to conclude that this means that non-archelogical artefacts will only be around for a short time.
But that doesn't seem to be how the Lindy effect is formulated, either on Wikipedia or on your post, so there doesn't seem to be an error in applying it to new things.
[+] [-] jackpirate|3 years ago|reply
> The Lindy effect is a theorized phenomenon by which the future life expectancy of some non-perishable things, like a technology or an idea, is proportional to their current age.
This implies that things that have been around for a short period of time do in fact have a short expected lifespan. You're correct that "A implies B does not mean B implies A as well", but that assumption is not needed.
[+] [-] szundi|3 years ago|reply
Also someone commented about COBOL. Of course no one thinks COBOL "is around", what also means statistics about usage. COBOL is declining and almost no one uses it anymore, so it is safe to say "it is almost not around".
[+] [-] Eddy_Viscosity2|3 years ago|reply
[+] [-] Barrin92|3 years ago|reply
The basis for this kind of reasoning is essentially that, if you can assume that you are an 'average user' (and you don't have reason to believe you're especially late or early), the chance that your prediction about the longevity of the project is correct is most likely to be true if you 1/3 - 3x[1] the lifespan of the project currently.
That is because if say, you predicted VsCode existed a hundred times longer than it currently did, that prediction is only true if you are indeed among the first 1%. 99% of VsCode users making that prediction will be wrong.
[1]https://cdn.vox-cdn.com/thumbor/2VfpAbtj-yOq5gHhYIdgrAIdBuw=...
[+] [-] samatman|3 years ago|reply
The Lindy effect is named after a restaurant, Lindy's, which ironically closed recently. If we were to take it as invertible, we would have to conclude that a brand-new restaurant, which opened an hour ago, is most likely to shut down one hour from now.
This is an obvious absurdity: if we wish to speculate on the longevity of new things, we can't use the Lindy effect to do so. It's a good heuristic for betting that something will continue, it's a bad heuristic for betting something won't.
[+] [-] yarg|3 years ago|reply
That includes the expectation that something young will be (on average) half way through its lifespan.
[+] [-] zwieback|3 years ago|reply
I don't believe that, btw, just guessing at the mindset of the author.
[+] [-] madrox|3 years ago|reply
I think the error is interpreting these things literally. If we did, then if it were 1990 we'd say Windows wouldn't be see the next century. Probabilities are attached.
[+] [-] jinwoo68|3 years ago|reply
[+] [-] blowski|3 years ago|reply
By the author’s logic, COBOL programs still in use today will long outlive Linux. That could even be true.
[+] [-] Aeolun|3 years ago|reply
There are 10 things that have been around for 10 years.
I think it’s safe to say that there is less chance of any individual thing that’s 1 year old being around in 1 year than the things that have already survived for 10.
[+] [-] avindroth|3 years ago|reply
I think there is a linguistic ambiguity here. What is meant by "the longer something has been around"? Longer with respect to what?
If you mean to say if A has been around longer than B, then A will stay around longer than B (in expectation), then this law is commutative with respect to A and B and what you specify as "opposite" holds.
If you mean to say if something has been around for t then it will stay around for some f(t) where f is an increasing function, then again what you specify as "opposite" holds.
[+] [-] randal_handle|3 years ago|reply
[+] [-] ascar|3 years ago|reply
I think you wanted to write A implies B does not mean "not A" implies "not B" as well. B implies A would be: the longer something will probably stay around, the longer it already has been around.
[+] [-] dwighttk|3 years ago|reply
[+] [-] catherd|3 years ago|reply
[+] [-] andrewchambers|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] sclangdon|3 years ago|reply
I get the impression from these people that they are constantly typing things. In fact, they're typing so much that they can't possibly waste valuable seconds using a mouse. I must be misunderstanding what they mean because that just can't be right.
And what's with the "you can achieve the same thing faster, without breaking your concentration" in regard to using a spell-checker or a calculator or whatever. Are you being serious? I can achieve the same thing faster? I mean how long do you think it takes to check the spelling of a word? Even if I must look it up in a physical dictionary, how long are we talking here?
Guys, seriously, slow down. You're going to burn out. I don't want to judge because I don't know you. Maybe you're a rockstar, but I'd guess that if you're really going this fast, the quality of your code is suffering.
[+] [-] PaulHoule|3 years ago|reply
It really drove me crazy when I went from being a linux partisan for being responsible for quite a few different Windows machines and on a given day I could be working with anything from Win 98 to Win ME to Win NT to various editions of Win 2000 and XP and if you had to find something in the UI it would be slightly different in all of those which was a cognitive load. Contrast to to Linux where I did it all on the command line and it stayed the same in that time frame.
[+] [-] jchw|3 years ago|reply
I’ve been running SwayWM for multiple years and it’s been great. There’s not anything I’m aware of that I can’t do in wlroots compositors; I even have Zoom screenshare working on my work computer under SwayWM. Most stuff, like WebRTC, doesn’t even require manual tweaks; just need the right packages installed. And you get the typical Wayland benefits, like great support for heterogeneous DPI, reduced jank, and potentially better robustness. (Depends on compositor for now; but there is a path towards compositor crash recovery, which should make things far better.)
A comparable Wayland compositor to Xmonad will likely rise as a good successor in the future.
[+] [-] superkuh|3 years ago|reply
There are many things my 2010 era Core2Duo running Ubuntu 10.04 can do that my fancy new Ryzen desktop with Debian 11 can't and won't ever be able to do. Things I cannot give up because they're too important for my daily life.
>And when Wayland finally happens? Well. I guess I’ll have no choice but to stop using computers forever ¯\_(ツ)_/
Wayland isn't going to happen. https://dudemanguy.github.io/blog/posts/2022-06-10-wayland-x...
[+] [-] mgraczyk|3 years ago|reply
Do yourself a favor
[+] [-] njharman|3 years ago|reply
The "innovations" toolmakers try to push on you will almost never be worth the cost of losing all your knowledge, customization to your uses, and muscle memory.
[+] [-] rbanffy|3 years ago|reply
OTOH, ed is 3 years older (Wow! 3 years between ed and Emacs!), and I'm very happy I don't use it today.
[+] [-] limpbizkitfan|3 years ago|reply
There are plenty of old tools in consistent/present use that are cumbersome wrecks, too. Curating good things and calling them some buzzword is silly.
[+] [-] shadowofneptune|3 years ago|reply
[+] [-] eternityforest|3 years ago|reply
For one thing, switching GUI tools has almost no cost, if that tool doesn't have a significant amount of non-ephemeral user content.
It may be different for people with a strong muscle memory, but I can switch calculators or dictionary apps at any time. Basic GUI apps aren't skills you learn, they're things you get vaguely familiar with, the discoverable UIs guide you even if you don't know what you're doing. The learning time is minutes to days at most.
If I have to learn a new app in 3 years, that's fine. It won't take me much more effort than it probably would to maintain the config for enough Vim plugins to get it to act somewhat like I want it to.
I could probably even switch away from something as big as LibreOffice without trouble, if they used the same file formats and actually gave a reason I might want to switch.
Plus, Android itself is still new, and for most things, Mobile is what really matters to me. Note taking is worthless if I can't access or write down the notes when I think of them or want to check them.
Perhaps if I was doing more advanced programming, more of my notes would be taken at a keyboard?
These simple old tools seem really use case specific. Like, speed of text editing is less critical if most of what you do is interact with modern frameworks, where things might change too fast and the projects might be too big to memorize, and you're relying much more on IDE features to help you, and spending 2x as much time researching as actually coding.
[+] [-] j7ake|3 years ago|reply
Diagrams, photos, images, equations.
How do you vim your way out of that?
If they are going to use Lindy arguments on note taking, then it should be pen and paper.
If you still want to se technology, then use pen and paper to take notes then take a photo of the page afterwards.
[+] [-] NonNefarious|3 years ago|reply
"I’ve combined XMonad and Chrome to get little floating web apps all over my desktop"
WTF, that is the last thing I'd want. The world (even the Mac world) has finally moved away from the asinine floating-window fad.
[+] [-] dangus|3 years ago|reply
> The problem with most notetaking apps is editing text outside Vim breaks my brain.
I see this as an unwillingness to learn. I felt like the tone of the article was of the sort where I was just there to be told that I'm inferior for using a mouse.
Microsoft Word is actually an older program than vim (not older than vi), so obviously the author should switch to Word to take notes instead of vim.
[+] [-] Zababa|3 years ago|reply
I'm trying to slowly move towards using software. I'm still relatively young, but I don't want to spend my whole life adjusting my habits to new random changes.
[+] [-] Animats|3 years ago|reply
Counterexamples - technologies that lasted a long time, then hit a hard dead end.
- NTSC / PAL video.
- Audio on magnetic tape
- Video on magnetic tape
- Manual transmissions in cars
- Daily newspapers
- Mimeograph machines
- Asbestos
[+] [-] AdmiralAsshat|3 years ago|reply
[+] [-] sbf501|3 years ago|reply
I've yet to see a GUI (FVWM, KDE, Gnome) that offers something X/Athena/Motif didn't nail 30+ years ago.
[+] [-] dredmorbius|3 years ago|reply
Look, in the case of all other software, I believe strongly in "release early, release often". Hell, I damned near invented it. But I think history has proven that UI is different than software. The Firefox crew believe otherwise. Good for them, and we'll see.
HN-safe archive link: https://web.archive.org/web/20120511115213/https://www.jwz.o...
Software performance improvements tend to come from hardware (Moore's Law, still-ish), and software algorithm (in the old-school sense of how information is actually processed) improvements. Leaning on the UI for massive performance enhancement is a bit like expecting order-of-magnitude income improvements by increasing your working hours. There's only so much time in a day, and there is only a limited rate at which humans can interact with digitised information --- generally text, images, video, audio, and data.
The Mother of All Demos was fifty years ago ... four years ago:
https://news.ycombinator.com/item?id=31676445
And yet, it incorporated very nearly all the basic human-computer interface principles still used today.
Apple's Macintosh has seen two principle variants of its desktop UI in the 38 years of its existence. And the second, OSX / Aqua, is now older than Mac Classic was when OSX was introduced by eight years. Apple are highly conservative in UI changes.
I'm not principally an Apple user, or fan. But for my desktop, I use an environment inspired by the Mac's predecessor, NeXT, namely Windowmaker. There's been very little development in years, but the product is stable, and still works even on retina-class displays. The fact that I don't have to go hunting down new interactions every few months or years is a tremendous advantage. And if you want, twm is still a serviceable window manager.
My own tools collection strongly resembles Cipriani's. Applications and tools learned decades ago still provide me regular use. I can do what I intended when I want without being buffetted by constant winds of change and shifting fashions. And quite frankly, it's awesome and a bit of a superpower.