(no title)
hugryhoop | 2 years ago
Now we do the same, but we look at the text editors of 1995 which used 4 MB of RAM as incredibly efficient and well made, paragons of craftsmanship.
Things never change, the old generation fights the new one and calls it stupid.
nox101|2 years ago
I'm not saying old editors weren't more efficient but the stuff editors handle today got more complex. LSP servers do way more analysis than any 1995 editor and they do it in an editor agnostic way. It costs more memory but it also lets us all jump into the future faster rather then every editor having to implement their own for every language.
amszmidt|2 years ago
Remote editing back in the 1980s was such a common thing on the Smalltalk and Lisp Machines that all system code was on another machine, more times than not you wouldn't even notice that it was a remote file!
One could do "emoji" just fine as well, and files would have WYSIWYG like look to them using "fat strings" -- that is 1980s technology. There is a dungeon crawler map using that feature to render the map as graphics, it is how you would implement chess pieces, or other "picture" like stuff.
Auto-complete was already standard, similar look up of "who calls" / "who uses" functionality to figure out where things are used, online documentation, etc etc etc...
So all this was perfectly possible, and already used and abused in 1995 -- VSCode isn't doing anything new in that regard.
vidarh|2 years ago
A color dialog was tens of KB of code in the 1980s.
My own editor handles Unicode well enough for most users in a few dozen lines of code. RTL would take a bit more, but not much. LSP servers if anything reduce the need for the editor resource use to grow.
It's not that these justify no extra resources use because they do, but they don't need to significantly increase resources use.
A lot of apps get away with huge resource use simply because people aren't used to paying attention to it any more, because for most it affects them little enough in isolation, per app, that when it matters addressing the resources use of one hardly makes a difference.
dale_glass|2 years ago
An Unicode font is easily 15 mb in size. Let alone that you'll have several of those. And the code and memory that takes to do all the magic of rendering it, hinting, and subpixel antialiasing.
Then there's that a 4K framebuffer is 32MB in size.
Smooth compositing requires every running program to have a buffer it draws into. So there goes a couple hundred MB more just to make sure you don't see the screen repaint like in Windows 3.1.
Yeah, you can have compact software where your only requirement is that it uses ASCII, does it at 80x25 and doesn't do anything more fancy than editing text.
wolvesechoes|2 years ago
Behold! The peak of technological prowess! So many poor souls of the past died in misery and 4 MB of RAM. They could not taste those sweet fruits of progress.
vidarh|2 years ago
E.g. in "waste" that is fine, I'd categorise AmigaE's choice to read the entire source file into memory, instead of doing IO character by character, or line by line from small buffers. It was a recognition that there was no compelling reason to not sacrifice that small amount of RAM for simplicity and speed. What you gain can differ, but as long as the benefit is proportionally good enough relative to the cost, that's fine.
But so much modern software pulls in huge dependencies for very little benefit, or try to be far too much, instead of being focused tools that interoperate well.
It's not so much that the new generation is stupid, as that a lot of people (of any generation) always choose the easy option instead of stopping to think. Sometimes that's the right tradeoff, often it's not.
And hardware advances mean you can get away with more and more. Sometimes that justifies more extravagant resource use. Often it doesn't.
zozbot234|2 years ago
This is only an issue if your OS doesn't have virtual memory and mmap. Modern OS's automatically prefetch files into free RAM (so there's no such thing as "free RAM is wasted RAM" either). I think newer versions of Amiga OS were supposed to be getting virtual memory support at some point, too.
glenstein|2 years ago
I was with you until here, which I think is the wrong take. That is, this gets it exactly backwards. It's not just that every generation gets upset at the previous generation so let's all shrug and move on, it's that this is really a thing that is unfolding from one generation to the next.
It seems like the reflex of oh well the previous generation said it so let's ignore it comes up a lot, to the point that I have this go to example that I use every time it does. I'm a baseball fan. And one thing you used to hear in the '80s, with a guy like, say, Rob Deer or Steve Balboni, was that they tried too hard to hit home runs and they struck out too much. Then you heard that in the '90s as well. Then you heard that in the 2000s, especially with money ball and guys like Jack Cust. Then it just kept getting even more extreme with guys Carlos Pena and now Joey Gallo.
So one thing you could say is, well, every generation says that there were less strikeouts in third day. But there's actually data on this and..... it's true! Almost every decade, from the 1800s through every decade of the 1900s through now, strikeouts really have been going up year to year. And so that intergenerational commentary, well, it's describing a real thing that really is happening.
The same can be said of other things, like people saying they always used to remember the environment being better. Or people saying attention spans are getting shorter. But, they are.
The instinct here I think is to dismiss these since every generation says it. But I think the conclusion should be opposite, that these are real things unfolding on a multi-generational level. So if you see it happening with software, maybe that's because there's really something to it.
scotty79|2 years ago
dist-epoch|2 years ago
Tell a kid learning to program today "you should program in assembly because it's efficient, like I did back in my days".
Kid looks around and sees it would take him 3 days to implement a hello world in assembly, but only 3 seconds to do it in Python. He has a 16 core computer with 64 GB of RAM. Both hello worlds run instantly. So how does that advice make sense? Kid calls you a crazy old man out of touch with the times. Kid goes on running locally a 50 GB LLM to make it do a hello world and feels very excited about the future of programming.
eropple|2 years ago
I agree with the factual observations in your post, but there's an additional bit here, and that there's qualitative value being assigned to what The Youths don't mind and The Olds protest. In baseball, the guys who strike out a lot but hit a ton of home runs create more runs, and therefore create more wins, than most base-hit machines (obvious outliers exist, but you get the idea). On my computer, VS Code does more things that benefit me than vim does (and the outlier here, I guess, would be "a lovingly crafted vim monstrosity that uses all the LSPs etc. designed for VS Code et al in the first place"--doable but not the happy path, etc.).
There's also (and IMO this is more in code than baseball) some kind of bizarre moral valence assigned, and that I don't even pretend to understand, but that's a different story.
Qwertious|2 years ago
That's because the text editors don't exist in a vacuum; the 4MB-RAM text editor would be slow on a 1995 computer but blazingly-fast on a 2024 computer.
VS Code is slow and annoying to use, and RAM is just a more measurable symptom of that.
thejosh|2 years ago
Jetbrains (IDEA IntelliJ, Pycharm, etcetc) put a lot of effort into making their IDE low latency as it was getting to a point of being almost ridiculous. Their editor is built in Java, and they run on their own runtime as they have so many hacks and tweaks to make it work as a desktop app as well (font rendering, etc).
Pavel Fatin has a [great article](https://pavelfatin.com/typing-with-pleasure/) about typing latency and his work around implementing this in IntelliJ, well worth a read.
bluetomcat|2 years ago
vidarh|2 years ago
Running a heavily scriptable editor with a GUI was entirely viable on a 7.16MHz Amiga with 1MB RAM, and more responsive than many modern editors.
Integration with other tooling, like RCS, compilers, or linkers was a given for Amiga apps at that time.
(FrexxEd was also notable for exposing internal buffers as files in a special filesystem, so you could e.g. lint or compile or call your revision control - limited as they were - directly on the buffers without any custom integration)
zozbot234|2 years ago
ben_w|2 years ago