(no title)
waboremo | 2 years ago
Windows is especially bad at this due to so much legacy reliance, which is also kind of why people still bother with Windows. Not to claim that Linux or MacOS don't have similar problems (ahem, Catalyst) but it's not as overt.
A lot of the blame gets placed on easy to see things like an Electron app, but really the problem is so substantial that even native apps perform slower, use more resources, and aren't doing a whole lot more than they used to. Windows Terminal is a great example of this.
Combine this with the fact that most teams aren't given the space to actually maintain (because maintaining doesn't result in direct profits), and you've got a winning combination!
blincoln|2 years ago
I think this blame is fair. Electron is the most obvious example, but in general desktop software that essentially embeds a full browser instance because it makes development slightly easier is the culprit in almost every case I've experienced.
I use a Windows 10 laptop for work.[1] The app that has the most lag and worst performance impact for as long as I've used the laptop is Microsoft Teams. Historically, chat/conferencing apps would be pretty lightweight, but Teams is an Electron app, so it spawns eight processes, over 200 threads, and consumes about 1GB of memory while idle.
Slack is a similar situation. Six processes, over 100 threads, ~750MB RAM while idle. For a chat app!
Microsoft recently added embedded Edge browser controls into the entire Office 365 suite (basically embraced-and-extended Electron), and sure enough, Office is now super laggy too. For example, accepting changes in a Word doc with change tracking enabled now takes anywhere from 5-20 seconds per change, where it was almost instantaneous before. Eight msedgewebview2.exe processes, ~150 threads, but at least it's only consuming about 250MB of RAM.
Meanwhile, I can run native code, .NET, Java, etc. with reasonable performance as long as the Electron apps aren't also running. I can run multiple Linux VMs simultaneously on this laptop with good response times, or I can run 1-2 Electron apps. It's pretty silly.
[1] Core i5, 16GB RAM, SSD storage. Not top of the line, but typical issue for a business environment.
joshstrange|2 years ago
That "slightly" is doing a massive amount of heavy lifting in that sentence.
I run a company on the side that produces software for events which require a website and mobile apps for iOS (iPhone and iPad)/Android. I cannot imagine being able to do this all on my own without being able to share a codebase (mobile apps built via Capacitor) across all of them. Would native apps be faster? Almost certainly but I'm not going to learn Kotlin and Swift and triple the number of codebases I have to work it. It's completely infeasible for me, maybe some of you are able to do that but I'm not, there aren't enough hours in the day.
I fully understand the cruft/baggage that methods like this bring but I also see first-hand what they allow a single developer to build on their own. I'll take that trade. I'm a little less forgiving of large companies but Discord and Slack (and other Electron apps) work fine for me, I don't see the issues people complain about.
dmonitor|2 years ago
tracker1|2 years ago
It's not an easy task, and it's not something that anyone has really done. There are plenty of single platform examples, and Flutter is about as close as you can get in terms of cross platform.
There are also alternatives that can use the engine of an installed OS browser. Tauri is a decent example for Rust. Also, Electron isn't to blame for the issues with Teams. VS Code pretty much proves you can create a relatively responsive application in a browser interface.
kaba0|2 years ago
By requiring more than that, we had to increase the essential complexity. I believe this tradeoff in itself is well worth it (and hopefully we can all agree on that going back to us-ascii-only locale is not a forward direction).
The problem I see is that the layers you also mention, each expose leaky abstractions (note that abstractions are not the problem, no person on Earth could implement anything remotely useful without abstractions — that’s our only tool to fight against complexity, of which a significant amount is essential, that is not reducible). Let’s also add a “definition” I read in a HN comment on what constitutes an ‘expert’: “knowing at least 2 layers beneath the one one is working with” (not sure if it was 1 or 2).
Given that not many people are experts and a tendency of cheaping out on devs, people indeed are only scratching that top layer (often not even understanding that single one!), but the problem might also be in how we organize these layers? When an abstraction works well it can really be a breeze and a huge (or only significant, see Brooks) productivity boost to just add a library and be done with it — so maybe the primitives we use for these layers are inadequate?
citrin_ru|2 years ago
Not my area but AFAIK modern (electron based) desktop apps are less accessible then classic win32 apps from win2k - win7.
leidenfrost|2 years ago
It's the fact that manpower can't keep up with the exploding amount of complexity and use cases that happened to computing in the last decades.
We went from CLI commands and a few graphical tools for the few that actually wanted to engage with computers, to an entire ecosystem of entertainment where everyone in the world wants 'puters to predict what could they want to see or buy next.
To maintain the same efficiency in code we had in the 90-2000s, we would need to instantly jump the seniority of every developer in the world, right from Junior to Senior+. Yes, you can recruit and train developers, but how many Tanenbaums and Torvalds can you train per year?
The biggest amount of cruft not only went to dark patterns and features in programs like animations and rendering that some people regard it as "useless" (which is debatable at minimum). But the layers went also to improve "developer experience".
And I'm not talking about NodeJS only. I'm talking about languages like Python, Lua, or even the JVM.
There's a whole universe of hoops and loops and safeguards made so that the not-so-genius developer doesn't shoot themselves in the foot so easily.
I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.
TeMPOraL|2 years ago
This doesn't strike me like something "everyone in the world wants", but rather something a small group of leaches is pushing on the rest of the population, to enrich themselves at the expense of everyone else. I'm yet to meet a person that would tell me they actually want computers to tell them what to see or buy. And if I met such person, I bet they'd backtrack if they learned how those systems work.
Exercise for the reader: name one recommendation system that doesn't suck. They all do, and it's not because recommendations are hard. Rather, it's because those systems aren't tuned to recommend what the users would like - they're optimized to recommend what maximizes vendor's revenue. This leads to well-known absurdities like Netflix recommendations being effectively random, and the whole UX being optimized to mask how small their catalogue is; or Spotify recommendations pushing podcasts whether you want them or not; or how you buy a thing and then get spammed for weeks by ads for the same thing, because as stupid as it is, it seems to maximize effectiveness at scale. Etc.
> I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.
I'll say the same thing I say to people when they claim banning ads would annihilate 90% of the content on the Internet: good. riddance.
Netflix would still be there. So would smartwatches and calorie counting apps. We're now drowning in deluge of shitty software, a lot of which is actually malware in disguise; "annihilating 90% of the software development workforce" would vastly improve SNR.
sigotirandolas|2 years ago
The early 00’s “open standard” of web forum + eMule + VLC would still be light years ahead of Netflix&co. if it weren’t for how hard it’s been gutted by governments, copyright lobbies, ISPs and device/platform vendors through the years. Heck, the modern equivalent often still is (despite all the extra hoops), unless you are trying to watch the latest popular show in English.
hcarvalhoalves|2 years ago
enterprise_cog|2 years ago
You are looking back with rose tinted glasses if you think all software was blazing fast back then. There was a reason putting your cursor on a progress bar to track whether it was moving was a thing.
wongarsu|2 years ago
Part of the "problem" with Windows is also lack of legacy reliance. As in: MacOS and Linux are at heart Unix systems, with a kernel architecture meant for 1970s hardware. The Windows NT kernel family is a clean-sheet design from the 1990s, a time where compute resources were much more plentiful.
For example, on Linux file system access has (by default) very basic permissions, and uses a closely coupled file system driver and memory system in the kernel. On Windows there is a very rich permission system, and ever request goes through a whole stack of Filesystem Filter Drivers and other indirections that can log, verify or change them. This is great from a functionality standpoint: virus scanners get a chance to scan files as you open them and deny you access if they find something, logging or transparent encryption is trivial to implement, tools like DropBox have an easy time downloading a file as you access it without dealing with implementing a whole file system, the complex permission system suits enterprise needs, etc. But on the other hand all these steps make the system a lot slower than the lean Linux implementation. And similar resource-intensive things are happening all over the kernel-API in Windows, simply because those APIs were conceived at a time when these tradeoffs had become acceptable.
c00lio|2 years ago
Yes, but still it seems to be useless to implementers, because practically every virus scanner implements braindead stuff like DLL injection for on-access-scanning.
sillywalk|2 years ago
I thought that the NT Kernel was heavily based on VMS. When Dave Cutler, their chief OS architect/guru left for Microsoft and took a bunch of engineers with him. FTA:
"Why the Fastest Chip Didn't Win" (Business Week, April 28, 1997) states that when Digital engineers noticed the similarities between VMS and NT, they brought their observations to senior management. Rather than suing, Digital cut a deal with Microsoft. In the summer of 1995, Digital announced Affinity for OpenVMS, a program that required Microsoft to help train Digital NT technicians, help promote NT and Open-VMS as two pieces of a three-tiered client/server networking solution, and promise to maintain NT support for the Alpha processor. Microsoft also paid Digital between 65 million and 100 million dollars."
[0] https://www.itprotoday.com/windows-client/windows-nt-and-vms...
kitsunesoba|2 years ago
apetresc|2 years ago
kaba0|2 years ago
It later turned out to be due to some Unicode handling in-built into a windows api they were using, while the developer’s version also not completely feature-complete. But both sides were sort of right.
ksec|2 years ago
You just describe modern Web Development.