Indeed. Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines: Gnome uses JS for various desktop interactions, and all major desktops run a different JS engine as a different user to evaluate polkit authorizations (so exactly zero RAM could be shared between those engines, even if they were identical, which they aren't), and then half your interactions with GUI tools happens inside browser engines, either directly in a browser, or indirectly with Electron. (And typically, each Electron tool bundles their own slightly different version of Electron, so even if they all run under the same user, each is fully independent.)Or you can ignore all that nonsense and run openbox and native tools.
torginus|1 month ago
They did hack around this with heuristics, but they never did solve the issue.
They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.
burner420042|1 month ago
lproven|1 month ago
A couple of years ago I saw a talk by Sophie Wilson, the designer of the ARM chip. She had been amused by someone saying there was an ARM inside every iPhone: she pointed out that there was 6-8 assymetric ARM cores in the CPU section of the SOC, some big and fast, some small and power-frugal, an ARM chip in the Bluetooth controller, another in the Wifi controller, several in the GSM/mobile controller, at least one in the memory controller, several in the flash memory controller...
It wasn't "an ARM chip". It was half a dozen ARMs in early iPhones, and then maybe dozens in modern ones. More in anything with an SD card slot, as SD card typically contain an Arm or a few of them to manage the blocks of storage, and other ARMs in the interface are talking to those ARMs.
Wheels within wheels: multiple very similar cores, running different OSes and RTOSes and chunks of embedded firmware, all cooperatively running user-facing OSes with a load of duplication, like a shell in one Javascript launching Firefox which contains a copy of a different version of the same Javascript engine, plus another in Thunderbird, plus another embedded in Slack and another copy embedded in VSCode.
Insanity. Make a resource cheap and it is human nature to squander it.
FrostViper8|1 month ago
I have plenty of complaints about gnome (not being able to set a solid colour as a background colour is really dumb IMO), but it seems to work quite well IME.
> Or you can ignore all that nonsense and run openbox and native tools.
I remember mucking about with OpenBox and similar WMs back in the early 2000s and I wouldn't want to go back to using them. I find Gnome tends to expose me to less nonsense.
There is nothing specifically wrong with Wayland either. I am running it on Debian 13 and I am running a triple monitor setup without. Display scaling works properly on Wayland (it doesn't on X11).
lproven|1 month ago
IMHO, I find the reverse. It feels like a phone/tablet interface. It's bigger and uses way more disk and memory, but it gives me less UI, less control, less customisation, than Xfce which takes about a quarter of the resources.
Example: I have 2 screens. One landscape on the left, one portrait on the right. That big mirrored L-shape is my desktop. I wanted the virtual-desktop switcher on the right of the right screen, and the dock thing on the left of the left screen.
GNOME can't do that. They must be on your primary display, and if that's a little laptop screen but there is a nice big spacious 2nd screen, I want to move some things there -- but I am not allowed to.
If I have 1 screen, keep them on 1 screen. If I have 2, that pair is my desktop, so put one panel on the left of my desktop and one on the right, even if those are different screens -- and remember this so it happens automatically when I connect that screen.
This is the logic I'd expect. It is not how GNOME folks think, though, so I can't have it. I do not understand how they think.
zozbot234|1 month ago
creshal|1 month ago