top | item 39018607

(no title)

abxytg | 2 years ago

I hate it so much. So arbitrary and capricious. I would say this is currently the number one blocker for the web as a serious platform. And they're doing it on purpose.

discuss

order

whatshisface|2 years ago

I guess the policy is that tabs can use 100% of the available resources on low end devices, but only 10% of the available resources on high end devices.

skybrian|2 years ago

I think the desktop policy might be better. In the tablets I've used, tabs sometimes get killed when I switch tabs and visit another website with a lot of ads. It's an annoying way to lose work in an unsubmitted form. It doesn't seem to happen for desktop.

vicktorium|2 years ago

what apps can't run on 4GB?

games?

3D?

Editing?

have you tried forking chrome and increasing this limit?

paulgb|2 years ago

Video editors are a big one. I've heard of people crashing a browser tab with Figma as well.

For data exploration tools it's very easy to want to use 4GB+ of memory. I found the limit cumbersome while working on financial tools. It usually comes up in internal tools where you reliably have a fast internet connection; it's harder to reach the limit for public-facing tools because there the slowness of sending 4GB+ to the browser is the more limiting factor.

The annoying part isn't just that the limit is there, but that you can't really handle it gracefully as the developer -- when the browser decides you've hit the limit, it may just replace the page with an error message.

jjcm|2 years ago

It's one of our big barriers over at Figma. Creative tools in general hit this limit pretty quickly. For context, I was a very heavy user of Photoshop back in the day. Even a decade ago I remember hitting 20GB of active memory use for Photoshop.

Things get really big really quick, especially when you're storing uncompressed versions of raster elements in memory. To frame things in a different way, 4GB is 22 seconds of 1080p video if you're loading the raw frames into memory.

KeplerBoy|2 years ago

Some AI apps. You can't really load a capable LLM in 4 GB. Or does this limit not apply when dealing with WASM and WebGPU?

amelius|2 years ago

4GB ought to be enough for anybody.