Tell HN: RAM Requirements are going through the roof
102 points| babebridou | 14 years ago | reply
Then there's the desktop. My desktop PC, for instance. It's a great 6 months old working computer, with 8 cores and 16GB of RAM. "This ought to be enough for all my needs", I thought. The Bill Gates in me was right so far, but unfortunately this won't be enough for long.
Yesterday I took a look at my memory usage, out of curiosity: 11.4GB. Woah. I had a look at the breakdown.
- Chrome : ~3GB
- Firefox : ~1.5GB
- Java (eclipse) : ~1.2GB
- Rest in tons of various work-related apps.
I'm of course responsible for letting this accumulate over several days, but still, a third of my RAM taken up by web browsing tabs? Chrome on its own clogging more than twice as much as Eclipse?What worries me now is how hard we are getting struck with RAM-hogging web pages. Since I began writing this post, my freshly restarted Chrome browser with 9 open tabs (Hacker News (x2), Coding Horror, Google Search for "Mac osX Lion review 6 months", MoPub Ad Service monitoring, Google Analytics Visitors Overview, Android Developer Console, Twitter, Gmail) is taking up roughly 500MB of RAM. That's insane. On my 2010 Macbook Pro with 4GB of RAM, this means 20% of my overall caching capacity is taken up by my core web needs. Needless to say, I can't use my macbook anymore.
Last summer I was working on a little java experiment - a cross-platform 3D labyrinth. I wanted the overall memory and data footprint to be as low as possible so it could be played on a low-end android phone with really fast loading, and yet to keep the game space as big as I could, so I designed my own dedicated data structures overnight. I made the following calculation: over 500MB of raw, uncompressed data, I could store the description of an area roughly equivalent to a map of Europe with a resolution of 2.5 meters per pixel.
I'm not claiming any feat here, just stating the obvious: web programmers are doing something that's definitely not cool for our current RAM budget. We forgot any sense of measure. It's unnecessary for a twitter feed following 175 persons for an hour or so to claim a 70MB RAM footprint. You're not the only nor the worst offender, Twitter. Except that I had 26 new tweets to display, I clicked, and the footprint suddenly grew to 76MB. 26 tweets = 6MB. We're talking about 140 characters tweets, let's be generous and multiply that amount by 100 to take into account the tweeter's profile (which are all 10,000 characters essays as we all know), and we get a total of 364 000 new characters, which end up claiming more than 6 million bytes in RAM. Which means the RAM impact of adding 26 new tweets to a webpage is at very very least 10 times higher than it could be, and probably more like a thousand times too high.
Like I said, I'm only using Twitter to state a point. I mean nothing wrong with Twitter's web devs - actually I'm myself a very poor web developer - and I'm pretty sure the blame could also be put on Google Chrome instead, but I remember a day in 2002 where my Internet Explorer was trying to load a 4MB webpage from the hard drive, causing a RAM footprint above 40MB and a failure after 20 minutes of waiting in front of a white screen. Back then I was merely a junior consultant working on QA, and I was the one to tell the devs that they were definitely doing something not cool at all for the user's computer. True, we were showing rather complex and impressive amounts of "data" on our webapp with much simpler but oh so wrongly implemented "UI" (in that case, hundreds of unnecessary nested table anchors), but I can't help thinking back at those times where it was simply impossible to make a product that was not cool for the user's computer, because the computer would refuse to run it at all. We're way past this line today. My Twitter's tab has now garbage collected some data. It's back to 73MB ram footprint. There are 32 new tweets to show. I click them, and the footprint bumps back up to 78MB. Meanwhile, my overall Chrome footprint is now showing 550MB private memory. That's 50MB for two clicks on Twitter and ~4500 characters in a HackerNews' submit form.
Moore's Law nowadays affects the computing power requirements of software rather than the computing performance of hardware, and this is killing us.
[+] [-] Lewton|14 years ago|reply
What's the point of having a tonne of ram only to let it sit idle?
I "only" have 4 gb of ram and chrome is taking up 1.2 gb of that, even though i have 12 windows open with ~5 tabs in each.
[+] [-] kmm|14 years ago|reply
The problem is that an extension like "1-Up for Google+" doesn't need 20 MiB of RAM just to play a sound and display a green mushroom, for a total of about 150 KiB of assets and maybe a few KiB of code.
[+] [-] mgkimsal|14 years ago|reply
[+] [-] keeperofdakeys|14 years ago|reply
[+] [-] cinch|14 years ago|reply
https://wiki.archlinux.org/index.php/Chromium_Tips_and_Tweak...
[+] [-] husted|14 years ago|reply
[+] [-] betterth|14 years ago|reply
The operating system plays a huge role in memory usage, many modern ones cache indiscriminately, preload most used applications, etc etc
Comparing RAM usage is almost apples and oranges these days, unless we're talking similar setups.
[+] [-] daed|14 years ago|reply
[+] [-] BCM43|14 years ago|reply
Edit: Debian minimal install using the Awesome window manager.
[+] [-] nsmartt|14 years ago|reply
[+] [-] jiggy2011|14 years ago|reply
Having said that many of the languages we are using such as Javascript are not exactly storage efficient combined with combined with the fact that many people who would probably not have considered themselves "programmer" enough to write native apps in C++ back in the day are writing things in Javascript and putting them online.
I often see people "benchmark" things like operating systems or browsers based on RAM usage (people coming to the conclusion that 32bit Win XP is more "efficient" than Win7 for example) but I think it is far more important to measure swap file usage as this is the thing that actually hurts performance. Afaik there is no performance hit in overwriting a flip-flop in memory that already has a value stored but there is a huge penalty for hitting a disk. So I'd be quite happy to have all my RAM utilized all the time if I have 0 swapping.
First of all if you are using a 64bit platform your pointers are going to be twice as big as they used to be but you get to access so much more memory that this issue solves itself.
Also IIRC Windows 7 does things like arrange your frequently used data into a contiguous area of the disk and then reads all that stuff into VM at bootup. This will result in higher RAM usage (due to many pages being loaded into physical RAM) but better overall performance since if other stuff needs the RAM more urgently it can either swap the cache back to disk or overwrite the cache entirely.
Let's say you have 8GB of RAM and are watching a HD video which is 7GB uncompressed and the rest of your system only really needs 1GB of your RAM, surely it makes sense what whilst you are watching the video you can use spare CPU cores to uncompress the rest of the video and put it into RAM.
I think it's more the case that computing will always expand to fill the amount of memory available. Bearing in mind that even the cost of swapping is dramatically reducing with SSDs etc.
[+] [-] rbanffy|14 years ago|reply
It makes a ton of sense for all those people (and other species, from different time/space geometries) who can comprehend a movie by experiencing its time dimension as a single event. Ordinary humans may find it less useful, albeit having no delay when skipping back and forth may have some utility for those of us who experience time in a more traditional way ;-)
[+] [-] randallsquared|14 years ago|reply
That's not clear to me. Why do something ahead of time that can be done faster than needed on the fly anyway? You're basically assuming that when system resources are needed for something else halfway through your video, processing power will be more useful than space.
[+] [-] Klinky|14 years ago|reply
The only time this would make sense is in a content creation environment where quick access to uncompressed frames makes a difference. However, those environments are already CPU starved. Typical consumers would not see a benefit from this scenario. More benefit will be had with the transition to SSDs, they will be eventually be much much faster & require less power than current hard drives.
[+] [-] cperciva|14 years ago|reply
On the laptop I'm writing this from, I've got a Chrome process with 827 MB of VM size -- but only 62 MB of pages actually in use. The rest is large memory-mapped regions consisting mostly of untouched pages -- which makes sense, because on a 64-bit machine, virtual address space is practically free.
[+] [-] babebridou|14 years ago|reply
Chrome says: "This is the best indicator of browser memory resource usage"
[+] [-] kmm|14 years ago|reply
I wouldn't say your macbook is useless though. I have only one computer and it's a 2008 laptop with 2 GB RAM. And I cope very well, although I experience the occasional lockup (I don't use swap).
[+] [-] babebridou|14 years ago|reply
Are the underlying systems running web applications (aka browsers) doing it wrong? Or are the webapp devs not doing what needs to be done in general? Or am I raising a false alarm and is everything alright?
As for my macbook, it has an unrelated problem, that's for sure, but all it does is magnify the memory issues for me.
[+] [-] zdw|14 years ago|reply
8GB of DDR2/3 from a good vendor has been less than $50 for a while now.
[+] [-] hobin|14 years ago|reply
[+] [-] whiskers|14 years ago|reply
If you ran the same set of apps on a 2GB machine you'd get different numbers.
[+] [-] babebridou|14 years ago|reply
Besides, I'll have to take your word on that, since my 4GB macbook pro has become completely unuseable due constant swapping, but it could be completely unrelated.
[+] [-] jacques_chester|14 years ago|reply
Which is annoying, because it leads to double-buffering.
[+] [-] joestringer|14 years ago|reply
Unfortunately, the flipside is that if you're using a computer more than a few years old, then anything more than your set of 'core' sites tends to slow your computer to a halt as your browser causes everything to be constantly swapping in and out of memory.
[+] [-] brador|14 years ago|reply
I'm on a 2GB Laptop with XP typing this and the fan is silent. SILENT. The second I turn noscript off it overloads, then of course, crashes. Some scripts are running constantly in the background, with click tracking and even mouse motion tracking becoming more commonplace.
[+] [-] chokma|14 years ago|reply
[+] [-] slowpoke|14 years ago|reply
I've been doing this since years and I seriously do not understand when people complain about extreme RAM usage of their browser. I have 4GB of RAM and often sit at 50-100+ tabs for weeks and don't really have any issues - Firefox Nightly is currently hogging ~16% of my memory according to htop - which is perfectly okay.
[+] [-] keenerd|14 years ago|reply
[+] [-] IgorPartola|14 years ago|reply
[+] [-] notatoad|14 years ago|reply
[+] [-] Tichy|14 years ago|reply
That said, there might be the occasional rogue app or web site, which should be uninstalled or avoided.
[+] [-] hollerith|14 years ago|reply
How would I know if a web site is "rogue"?
[+] [-] pm24601|14 years ago|reply
1) RAM is a finite resource 2) Managing excessive RAM usage (even without swapping) requires OS effort 3) indifferent memory management is a sign of sloppy coding and usually is a memory leak 4) casual memory usage does not fly in the mobile power constrained world that is now upon us.
#1: Ram is finite
If a computer with 1 terabyte memory with all but 1 gb consumed by a memory-indifferent program, is going to behave poorly. The OS is spending a lot of time trying to find free space every time the program or OS needs new memory for any reason.
#2 Managing excessive Ram usage is not free:
The OS is constantly shuffling stuff around in memory, updating pointers, etc. If the OS has more active memory to manage then the CPU is spending more time to do this overhead work that is not available to actually run the program.
#3 Excessive memory usage = sloppy programming (ers)
Whenever I have found code that is memory-indifferent I have found bugs beyond memory usage: 1) massive http-session size. On a web server if each session takes "only" 2 megabytes, that means 1 high-end server can handle about 4000 users (8gb of session data) before bogging down. 2) memory leaks - temporary data created but never released to the OS.
#4 : mobile
Memory uses power. Remember this is a key reason why Flash is dead on mobile. Adobe could never reduce the memory/power hungriness of Flash.
As a Java developer, I routinely constrain the JavaVM's memory during testing to just above the target footprint. I want to know if there is a memory issue before it blows up in my face on a busy day.
[+] [-] AndrewDucker|14 years ago|reply
[+] [-] sirclueless|14 years ago|reply
1) Freeing memory is expensive. In order to do it properly in a running process, you need to check that each piece of memory is not in use, and that typically requires traversing large graphs. It's only something you want to do when absolutely necessary. (Hint: it's not necessary when you have 16 GB of RAM.)
2) RAM is really fast. Things in memory can be read in a few CPU clock cycles. On disk, you need to wait for a plate of metal moving at 50 mph to swing by.
3) It's totally free! It's not like unused RAM can enter power-saving mode or anything. If your RAM usage isn't at 100% you are wasting valuable space. This isn't a checkout lane where lower usage is a win for everyone. It's like an everlasting candy fountain: it all rots if you leave it, so get the whole neighborhood in for a piece.
So honestly, what are you complaining about? So chrome wants to cache hour-old pictures just in case you hit the back button 45 times. What's the big deal? YOU HAVE 6,600,000,000 FREAKIN' BYTES LEFT! You could probably fit Project Gutenberg in there. If and when you run out of memory, and pages take 15 seconds to load because your hard drive is swapping like crazy, then you have a problem. But you won't have that problem. And just because chrome is taking up 3 GB at the moment doesn't mean it won't be a nice citizen when the operating system starts to cry about low memory, which it won't do because YOU HAVE 6 MORE GIGABYTES.
[+] [-] inconditus|14 years ago|reply
[+] [-] whimsy|14 years ago|reply
[+] [-] hythloday|14 years ago|reply
[+] [-] viraptor|14 years ago|reply
There's the whole javascript framework with it's own allocator, memory pools which will stay around for a long time. Some of the browser elements use the same kind of vm to actually display the UI to you, adding some more mem usage. Then there's the whole page which had to be read, parsed into a tree, processed to create appropriate model for display (all stages are preserved in an editable way). This has to be a live model for the whole time, since javascript might need to interact with it - you can't just flatten it into a screenshot and display that. Now the images - they also need to be loaded and decompressed for display. Also the text isn't that simple - there's the whole engine behind loading fonts, analysing each letter / pair / triple / ... for special spacing rules to be applied. There are multiple glyphs, font variants, sizes, etc. to rasterise. That text needs to be distributed properly inside of the DOM model generated above. That text flow actually affects the model again - they're linked, so they get processed again (text metrics need to be recomputed on each page resize, which may make some elements longer/shorter). Then the theme goes on top of this - since the application displays itself, the needed libraries / theme elements will be loaded too. But of course not all browsers use the theme from OS for displaying the form elements, so another layer needs to be loaded here. .... I could go on forever about elements which are needed for rendering the page.
So yes - lots of that could be much lighter. Lots of that could be made into modules or pulled some levels higher to make them more general. Lots of that might be just a result of sloppy coding noone ever got to fix.
But just remember that what you're actually using is a system on top of a system on top of a system on top of... If you used gopher on a real 80x25 term you'd use kilobytes. If you used first web browser with no image support and no scripting, you'd be under megabytes. But you don't - you're looking at a realtime rendering of 10 different models coming together and interacting with each other with next to no lag. And it all probably renders with hardware support to give you a very smooth scrolling experience. So no, web developers themselves have almost nothing to do with this whole mess. Unless they made a silly mistake and are leaking memory like crazy... they're not the ones to look at first.
(I simplified in many places, not all facts are relevant to every OS/browser combination, etc. just trying to make a point here)
[+] [-] nirvdrum|14 years ago|reply
[+] [-] babebridou|14 years ago|reply
You and others just filled that gap, thank you, I feel less stupid now.
[+] [-] willvarfar|14 years ago|reply
That aside, i recently hired the cheapest VPS i could find; 128MB RAM. I could not run apt-get; i ad to upgrade to 256MB.
[+] [-] babebridou|14 years ago|reply
[+] [-] silasb|14 years ago|reply
[+] [-] ulvund|14 years ago|reply
It is a disappointment for me to see the link get so many points in this community where you would expect the baseline of knowledge of technology to be 'above the average'.
[+] [-] nyellin|14 years ago|reply
Chrome leaks memory. Check memory usage again after force-quitting Chrome and relaunching it. I have to restart Chrome daily to regain 3-5 gigs of swap.
[+] [-] nextweek|14 years ago|reply
Browsers have sub second back buttons, how much pre-rendered ram space do you think that takes up?
[+] [-] babebridou|14 years ago|reply
I'm sure the os & browser devs are all good and great. I'm sure the web devs are all fine and accurate; that the subject is a complex one and we're already all doing the best we can.
But that's not my point: I'm worried that one day we'll run out of RAM because we're designing our webapps as if client RAM was no longer a finite resource, we simply forgot it even existed in the first place.