top | item 3521982

Tell HN: RAM Requirements are going through the roof

102 points| babebridou | 14 years ago | reply

There have been a few talks about how Moore's law is coming to an end, about how you can now build servers with close to a TB of RAM, how you can now scale clusters of servers and achieve great metrics when it comes to raw data storage or memcaching, how incredible feats in both parallelization and miniaturization will reverse the downhill trend for high-end computing.

Then there's the desktop. My desktop PC, for instance. It's a great 6 months old working computer, with 8 cores and 16GB of RAM. "This ought to be enough for all my needs", I thought. The Bill Gates in me was right so far, but unfortunately this won't be enough for long.

Yesterday I took a look at my memory usage, out of curiosity: 11.4GB. Woah. I had a look at the breakdown.

  - Chrome : ~3GB
  - Firefox : ~1.5GB
  - Java (eclipse) : ~1.2GB
  - Rest in tons of various work-related apps.
I'm of course responsible for letting this accumulate over several days, but still, a third of my RAM taken up by web browsing tabs? Chrome on its own clogging more than twice as much as Eclipse?

What worries me now is how hard we are getting struck with RAM-hogging web pages. Since I began writing this post, my freshly restarted Chrome browser with 9 open tabs (Hacker News (x2), Coding Horror, Google Search for "Mac osX Lion review 6 months", MoPub Ad Service monitoring, Google Analytics Visitors Overview, Android Developer Console, Twitter, Gmail) is taking up roughly 500MB of RAM. That's insane. On my 2010 Macbook Pro with 4GB of RAM, this means 20% of my overall caching capacity is taken up by my core web needs. Needless to say, I can't use my macbook anymore.

Last summer I was working on a little java experiment - a cross-platform 3D labyrinth. I wanted the overall memory and data footprint to be as low as possible so it could be played on a low-end android phone with really fast loading, and yet to keep the game space as big as I could, so I designed my own dedicated data structures overnight. I made the following calculation: over 500MB of raw, uncompressed data, I could store the description of an area roughly equivalent to a map of Europe with a resolution of 2.5 meters per pixel.

I'm not claiming any feat here, just stating the obvious: web programmers are doing something that's definitely not cool for our current RAM budget. We forgot any sense of measure. It's unnecessary for a twitter feed following 175 persons for an hour or so to claim a 70MB RAM footprint. You're not the only nor the worst offender, Twitter. Except that I had 26 new tweets to display, I clicked, and the footprint suddenly grew to 76MB. 26 tweets = 6MB. We're talking about 140 characters tweets, let's be generous and multiply that amount by 100 to take into account the tweeter's profile (which are all 10,000 characters essays as we all know), and we get a total of 364 000 new characters, which end up claiming more than 6 million bytes in RAM. Which means the RAM impact of adding 26 new tweets to a webpage is at very very least 10 times higher than it could be, and probably more like a thousand times too high.

Like I said, I'm only using Twitter to state a point. I mean nothing wrong with Twitter's web devs - actually I'm myself a very poor web developer - and I'm pretty sure the blame could also be put on Google Chrome instead, but I remember a day in 2002 where my Internet Explorer was trying to load a 4MB webpage from the hard drive, causing a RAM footprint above 40MB and a failure after 20 minutes of waiting in front of a white screen. Back then I was merely a junior consultant working on QA, and I was the one to tell the devs that they were definitely doing something not cool at all for the user's computer. True, we were showing rather complex and impressive amounts of "data" on our webapp with much simpler but oh so wrongly implemented "UI" (in that case, hundreds of unnecessary nested table anchors), but I can't help thinking back at those times where it was simply impossible to make a product that was not cool for the user's computer, because the computer would refuse to run it at all. We're way past this line today. My Twitter's tab has now garbage collected some data. It's back to 73MB ram footprint. There are 32 new tweets to show. I click them, and the footprint bumps back up to 78MB. Meanwhile, my overall Chrome footprint is now showing 550MB private memory. That's 50MB for two clicks on Twitter and ~4500 characters in a HackerNews' submit form.

Moore's Law nowadays affects the computing power requirements of software rather than the computing performance of hardware, and this is killing us.

89 comments

order
[+] Lewton|14 years ago|reply
Chrome is designed to be memory hungry. It doesn't NEED the memory, but if it's there it sure as hell will use it to try and speed up your internet experience.

What's the point of having a tonne of ram only to let it sit idle?

I "only" have 4 gb of ram and chrome is taking up 1.2 gb of that, even though i have 12 windows open with ~5 tabs in each.

[+] kmm|14 years ago|reply
"Idle" memory is used by the operating system as cache. On my computer, 800 MiB of my RAM is in use by programs, but 1.7 GiB of my 2 GiB RAM is used in total, by programs, buffers and cache.

The problem is that an extension like "1-Up for Google+" doesn't need 20 MiB of RAM just to play a sound and display a green mushroom, for a total of about 150 KiB of assets and maybe a few KiB of code.

[+] mgkimsal|14 years ago|reply
Why is the default to let any app take as much as it can? Or, perhaps more to the point, why can we not force apps to live within X megs of RAM, regardless of what we have available (for whatever reason we choose)? If it's a possibility to do that, it's pretty nigh hidden to everyone.
[+] keeperofdakeys|14 years ago|reply
The operating system can use this to cache files, which greatly improves read and write performance.
[+] husted|14 years ago|reply
Exactly, free memory is wasted memory.
[+] betterth|14 years ago|reply
RAM is used by an operating system and managed. You may use 11/16 GB at once, but someone could browse the internet with the same tabs at 3 or 4 GB and be fine. Hell, they could at 2GB and be fine.

The operating system plays a huge role in memory usage, many modern ones cache indiscriminately, preload most used applications, etc etc

Comparing RAM usage is almost apples and oranges these days, unless we're talking similar setups.

[+] daed|14 years ago|reply
To add to this, I think it's important that OP knows this is a good thing. I built my computer just over 3 years ago when DDR2 prices bottomed out, so I have 8GB, and this thing is still more computer than I need. I boot up, load all my applications into RAM, and everything is super snappy - because RAM is fast. Prior to this rig I found myself compelled to upgrade at least every 3 years because things would get slow. Times change.
[+] BCM43|14 years ago|reply
I browse with 1.5 gigs, with both firefox and chrome open. Firefox will have up to 30 tabs, and chrome up to 20. I can still browse fine.

Edit: Debian minimal install using the Awesome window manager.

[+] nsmartt|14 years ago|reply
I routinely open more than 30 tabs on Chrome on both Windows and Ubuntu. I have 4 GB. I have had problems on Ubuntu, but never on Windows.
[+] jiggy2011|14 years ago|reply
I would say that it's not so much that RAM Requirements are going through the roof as it is that the benefit you can get from having more RAM is going through the roof.

Having said that many of the languages we are using such as Javascript are not exactly storage efficient combined with combined with the fact that many people who would probably not have considered themselves "programmer" enough to write native apps in C++ back in the day are writing things in Javascript and putting them online.

I often see people "benchmark" things like operating systems or browsers based on RAM usage (people coming to the conclusion that 32bit Win XP is more "efficient" than Win7 for example) but I think it is far more important to measure swap file usage as this is the thing that actually hurts performance. Afaik there is no performance hit in overwriting a flip-flop in memory that already has a value stored but there is a huge penalty for hitting a disk. So I'd be quite happy to have all my RAM utilized all the time if I have 0 swapping.

First of all if you are using a 64bit platform your pointers are going to be twice as big as they used to be but you get to access so much more memory that this issue solves itself.

Also IIRC Windows 7 does things like arrange your frequently used data into a contiguous area of the disk and then reads all that stuff into VM at bootup. This will result in higher RAM usage (due to many pages being loaded into physical RAM) but better overall performance since if other stuff needs the RAM more urgently it can either swap the cache back to disk or overwrite the cache entirely.

Let's say you have 8GB of RAM and are watching a HD video which is 7GB uncompressed and the rest of your system only really needs 1GB of your RAM, surely it makes sense what whilst you are watching the video you can use spare CPU cores to uncompress the rest of the video and put it into RAM.

I think it's more the case that computing will always expand to fill the amount of memory available. Bearing in mind that even the cost of swapping is dramatically reducing with SSDs etc.

[+] rbanffy|14 years ago|reply
> Let's say you have 8GB of RAM and are watching a HD video which is 7GB uncompressed and the rest of your system only really needs 1GB of your RAM, surely it makes sense what whilst you are watching the video you can use spare CPU cores to uncompress the rest of the video and put it into RAM

It makes a ton of sense for all those people (and other species, from different time/space geometries) who can comprehend a movie by experiencing its time dimension as a single event. Ordinary humans may find it less useful, albeit having no delay when skipping back and forth may have some utility for those of us who experience time in a more traditional way ;-)

[+] randallsquared|14 years ago|reply
Let's say you have 8GB of RAM and are watching a HD video which is 7GB uncompressed and the rest of your system only really needs 1GB of your RAM, surely it makes sense what whilst you are watching the video you can use spare CPU cores to uncompress the rest of the video and put it into RAM.

That's not clear to me. Why do something ahead of time that can be done faster than needed on the fly anyway? You're basically assuming that when system resources are needed for something else halfway through your video, processing power will be more useful than space.

[+] Klinky|14 years ago|reply
You can only store about 20 minutes worth of decompressed RGB24 1080p video in 7GB of RAM. There really is no point to doing this.

The only time this would make sense is in a content creation environment where quick access to uncompressed frames makes a difference. However, those environments are already CPU starved. Typical consumers would not see a benefit from this scenario. More benefit will be had with the transition to SSDs, they will be eventually be much much faster & require less power than current hard drives.

[+] cperciva|14 years ago|reply
Are your numbers the VM size, or the RSS?

On the laptop I'm writing this from, I've got a Chrome process with 827 MB of VM size -- but only 62 MB of pages actually in use. The rest is large memory-mapped regions consisting mostly of untouched pages -- which makes sense, because on a 64-bit machine, virtual address space is practically free.

[+] babebridou|14 years ago|reply
Figures are taken from the "private" column of chrome://memory-redirect/

Chrome says: "This is the best indicator of browser memory resource usage"

[+] kmm|14 years ago|reply
I agree that this is completely getting out of hand. Memory was (and still is) very cheap but we are reaching a limit. The problem is that there is little we can do about it. I doubt that both Mozilla and Google don't know that their browsers consume disproportionate amounts of memory. But developing a web browser isn't easy and I assume a huge memory footprint is unavoidable. I've often wondered whether the problem lies not with the web developers nor with the programmers but in the expressiveness of the HTML+CSS+Javascript combination. This allows an enormous diversity in websites. But I assume that storing an exhaustive amount of metrics for each element on a page can be quite some data.

I wouldn't say your macbook is useless though. I have only one computer and it's a 2008 laptop with 2 GB RAM. And I cope very well, although I experience the occasional lockup (I don't use swap).

[+] babebridou|14 years ago|reply
I'm wondering where the memory useage really comes from. I'm coming from an dev culture where if I do things right, i can trust the underlying system I built on to make it run fast enough and use up as little resources as it can, so most of the performance & optimization work comes from me.

Are the underlying systems running web applications (aka browsers) doing it wrong? Or are the webapp devs not doing what needs to be done in general? Or am I raising a false alarm and is everything alright?

As for my macbook, it has an unrelated problem, that's for sure, but all it does is magnify the memory issues for me.

[+] zdw|14 years ago|reply
With memory prices as low as they are right now, you should upgrade if possible.

8GB of DDR2/3 from a good vendor has been less than $50 for a while now.

[+] hobin|14 years ago|reply
Just out of curiosity, but why don't you have a swap partition?
[+] whiskers|14 years ago|reply
A lot of apps will take more memory than they need so they can keep more data "live" and be more responsive when they need to perform actions.

If you ran the same set of apps on a 2GB machine you'd get different numbers.

[+] babebridou|14 years ago|reply
Still I fail to see why this "live" data takes so much space.

Besides, I'll have to take your word on that, since my 4GB macbook pro has become completely unuseable due constant swapping, but it could be completely unrelated.

[+] jacques_chester|14 years ago|reply
> A lot of apps will take more memory than they need so they can keep more data "live" and be more responsive when they need to perform actions.

Which is annoying, because it leads to double-buffering.

[+] joestringer|14 years ago|reply
For those with the RAM to spare, this seems like sensible behaviour -- If there are ways to speed up your web browsing by using otherwise idle RAM, then great!

Unfortunately, the flipside is that if you're using a computer more than a few years old, then anything more than your set of 'core' sites tends to slow your computer to a halt as your browser causes everything to be constantly swapping in and out of memory.

[+] brador|14 years ago|reply
PROTIP: Use noscript to take RAM requirements back to 2007.

I'm on a 2GB Laptop with XP typing this and the fan is silent. SILENT. The second I turn noscript off it overloads, then of course, crashes. Some scripts are running constantly in the background, with click tracking and even mouse motion tracking becoming more commonplace.

[+] chokma|14 years ago|reply
Also using RequestPolicy addon on Firefox will dramatically cut down on stuff loaded from 3rd-party websites (including: Google analytics, Facebook like button, ad- and tracking services of all kind)
[+] slowpoke|14 years ago|reply
And as an added bonus, you're reclaiming a lot of your online privacy. Unless that was your main reason to install all these addons in the first place (as is the case for myself). It also tends to speed up page load times, often by magnitudes, because of all the crap some sites want to load.

I've been doing this since years and I seriously do not understand when people complain about extreme RAM usage of their browser. I have 4GB of RAM and often sit at 50-100+ tabs for weeks and don't really have any issues - Firefox Nightly is currently hogging ~16% of my memory according to htop - which is perfectly okay.

[+] keenerd|14 years ago|reply
Turn off flash and javascript. That should reduce your browser's memory use by 90%. And use Opera. I can easily fit 200 tabs in under a GB.
[+] IgorPartola|14 years ago|reply
Or turn off your computer completely to have 100% free RAM. This advice is not as helpful as it seems if your line of work is, let's say, a JavaScript or Flash developer or you want to use anything more advanced than a blog.
[+] notatoad|14 years ago|reply
your operating system uses the ram available to it. you see 11.6GB of ram usage because you have 16GB of ram. my laptop has 4GB and short of running virtual machines i've never wanted for more.
[+] Tichy|14 years ago|reply
The apps simply try to make the best use of the memory available, for example they could use it for caching. It would be bad if they weren't using up all the memory available.

That said, there might be the occasional rogue app or web site, which should be uninstalled or avoided.

[+] hollerith|14 years ago|reply
>there might be the occasional rogue app or web site, which should be uninstalled or avoided.

How would I know if a web site is "rogue"?

[+] pm24601|14 years ago|reply
You are correct to be calling out applications that are indifferent to memory usage. The commenters who answer "ram is cheap", "your time is worth more than the ram",etc. are missing critical points:

1) RAM is a finite resource 2) Managing excessive RAM usage (even without swapping) requires OS effort 3) indifferent memory management is a sign of sloppy coding and usually is a memory leak 4) casual memory usage does not fly in the mobile power constrained world that is now upon us.

#1: Ram is finite

If a computer with 1 terabyte memory with all but 1 gb consumed by a memory-indifferent program, is going to behave poorly. The OS is spending a lot of time trying to find free space every time the program or OS needs new memory for any reason.

#2 Managing excessive Ram usage is not free:

The OS is constantly shuffling stuff around in memory, updating pointers, etc. If the OS has more active memory to manage then the CPU is spending more time to do this overhead work that is not available to actually run the program.

#3 Excessive memory usage = sloppy programming (ers)

Whenever I have found code that is memory-indifferent I have found bugs beyond memory usage: 1) massive http-session size. On a web server if each session takes "only" 2 megabytes, that means 1 high-end server can handle about 4000 users (8gb of session data) before bogging down. 2) memory leaks - temporary data created but never released to the OS.

#4 : mobile

Memory uses power. Remember this is a key reason why Flash is dead on mobile. Adobe could never reduce the memory/power hungriness of Flash.

As a Java developer, I routinely constrain the JavaVM's memory during testing to just above the target footprint. I want to know if there is a memory issue before it blows up in my face on a busy day.

[+] AndrewDucker|14 years ago|reply
I have 25 tabs open in Firefox right now, and that's taking up 600MB. So, about 24MB/page. Seems a bit high, but considering the number of images and amount of JS, not so high that I'm panicking about it.
[+] sirclueless|14 years ago|reply
Try to think about this from an optimizer's standpoint. Here are some excellent reasons why giant memory footprints are a good thing:

1) Freeing memory is expensive. In order to do it properly in a running process, you need to check that each piece of memory is not in use, and that typically requires traversing large graphs. It's only something you want to do when absolutely necessary. (Hint: it's not necessary when you have 16 GB of RAM.)

2) RAM is really fast. Things in memory can be read in a few CPU clock cycles. On disk, you need to wait for a plate of metal moving at 50 mph to swing by.

3) It's totally free! It's not like unused RAM can enter power-saving mode or anything. If your RAM usage isn't at 100% you are wasting valuable space. This isn't a checkout lane where lower usage is a win for everyone. It's like an everlasting candy fountain: it all rots if you leave it, so get the whole neighborhood in for a piece.

So honestly, what are you complaining about? So chrome wants to cache hour-old pictures just in case you hit the back button 45 times. What's the big deal? YOU HAVE 6,600,000,000 FREAKIN' BYTES LEFT! You could probably fit Project Gutenberg in there. If and when you run out of memory, and pages take 15 seconds to load because your hard drive is swapping like crazy, then you have a problem. But you won't have that problem. And just because chrome is taking up 3 GB at the moment doesn't mean it won't be a nice citizen when the operating system starts to cry about low memory, which it won't do because YOU HAVE 6 MORE GIGABYTES.

[+] inconditus|14 years ago|reply
Certainly all true points. However, for the people without the luxury of 16GB (I'm running on an old machine, with 3gb of RAM), Chrome consistently freezes the computer due to the resource hogging. I've learned my lesson and switched to Firefox.
[+] whimsy|14 years ago|reply
Doesn't 1) contradict 3)? If freeing memory is expensive, and memory is full, then there is certainly a cost of having full memory.
[+] hythloday|14 years ago|reply
Instead of complaining that your RAM is 75% allocated, you should be complaining that 25% of it is unused. (Imagine buying a processor and noting that it never went above 75% utilization.) Memory deallocation isn't free (it's often not even cheap), and it's completely wasted work if the application is closed. In addition, for every allocation that's not unnecessarily deallocated, you don't waste cycles redrawing bitmaps, reparsing text, redecompressing images, and so on.
[+] viraptor|14 years ago|reply
While I agree in general, I think you're pointing at the wrong target here. It's not that "web programmers are doing something that's definitely not cool for our current RAM budget". The javascript is loaded and probably jitted to a smaller space than its text (depends on the number of paths, etc - I'm ignoring lots of stuff here), the actual text passing through the wire is also pretty small. So where's the memory going? All the supporting stuff...

There's the whole javascript framework with it's own allocator, memory pools which will stay around for a long time. Some of the browser elements use the same kind of vm to actually display the UI to you, adding some more mem usage. Then there's the whole page which had to be read, parsed into a tree, processed to create appropriate model for display (all stages are preserved in an editable way). This has to be a live model for the whole time, since javascript might need to interact with it - you can't just flatten it into a screenshot and display that. Now the images - they also need to be loaded and decompressed for display. Also the text isn't that simple - there's the whole engine behind loading fonts, analysing each letter / pair / triple / ... for special spacing rules to be applied. There are multiple glyphs, font variants, sizes, etc. to rasterise. That text needs to be distributed properly inside of the DOM model generated above. That text flow actually affects the model again - they're linked, so they get processed again (text metrics need to be recomputed on each page resize, which may make some elements longer/shorter). Then the theme goes on top of this - since the application displays itself, the needed libraries / theme elements will be loaded too. But of course not all browsers use the theme from OS for displaying the form elements, so another layer needs to be loaded here. .... I could go on forever about elements which are needed for rendering the page.

So yes - lots of that could be much lighter. Lots of that could be made into modules or pulled some levels higher to make them more general. Lots of that might be just a result of sloppy coding noone ever got to fix.

But just remember that what you're actually using is a system on top of a system on top of a system on top of... If you used gopher on a real 80x25 term you'd use kilobytes. If you used first web browser with no image support and no scripting, you'd be under megabytes. But you don't - you're looking at a realtime rendering of 10 different models coming together and interacting with each other with next to no lag. And it all probably renders with hardware support to give you a very smooth scrolling experience. So no, web developers themselves have almost nothing to do with this whole mess. Unless they made a silly mistake and are leaking memory like crazy... they're not the ones to look at first.

(I simplified in many places, not all facts are relevant to every OS/browser combination, etc. just trying to make a point here)

[+] nirvdrum|14 years ago|reply
Shouldn't it be the responsibility of the Web dev to realize this and work to minimize it? In a way, it's not much different than building on top of any VM. And here it definitely impacts the UX. I've certainly closed down tabs from sites that I know are bogging my system down.
[+] babebridou|14 years ago|reply
I implied in another comment that my FUD about browser memory came from being wrong by an order of magnitude in the wrong direction when applying my notion of memory footprint to web browsing.

You and others just filled that gap, thank you, I feel less stupid now.

[+] willvarfar|14 years ago|reply
Your reasoning is technically flawed.

That aside, i recently hired the cheapest VPS i could find; 128MB RAM. I could not run apt-get; i ad to upgrade to 256MB.

[+] babebridou|14 years ago|reply
Like I said I'm in no way a browser or web expert. I only have a notion of what could potentially fit in a certain amount of memory, and when I look at webpages I'm at least an order of magnitude off, in the wrong direction. This is why I'm worried. I sure hope I'm completely wrong!
[+] silasb|14 years ago|reply
To get better performance the operating system will use as much as the RAM as possible. Why only use 4GB when you have 16GB to use? Cache as much of that as possible. Remember locality in CS and how it relates with caching, this is exactly the reasons why your OS does this.
[+] ulvund|14 years ago|reply
Exactly, OP does not understand the role RAM plays in computers.

It is a disappointment for me to see the link get so many points in this community where you would expect the baseline of knowledge of technology to be 'above the average'.

[+] nyellin|14 years ago|reply
It isn't a problem with Twitter or Gmail, it is a bug in Chrome.

Chrome leaks memory. Check memory usage again after force-quitting Chrome and relaunching it. I have to restart Chrome daily to regain 3-5 gigs of swap.

[+] nextweek|14 years ago|reply
I hate these kinds of questions about ram usage. Os developers and browser developers are smarter than you give them credit for. They will do garbage collection when memory becomes tight. You are slowing yourself by clearing ram when you might need it and you still have free ram.

Browsers have sub second back buttons, how much pre-rendered ram space do you think that takes up?

[+] babebridou|14 years ago|reply
> Browsers have sub second back buttons, how much pre-rendered ram space do you think that takes up?

I'm sure the os & browser devs are all good and great. I'm sure the web devs are all fine and accurate; that the subject is a complex one and we're already all doing the best we can.

But that's not my point: I'm worried that one day we'll run out of RAM because we're designing our webapps as if client RAM was no longer a finite resource, we simply forgot it even existed in the first place.