For those of you interested in parallelism in browsers, I suggest you keep an eye on Mozilla's experimental new browser engine, Servo.[1] The goal is to make use of a variety of concurrency strategies (such as a Chrome-esque process-per-tab design[2]) and Rust's built-in support for memory-safe concurrency abstractions (fork/join, lightweight tasks, SIMD, etc.) to produce a ludicrously parallel[3] web browser. And even if Servo itself never happens to make its way into production, one of its purposes is to enable Mozilla to explore effective parallelism strategies to pursue with Gecko.
[2] I wasn't exactly thrilled about this myself, but as long as Servo has to interact with C++ code (notably, SpiderMonkey) this was judged critical for security. Fortunately, pcwalton seems to believe that Servo's tab processes will occupy less memory than Chrome's.
If you'd like to try this out on Windows without affecting your main Firefox profile, we just released portable packages of the Firefox Nightly and Aurora builds at PortableApps.com yesterday: http://portableapps.com/news/2013-12-04--firefox-aurora-27-a...
They run self-contained in their own directory so you can quickly extract them to your Desktop or portable device. The installer downloads the latest build as you install it and configures it for standalone use. When you're done testing, you can just delete the FirefoxPortableNightly directory.
Bonus: The Nightly branch also has the new Australis UI redesign that they've been working on and is worth checking out.
I remember the days when for multiprocessing was the only option and multi-threading was only available on a few systems.
Now with the security exploits many plugins have exposed and the way a misbehaved thread can bring the whole application down, we are moving back to the multiprocess model as a better sandbox model.
> All IPC happens using the Chromium IPC libraries
Interesting that they chose to share code with Chrome. Since the two are competitors, I would have thought that they'd use completely separate implementations. It's interesting that open source makes this sharing possible.
> Interesting that they chose to share code with Chrome. Since the two are competitors, I would have thought that they'd use completely separate implementations.
That's what open source is about: collaboration instead competition. Why compete when you can pool your resources together.
I'm very excited about this. I usually drive Firefox Beta without any sorts of complaints, but installed nightly just to try this out live.
With my list of extensions[1] this doesn't seem to be particularly stable. It fails to bring up my tabs from last time. That would be OK for experimentation, had it not been for the fact that it also crashes regularly.
These two combined really is test-stopper for me.
Note: I'm not complaining. I'm very pleased this is being worked on. I'm just commenting first-hand experience about the state of things, so that others can make up their minds if they want to give it a go as well.
Are the crashes listed in Firefox's about:crashes page? Filing bug reports with those crash IDs (which reference stack traces on crash-stats.mozilla.com) would be a big help.
Firebug might be a problem because it is tightly coupled to Firefox's internal debugging APIs.
Firebug is known to not work (and cause stability problems) at the moment (for reasons mentioned by cpeterso). We're working with addon developers to improve this situation.
From a code maintainance point of view, how do you manage to keep this 'branch' in track with the main one ?
I mean, every patch made to the real firefox has to be carefully reviewed and backported to this multiprocess branch.
Is that a manual process ? Or can it be automated like that :
1) Check if new commit arrived on 'head'
2) Auto backport it to the multiprocess branch
3) Try a build + run tests. Everything looks good ? Keep it
4) Not goot ? Send an email to the multiprocess maintainer so that he has a look ?
Since multiprocess is in the regular Nightly, the code is in the main line of development (mozilla-central). Based on the preference, it decides at runtime how to handle the content/chrome interface.
But when Mozilla does branch off separate trees, VCS merges and lots of automated tests are largely sufficient.
To try Electrolysis (multiprocess) in Firefox Nightly: in about:config, toggle the browser.tabs.remote pref and restart (still work-in-progress, don't expect a fully working browser).
I welcome this just so we can determine which tabs are using the CPU persistently. I had to switch back to Chromium (after a good few weeks really giving FF another go) because I was sick of this issue. Firefox is smoother and more memory friendly than Chromium these days, a pleasure to use, but in Chrome I can kill hoggy tabs... so that's where I'm staying for the moment.
Now, this isn't snarky, but you really run into issues like that, where its noticeably bad in a particular tab, enough to need to kill it? What sort of sites, and what processor?
Very nice indeed, I crash Firefox a few times a day from hitting the memory limit.
Granted this is due to AB+ and Reddit Enhancement Suite. Although imo.im leaking memory over time doesn't help! (I am somewhat annoyed that an IM client takes up 500MB of memory, I miss Meebo!)
Right now FF is at a fairly svelte 1.8GB. Heh.
The other problem is that performance degrades dramatically as the number of open tabs increases. Once I hit 50 or so tabs scrolling becomes horribly jerky. From the sounds of it, this change may very well fix that as well.
(For reference I am on an insanely fast home built machine!)
My biggest problem with Firefox is its startup time. It takes much longer to start the firefox.exe then IE and Chrome which are both very fast.
I am using Win7 on i7-3960X, intel SSD, 16 GB RAM.
If I install any plugin then this thing is much worst. (For this reason I am not using any plugin which is a big loss).
It is weird that this issue is seldomly mentioned, but I think that it is much more important then the other performance benchmarks such as javascript performance.
I don't have no/low tab cold start problems in either.
... that said, try closing Chromium with 20 odd tabs open and then reopening it. It'll take bloody ages to reload all those tabs. Firefox lazy tab loading saves a metric buttload of time in this scenario.
Interestingly, that complaint no longer exists for me. Firefox really starts instantly. Which is unfortunate, as it's the first program in my task bar and I'm still used to use Win+Shift+1 to start a new instance. Which, if Firefox isn't already running, results in it asking whether I want to start in Safe mode because I was holding shift.
Tom's Hardware Guide's "Web Browser Grand Prix" measured Firefox's startup times being much faster than Chrome, for both cold and warm starts and single and multiple tabs.
I just hope Mozilla won't go extreme, and won't use a separate process for each tab like Chrome does. It produces memory bloat if you have many tabs open. While they say they'll mitigate memory issues, this should be balanced.
It's unfortunate how behind the curve Mozilla is on this. No denying this was a huge undertaking but the length of time it's taken has obviously been detrimental to Firefox usage, the only real reason I still use Chrome as my primary browser. Though I'll give Electrolysis a shot with Firefox Nightly and see how it works out.
Huh? I don't get this. Mozilla is switching from a single-process model to a multi-one. Chrome was built that way from say one. I hope you see that moving from different models costs more time then actually pick one and support it forever.
The article explains this. We built the infrastructure, and we used it for out-of-process plugins on desktop, and for Firefox OS, but we went in search of lower-hanging fruit on desktop. There was also (IIRC) some worry about breaking extension compatibility, which would make it a nonstarter. I think there are some planned mitigations for that now, and we've fixed a lot of the easy wins for responsiveness, so we're pursuing this again.
This is more about responsiveness. It's not just about slow script, if you open a very large text file in a tab, the entire Firefox UI stalls. If you have some moderately heavy operation happening in a different tab, scrolling gets choppy across the board, switching tabs is janky, etc etc etc.
Testing it out now, so far so good! I might make this my default profile, I would love not having rogue tabs freeze the entire system. It's very nearly my one remaining thing I prefer Chrome for, Firefox has really improved lately.
As a dedicated FF user, I've been waiting for this for so long. I may finally be able to isolate which tab is grinding my computer to a halt! As usual, this is amazing work.
This is a terrible idea. There is no justifiable reason to do this. The reasons given are weak, this is just more over engineering that will add an enormous amount of complexity and add no real value.
Lets look at the reasons given as to why they want to do this.
>Performance. Most performance work at Mozilla over the last two years has focused on responsiveness of the browser. The goal is to reduce "jank"—those times when the browser seems to briefly freeze when loading a big page, typing in a form, or scrolling.
You can do all of this with proper threading and task delegation. Putting things in separate processes will not magically make things better. The answer to "jank" is proper coding, not over engineering. Last time I checked there was the same "jank" in IE and Chrome even though they use MPs.
>Security. Technically, sandboxing doesn’t require multiple processes. However, a sandbox that covered the current (single) Firefox process wouldn’t be very useful. Sandboxes are only able to prevent processes from performing actions that a well-behaved process would never do. Unfortunately, a well-behaved Firefox process (especially one with add-ons installed) needs access to much of the network and file system.
This is BS. You could have three processes and have FireFox sandboxed completely. Main process runs in a low integrity mode which limits it's resource access to a single directory. Second process is a download delegation process (takes a file after it is downloaded and moves it to the requested location while also promoting it's integrity) running in normal integrity mode. Third process is a network communication delegate/proxy running in normal or possibly even low integrity. These two delegate processes I mentioned will still be needed for the MP Firefox so it is no more work to create them.
>Stability
This is the only true benefit, but it is of very little value. Firefox almost never crashes and when it does, the session restore brings you back to were you left off in seconds.
Cons? More complexity means more bugs. This is a workaround for really fixing FireFox. I am going to have 150+ extra processes in my task manager now. More memory use. More context switches in the operating system eating up resources and causing more system latency and overall slowdown (context switches at the kernel level which will affect the whole OS).
[+] [-] kibwen|12 years ago|reply
[1] https://github.com/mozilla/servo/
[2] I wasn't exactly thrilled about this myself, but as long as Servo has to interact with C++ code (notably, SpiderMonkey) this was judged critical for security. Fortunately, pcwalton seems to believe that Servo's tab processes will occupy less memory than Chrome's.
[3] https://github.com/mozilla/servo/wiki/Design
[+] [-] JohnTHaller|12 years ago|reply
They run self-contained in their own directory so you can quickly extract them to your Desktop or portable device. The installer downloads the latest build as you install it and configures it for standalone use. When you're done testing, you can just delete the FirefoxPortableNightly directory.
Bonus: The Nightly branch also has the new Australis UI redesign that they've been working on and is worth checking out.
[+] [-] pjmlp|12 years ago|reply
Now with the security exploits many plugins have exposed and the way a misbehaved thread can bring the whole application down, we are moving back to the multiprocess model as a better sandbox model.
Old becomes new as they say.
[+] [-] sp332|12 years ago|reply
[+] [-] timothya|12 years ago|reply
Interesting that they chose to share code with Chrome. Since the two are competitors, I would have thought that they'd use completely separate implementations. It's interesting that open source makes this sharing possible.
[+] [-] timdiggerm|12 years ago|reply
- pridefully write your own
- use theirs
[+] [-] rbanffy|12 years ago|reply
That's what open source is about: collaboration instead competition. Why compete when you can pool your resources together.
[+] [-] evilpie|12 years ago|reply
[+] [-] jonknee|12 years ago|reply
[+] [-] dchest|12 years ago|reply
[+] [-] carnaval|12 years ago|reply
(I am aware that a fully sandboxed JIT is more complex to integrate than an IPC library)
[+] [-] josteink|12 years ago|reply
With my list of extensions[1] this doesn't seem to be particularly stable. It fails to bring up my tabs from last time. That would be OK for experimentation, had it not been for the fact that it also crashes regularly.
These two combined really is test-stopper for me.
Note: I'm not complaining. I'm very pleased this is being worked on. I'm just commenting first-hand experience about the state of things, so that others can make up their minds if they want to give it a go as well.
[1] Installed extensions: Adblock Edge, Duckduckgo search, Firebug, Flashblock, Norwegian dictionary.
[+] [-] cpeterso|12 years ago|reply
Firebug might be a problem because it is tightly coupled to Firefox's internal debugging APIs.
[+] [-] hansjorg|12 years ago|reply
1: http://språkrådet.no/Politikk-Fakta/Spraakpolitikk/
[+] [-] handsomeransoms|12 years ago|reply
[+] [-] ritonlajoie|12 years ago|reply
1) Check if new commit arrived on 'head' 2) Auto backport it to the multiprocess branch 3) Try a build + run tests. Everything looks good ? Keep it 4) Not goot ? Send an email to the multiprocess maintainer so that he has a look ?
Is there another way to do that ?
[+] [-] coyotebush|12 years ago|reply
But when Mozilla does branch off separate trees, VCS merges and lots of automated tests are largely sufficient.
[+] [-] RunningDroid|12 years ago|reply
[+] [-] paulrouget|12 years ago|reply
Edit: you will lose your current session
[+] [-] nly|12 years ago|reply
[+] [-] girvo|12 years ago|reply
[+] [-] aquadrop|12 years ago|reply
[+] [-] com2kid|12 years ago|reply
Granted this is due to AB+ and Reddit Enhancement Suite. Although imo.im leaking memory over time doesn't help! (I am somewhat annoyed that an IM client takes up 500MB of memory, I miss Meebo!)
Right now FF is at a fairly svelte 1.8GB. Heh.
The other problem is that performance degrades dramatically as the number of open tabs increases. Once I hit 50 or so tabs scrolling becomes horribly jerky. From the sounds of it, this change may very well fix that as well.
(For reference I am on an insanely fast home built machine!)
[+] [-] fenesiistvan|12 years ago|reply
It is weird that this issue is seldomly mentioned, but I think that it is much more important then the other performance benchmarks such as javascript performance.
[+] [-] nly|12 years ago|reply
... that said, try closing Chromium with 20 odd tabs open and then reopening it. It'll take bloody ages to reload all those tabs. Firefox lazy tab loading saves a metric buttload of time in this scenario.
[+] [-] ZeroGravitas|12 years ago|reply
http://support.mozilla.org/en-US/kb/reset-firefox-easily-fix...
[+] [-] cpeterso|12 years ago|reply
[+] [-] pwnna|12 years ago|reply
I seem to remember that there are things you can do if you deviate too much from the average of others.. but I don't remember where to find that info.
[+] [-] ygra|12 years ago|reply
[+] [-] moron4hire|12 years ago|reply
[+] [-] cpeterso|12 years ago|reply
http://www.tomshardware.com/reviews/chrome-27-firefox-21-ope...
[+] [-] shmerl|12 years ago|reply
I just hope Mozilla won't go extreme, and won't use a separate process for each tab like Chrome does. It produces memory bloat if you have many tabs open. While they say they'll mitigate memory issues, this should be balanced.
[+] [-] efuquen|12 years ago|reply
[+] [-] why-el|12 years ago|reply
[+] [-] tedmielczarek|12 years ago|reply
[+] [-] mariusmg|12 years ago|reply
[+] [-] TrainedMonkey|12 years ago|reply
Given how cheap memory is, and the fact that almost all new devices are multicore I see this as a big positive.
[+] [-] kunil|12 years ago|reply
[+] [-] pekk|12 years ago|reply
I guess everything is stable, if you just avoid all the features everyone uses.
[+] [-] reubenmorais|12 years ago|reply
[+] [-] khuey|12 years ago|reply
[+] [-] darkstalker|12 years ago|reply
[+] [-] Groxx|12 years ago|reply
[+] [-] acjohnson55|12 years ago|reply
[+] [-] Too|12 years ago|reply
[+] [-] ksec|12 years ago|reply
It has taken far too long for e10s.
[+] [-] goggles99|12 years ago|reply
Lets look at the reasons given as to why they want to do this.
>Performance. Most performance work at Mozilla over the last two years has focused on responsiveness of the browser. The goal is to reduce "jank"—those times when the browser seems to briefly freeze when loading a big page, typing in a form, or scrolling.
You can do all of this with proper threading and task delegation. Putting things in separate processes will not magically make things better. The answer to "jank" is proper coding, not over engineering. Last time I checked there was the same "jank" in IE and Chrome even though they use MPs.
>Security. Technically, sandboxing doesn’t require multiple processes. However, a sandbox that covered the current (single) Firefox process wouldn’t be very useful. Sandboxes are only able to prevent processes from performing actions that a well-behaved process would never do. Unfortunately, a well-behaved Firefox process (especially one with add-ons installed) needs access to much of the network and file system.
This is BS. You could have three processes and have FireFox sandboxed completely. Main process runs in a low integrity mode which limits it's resource access to a single directory. Second process is a download delegation process (takes a file after it is downloaded and moves it to the requested location while also promoting it's integrity) running in normal integrity mode. Third process is a network communication delegate/proxy running in normal or possibly even low integrity. These two delegate processes I mentioned will still be needed for the MP Firefox so it is no more work to create them.
>Stability
This is the only true benefit, but it is of very little value. Firefox almost never crashes and when it does, the session restore brings you back to were you left off in seconds.
Cons? More complexity means more bugs. This is a workaround for really fixing FireFox. I am going to have 150+ extra processes in my task manager now. More memory use. More context switches in the operating system eating up resources and causing more system latency and overall slowdown (context switches at the kernel level which will affect the whole OS).
[+] [-] goggles99|12 years ago|reply
[+] [-] davidbielen|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]