top | item 12805656

A Quantum Leap for the Web

495 points| Manishearth | 9 years ago |medium.com | reply

132 comments

order
[+] dherman|9 years ago|reply
[disclaimer: I co-founded Mozilla Research, which sponsors Servo]

It's awesome to see the Gecko team continue to tackle big, ambitious projects now that electrolysis is rolling out. And I'm so excited that they're betting big on Servo and Rust. Servo has really been taking advantage of one of Rust's promises: that you can reach for more aggressive parallelism and actually maintain it. I believe recent numbers showed that effectively all of Firefox's users have at least two cores, and about half have at least 4. The more we fully utilize those cores, the smoother we should be able to make the whole web.

Over the last year, all three communities have been laying groundwork to be able to land Rust components in Firefox and share components between Gecko and Servo, and now it looks like that's teed the Gecko team up to commit to making use of some big pieces of Servo in the coming year. Some of the initial builds of Firefox with Stylo that Bobby Holley has showed me look really amazing, and WebRender could be a game-changer.

And the Servo project is just getting warmed up. ;) If you're interested in what they're up to next, check out Jack Moffitt's recent presentation from a browser developer workshop last month:

https://www.youtube.com/watch?list=PL4sEzdAGvRgCYXot-o5cVKOo...

[+] tomdale|9 years ago|reply
I have to say, the work being done on Servo is really exciting—the first tectonic shift in browser engines to come along in years.

pcwalton's talk about WebRender earlier this year[1] was one of those rare technical presentations that left my jaw on the floor. In particular, the insight that modern browsers are just AAA game engines with a security model, so they should be architected similarly, changed the way I think about browsers. That game developers and Mozilla are both so excited by Rust's ability to safely write parallel systems at scale makes a lot of sense.

[1]: https://air.mozilla.org/bay-area-rust-meetup-february-2016/#... Previous HN discussion: https://news.ycombinator.com/item?id=11175258

[+] the8472|9 years ago|reply
> The more we fully utilize those cores, the smoother we should be able to make the whole web.

I wonder why the GC/CC are not multithreaded though. It seems like those are fairly isolated components, considering the entire application gets suspended so they can do their job, i.e. prime candidates for parallelism.

When forcing a collection on a large firefox instance it can easily spend 20+ seconds collecting on a single thread while a java VM can munch churn through something like 1 gigabyte per second per core.

In other words, from the outside it looks like a low-hanging fruit that has not been plucked.

[+] xorcist|9 years ago|reply
> all of Firefox's users have at least two cores,

As an end user, it concerns me slightly that the visible change will be that Firefox pegs two cores instead of just one.

An absolute explosion in Javascript usage on the web together, of low quality haphazardly put together at runtime, together with the convenience of tabs makes this a problem. Mozilla may do everything right but that doesn't necessarily help the end user.

Is there anything I could do easily to remedy this, without bothering with white listing Javascript? And is there any activities at Mozilla concerning this, perhaps with identifying the most trivial cases of scripts spinning without doing useful work? Maybe pausing DOM changes for documents that aren't visible?

[+] oldmanjay|9 years ago|reply
I would personally not be so eager to disclaim such impressive credentials, but I suppose you have your reasons.
[+] armitron|9 years ago|reply
It's telling that not one comment in this entire thread mentions security and exploitability, two areas where Firefox is not just terrible, but the worst choice amongst (Chrome, Safari, Firefox) (1) but also Edge and IE 11. Everyone is focusing on performance, as if that's the BIG issue these days. Talk about having your priorities screwed up.

Reading your comment, and the linked post, I'm left with the impression that this will not change in the foreseeable future. Every exploit shop considers Spidermonkey a security clusterfuck yet it's still in use. Multiple processes do absolutely nothing for security unless combined with sandboxing ala-Chrome. Continuing to use C++ rather than fully embracing Rust (or something even better than Rust) also does nothing for security. Iteratively improving things on top of a Javascript engine that's a security disaster and a C++ core will not give us a secure browser, as one can not build castles on top of sand.

At some point people need to realize that one has to scrap the pile of mud and start again, on solid foundations. Alas I feel that these lessons escape the Mozilla folks and thus their browser will remain low hanging fruit for adversaries.

In an age where Nation States can MITM __the entire planet on demand__ [QUANTUM] and the FBI delivers Firefox 0day through TOR exit nodes, this blatant disregard for security should be entirely unacceptable. I don't really blame Mozilla, but those who use Firefox and give Mozilla their share in the browser market. If we don't demand better, we shall never have it.

(1) http://cyber-itl.org/blog-1/2016/9/12/a-closer-look-at-the-o...

[+] mtgx|9 years ago|reply
> A first version of our new engine will ship on Android, Windows, Mac, and Linux. Someday we hope to offer this new engine for iOS, too.

More people need to put pressure on Apple to allow third-party browser engines on iOS. Fortunately, they're already getting sued over this, but just in case that doesn't succeed, there should also be bigger public pressure on Apple to allow them.

http://www.recode.net/2016/10/7/13201832/apple-sued-ios-brow...

[+] tedmielczarek|9 years ago|reply
I've had a Gecko port stood up and running on iPhone hardware several times in the past 6 years but we've never sorted out a real path to shipping. The most recent incarnation felt a lot nicer than Safari with our async pan/zoom architecture. Maybe I should just get Servo running and we can ship it as a tech demo. :-P
[+] bobajeff|9 years ago|reply
This sounds to me like Mozilla is getting impatient with Servo. Servo was more than just a parallel browser engine it was the only new web engine not based off of decade old codebase.

It was a statement that it's feasible to hold off on monoculture because compatibility isn't impossible to achieve on new engines.

[+] Manishearth|9 years ago|reply
> This sounds to me like Mozilla is getting impatient with Servo.

I mean, yes. In a way. I think this was always part of the plan for Servo -- if things go well start uplifting ideas and/or code to Firefox. This isn't exactly impatience; it makes perfect sense to do this.

This isn't a change in gear for Servo, though. Servo is still chugging along. There are still no concrete plans (that I know of) for a Servo product, however Servo is working on stuff that it needs for it to be a product (i.e. it's not just focused on trying out researchy ideas to be used by Gecko), like proper SSL security. So there's nothing stopping Servo from being a product in the future, and Quantum doesn't affect this.

Quantum affects Servo in a couple of ways:

- There are now paid Gecko engineers hacking on bits of Servo (yay!). On the flipside, paid Servo engineers (e.g. me) are working on Quantum, but for the most part this involves improving Servo itself.

- Rough edges are being polished, web compatibility is being addressed for the quantum components. These were things that were always being worked on, but weren't always a priority for a given component. For example, there have always been optimizations that we knew we could do, but we hadn't done them so far since there were other researchy things to focus on. Now these are getting implemented.

- The build/CI system will probably have major changes to make it possible to smoothly integrate with Gecko. This doesn't really affect goals, just the day-to-day servo developer experience.

It doesn't affect Servo's goals or de-emphasize Servo itself. It just gets some of the advances that Servo has made to the public faster.

[+] pcwalton|9 years ago|reply
Servo is still being developed with the same manpower. Quantum is not de-emphasizing Servo. It would make little sense to do so, since the lack of legacy in Servo is part of what has given us the freedom to experiment with things like parallel restyling in the first place. And Quantum helps Servo, too—by giving us real-world Web compatibility experience with portions of Servo's codebase sooner, it helps us shake out bugs faster than we can with Servo alone.

(Disclaimer: As always, I speak for myself, not for my employer.)

[+] larsberg|9 years ago|reply
Project Quantum is an opportunity to ship some of the Servo components in Firefox and gain millions of users, real-world tests of the technology, and hopefully dramatically expand our contributor base. As we mention in a post to the Servo mailing list (https://groups.google.com/forum/#!topic/mozilla.dev.servo/3b... ), Servo has a lot coming in 2017 unrelated to Project Quantum or Firefox.
[+] dherman|9 years ago|reply
I can tell you from where I sit at Mozilla, everyone is very excited about Servo. It just takes time to build things! :) It wouldn't make sense to wait for Servo to reach full web compatibility before starting to integrate it into production uses. This is really just a natural next step in the progression of adopting Servo at Mozilla.
[+] bzbarsky|9 years ago|reply
Compatibility in new engines is ... hard. Servo is basically going to need to spoof the WebKit UA and duplicate a bunch of WebKit bugs (the Edge approach) or spoof the Gecko UA and duplicate a bunch of Gecko bugs. Some specs now have an explicit "does your UA say you are Gecko, or does it say you are WebKit" switch with behavior specified for both branches. :(
[+] NoGravitas|9 years ago|reply
Isn't this announcement basically just saying, though, that Servo is going to replace Gecko piece by piece, rather than all at once?
[+] dblohm7|9 years ago|reply
Servo has always been and still is a research project. It was never intended to go into production as a fully-fledged Mozilla product. Quantum is the initiative to bring a lot of those technologies into Gecko.
[+] aantix|9 years ago|reply
They should use Yahoo's front page as their performance baseline.

Whenever I load it, the favicon starts to flicker, multiple movies (ads) start playing, and I can't tell whether scrolling has been badly hijacked by some rogue js plugin or if the performance of their video playback is just that bad.

[+] mgalka|9 years ago|reply
Yahoo's home page explains so much about the company -- unable to maintain even the simplest and most basic features of their site. I'm still on Yahoo mail, and there were a few months this year where the basic search functionality didn't work.

Yahoo would probably make a great case study in corporate culture gone wrong.

[+] anigbrowl|9 years ago|reply
There are so many major brand websites that are so awful that I want to grab anyone who admits to working there and yell 'have you tried using your own damn product lately because it sucks' but I'd rather not be returned to prison for my unique form of UI feedback.
[+] yalooze|9 years ago|reply
Does anyone know how this compares to the current implementations of competing browsers? ie. is Firefox still playing catch up in some respects or is this leaps ahead of the competition too?
[+] metajack|9 years ago|reply
I think performance of the style systems in Blink and Firefox are similar. Since Servo's style system is linearly scalable, we expect most users to get a ~4x speed improvement on styling.

Users's don't care specifically about style performance, though, but about things like interactivity. We think Servo's style system will improve those things, but don't have numbers for that on hand.

To give a concrete example, it takes Firefox ~1.2 seconds to restyle the single page HTML5 spec with it's current non-parallel style system. Firefox with Stylo can do this in ~300ms. That is close to a full second off of initial page load.

[+] c-smile|9 years ago|reply
"But nowadays we browse the web on ... that have much more sophisticated processors, often with two, four or even more cores."

Having batched GPU rendering / rasterization makes real sense, yes. When it shown, the browser is the largest screen space consumer.

4K displays (300ppi) increased number of pixels that need to be painted by 9 times. Thus CPU rendering / rasterization is not the option anymore, yes.

But browser is not the only process competing for those cores.

2 or even 4 cores ... You have more front running applications than that these days. Some of them are invisible but still CPU intense.

In order to get significant benefits from parallelism in browsers the number of cores shall be measured in tens at least I think. If not even more than that. To run things in parallel like bunch of Cassowary solvers for each BFC container.

I suspect that the main bottleneck at the moment is in existence of methods [1] that force synchronous layout / reflow of the content. These are things that kill parallel execution. DOM API shall change towards batch updates or other more parallel friendly means.

[1] https://gist.github.com/paulirish/5d52fb081b3570c81e3a

[+] TeeWEE|9 years ago|reply
You can make a super fast web browser, but that doesnt solve the fundamental issue: The web is not designed for performant applications. Resources loading, javascript, rendering... Solve that first before you build a fast engine...

Off course a fast engine is good. But dont forget the root problems with the web.

[+] amelius|9 years ago|reply
Yes. The article speaks of "zero latency", but that is simply not achievable with a network connection that has latency. My guess is that in current browsers, on average 90% of the latency is in the network (as opposed to rendering). So even if the render step was perfect, you would only get a measly 10% performance gain.
[+] TekMol|9 years ago|reply

    we’ll be rolling out the first stage of Electrolysis to
    100% of Firefox desktop users over the next few months.
From my experiments with it, this still does not fix the problem that the javascript in all windows shares one core. A script running in one browser window still slows down the other windows.

A problem that Chrome has solved years ago. So I think this is not really a leap for the web. Just FireFox catching up a bit.

FireFox is my main browser. The way I deal with it is that I start multiple instances as different users. So they run inside their own process. This way I can have a resource hungry page open in one window (For example a dashboard with realtime data visualization) and still work smoothly in another.

[+] bholley|9 years ago|reply
> So I think this is not really a leap for the web. Just FireFox catching up a bit.

To be clear, Project Quantum is the next phase of architecture, post-Electrolysis. We're also simultaneously working on multiple content processes (which is how Chrome often avoids inter-window jank), but not under the Quantum umbrella.

We think we can do better though, which is where Quantum comes in. The Quantum DOM project is designed to solve exactly the problem you're describing, while using fewer resources and scaling to more tabs. Stay tuned!

[+] bzbarsky|9 years ago|reply
> this still does not fix the problem that the javascript in all windows shares one core

Yes, that's what makes it the "first stage".

The "leap" part is not Electrolysis; as you note that's just table stakes. The "leap" part is what we can work on now that the Electrolysis first stage, which was very labor-intensive, is done.

[+] pcwalton|9 years ago|reply
Servo has a solution for that: it runs all origins in separate threads. (Note that this goes further than Chrome does; Servo runs cross origin iframes off the main thread too, while Chrome does not.)

Gecko is solving this too, with Electrolysis and Quantum DOM. But because the architectures are so different at the DOM level (direct bindings vs. an XPCOM/direct binding hybrid, tracing vs. reference counting, off main thread layout vs. main thread layout, Rust serialization vs. IPDL, etc.) the Servo solution doesn't directly apply to Gecko. So Gecko and Servo are working on their own solutions mostly independently, rather than sharing code as in most of Quantum.

[+] metajack|9 years ago|reply
Parts of Quantum are addressing this, and also Electrolysis will also probably grow to fix this as well. You can already turn on multiple content processes in Firefox to get Chrome-like behavior (at the cost of some rough edges; there's a reason it's not currently the default). I think the setting is dom.ipc.process-count in about:config. I have mine set at 10.
[+] aikah|9 years ago|reply
> A problem that Chrome has solved years ago.

The trade-off is the crazy amount of resources Chrome uses,even on a multicore machine. 20 process spawned and shit like this can bring down any computer. Chrome resource usage is excessive.

[+] cheiVia0|9 years ago|reply
You can alleviate this problem by increasing dom.min_background_timeout_value.
[+] runeks|9 years ago|reply
I don't understand what the difference is between Quantum and Servo. To me it sounds like a new name for the same thing. I recall Servo being promoted this way three years ago.
[+] bandrami|9 years ago|reply
Pet peeve of mine: a "quantum leap" is literally the smallest change of state that is physically possible, but it's come to mean the opposite in popular use.
[+] TheRealPomax|9 years ago|reply
Time to shelve that pet peeve, because any physicist should be able to tell you that this claim is one of those hilarious "people keep pretending this is true" when it has nothing to do with what a quantum leap actually is.

In physics, a "quantum leap" is a synonym for atomic electron state changes. These don't take place as a continuous gradient change, they jump from one state to the next, releasing or absorbing photons as they do so: the changes are quantized, and any change from one state to another termed a quantum leap. And it really is "any change": they're all quantum leaps.

There is also no stipulation on direction, state changes go both up and down. So while there certainly is a lower bound to the value that we will find in the set of all possible quantum leaps in atomic physics (because you can't go "half distances", the whole point of quanta is that it's all or nothing). the idea that a quantum leap is "one value" is patent nonsense; different state transitions have different values, and can be anywhere between tiny and huge jumps, and can release massively energetic photons in the process.

And if the term has been given a new meaning in everyday common language, then cool. That's how language works. It doesn't need to be correct with respect to its origin, people just need to all agree on what the terms used mean. In every day language, understanding that "a quantum leap" means "a fundamental change in how we do things" (which is implied always a big deal, even if the change itself is small) demonstrates a basic understanding of conversational English. So that's cool too.

[+] white-flame|9 years ago|reply
It's a barrier that can't be smoothly crossed, so it implies revolution over evolution.

I think you're accidentally mixing in some Planck with your quantum.

[+] k26dr|9 years ago|reply
Doesn't Chrome/Chromium already run as multiple processes?
[+] andrewstuart|9 years ago|reply
I wish I could see how much processing power/memory was being taken by each tab.
[+] mxuribe|9 years ago|reply
Looking forward to this!
[+] ifhs|9 years ago|reply
Sorry but posting this on medium is a quantum leap backwards. That too from Mozilla.