top | item 8559519

Principles of Rich Web Applications

534 points| rafaelc | 11 years ago |rauchg.com | reply

148 comments

order
[+] tshaddox|11 years ago|reply
This is a great article. It's extremely thorough, and touches upon most the difficulties I've encountered in my (limited) experience coding JS on the web, as well as several I hadn't even considered.

My only complaint, if it can be considered a complaint, is that the author doesn't address the real-life costs of implementing all his principles. The main one is the complexity of the server- and client-side architecture required to implement these principles, even for a minimal application like TodoMVC [0].

I agree that user experience is extremely important, and perceived speed is fundamental, but I certainly don't think it's important enough to justify the cost of figuring out how to implement all these principles, especially for a startup or other small team of developers.

Of course, the hope is that tooling will quickly progress to the point that these principles come essentially for free just by following the best practices of whatever libraries/frameworks/architectures you're using. There was probably a time where the basic principles of traditional static web apps (resourceful URLs, correct caching, etc.) also looked daunting for small teams, but that's quite manageable now with Rails/Django/etc. (and maybe earlier frameworks).

[0] http://todomvc.com/

[+] enjalot|11 years ago|reply
There is one such framework: http://derbyjs.com/ It's been in development for about 3 years, so it certainly has a cost to implement. It is being used in production by Lever (YC) to ship very usable enterprise software.

There is even a TodoMVC example being submitted for Derby: https://github.com/derbyjs/todomvc/tree/master/examples/derb...

Another nice thing is that the templating engine can be used client-side only if you want: https://github.com/derbyjs/todomvc/tree/master/examples/derb...

[+] fourpoints|11 years ago|reply
Client-server frameworks (in JVM: http://vaadin.com/ or in JS: https://www.meteor.com/) are something that help (or should help) overcoming the complexity of all this. Mostly taking care of UX, communication and optimization parts. I guess they are not for those who want to build everything from scratch, but speed up teams quite a lot.
[+] jwblackwell|11 years ago|reply
Very important point, especially for the HN audience. There's some really good principals here but in reality you don't need to worry about most of these an an early startup. If you use modern frameworks then a lot is taken care of already.

Things like stale code on the client is simply not an issue until you've got a lot of people using your app. Don't worry about it.

[+] discreteevent|11 years ago|reply
Fair point. One should be particularly wary of the lure of 'liveness' and reactivity. They are nice things to have but don't underestimate the expense in terms of performance, stability and complexity (events are spaghetti).
[+] brianberns|11 years ago|reply
JavaScript "is the language of choice for introducing computer science concepts by prestigious universities"!? God help us all.

I see the link to the Stanford course, but I hope it's still in the minority. JS is not a language that I would want to teach to newbies, especially if those newbies are on a path towards a CS degree.

[+] PavlovsCat|11 years ago|reply
I can't stand how the Facebook feed updates in realtime. I read a bit, leave the tab, and when I come back to it it updates, so I have to find my place again (which I don't do, I just go "screw facebook" and close the tab ^^). The same for forums, if I want to see new or changed posts, I'll hit F5 -- and when I don't press F5, it's not because I forgot, but because I don't want to. Pressing it for me is a great way to make me go elsewhere; or in the case of facebook, to stay and resent you.

I don't need to know in realtime how many people are on my website. I need to know how many there were last week, and compare that with the week before that. Likewise, I don't really need to see a post the instant it was made. At least for me, the internet is great partly because people chose what they do how and at what pace, because it's more like a book and less like TV, and making it more like TV is not improvement.

This is not against the article per se, which I found very very interesting and thorough, just something I have to get off my chest in general. Though I really disagree with the article when it comes to the facebook feed, I think that should serve as an example for what not to do.

Please, think twice, and never be too proud to get rid of a shiny gimmick when it turns out it doesn't actually improve anything. Let's not sleepwalk to a baseline of stuff we just do because everybody does it and because it's technically more complex. As Einstein said, anyone can make stuff more complex :P

[+] benjoffe|11 years ago|reply
I would argue that in the case of Facebook's updating feed that the idea is fine, the implementation is buggy.

What should happen is the new content is loaded above and instantaneously the document scroll position is updated to preserve the previous position. This will allow new content to come in without disrupting your experience.

[+] spenuke|11 years ago|reply
"The internet is great partly because people chose what they do how and at what pace, because it's more like a book and less like TV, and making it more like TV is not improvement."

Well worth thinking about.

[+] andrewstuart2|11 years ago|reply
A single page app is just this: a web page that doesn't ever reload the page or reload scripts. It doesn't matter how much content the initial page had on it, and it certainly doesn't mean that you send an empty body tag. The history APIs even let the back button and URL bar behave exactly as the user expects them to, but without a single round trip if you already have the data and resources.

While it's certainly true that the first page may load slower, and you'll load a few scripts as well, you never need to reload those again. Frameworks like Angular encourage you to use a "service" mindset that capitalizes on this property.

The longer you use a single page app, the fewer round trips you will have. If you ask me, your communication should only be for raw materials (scripts, templates) that you won't need to validate or request again during the current session, and raw data (json). This is more loosely coupled, more cacheable at all the different levels, and more scalable in large part due to the decoupling.

Once the initial view loads, I totally agree that you should intelligently precache all your resources and data asynchronously in the background to usher in the era of near zero-latency user interactions. Preferably, you do this in an order based off of historical behavior/navigation profiling to best use that time/bandwidth you have before the next click.

I get the impression articles similar to this one that there was once a similar mindset surrounding mainframes and dumb terminals. The future is decentralized, web included.

[+] Offler|11 years ago|reply
I think people have different use cases that they aggregate under "web application". If you are building a desktop replacement application then I can see that initial load might not be such a big deal but if you are building a less complex application like the Twitter UI then server side rendering makes more sense. It's all context specific in the end. To hard to make general claims.
[+] romaniv|11 years ago|reply
The future is decentralized, web included.

Decentralized work is a bad thing. Instead of Mozilla/W3C solving some problem for everyone, we get every website coming up with their own clever solutions for nearly everything. (Also, thousands of clients re-doing the work that could be done once and served from cache.)

Also, single-page apps increased reliance of the entire web on the handful of CDNs and framework providers.

[+] andrewstuart2|11 years ago|reply
*I get the impression from reading articles similar to this one...
[+] protonfish|11 years ago|reply
This is not a great article and is resistant to criticism by virtue of its excessive length. Still, I'll try to point out some major flaws.

1. Single page apps have many drawbacks that are conveniently not mentioned: Slow load time, memory bloat, persistence of state on reload, corruption of state and others. SPAs are just another misguided attempt at making web apps more like desktop apps. Web apps are network applications - if you remove the network communications portion, what is the point of them?

2. JS is a great tool to make pages more responsive. This has been the case for years and I am lost on why the author writes on and on about it without any poignant observations or facts.

3. Using push (web sockets) is a valuable tool for accomplishing particular features. This does not mean that more is better and we should start using it for everything. Server pull is a strong feature of the web and is arguably a key to much of its success.

4. Ajax is great, no argument.

5. Saving state as a hash value in the URL not only puts JS actions into history, but makes them visible and bookmarkable as well. Push state is a quagmire.

6. The need to push code updates is one of the problems caused by SPAs that is not needed in normal web apps. Even so, this could be solved with a decent, as yet unimplemented, application cache.

7. Predicting actions is overkill. If you focused on doing everything else well, there is no need to add significant amounts of complication. More code = poorer performance and decreased maintainability.

[+] darkmarmot|11 years ago|reply
I agree with all of your points but number 1. I think a good SPA should be a network-reliant application whose complexity is demanded by the use case. And a good framework for one should have provide near instant load times: a small core library with additional resources and logic only loaded on demand.
[+] rictic|11 years ago|reply
With a little work you can make a single page app that keeps the state preserved in the URL. The reason you try to remove all blocking on network communication is the obvious one: performance.
[+] mwcampbell|11 years ago|reply
Is there any web application framework, presumably encompassing both client-side and server-side code, that implements these principles? I'm guessing that Meteor comes closest.
[+] pothibo|11 years ago|reply
Here's a few thoughts:

I can see that in terms of bandwidth, SPA can be more efficient then normal HTML page. But this makes a few assumption. First, that your JS package never changes. As soon as 1 character changes in your package, the cache is invalidated and the whole package needs to be downloaded. Like you said, it's application specific. But if your app has ~3 pageview per session, it becomes very hard to justify the use of a SPA.

As for acting as soon as there's a user input, this can be done with SPA or not. One thing to mention though, is that Pull-to-refresh is something that is gradually falling out of favour.

Besides those 2 things, insightful post.

[+] quaunaut|11 years ago|reply
> I can see that in terms of bandwidth, SPA can be more efficient then normal HTML page. But this makes a few assumption. First, that your JS package never changes. As soon as 1 character changes in your package, the cache is invalidated and the whole package needs to be downloaded.

Sure, but there's strategies against this, right? Generally, vendor and 3rd-party code doesn't change often, so minify and stick all that together. Then, you've got your core application code, which you attempt to keep as small and fast as possible.

I will say, I'm not as experienced as the author of this piece, but at the end of the day I feel like the author is making blanket-statements that honestly don't hold up to the reality of what users actually want. I think it also makes assumptions about your stack, and your resources- yes, if you've got incredibly fast, top of the line servers, server-side rendered pages are probably a better idea, as the time difference between a json payload and the page being rendered by the server is much smaller.

On the other hand, even a cheap rails(ie slow) server with a CDN handing off the client code can shove some JSON out no problem, and it can do it very fast, even the worst-off users usually only at 300ms for total receive time- a time which generally, is 100-200ms slower than your average server's render time of the page alone.

Furthermore, it lets you offload who is delivering said content- if a CDN is giving up all that Javascript, then the initial render times may actually not be that much slower than if it was server rendered.

----

I also get the feeling a lot of people are making the mistake right now of assuming that because there's been a lot of evolution in the frontend framework world in the past 2 years, that it also means we're hitting peak performance, which couldn't be further than the truth. Angular apparently(I'm going off of what I've seen many say about 1.0 in the post-2.0 world) completely bungled performance the first time around, but it'll be better next year. Ember is already well on its way to being fast, and by summer of next year is going to be blazing quick with all of the HTMLBars innovations.

I think we're barely getting started figuring out frontend frameworks. Even if right now it may not be the best idea for your personal use case, I'd check back once a year until the evolution slows down to make sure you don't end up regretting not jumping in.

[+] adamnemecek|11 years ago|reply
if you are using AMD, the single modules will be cached so when your application changes, only the corresponding module will have to be re-downloaded.
[+] Illniyar|11 years ago|reply
There are a few issues with these:

1) "Server-side rendering can be faster" - the information in this part quietly ignores the fact that:

  * even if you have server-side rendering, you are still going to load external javascript/css files

  * browsers optimize multiple resource loading by opening multiple concurrent connections and reusing connections

  * you can and should use cdn (hence actually lowering the 'theoretical' minimum time)

  * browsers cache excessively - and you can make them cache even for longer

  * the fact that rendering on the server-side takes a lot of cpu and hence increases response time dramatically the more requests are made
6) while reloading the page when the code changes is a good idea, hot updating javascript is a really bad idea - beyond the fact that it's terribly hard, will most likely result in memory leaks in the end and as far as I know no one is doing it, it'll be extremely hard to maintain or debug.

The rest of the principles are quite true, informative and should be practiced more often (assuming you actually have the time to engage in these kind of improvements as opposed to making more features).

[+] valisystem|11 years ago|reply
Just to nuance your 1) points about server side rendering, with pro-points :

• you only need html and css loaded to show content to your user, and js loads while the user is watching content, js has some time to be ready on first interaction.

• still feels slower than showing stuff with only html+css

• for pages content that changes a lot, if you rely on cdn for html pages, you need to update content with js on page load and you either ends up with a splash wait-while-we-are-loading or a blinking christmas tree.

• if your html is small enough, the cache checking round-trip is not that faster than loading content, while a JS rendering will need cache round trip AND data loading round trip. You can eliminate some html round trip with cache expiration, but at the expense of reliable deployments.

• still, JS rendering/update can be slower than server side CPU, especially on mobile devices.

[+] abdullin|11 years ago|reply
This is not entirely true, I believe:

1. With server-side rendering browser can get HTML and display it before other resources are downloaded, parsed and executed (for JS).

2. In pre HTTP/2 world resource requests can be expensive, since browsers limit the number of outstanding requests.

3. Some users still use slow phones with slow and laggy network connection. Server-side rendering can improve experience for them a lot.

[+] Offler|11 years ago|reply
You can send a pre rendered HTML shell with text content, when using server side rendering with something like React, quite quickly. I think this gives users a very nice experience especially compared to staring at a blank page!
[+] glifchits|11 years ago|reply
Wow, the views counter... Haven't read the article yet, just astonished at the rate of increase...
[+] beenpoor|11 years ago|reply
Do you know how the counter works ? I am a JS noob. I see at the bottom of script he's updating the counter, but who is calling the update ?
[+] aliakhtar|11 years ago|reply
> Server rendered pages are not optional

> Consider the additional roundtrips to get scripts, styles, and subsequent API requests

If you're using a framework like GWT, it compiles all of the relavant css files, javascript, and ui template files, into one .html file. Then there's only one or two http requests to download this html file, and the server only has to handle requests for fetching data, updating or adding stuff, etc. You can also gzip + cache this .html file, to make it even smaller.

It runs lightning fast, too.

[+] sanderjd|11 years ago|reply
Another problem I had with that point is that there is no theoretical floor to the number of requests that can be made in parallel. If it takes 100 requests to get all your data, and the browser can handle 100 requests in parallel, you can get all your data with that same theoretical 50ms latency floor. This is a pedantic point, because as far as I know, all browsers currently limit parallel requests to a fairly low number, but it isn't true that more requests requires more wall-time latency in a theoretical sense.
[+] megaman821|11 years ago|reply
This isn't as awesome as it sounds. When inlining everything there is no control over the prioritization of resources. Large files like images can block the rendering of your layout giving the appearance of being slow even if the overall download time is less. HTTP2 has stream prioritization to solve this problem and is much more cache friendly.
[+] nekitamo|11 years ago|reply
This might sound good on paper, but have you actually tried using a large GWT application daily? I use Google Adwords every day, and it's one slowest and most frustrating web experiences ever. And if Google can't get a GWT application right, who's to say you can?
[+] derengel|11 years ago|reply
But Google is not pushing GWT anymore since a long time, they don't want more adoption of the framework, instead, they want you to start using Dart, so beware of GWT's future.
[+] cbsmith|11 years ago|reply
Are you suggesting that we now need a sophisticated framework in order to concatenate files together?
[+] Cyranix|11 years ago|reply
Seems like a well-reasoned set of opinions at first blush. I'll have to give it more time to sink in for the most part, but the one bit that elicited immediate disagreement from me was the particular illustration of predictive behavior. There is unquestionably value in some predictive behaviors (e.g. making the "expected path" easy) but breaking with the universal expectations of dropdown behavior doesn't seem like a strong example to follow.
[+] EGreg|11 years ago|reply
Funny enough I've had to deal with many of these when implementing http://platform.qbix.com

I pretty much agree with everything except #1. Rendering things on the server has the disadvantage of re-sending the same thing for every window. I am a big fan of caching and patterns like this: http://platform.qbix.com/guide/patterns#getter

You can do caching and batching on the client side, and get a really nice consistent API. If you're worried about the first load, then concatenate all your js and css, or take advantage of app bundles by intercepting stuff in phonegap. Give the platform I built a try, it does all that stuff for you, including code updates when your codebase changes (check out https://github.com/EGreg/Q/blob/master/platform/scripts/urls... which automagically makes it possible)

I would say design for "offline first" and other stuff should fall into place.

[+] quarterwave|11 years ago|reply
For real-time updates in response to user actions, which is a bigger concern: average latency, or its variance?

Example: Server generates a sine wave which gets displayed as a rolling chart waveform on the client. As client spins a knob to control the amplitude, the server-generated stream should change (sine wave is a trivial example, representative of more complex server-side computation).

[+] lambeosaurus|11 years ago|reply
The real-time updates he's talking about don't require server-side processing - the google homepage switching immediately to the search view for instance - that processing can be contained within the Javascript application, and state is simply maintained against the server (and then by extension across other instances of the application).

I don't imagine he's suggesting we try the same approach where server-side processing is required.

If I have misunderstood you then I apologise.

[+] einrealist|11 years ago|reply
I really like the simplistic principles http://roca-style.org defines for web applications.

I find single page applications way too complex. The amount of code duplication is horrific. So everyone ends up building platforms like GWT or Dart in order to hide that overhead. But that does not mean that things get simple.

(Maybe I'm getting old.)

[+] pluma|11 years ago|reply
I can see where you're coming from but I find that React (with node on the server and a RESTful database) eliminates a lot of the code duplication because I can run the same view rendering logic on the client and the server.

ROCA is an appealing idea, but my concern is that in order for the the-API-is-the-web-client approach (which ROCA as I understand it seems to advocate) to work you end up mixing two entirely separate levels of abstraction: what may be a good abstraction on the API level may not be a good abstraction on the UI level. It's sufficient if your web app is just an API explorer, but not every app lends itself to that.

You could say that then we shouldn't be building those apps, but that's simply not realistic.

[+] daigoba66|11 years ago|reply
A neat example is github.com. When browsing a repository it refreshes only the relevant part of the page. But the URL changes and can be used to navigate to a specific resource.

But as the article points out is often the case, at github.com the HTML loaded does not include the already rendered resource; it must be pulled in via a separate request.

[+] lucaspiller|11 years ago|reply
The way GitHub works is pretty decent, but also pretty basic. It uses PJAX, so the HTML is still rendered on the server but the body content is updated.

It still has a few issues though, I work on flakey connections now and again and sometimes it just gets stuck - it would be nice if the request were retried automatically after a few seconds.

[+] jamesbrewer|11 years ago|reply
Great example! Github needs to give their code some TLC though. It seems like new comments on a pull request don't show up until after I push a new commit.
[+] jfroma|11 years ago|reply
> "Server rendered pages are not optional"

I don't get this, in my opinion they are optional, you can show the ios png placeholder (shown in the next item) which is a very static and cacheable content, while fetching your highly dynamic data from a database or somewhere else.

It feels like the first principle contradicts 2, 3 and 4.

[+] jamesbrewer|11 years ago|reply
The solution you offer is exactly what server rendering is meant to stop. You should load content as fast as possible and that means rendering it on the server.

Please stop putting loading icons and spinners where your content should be.

[+] darkmarmot|11 years ago|reply
Just one thing to point out: it seems as if a lot of your SPA arguments are predicated on the idea that apps don't chunk and/or stream their logic. While the front-end SPA framework I use is currently pretty bad for SEO, almost none of the download or latency issues are applicable...
[+] dllthomas|11 years ago|reply
I think these are great ideals to strive for, but they seem lower priority than a couple things that they can get in the way of if you're not careful.

First, in your quest to show me the latest info, please please please don't introduce race conditions into my interface. I don't want to go to hit a button or a key or type a command, and have what that means change as I'm trying to do it.

Second, it's often important to me what has happened locally vs. what is reflected on the server (especially if that's public). Please do update the interface optimistically in response to my actions rather than sitting and spinning, but please also give me some indication of when my action is complete.

[+] MrBra|11 years ago|reply
"A slightly more advanced method is to monitor mouse movement and analyze its trajectory to detect “collisions” with actionable elements like buttons."

Is this a joke?

[+] derengel|11 years ago|reply
When you are developing a web application for phone, tablet and desktop, is it a good principle to use the same HTML for the three and a separate CSS for each device? is there a case where this would cause problems?
[+] bliti|11 years ago|reply
It depends how this is setup. If you are using a template engine then I don't see why would this be a big deal if its a technical decision followed throughout the project. If you are not using a template engine and using Javascript to throw things around (like a bunch of Jquery piled on top of each other) then it becomes an issue.

Are you using any kind of server side framework (Like Django/Rails)?

Are you using any kind of client side framework (like Angular)?

Are you using any kind of layout framework (like Bootstrap)?

[+] dmak|11 years ago|reply
After a certain scale, its better to just decouple mobile and tablet form web.