This is what I tell potential customers that are evaluating our product that creates single-page web applications where the JS dynamically creates all content: if you want search engines to crawl the content, then you're not using the right product. It's a surprisingly easy decision when viewed in those terms. Most SAAS products that likely would be implemented as SPAs do not care about public search engine access, and are only used by B2B customers where you can dictate a certain minimum standard in terms of browser versions (IE).
Everything else should be a traditional static web site with "decorative" JS, or "progressive enhancement".
The "holy grail" architecture whereby the first request sends a fully formed HTML response which is then immediately hydrated with a JS frontend app that takes over routing and data gathering is now pretty technically feasible. It's the best of all worlds ... fast static content for the first load, and then fast frontend routing thereafter. If you ward your clients away from this path you're doing them a disservice. With this stack JS web apps no longer need to be app-like, they're well suited to content-heavy sites too. They even work with JS turned off :)
Can't upvote hard enough. Applications on the web are amazing, and developers must embrace JavaScript heavily.
That said, the apps do need to remain accessible to those with screen readers, vision issues, motor impairments, and cognitive issues. They need to be accessible to those with slow or spotty connections. But search engine optimization is about content sites, not necessarily apps.
Static pages are perfect for those cases, and I am thrilled to see more people coming around to that.
Why does it have to be one or the other, though? You can have static, pre-rendered landing pages for search engines to crawl and index, and your actual application can be an SPA. Right?
I generally agree with the article, but the examples of javascript web apps that were given seem weak:
> Most web apps I work with daily have highly sophisticated in-browser interactions that are built with JavaScript and can only be built with JavaScript: Flickr, YouTube, Facebook, Twitter, GMail etc
The web apps that I use the most day-to-day are Trello, Slack and Gitter. IMHO, those are better examples of js bringing actual value to the table that progressive enhancement simply cannot.
With that being said, the issue of overusing SPA technology when it doesn't fit the need is definitely real.
Part of the issue comes from people wanting to pad their resumes w/ experience in "hot" technology, or people who do have a genuine interest in improving their skills, but are not very skilled in identifying the pros and cons of whatever "new hotness" or "best practice" they read on their favorite news aggregators. By extension these creates an industry for grunt work to maintain/refactor/rewrite everchanging codebases/frameworks. Coupled with the general tendency of people to favor new shrink-wrapped libraries over doing good ol' painstaking research, it's difficult to reverse that trend.
I am a JS dev (after a lot of experience in statically typed languages) and just amazed to see the comments in this thread. At one end its experts(or near-experts) and at other end is just ignorants who probably did some $(...).show() and call it javascript.
No wonder why its hard to hire good javascript developers. People just don't appreciate how wonderful this language is.
>> ignorants who probably did some $(...).show() and call it javascript
So was that a dig against jQuery?
>> People just don't appreciate how wonderful this language is.
Ever try adding a string and an integer in JavaScript?
I think developers who think JavaScript is great, love it because it's forgiving. Do what you want, the JavaScript siren says, it might work, it might not but at least you won't have an ugly all stop. I'll try to make it do something, anything, sings the siren.
It's hard to hire good javascript developers because knowing javascript doesn't imply that you know how to be a good programmer. Nowadays, you can string together 500 node modules and make an impressive demo on the surface. The moment that dev needs to dig deep and debug something or create something from scratch, they can't.
I agree that Javascript web applications have a place. The problem is that the pendulum has swung too far, and folks are using Javascript everywhere. Static content generation on the client side, rendering non-interactive content, on the backend, in the build process, as a generic scripting language... the list goes on.
We're watching programming and design concepts which played out back with Windows 95 being hailed as "the future of programming". We're being told "worse is better" as Javascript is being used to turn our computers against their users by serving as a vector for malware, ads, and bots.
It's a mess, and it's going to take pushes from the far side of the pendulum's swing to bring it back to a sane place. If the pendulum was actually stalled too far towards "server side only", I'd agree that "server side only" articles would be too much - but we're not there.
> We can only improve the user experience with the front-end technologies we have.
No, we are more than capable of degrading the user experience with the front end technologies we have. There's a lot of proof of that on the web today. Delayed full page ads, pop-overs for every third word, hijacked scrollbars, malware installs...
JavaScript has grown up; it's got one of the best package managers anywhere, rapidly improving syntax and semantics, a large passionate community, loads of great tooling and a well maintained cross-platform binary in the shape of Node. It's no less deserving than any other language to be run on a server, as part of a build, generic scripts, whatever. Browser security issues arise because it's hard to resolve total user security across the public web coupled with highly interactive scriptable web pages ... its got nothing to do with the JavaScript language per se. It's fair enough to argue against the level of control scripts happen to have over the browser, but web page scripting could be implemented in <insert some other language here> and the same issues would be present. Your comment comes across as ignorant and prejudiced I'm afraid.
I am a huge fan of JavaScript myself mostly because I found it very fun to work with and that you can do anything with it at the moment (websites, games, native mobile apps, smart TV apps, program raspberry pies and much more). About single page apps now: don't do a SPA if your website is not really an interactive application. Do you have a game, photo editor, P2P video chat? If yes, go ahead, SPA is really great for those things but for God's sake do not create a single page app when your website actually has more pages with different content and no interactivity. Why try to emulate how a browser works (loading, rendering, etc...) inside the browser itself?
> Most web apps I work with daily have highly sophisticated in-browser interactions that are built with JavaScript and can only be built with JavaScript: Flickr, YouTube, Facebook, Twitter, GMail etc.
There is truly nothing about any of those that couldn't be done with a simple page refresh. Especially YouTube. Generally I find most of what JavaScript adds just irritating.
I feel like Gmail reached a peak of Javascript use just slightly past what the basic-HTML version has, and everything since has just slowed it down and made it eat more RAM for little benefit. Sometime between then and now I'm sure they added some "helpful" frameworks that have contributed to the problem. Inbox is even worse.
Our children won't believe we ever browsed the web in a graphical multitasking OS on 64mb of RAM and a single processor core. "Don't be silly, that wouldn't even be 1/6 enough to load Google Inbox!"
An unanswered question in this debate is: what is a web app and what is a web site? Sometimes the boundaries are blurry and sometimes they are clear.
For example, should a blog rendered in the browser be treated as a web app? Or rendered principally as HTML and CSS before it's sent to the browser (rather than rendered in the browser using Javascript). Here's an example: over 300K of Javascript to render a plain text page: http://elm-lang.org/blog/new-adventures-for-elm
My impression is that some developers are beginning to treat anything rendered in the browser as a web app - even plain web pages with no dynamic elements.
Why do developers do this? Is it because it makes their life easier using a single tool or language for both dynamic and static content? Or is it because they want to learn a new framework that's popular or interesting? Whatever the reason, one has to ask if giving users a good and fast site experience comes anywhere into the equation.
Each time yet another hipster posts his unique snowflake insights about why we absolutely need yet another bloatware to build websites on this vey site (which is just static html pages over an FS storage in less than MB of code, including a dialect of Lisp and related DSLs) a kitten dies somewhere.
JavaScript web apps are fine, as long as they are "isomorphic", i.e. the front-end code is also run on the server and the resulting HTML is sent to the client, and as long as it's possible to at least navigate to and read all the content without having JavaScript enabled.
Am I the only one wondering what the fuck "progressive enhancement" is? Even the "enhanceconf" website doesn't explain.
>We're convinced progressive enhancement remains an important aspect in pushing the boundaries of the web while still providing a robust experience for every user.
What?
EDIT:
>Your HTML is not more accessible or more semantic when it’s rendered on the server.
IDIOT DETECTOR TO FULL POWER. It is by definition more accessible if it doesn't require Javascript to render.
> Most web apps I work with daily have highly sophisticated in-browser interactions that are built with JavaScript and can only be built with JavaScript: Flickr, YouTube, Facebook, Twitter, GMail etc.
Good that GMail is created with GWT. So it doesn't even use JavaScript.
Most people are so afraid of JavaScript that they create hella crazy abstractions (Angularjs [Google], ReactJS [Facebook]) and even then Microsoft and Google trying to avoid JavaScript more and more with Typescript or Dart, they transpile it.
JavaScript maybe is valuable in the web, but not in it's current form. So much quirks that you either spend using a library or fixing browser incompabilites.
A lot of people here said that transpiling will be used a lot in the upcoming years, and I think that also. People will more and more. They want to use Python/Scala/Java/Whatever on both sides.
Also it will reduce complexity by a lot.
Currently your typical webapp has, a server-side technology (even if you are using node), then you have a web frontend which uses at least npm, however mostly you end up with npm, bower, gulp/grunt, webpack and whatever.
> Client-server architectures instead of monoliths
What about small teams? They can't split their monolith since it will become unmanageable. So if they have two apps, frontend/backend it's still a mess to manage these projects with a small team.
> We need to stop excluding JavaScript apps from “the web as it was intended”. JavaScript apps are “of the web”, not just second-class citizens “on the web”.
In my eyes JavaScript should be used where it is needed, but not used in stuff which never should touch it.
JavaScript is still a mess.
Gmail is not built with GWT, it's built with Closure Compiler.
Inbox is built with GWT and Closure compiler together, as a hybrid app, with the UI done in Closure/JS, and the business logic done in Java so it can be shared with Android and iOS.
Over the past 2 years, the GWT team has been honing a new JsInterop spec that removes the majority of the impedance mismatch between Java and JS for hybrid apps, which will be part of the 2.8 release.
One of my favorite features is automatic Java8 lambda <=> JS function conversion. Any single method interface in Java, when marked with @JsFunction, can be passed to JS where it can be called as a regular JS function, likewise, any JS function can be passed back to any Java function accepting a single-method-interface, and it will pretend to implement the interface.
You can now write (GWT 2.8), code like $("ul > li").click(e -> Window.alert('You clicked me')) with only a few lines of code to interface with jQuery for example.
I have to disagree. Adding another layer of abstraction cannot reduce complexity, although it might make certain tasks easier. Consider assembly vs C as an example: C can make lots of programming tasks way easier, but the complexity of the system is not reduced.
With C, this becomes apparent when the program segfaults. A garbage-collected language or Rust can prevent segfaults using a subsystem to manage and/or enforce memory ownership, again making lots of tasks easier, while adding even more complexity.
In the case of transpile-to-JS languages, these can fix many of the shortcomings of JS and make lots of tasks easier, but they are a more complex system which can cause additional work if the generated code fails at run time, and the browser debugger brings up something completely different from your source code.
Your point that we will see more transpiled languages in the future makes sense.
"Most people are so afraid of JavaScript that they create hella crazy abstractions (Angularjs [Google], ReactJS [Facebook])"
Much as I dislike JavaScript, every other language uses frameworks or "abstractions" to be more efficient. Why the hell would I reimplement all that crap myself in Python when Flask or Django have done 90% of my app for me.
> Most web apps I work with daily have highly sophisticated in-browser interactions that are built with JavaScript and can only be built with JavaScript: Flickr, YouTube, Facebook, Twitter, GMail etc.
Highly sophisticated browser interactions? You mean adding a comment? Which you actually just did on Hacker news using zero javascript.
None of those sites need or require a lot of Javascript. In fact, when I use Gmail I have the standard html version set to default. It's much faster and time to inbox is faster.
[+] [-] TimJYoung|10 years ago|reply
Everything else should be a traditional static web site with "decorative" JS, or "progressive enhancement".
[+] [-] Wintamute|10 years ago|reply
[+] [-] onion2k|10 years ago|reply
[+] [-] bphogan|10 years ago|reply
That said, the apps do need to remain accessible to those with screen readers, vision issues, motor impairments, and cognitive issues. They need to be accessible to those with slow or spotty connections. But search engine optimization is about content sites, not necessarily apps.
Static pages are perfect for those cases, and I am thrilled to see more people coming around to that.
[+] [-] enraged_camel|10 years ago|reply
[+] [-] todd3834|10 years ago|reply
[+] [-] lhorie|10 years ago|reply
> Most web apps I work with daily have highly sophisticated in-browser interactions that are built with JavaScript and can only be built with JavaScript: Flickr, YouTube, Facebook, Twitter, GMail etc
The web apps that I use the most day-to-day are Trello, Slack and Gitter. IMHO, those are better examples of js bringing actual value to the table that progressive enhancement simply cannot.
With that being said, the issue of overusing SPA technology when it doesn't fit the need is definitely real.
Part of the issue comes from people wanting to pad their resumes w/ experience in "hot" technology, or people who do have a genuine interest in improving their skills, but are not very skilled in identifying the pros and cons of whatever "new hotness" or "best practice" they read on their favorite news aggregators. By extension these creates an industry for grunt work to maintain/refactor/rewrite everchanging codebases/frameworks. Coupled with the general tendency of people to favor new shrink-wrapped libraries over doing good ol' painstaking research, it's difficult to reverse that trend.
[+] [-] workitout|10 years ago|reply
Never used them, are they important? What do they do? I've used all the ones cited as bad examples.
[+] [-] anupshinde|10 years ago|reply
[+] [-] Matachines|10 years ago|reply
[+] [-] workitout|10 years ago|reply
So was that a dig against jQuery?
>> People just don't appreciate how wonderful this language is.
Ever try adding a string and an integer in JavaScript?
I think developers who think JavaScript is great, love it because it's forgiving. Do what you want, the JavaScript siren says, it might work, it might not but at least you won't have an ugly all stop. I'll try to make it do something, anything, sings the siren.
[+] [-] dplgk|10 years ago|reply
[+] [-] mickael-kerjean|10 years ago|reply
[+] [-] falcolas|10 years ago|reply
We're watching programming and design concepts which played out back with Windows 95 being hailed as "the future of programming". We're being told "worse is better" as Javascript is being used to turn our computers against their users by serving as a vector for malware, ads, and bots.
It's a mess, and it's going to take pushes from the far side of the pendulum's swing to bring it back to a sane place. If the pendulum was actually stalled too far towards "server side only", I'd agree that "server side only" articles would be too much - but we're not there.
> We can only improve the user experience with the front-end technologies we have.
No, we are more than capable of degrading the user experience with the front end technologies we have. There's a lot of proof of that on the web today. Delayed full page ads, pop-overs for every third word, hijacked scrollbars, malware installs...
[+] [-] Wintamute|10 years ago|reply
[+] [-] XCSme|10 years ago|reply
[+] [-] donatj|10 years ago|reply
There is truly nothing about any of those that couldn't be done with a simple page refresh. Especially YouTube. Generally I find most of what JavaScript adds just irritating.
[+] [-] ashark|10 years ago|reply
Our children won't believe we ever browsed the web in a graphical multitasking OS on 64mb of RAM and a single processor core. "Don't be silly, that wouldn't even be 1/6 enough to load Google Inbox!"
[+] [-] esailija|10 years ago|reply
[+] [-] open-source-ux|10 years ago|reply
For example, should a blog rendered in the browser be treated as a web app? Or rendered principally as HTML and CSS before it's sent to the browser (rather than rendered in the browser using Javascript). Here's an example: over 300K of Javascript to render a plain text page: http://elm-lang.org/blog/new-adventures-for-elm
My impression is that some developers are beginning to treat anything rendered in the browser as a web app - even plain web pages with no dynamic elements.
Why do developers do this? Is it because it makes their life easier using a single tool or language for both dynamic and static content? Or is it because they want to learn a new framework that's popular or interesting? Whatever the reason, one has to ask if giving users a good and fast site experience comes anywhere into the equation.
[+] [-] paulddraper|10 years ago|reply
Mostly this. As soon as you add a bit of dynamicism, it becomes easier to do everything in one language/workflow.
[+] [-] dschiptsov|10 years ago|reply
[+] [-] __derek__|10 years ago|reply
[1]: https://speakerdeck.com/henrikjoreteg/the-evolution-of-the-w...
[2]: https://joreteg.com/blog/viability-of-js-frameworks-on-mobil...
[+] [-] sotojuan|10 years ago|reply
[+] [-] devit|10 years ago|reply
[+] [-] wcummings|10 years ago|reply
>We're convinced progressive enhancement remains an important aspect in pushing the boundaries of the web while still providing a robust experience for every user.
What?
EDIT:
>Your HTML is not more accessible or more semantic when it’s rendered on the server.
IDIOT DETECTOR TO FULL POWER. It is by definition more accessible if it doesn't require Javascript to render.
[+] [-] saurik|10 years ago|reply
[+] [-] merb|10 years ago|reply
Good that GMail is created with GWT. So it doesn't even use JavaScript. Most people are so afraid of JavaScript that they create hella crazy abstractions (Angularjs [Google], ReactJS [Facebook]) and even then Microsoft and Google trying to avoid JavaScript more and more with Typescript or Dart, they transpile it. JavaScript maybe is valuable in the web, but not in it's current form. So much quirks that you either spend using a library or fixing browser incompabilites.
A lot of people here said that transpiling will be used a lot in the upcoming years, and I think that also. People will more and more. They want to use Python/Scala/Java/Whatever on both sides. Also it will reduce complexity by a lot.
Currently your typical webapp has, a server-side technology (even if you are using node), then you have a web frontend which uses at least npm, however mostly you end up with npm, bower, gulp/grunt, webpack and whatever.
> Client-server architectures instead of monoliths
What about small teams? They can't split their monolith since it will become unmanageable. So if they have two apps, frontend/backend it's still a mess to manage these projects with a small team.
> We need to stop excluding JavaScript apps from “the web as it was intended”. JavaScript apps are “of the web”, not just second-class citizens “on the web”.
In my eyes JavaScript should be used where it is needed, but not used in stuff which never should touch it. JavaScript is still a mess.
[+] [-] cromwellian|10 years ago|reply
Inbox is built with GWT and Closure compiler together, as a hybrid app, with the UI done in Closure/JS, and the business logic done in Java so it can be shared with Android and iOS.
Over the past 2 years, the GWT team has been honing a new JsInterop spec that removes the majority of the impedance mismatch between Java and JS for hybrid apps, which will be part of the 2.8 release.
One of my favorite features is automatic Java8 lambda <=> JS function conversion. Any single method interface in Java, when marked with @JsFunction, can be passed to JS where it can be called as a regular JS function, likewise, any JS function can be passed back to any Java function accepting a single-method-interface, and it will pretend to implement the interface.
You can now write (GWT 2.8), code like $("ul > li").click(e -> Window.alert('You clicked me')) with only a few lines of code to interface with jQuery for example.
You can see a deeper dive here: https://drive.google.com/a/google.com/file/d/0BwVGJUurq6uVR1...
[+] [-] guscost|10 years ago|reply
I have to disagree. Adding another layer of abstraction cannot reduce complexity, although it might make certain tasks easier. Consider assembly vs C as an example: C can make lots of programming tasks way easier, but the complexity of the system is not reduced.
With C, this becomes apparent when the program segfaults. A garbage-collected language or Rust can prevent segfaults using a subsystem to manage and/or enforce memory ownership, again making lots of tasks easier, while adding even more complexity.
In the case of transpile-to-JS languages, these can fix many of the shortcomings of JS and make lots of tasks easier, but they are a more complex system which can cause additional work if the generated code fails at run time, and the browser debugger brings up something completely different from your source code.
Your point that we will see more transpiled languages in the future makes sense.
[+] [-] delambo|10 years ago|reply
[+] [-] collyw|10 years ago|reply
Much as I dislike JavaScript, every other language uses frameworks or "abstractions" to be more efficient. Why the hell would I reimplement all that crap myself in Python when Flask or Django have done 90% of my app for me.
[+] [-] rhinoceraptor|10 years ago|reply
Angular and React are not abstractions of JavaScript the language. They are abstractions of the DOM API.
[+] [-] dham|10 years ago|reply
Highly sophisticated browser interactions? You mean adding a comment? Which you actually just did on Hacker news using zero javascript.
None of those sites need or require a lot of Javascript. In fact, when I use Gmail I have the standard html version set to default. It's much faster and time to inbox is faster.