top | item 25813836

js;dr = JavaScript required; Didn’t Read.

230 points| ezist | 5 years ago |tantek.com | reply

215 comments

order
[+] mistersys|5 years ago|reply
Some people don't seem to understand what the whole JS SPA thing is about, and it's quite strange to me.

It's not popular because it's a fad, it's not about replacing good old static websites with fancy over-engineered JS code.

It's about making desktop-class applications more accessible via the web. Desktop-class Apps have lower latency requirements then server rendered frameworks are capable of delivering, plain and simple. You could certainly build Facebook as a fully server-rendered PHP app, but that would hurt Facebook's business because its servers would need to do more work and its users would have to wait longer for content.

Fully server-rendered frameworks are not capable of delivering low-latency desktop-class applications. If your app doesn't require low-latency updates, then you can certainly use a classic PHP or Ruby on Rails stack with no problem.

Sure, you can use JQuery style code to make your PHP app more interactive, but you're probably just going to end up with a messy, hard to understand JS code base eventually if you you don't have some sort of low-latency client-side declarative templating framework.

There's certainly some companies that are caught in the hype and build a SPA app when they would probably be better served with a simple PHP site, but on every project I've worked on that used React across a few companies (interactive apps for ex: cropping & manipulating images, building visualizations and reporting on data), not using an SPA framework would have slowed down development dramatically, or left us with a really poor product.

[+] kortilla|5 years ago|reply
> It's about making desktop-class applications more accessible via the web.

Everyone understands this is what devs are trying to do. The complaint is that my local newspaper doesn’t need a desktop class application. Nor does my bank, nor does Reddit for that matter.

There are vanishingly few websites that need a “desktop application” performance profile. Most websites are just viewing documents. The size of the SPA frameworks is frequently higher than the actual content being viewed.

Finally, if matching desktop app performance is truly the goal, the majority of SPAs fail horribly. Poor request patterns and transitions make the pages just as slow as server side rendered html. I would rather wait one second to get a JS-free response than look at another damn spinning circle for 5 seconds as the page sputters new elements in.

[+] pkage|5 years ago|reply
HN gets frothing mad about javascript, and to some extent I agree: sites that are fundamentally about content (news websites, blogs, etc.) should not be locked behind JS gates. The amount of code bloat on those pages tends to be related to tracking and advertising, so not running JS on those pages makes sense.

On the other hand, web applications that provide full-featured experiences are only possible because of the full spectrum of web technologies. Choosing not to run JS and claiming that Google Docs should work without JS is ridiculous.

[+] 4bpp|5 years ago|reply
> There's certainly some companies that are caught in the hype and build a SPA app when they would probably be better served with a simple PHP site

Yeah, like your example, Facebook. Ever since their redesign, I've switched to only using the mobile+noscript site (on desktop), because the SPA version is resource-hungry to the point that it regularly DoSes whatever browser thread it gets assigned to and has UX that, ironically, seems to be terrible as anything other than a mobile app (they've replaced text with abstract square "touchable" buttons and introduced airy spacing everywhere that allows you to see maybe about half a post per screen).

It's as if its designers have been trying to ram their mobile app down my throat for years (by nagging screens at first, and then by outright removing the ability to view private messages from the mobile page - except sometimes by refreshing sufficiently many times you could trigger a bug and still drop through to the old messenger view, adding insult to injury), and when I still didn't bite, they decided to replace the whole service (which up until then had been one of the last remaining decent mainstream websites) with a facsimile of one.

[+] aboringusername|5 years ago|reply
> It's about making desktop-class applications more accessible via the web.

I am not sure what "web" you're using, but as someone who uses noscript and has been enabling scripts for over a year now, I can firmly say JS is NOT used for making "desktop-class applications more accessible".

It's used for ads. And spying. Lots and lots of spying.

Seriously.

Googletagmanager/analytics is everywhere, it doesn't deserve to be, it doesn't need to be. That domain needs to die a painful, horrible death.

facebook, twitter, sessioncam, and many others are used to bloat pages, increase my energy usage, decrease my battery quicker and contribute to the wastage of energy on an unprecedented scale.

Just ask yourself, how much money, battery life, bandwidth is spent every year on downloading useless scripts that as far as I can see offer no value whatsoever. By selectively deciding what scripts to enable I get the folliowing result:

1: pages are lighter, less bloated, and STILL WORK

2: I download less scripts, use less battery and save bandwidth and energy for myself and all humanity.

3: There is less spying as fetch() requests are blocked and there can be hundreds of them across a single web-page session (you can watch really bloated pages for 10 minutes, there can be 100's of requests easily).

Test: load the following two pages, study their usage, test each with JS enabled/disabled:

1: https://old.reddit.com (with JS:2.69, without: 2.34mb)

2: https://reddit.com (with JS: 8.58mb, without: 8.10kb, but the page is broken...)

Baring in mind old.reddit works fine with JS disabled, showing JS is not needed for a site like reddit to work at all.

Yet, "web developers" use them as if it's nothing. Pages that execute over seconds, possibly MB's of data, all the additional requests that are made to enrich the likes of FB/Google et al.

So no, SPA are a scam and the web is worse today than I remember back in the 2000's, at least we didn't have cookie pop up boxes because people can't help but abuse JS.

Most of the GDPR violations I've found thus far are from scripts that have no place or purpose, that slurp up user data without remorse, that, if disabled, doesn't impact the functionality of the page, and that enables the great surveillance capitalism and data-raping we are seeing today.

[+] ori_b|5 years ago|reply
> It's not popular because it's a fad, it's not about replacing good old static websites with fancy over-engineered JS code.

Yet, in practice, this is what happens.

When turning JS off is an option, that's usually one of the biggest improvements I can make to my page surfing experience. Things load faster, ads don't expand over content, the page reflows less often as things are injected, shit doesn't autoplay, and spinners and animations don't distract. I can just read the content.

It's possible to use JS well, but I don't see it happen very often. And the unpleasantness from of abuse usually outweighs the benefits from good use.

[+] h_anna_h|5 years ago|reply
> its servers would need to do more work

So far I have seen no evidence that this would be the case (rather, I have reasons to believe the contrary).

> and its users would have to wait longer for content.

They would actually have to wait less. Recently two of the sites that I used to use switched to react. I was able to have open literally 100s of tabs with them, each of them loading almost instantly (excluding media). Now I can barely hold 2 open and they load slowly (even if we ignore the time that it takes to load the media).

[+] pmlnr|5 years ago|reply
> It's about making desktop-class applications more accessible via the web.

...

Go and take a look at a "desktop class application" from 10 years ago - say Photoshop 6. Compare the speed, the UI, the native look & feel.

Go to a SPA today.

Cry.

[+] austincheney|5 years ago|reply
No, it’s a fad. Most corporate JavaScript developers cannot actually write JavaScript of their own. They need giant frameworks to do the heavy lifting and the output of such is a SPA. If you really wanted accessibility you wouldn’t complicate your life with screen reader compliance via a SPA.

To be clear a SPA is generally a front end for user input, such as a form (or series of forms) clobbered together so that a page is loaded fewer times. This a traditional web path that exchanges page traversal for interaction, often to maintain state locally. Conversely, a browser application is an application that executes in the browser without regard for data and state, such as photoshop or a spreadsheet in the browser, which aren’t concerned with any server application.

[+] klyrs|5 years ago|reply
> It's about making desktop-class applications more accessible via the web.

The problem I see is folks unnecessarily turning their websites into desktop-class applications. It's especially popular on ecommerce sites -- today I tried to look at some lumber prices and the website had input latency measured in seconds before my phone's browser just crashed.

[+] mlang23|5 years ago|reply
> It's about making desktop-class applications more accessible via the web.

If we, for a second, stick to the meaning accessibility used to have, namely, usability by people with disabilities, quite the contrary is the reality. The SPA trend, "lets just move every app into the web" fuels the digital divide like nothing else. It has become harder and harder to actually use the modern web, and a lot of why that is comes down to SPA and JS.

[+] dsego|5 years ago|reply
The current fb/messenger interface is a disaster. Transitions are slow and they still didn't get the local state right and coordinated different parts of the UI. I've stopped using messenger because of this, managed to send messages to the wrong recipient more than once because of latency in the UI.
[+] kilburn|5 years ago|reply
> Some people don't seem to understand what the whole JS SPA thing is about, and it's quite strange to me.

Most people understand the promises of SPAs, but there are several forces that play against everything you said:

- Some websites don't need a desktop experience yet they go the SPA route.

- SPAs are -in my experience- significantly harder to get right than server-rendered apps. For sites that are a good fit for "the regular old web" we are speaking about at least an order of magnitude here.

- Oftentimes it is companies/products with significant resources that embark on the SPA ordeal. This usually means that they also have several teams demanding product analytics, A/B testing and what not, and hence their sites end up loaded with random shit (gtm, analytics, optimizely, facebook pixel and the kitchen-sink).

For all these reasons, it takes an extraordinary (i.e: significantly better than average) team, from the developers all the way to management, to deliver on the SPA promise.

As a result, most SPAs suck, and hence a lot of people cultivated an aversion to them. It really is that simple.

[+] mattl|5 years ago|reply
I don’t want to run your proprietary JavaScript on my computer.
[+] cosmotic|5 years ago|reply
All the popular SPAs I use have awful latency. It may have been a design goal, but huge failures all around.
[+] drinchev|5 years ago|reply
> You could certainly build Facebook as a fully server-rendered PHP app, but that would hurt Facebook's business because its servers would need to do more work and its users would have to wait longer for content.

I think they do that mostly because for the users who’ll have to download the header 100 times for each action they’ll do. Not really sure that all benefits from SPA are hidden in company costs. Mostly they are in modern tech approach.

Writing this I still agree with the article. SPA is needed when you are under authentication but publicly you can live with progressive web.

[+] cellar_door|5 years ago|reply
Also, SPA and server side rendering are not mutually exclusive. Both make sense as performance optimizations depending on the context.
[+] asddubs|5 years ago|reply
IMO you can have your cake and eat it too - just offer a fallback of some sort, it's good SEO anyway. I haven't done anything in react but it's my understanding that this is actually possible to do, have react deliver an initial pre-rendered page and then still have your fancy SPA on top of it
[+] shadowgovt|5 years ago|reply
To a first approximation, you can do all your heavy-cpu computation on the server and have the client be a close-to-dumb terminal or you can have the server do some preprocessing and the client do some postprocessing.

One spreads the load over CPUs better.

[+] wyager|5 years ago|reply
> Fully server-rendered frameworks are not capable of delivering low-latency desktop-class applications

Evidently, neither is JavaScript.

[+] bonoboTP|5 years ago|reply
People don't care about 10 years from now or edge case users. It matters how it looks today and that it drives engagement and profit now. People aren't making their websites for eternity. It's like lamenting that the billboards on the street are not archived or the menu card of the kebab shop is not available as it was 10 years ago. The web is there to provide functionality and satisfy business needs.

It's usually not the hobbyists who pack their sites with all the modern fancy js.

[+] paxys|5 years ago|reply
> Because in 10 years nothing you built today that depends on JS for the content will be available, visible, or archived anywhere on the web.

What does that even mean? Saying browsers won't support JS in 10 years is an idiotic claim. Even more so considering it was written 5 years ago.

Beyond that, I don't know why people think that the web should just be either simple pages for serving content or massive desktop-class applications. There is an entire world between these two that is perfectly valid. Yes I want to host a blog but I also want it full of widgets and other fancy JS. I want to use new frameworks and rewrite it every few months. It may be messy, it may be sometimes inaccessible, and yes it may not be available a few years from now. But the web is and always has been about creativity and expressing yourself in any way you want. Heck my personal site ran on Flash once upon a time.

People here are the kind who would complain about geocities or myspace pages back in the day ("why is it all so flashy? Why can't it just be a simple page of text?")

[+] giantrobot|5 years ago|reply
If your "web page" is essentially just a script tag and relies on content through first and third party APIs it's going to be pretty difficult to keep that running long term.

If an API shuts down, an incompatible change is pushed, or a JS CDN goes defunct in that JavaScript-only site breaks. JavaScript-only sites are pretty fragile.

If the same content was just a normal HTML document it could at least be easily archived. It's also trivial to keep online. Even if some CDN dies and CSS or JS doesn't load it's still readable.

JavaScript developers have never been great about progressive enhancement but it seems of late they've gotten worse. If the sole purpose of a site is to be an app then tons of JavaScript is necessary. But that's a minority of sites. A blog post or news article doesn't need to dynamically build the DOM with JavaScript and short circuit all of a browser's loading and rendering logic.

[+] leonixyz|5 years ago|reply
It has became ridicolous how many pages simply render blank without javascript, and many are just simple articles/text-and-images without any functionality at all. I agree 100% with the author of this article. The whole concept of throwing angular/vue/react just for a blog doesn't make any sense in the first place.
[+] osrec|5 years ago|reply
Some people seem unable to embrace the fact that the web is as much an app delivery platform as it is a content delivery platform.

You simply can't get away from this fact, even if you don't like it.

[+] aboringusername|5 years ago|reply
What staggers me is when sites like Twitter decide that JS is now required to read a tweet...Like it's literally some text like what you read on HN but it needs damn JS...Why?! The entire website could work as a simple .txt file for all I care, no reason why it should be so bloated.

All sites like FB, Twitter should by law be required to work without JS at a minimum due to how important they are in our society today.

[+] progval|5 years ago|reply
Twitter's goal is explicitly to have a single codebase that works everywhere: desktop website, mobile website, and mobile apps. (Sorry I can't find a source for this, any search with the "twitter" keyword is flooded with irrelevant results...)

It makes sense in that context; but I agree it's a shame they won't keep a small alternative frontend just to view tweets.

[+] shamas|5 years ago|reply
Why? Why should you be in a position to dictate to twitter how to operate. It's a multi-million dollar company with many smart people working away on the product. Who are you? If you just want walls of text and a completely unengaging product, stick to hackernews.
[+] hexo|5 years ago|reply
This. When I get blank page instead of content - I usually ignore the site completely. I do have first-party JS enabled, so if the site is actually JS built SPA and it serves its own JS then it renders normally without me noticing anything. But when I have to elaborate which of the fifty 3rd party script serving servers I need to enable I gladly skip the whole site. Happy to see I'm not alone here.
[+] shamas|5 years ago|reply
Why would you browse the internet this way? Why not just enable the scripts and close the site if it's annoying?
[+] no_wizard|5 years ago|reply
Strong opinions follow

Flatly, this is my observation and I hadn’t as of the time of this post really seen it mentioned:

JS is over used not just because of tracking and ads, though this is a big part with all popular websites I’ve visited in the last 30 days (side note: thanks uBlock origin!)

It’s also in large part because I strongly believe frontend developers are re-enforced to think this way. I see a lot of blogs, community meetups and conferences organized around leveraging JS and specifically the large frameworks, which is okay! However it only re-enforces that JavaScript first solutions are implicitly better, rather than an emphasis on leveraging the whole tech stack correctly. I have friends that I respect very much who have largely gotten by with just a passing knowledge of CSS, HTML and hasn’t yet gained a deep understanding when it’s more appropriate to leverage those technologies over JS, let alone the trend of pushing so much work to the client (such as, not even bothering to scope APIs correctly. How many times have you had to sort the results of an API request because it is not sent over the wire sorted for it’s use case, even though your team controls the API?) The industry does not enforce wholistic thinking when it comes to this. That is the real problem to me.

Web components are somewhat an exception to this, as API considerations go they do attempt to strike a balance between server side rendered content and dynamic client side content, our industry just isn’t heading in a direction where that balance was struck

[+] aboringusername|5 years ago|reply
I mean all it takes is one script to not load or render, or for a script to not do what it is supposed to and your page is dead. Like a regex that doesn't accept .xxx gTLD, that one line of code is now broken as the web is changed.

I am not sure websites are designed to be anything more than "it's here today in this society, in this context, and may break in a few hours to a few years". Much of the "early" internet is now gone, just as FB, YT and others will also one day, be "gone".

It's entirely possible the concept of "data storage" may also one day be gone, we can't be sure the technology of today will be there in 5 years or 500 years, we'll all be long dead by that point anyway.

This message will likely be read by humans alive today, right now, and never seen again for the rest of time. Not everything needs to be archived and remembered.

[+] kubanczyk|5 years ago|reply
"My name is Ozymandias, King of the Kings/Look on my posts, ye mighty, and despair!"
[+] ehutch79|5 years ago|reply
But...

We need our app to be accessible on essentially 4 different platforms. Even if we had the resources to code and maintain four native applications, getting users to update is like pulling teeth. Full page loads are out, because we don't want to rerender to things like reordering a list. Theres a whole bunch of UX quality of life things that go away if you remove javascript...

Honestly, I'd prefer that our internal business app DIDN'T appear in google...

[+] colejohnson66|5 years ago|reply
I wish there was a better alternative to Electron’s method of giving each app its own copy of the browser. It makes sense for compatibility reasons (your code may only be tested on one version of Chrome and later ones may break it), but most apps don’t need that. Using the built in Chrome copy in a “frameless” mode would be nice.

JavaScript (through TypeScript) is a very nice programming language, but it gets a horrible rap from bad or lazy programmers. It’s kind of like how PHP 5 was. Everyone still rags on JS for things that ES6 fixed just like how people rag on PHP for things that have been fixed since PHP 7.

[+] battles|5 years ago|reply
I usually navigate the web with javascript disabled and encounter this a lot with articles that require scripts just to display words on a page. A really weird example of this is engadget articles. They display an empty page if you have javascript disabled. The only reason why is their css has the html tag set to display:none. If you uncheck that css prop in the dev tools then the content appears normally. Weird choice on their part.
[+] lcall|5 years ago|reply
Some are helpfully pointing out here that the web is now used for both A) content delivery and B) app delivery, and the B parts are hard to do w/o JS.

I tend to browse with JS disabled as a default (images too, that's another conversation maybe), and for sites that are really important to me, I enable it in browser settings, just for that site (sometimes temporarily, depending). I leave a couple of tabs open that I can get to very quickly for that purpose, but fortunately most of the time I can ignore sites that require JS.

Reasons include speed, convenience, and security. (I also do most browsing in separate user accounts, depending on the security level of what I am doing, and what other data that account handles.)

Edit: Sometimes for those same reasons, or for automation purposes of some tasks (like checking a particular page for a certain fact that I want to act on when it changes, such as some security updates), it's nice to be able to use text browsers like links (or wget & curl) too, and have the info available w/o requiring JS for it.

[+] corobo|5 years ago|reply
I might not be able to read this 10 years from now so I'm not going to read it now.

Bizarre argument. Seems like something you don't need to worry about but whatever floats your boat

[+] mrkeen|5 years ago|reply
It's the 'voting with your feet' argument.

It's a presponse to "my site is js; if you don't like it, leave it!"

[+] approxim8ion|5 years ago|reply
I went out of the way to exclude JS from my blog, partly for the fun of it, to see how far I can go before I need it, and partly for reasons mentioned here + optimizing for speed.

One of the things I enjoyed doing without JS was using a comma-separated tag system for posts and then filtering the post list by tag.

[+] dgb23|5 years ago|reply
Did you use checkbox labels with it? I like the technique, but am concerned about accessibility.
[+] azangru|5 years ago|reply
> Because in 10 years nothing you built today that depends on JS for the content will be available, visible, or archived anywhere on the web.

What? That doesn't even make sense. Javascript and browser vendors are going out of their way to be backwards compatible (to the point of rejecting the perfect Array.prototype.flatten in favor of the ungrammatical Array.prototype.flat so as not to break websites that depend on MooTools). Is the author implying that in 10 years the javascript of today will no longer be executable by the browsers? Why?

[+] jw14|5 years ago|reply
> Because in 10 years nothing you built today that depends on JS for the content will be available, visible, or archived anywhere on the web. > All your fancy front-end-JS-required frameworks are dead to history,

Pretending to know anything 10 years in advance is foolish.

There’s a degree of contempt here. That makes me evaluate this as a rant.

If my target audience is developers, I’ll consider using server side rendering. For normal people I doubt it matters either way.

[+] ian-g|5 years ago|reply
This makes sense to me for articles and the like. If the point of the page is information, it makes sense to me that the page should load content first.

OTOH, twitter, slack, and package tracking are consistently changing. It makes more sense for something like that to load content via JS, since there's no guarantee the page looks the same between two loads

[+] Jugurtha|5 years ago|reply
Consider using the JustRead extension[0]

Yes, it requires JavaScript but consider it "good" JavaScript usage to combat "bad" JavaScript isage. You can then hit Ctrl+Shift+L on Chrome, and it will get rid of all the noise in the page, and display the article in 70-ish characters. The experience without the crap is really cool.

- [0]: https://chrome.google.com/webstore/detail/just-read/dgmanlpm...