top | item 32650996

JavaScript hydration is a workaround, not a solution

172 points| fagnerbrack | 3 years ago |thenewstack.io

231 comments

order
[+] onion2k|3 years ago|reply
...get a PageSpeed score of 100/100

I'm slowly coming around to the idea that PageSpeed (or Lighthouse, or Core Web Vitals, or whatever Google has invented this week) is what drives a lot of the complexity in web app dev. People refuse to throw out what they've learned, so every time there's a new metric to chase in case you lose SERPS ranking for being slow! devs heap another layer of complexity on to the Webpack bonfire.

Hydration is an example of this. People chased 'first contentful paint' and 'cumulative layout shift' timings because that's what Google told everyone they needed to optimize for. That meant reducing the amount of upfront work done in JS, pushing some barebones HTML and CSS to the client for those sweet, sweet metrics, and then running a massive bundle of deferred JS to make it do anything. Google is pulling that rug with Time to Interactive, First Input Delay and (coming soon) Interaction to Next Paint, so now devs are trying to write the same website but have the server strip out the code they wrote (eg Remix.run).

Everyone wants a fast website. No one wants a heap of fragile complex build tooling. The answer to the first problem is to stop trying to solve the second problem with MORE TECH. Go back to fundamentals. Just make something that works with with HTML and CSS alone, and enhance it with JS. You don't need to be clever about it, especially if the level of interactivity on your website amounts to basically a form.

[+] dmix|3 years ago|reply
The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.

The three recent developments I've noticed:

- "Islands" in Deno https://fresh.deno.dev/ and https://remix.run/ where only small isolated parts get hydrated, instead of the whole page

- Using http://linear.app style data-flows ala Replicache (https://replicache.dev/) where JSON data is preloaded for all sub-links and stored offline-first like a desktop app, so clicking on anything loads immediately w/o waiting for network requests to finish

- Now with 'resumability' where the server-side framework was built with client hydration in mind and delivers the bare minimum event/DOM data necessary to make the page interactive (instead of just being a glorified HTML cache load before the usual JS activates)

For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance and interactivity on the web. Or pure server-side performance but with full JS interactivity.

The next set of frameworks is going to be as big of an evolution the way Angular/Backbone->React/Vue was ~8yrs ago. But it's going to require new backend server frameworks, not just a new client framework. There's probably a big opportunity for the project that can combine this stuff properly.

[+] BeefWellington|3 years ago|reply
> The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.

It's getting much much better but performance is only "identical to desktop" if you ignore anything about its resource usage or speed increases in processors over the past decades.

> For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance and interactivity on the web. Or pure server-side performance but with full JS interactivity.

For people following JS these are examples of constantly relearning past lessons. I'm not sure how anyone could reliably expect 100+ms round-trip time (on a good connection) to offer the same experience as something local but I think what it actually means is that the people writing JS-based software haven't actually used a native desktop app for years and have done mostly web-based things.

You could be forgiven it since HTML/JS as a user interface design language appears to have taken over completely, to the point where even the most popular code editors are now web browser-based.

Seriously though, go load up any natively compiled app on your OS of choice and compare the speed of it doing any given task to what you get out of web-based versions, electron versions, etc. There isn't a comparison.

My griping aside, I recognize JS as a language is here to stay and it's important to stay on top of its developments and improvements.

[+] doix|3 years ago|reply
The things you have listed minimize the impact of network latency, they don't affect the rendering performance which is still a big deal. Apps that need to render large amounts of data still kind of suck, you'll see many apps "virtualize" things. So rather than having 10,000 elements, you have however many fit in your viewport + N and as you scroll they get reused. The tearing hurts my soul.

Compare the scrolling in Excel 97 + Windows NT to Google Sheets/Office 365. It's night and day. The webapps that render everything with WebGL do preform better, but then you have non-native widgets.

I hope this problem gets solved one day.

[+] thethirdone|3 years ago|reply
> The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.

Web browsers in general are not able to match applications on the desktop. Additionally, typical JS frameworks come with at least a 2x performance penalty compared to hand-optimized vanilla JS (Not a commonly done thing).

Being excited about getting reasonable performance with a great development environment is fine, but deluding yourself into thinking that its great performance is not.

[+] cxr|3 years ago|reply
> For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance

Funny, my desktop itself is already written in JS, and supports/integrates with apps written that way, too, (and also that aren't), and that's been the case for a while now. And the same has been true of apps from the Mozilla platform lineage for even longer; Firefox has been an Electron-style app for 100+% of its lifetime, for example. Talk to any self-styled JS programmer for any length of time, though, and it's like these things don't even exist—like the latter was actually invented by the Electron folks, and the only thing that made JS development viable generally as a serious applications programming language is the trio of NPM+Webpack+React/React-alikes.

It's overall not worth taking their opinions at face value. They tend to be the ones who are "not following JS". They're worshipping some weird toolchain cult and calling it JS. Indeed, judging by the compulsion to try to work around ostensible deficiencies in the language and deal in cargo cult advice (esp. e.g. concerning things like `==`/`===` and `this`, and insisting on NodeJS-isms like its non-standard `requires`) it's evident that they actually hate JS, despite what they're likely to say to the contrary.

[+] Existenceblinks|3 years ago|reply
Ruby, Go, Rust, Python, PHP, Elixir folks rolling eyes. The next web framework will not be limited to only one language and these coupling paradigm.
[+] silasb|3 years ago|reply
You did a good job summarizing the frontend framework evolution, but I'm curious where you think the evolution in backend frameworks are going? I was thinking LiveView, but I also think WASM could come into play as well.
[+] lelanthran|3 years ago|reply
> The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.

They're not even within orders of magnitude. What makes you say that the performance is nearly identical?

[+] samwillis|3 years ago|reply
While I think the "resumability" that Builder have developed for Qwik is very clever, I increasingly prefer the approach taken by HTMX and Alpine.js. Move back from JSON apis and render your html fragments on the server (you need to anyway!). It removes so much duplication of logic, simplifies your tool stack and reduces your risk of vulnerabilities.

I would even be tempted to say, both "Hydration" and "Resumability" are workarounds.

Attaching your event handlers to the DOM via HTML attributes rendered on the server à la HTMX, Alpine.js or god forbid even 'on{event}=""' attributes removes so much of this complexity.

There will however always be places for client side html rendering though, the closer your webpage becomes to being an "App" the more likely you will need to do client side rendering, especially if you want to work offline.

[+] graboid|3 years ago|reply
I just had to do a site which involved a fair amount of interactivity. A sidebar with different types of filters, a search bar at the top, and a complex boolean filter builder inside a modal (think like infinitely-nestable AND/OR filters, where each filter involved 5 dropdowns interacting with each other). Then some area to collect the results and do some operations on them. I was feeling brave and decided to skip Vue which we otherwise use for complex UIs and do the whole page with HTMX and nothing else. And it worked.

It took me longer, thats for sure, but also because I had to get a feeling for how to approach many things I would have just routinely plumbed together with Vue. And the feeling at the end was really satisfying, a fast-loading, complex interactive page, powered by ~10kb of Javascript, with all filtering logic, validation etc. completely server-side (we use ASP.net). And I have confidence it will work exactly as it is years from now. With our Javascript tooling, I expect things to break by that time.

That said, I certainly pushed the library in parts, and the vote is still out on how easy it will be to understand the flow of interaction if some time has passed.

[+] AngeloR|3 years ago|reply
I haven't dug into Alpine.js, but I've been messing with HTMX on some personal projects for a few months now and am really pleased with it. It's honestly made it a pleasure for me to return to front-end work.
[+] ricardobeat|3 years ago|reply
This is a very confusing, and then disappointing, article.

It starts by describing what is, essentially, plain javascript or how you'd use jQuery in the old days. Attaching events to the DOM at startup is not hydration.

Then it starts describing what seems to be React's particular flavour of hydration, where it re-renders the entire app before applying its diff/reconciliation algorithm. But in a very roundabout way, it's not clear what they are even talking about. Not every framework works like that. And, it's definitely not the heaviest part, diffing the DOM can be extremely fast. The slowness comes from, erm, everything else and just the sheer amount of code these frameworks run.

By the third section, I had the feeling this would be an advertisement for Qwik, because of the 'closure around app state' concept, and how they appear to believe that lazy loading everything is the best idea since sliced bread. Bingo :)

[+] RestlessAPI|3 years ago|reply
Guys, multiple things can be true at the same time.

1. Webapps are largely overloaded and dont need to be as huge/complicated as they are.

2. Webapps are an objective computing miracle that brings full app functionality to tech illiterate people that is platform agnostic, everywhere on the planet, and thus most of their complexity is justified.

3. Performance for performance sake is never a hill to die on as it always leads to increasing complexity.

4. Performance when performance matters, is invaluable, and you should be prepared to make dramatic concessions for it.

5. Chasing metrics where the value is not something you easily understand, is usually never worth it.

6. Chasing metrics whose value is obvious (for example, PageSpeed) is worth it.

When websites should be static, they should be static. When webapps should be webapps, they should be webapps.

Software is a tool. A means to an end. Getting hung up on things like this isnt worth it. Just focus on delivering value to your users, and making it a good experience. Sometimes thats best achieved through a static website, sometimes its through a webapp.

[+] jozzy-james|3 years ago|reply
and when websites are kind of apps, but not really so you need the SEO benefits of regular websites yet the interaction of apps....then you get into the fun bits
[+] simonbarker87|3 years ago|reply
I started in web dev 17 years ago, but didn’t get my first professional dev job until 3 years ago.

I had never had to learn angular, react, separated backend and frontend when writing code for my own stuff and my own company.

I would regularly process as much as I could on the server and ship the HTML and a JSON object to the browser and then just use native JS or more SSR from there.

My sites and apps were complex, just as complex as the stuff we were making in my first two dev jobs but the dev ex was so much nicer.

Shipping a heap of JS to the browser and getting that to do the heavy lifting of making the HTML etc just felt like an anti pattern but I went with it because “that’s the way we do it now”

Seeing Remix, Laravel, RoR still going, Astro etc is starting to convince me that perhaps separated FE and API isn’t the one true way and the old way might have been better.

[+] rglover|3 years ago|reply
The overhead is arbitrary if done properly. I did this in Joystick [1] and was shocked at how overcomplicated folks make it.

You're literally just saying "render to static HTML on the server, and on the client, have a way to render a root component to screen and attach event handlers." Without any serious thoughts about optimization (practically none yet), a no-cache refresh/mount takes 227ms to DOMContentLoaded and 696ms to a full load.

Here's the SSR I do:

https://github.com/cheatcode/joystick/blob/development/node/...

Here's the mount ("hydration"):

https://github.com/cheatcode/joystick/blob/development/ui/sr...

The only "magic" is that I embed the hydration logic into the built JS for the current page and it fires automatically on load (no need to add manual hydration code).

[1] https://github.com/cheatcode/joystick

[+] jstanley|3 years ago|reply
Are you saying it takes 469ms to attach event handlers? That's well over a billion clock cycles. That doesn't sound efficient.
[+] efields|3 years ago|reply
Incoming rant: I’ve had to do more hands-on hydration work as I explore static site generators and I’m just deeply unhappy with the state of front end tooling. I now have a taste for

* directory based routing and opinionated defaults that give you the basics to string together html pages with reusable partials and be production ready in minutes (think rails)

* postcss (css tooling)

* reusable and compostable components

* hmr-style dx

* serverside generated pages, as inlined as possible — fastest experience for end user, avoids js if not needed

* not having to think how and where css or images get compiled

Vitejs looked headed that way but recenty dove into it and… it ain’t it yet. Getting something to render on a server then “turn on” with react in the browser is not straightforward.

Remember how much sense $(document).ready() made? Hydration should be that easy and it is not.

[+] bern4444|3 years ago|reply
Take a look at remix.run. It has most of what you list and the rest can be easily added on.
[+] the__alchemist|3 years ago|reply
> In web development, Hydration is a technique to add interactivity to server-rendered HTML. It’s a technique in which client-side JavaScript converts a static HTML web page into a dynamic web page by attaching event handlers to the HTML elements.

This is how I use JS. I may be misunderstanding something, but this is a nice way to use JS to add targeted interactivity to a page while keeping load times and interaction-latency low. Modern JS is good enough for this purpose without the webpack/VDOM/bundling/dependencies that have made many websites sluggish.

[+] commandlinefan|3 years ago|reply
That's how everybody used Javascript when it first came out, coming up on 30 years ago. This is how it was designed to be used in the first place.

This seems to happen with everything - somebody solves a problem, somebody else gains a very partial understanding of the solution, adds unnecessary hacks on top of the solution that they thought they needed because the didn't spend any time understanding how the solution actually worked, and somebody else comes along and adds back the things that were already in the original solution on top of the hacks so that it works the way it always worked, but much slower and in a way that will break in surprising ways when you least expect it.

[+] robertoandred|3 years ago|reply
Their definition is wrong. Hydration takes static HTML and replaces it with client-rendered HTML.
[+] Existenceblinks|3 years ago|reply
Yep, some duplicated templates where they need to be is dead simple and cheap. I've been thinking about this for many years and still think that is by far the best.
[+] pwdisswordfish0|3 years ago|reply
I thought that was called "progressive enhancement". I like that too
[+] fwip|3 years ago|reply
I'm not sure if this is reflective of a normal Qwik project, but in the TODO app, javascript fragments are not even downloaded until the event fires, which adds a very noticeable latency.

e.g: on first click (to check off a TODO item), my browser went and fetched q-6642ef59.js, which has the below contents and in turn caused the fetch of two other javascript fragments, which in turn caused a fetch of two more.

    import{d as o}from"./q-dd8cb722.js";import{t}from"./q-3d9b01e7.js";const s=o((({item:o,todos:s})=>t(s,o)));export{s as Item_onRender_on_click};
Looking at the network tab, this took a full quarter-second to finish fetching on my Macbook on fast office internet. They cite "50 ms to ready for interaction" for this demo app, but it's really 300ms until the the interaction begins.

Perhaps judicious use of preloading javascript and/or CSS animations could hide this from a user, but it seems icky, especially if you're targeting mobile with much higher latencies.

[+] lucideer|3 years ago|reply
While the points against hydration are valid, the proposed solution sounds to me like it would have other (opposite) performance problems:

i.e. hydration has an inherent startup overhead, and possibly a continuing memory overhead (though the latter doesn't seem inherent), whereas resumable sounds like it may start fast, run slow (and I can't see where they solve the mentioned memory overhead problem).

I haven't tested this myself so since they have some Qwik demos linked I gave them a try. Anecdotally, startup seems very slow in them. I thought this was meant to be the problem being solved. Maybe the component download is fast but the framework download/parse is the slow part?

Either way, none of the examples are complex or interactive enough to properly test ui latency so I don't know.

[+] beebeepka|3 years ago|reply
I've been trying to stay away from doing public facing web apps for a decade, so, from my point of view, hydration doesn't make any sense.

Authorization is in the browser, meaning I get to serve the login page "real fast".

That said, I get the reasons people do it, but frankly, it doesn't sit well with me. It's guaranteed that the whole process is unreasonably complicated as opposed to serving a few static files that can be cached and everything.

I'm theory, it could benefit people browsing without JS but I am not sure if things work out this way. Do next next require js on the client to display whatever was "rendered" on the ui server?

[+] Existenceblinks|3 years ago|reply
A symptom of everything js. The desire to write template once in one language and ignore that role of frontend and backend are different, so are business logic usually. At the end of the day some duplicated codes are simpler and better solution.
[+] cutler|3 years ago|reply
Of all the terms in tech "hydration" makes me cringe the most.
[+] firasd|3 years ago|reply
Anyone tried Phoenix Liveview recently? Seems like a solution to opt out of this kind of situation entirely.. I'll be exploring the framework
[+] pier25|3 years ago|reply
Misko is right of course, but it remains to be seen if Qwik will be everything it promises to be.
[+] togaen|3 years ago|reply
Burn it all down. Just stop using JavaScript.
[+] pier25|3 years ago|reply
Good luck using even a barebones site like HN without JavaScript.
[+] aliqot|3 years ago|reply
Maybe I'm just getting old, but Javascript jumped the shark at some point. Hydration, lazy loading, managing flashes of unstyled content, a lot of this is built to address things that wouldn't be problems if we treated the browser as the dojo that it is and not be so dang wasteful.

im sure someone with shinier boots than mine will pop in and tell me how im wrong, and perhaps youre right, but the web was a much better place without all the toolchain and shenanigans. We must return to fundamentals.

[+] erokar|3 years ago|reply
I agree with you from a technical standpoint — something created for displaying and navigating between documents is being misused for distributing apps.

At the same time though none other model affords the same distribution, write once run everywhere. No other distribution platform can compete. And users have become accustomed to the web being apps as well as documents. In fact, many have become accustomed to more or less treating their web browser as an OS (or more realistically, not giving the distinction any thought).

Maybe an interesting way forward could be a web model specifically for apps, which of course many would say would be WebAssembly. The HTML/CSS/JS combo has been remarkably successful and resilient though, so I'm not convinced it will be dethroned easily.

[+] swatcoder|3 years ago|reply
My boots are pretty dusty too, and a web browser isn’t usually how I usually want to engage with a tool or toy.

But the systems architecture model of infinitely beefy backends with simply-adequate thin clients predates both of us. Browsers, the cloud, and all this obnoxious javascript tooling are the contemporary implementation of that and not without reason. They do it pretty well!

Like with any new tool, people get carried away and start using it for things that really don’t need it. I’m with you that most websites don’t need all this stuff and get caught up in it anyway.

But that’ll burn off, and in the meantime, we’ll end up with a rich, mature thin client system for the solutions that need it and that system will be around for decades. No sharks jumped.

[+] sopooneo|3 years ago|reply
I remember when HTML5 and CSS3 were just in the RFC phase and many web developers thought they would make their jobs simpler. Because the things they were being asked to do, for which they were currently creating tortured workarounds, could be done directly with the new standards.

But of course the job did not become simpler. The fixed value turned out not to be what clients would demand, but the quantity of tortured workarounds devs could be enticed to endure.

[+] phailhaus|3 years ago|reply
"Am I out of touch? No, it is the children who are wrong."

This is an old, tired take that boils down to "we should be ashamed for wanting nice things." If only everyone would just accept simple websites like HackerNews! This is backwards, it is user-blaming. It turns out that the web is an incredible platform that has revolutionized the world, and thousands of people have worked hard to build tools that make it easier to develop on. "Web fundamentals" can't give you Google Docs.

[+] waboremo|3 years ago|reply
Do you have any proposed solutions for the problems that exist? Going back to fundamentals doesn't eradicate the problems, it will only recreate them in a slightly altered state.

One of the most commonly proposed solutions is we keep the web to only static content. Well now we've just shifted all the media-heavy interactive content onto a dedicated app instead of the browser, shifting all the same exact problems onto a new platform instead.

[+] qsort|3 years ago|reply
You're not wrong, the websites I enjoy the most are basically all static.

I think the problem is that the "web fundamentals" aren't that good to begin with. A web application is very often the least bad solution, but you're not going to have "rich" web applications without tons of JS. Show the average user HN and they won't like the interface.

[+] scrollaway|3 years ago|reply
> Hydration, lazy loading, managing flashes of unstyled content, a lot of this is built to address things that wouldn't be problems if we treated the browser as the dojo that it is and not be so dang wasteful.

What's your point here, exactly? That if random wordpress blogs and recipe websites were less wasteful, the problems these solutions are addressing would not exist?

I can make you an extremely non-wasteful webapp which still needs to display a hundred images on a page (because reasons), so lazy-loading the images is still important.

I can find you a very well-optimized website that is only a few kilobytes, but still loads slow as shit because their network is bad and I'm on a terrible 3G link. FOUC would still be an issue.

[+] SketchySeaBeast|3 years ago|reply
> if we treated the browser as the dojo that it is

I have no idea what this means. Isn't a dojo a place for learning or meditation? I don't understand how that fits the browser, the internet, or web development.

[+] guipsp|3 years ago|reply
I thoroughly agree, society was much better before computers came along.
[+] wruza|3 years ago|reply
We must return to fundamentals.

These are not fundamentals. Overtone window has shifted out of its initial position completely, but browsers ignored it for two decades and offloaded that to webapp developers. Web 2.0 is not a browser, it is what became possible with everything people have built outside of it, on top of “take it or leave it” attitude. Web 1 is an archaic network of winword-level documents that is a huge step backwards in ui, ux, common sense.

We must not return to anything, browsers must get their ass up and running towards what other people achieved through hard work despite all the obstacles.

[+] pier25|3 years ago|reply
The core issue is that browsers were not made to provide the sophisticated features we require these days. Because of this fundamental problem, you can't build something like Gmail or Spotify without increasing the complexity of development exponentially.

The "SSR + sprinkled JS" paradigm is still totally valid though for many use cases.

[+] intothemild|3 years ago|reply
Developers, just coding their websites, not a toolchain in sight.
[+] cxr|3 years ago|reply
> but Javascript jumped the shark at some point

Contemporary frontend, in-browser app development, you mean. JS is a programming language. By conflating a language with a particular culture of software development, you implicitly transfer more power to that culture, the people in it, and their practices, even though your message expresses a clear desire for the opposite.