top | item 47029339

JavaScript-heavy approaches are not compatible with long-term performance goals

178 points| luu | 15 days ago |sgom.es

234 comments

order

carshodev|14 days ago

This title is very misleading, it should be "Why React is not compatible with long-term performance goals"

And I do agree generally. React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.

People relying on bloated React packages is obviously not great but that is nothing to do with javascript itself.

The JS engines are all relatively fast now. And the document model of the current web provides major accessibility to both humans and search tools like SEO and GEO. JS is not my favorite language. I would rather the web was based on a statically typed language that had better error handling practices like Go. But this will most likely not happen any time soon as it would require every level of the ecosystem to adapt. Browsers, Frameworks, Developers etc.

Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

If you want faster webapps just switch to sveltekit or vue or qwik. But often the ones choosing the framework for the project have not written much code in years, they know react is as safe option and used by everyone else so they follow along, if it gets slow its a "bug" causing it as they built apps that were "good enough" before using it.

socalgal2|14 days ago

> Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

They already tried. It was called Dart and for a while there was an experimental flag to enable it directly in Chrome. It was cancelled and Dart was relegated to transpiling to JS/WASM.

hinkley|14 days ago

The reason I'm a backend dev at the moment is that I looked at the React model and decided I didn't want anything to do with this insanity.

I've been appalled by how long and how broadly the mass hysteria lasted.

netdevphoenix|14 days ago

> Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.

This is not something you really want to happen for the health of the web tech ecosystem. I am surprised to see actual developers nonchalantly suggesting this. A type system for the web is not worth an IE 2.0

cyberax|14 days ago

> React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.

I strongly disagree with this. Svelte/Solid/Vue all become a twisted mess eventually. And by "eventually" I mean "very very soon".

The idea to use proxies and automatic dependency discovery looks good from the outside, but it easily leads to models that have random hidden interdependencies.

React's rendering model is simplistic ("just render everything and then diff the nodes"), but it's comprehensible and magic-less. Everything is explicit, with only contexts/providers providing any "spooky action at a distance".

And the recent React versions with Suspense neatly add the missing parts for async query/effects integration.

> If you want faster webapps just switch to sveltekit or vue or qwik.

If you want even worse webapps then switch to Vue and forgo being able to ever maintain them.

LtWorf|14 days ago

Saying that go has good error handling is… bold.

Letting google implement a new language for the web would probably result in something even worse than javascript.

ragall|14 days ago

> The JS engines are all relatively fast now.

That's only if you're limiting yourself to laptops. In the smartphone space, the median device (even in the US) is an Android running 2-4 Mediatek cores with a 10-year-old design. On those devices React is unuseable: it can take 5-10 seconds just for initial parsing of all the React code.

kumarvvr|14 days ago

Do you consider Angular to have a better rendering system? Or is it similar to React?

Asking because I use Angular and want to learn other frameworks in case Angular is just as bad for long term.

icyJoseph|14 days ago

Didn't the React team present an exploration of Performance?

- https://www.youtube.com/watch?v=uAmRtE52mYk

It showed that it is fast already, faster with the compiler, and could even be way faster than signal based approaches, and even faster if it dropped some side-effects it does. Pulling from memory here.

criticalfault|14 days ago

> I would rather the web was based on a statically typed language that had better error handling practices like Go.

So, dart?

Back in the day Google had plans to make a dart VM in Chrome, but powers that be didn't like this idea.

So here we are. it's 2026 and the web is still limited to one language. Dare I say the only area where we don't have alternatives.

owenpalmer|13 days ago

> People relying on bloated React packages is obviously not great but that is nothing to do with javascript itself.

JS is slow. If React was written in C++, the performance wouldn't be an issue. Then again, React is trying to solve DOM issues, not JS issues.

pjmlp|14 days ago

I keep enjoying SSR with Java and .NET frameworks as much as possible since the 2000's, no need for Go alone.

React is alright, when packaged as part of Next.js, which basically looks like React, while in practice it is SSR with JavaScript.

fud101|14 days ago

I'm tired of HN complaining about react for dumb reasons. To be clear, performance and bundle size matters for some things - if that's your complaint, whatever, i have no issue. My issue is there are whole host of use cases for which React is perfect, it has a functional declarative style which favours human understanding and correctness. This is a good thing. This is what the core of modern React gives you with functional components and hooks. You can do a lot with the core these days. Svelte and Vue are inferior for all the reasons React is functional and those other choices are imperative. I could go on but stop trying to make Svelte a thing, no one is buying.

ericmcer|14 days ago

Why is Vue, Svelte, Qwik etc. faster?

React done poorly can be slow. Bad code can just force the virtual DOM to diff unnecessarily 1000s of times and your app will still work fine for the most part. That said, if you are intelligent about when you force React to reconcile state changes with the DOM it is very efficient.

All of these frameworks purpose is to synchronize the state of your app with the state of the DOM. There is no getting around that, that is why they exist. Vue uses a virtual DOM with diffing as well, I am not sure how Svelte/Qwik do it but I imagine they have a similar mechanism? Like you can't get around the fact that at the end of the day all of these frameworks exist to say: "The data in the users app updated, what DOM updates do we need to make to reflect that?".

Anyway that was a tangent but, I think React is perceived as "slow" or buggy because the vDOM & reconciler are so flexible that it allows people to do extremely dumb things while coding and it will still work.

slopinthebag|14 days ago

React's model is fine, it's just the wrong language. React implemented in Rust for example can be much faster, although you pay the cost differently for wasm <-> js communication.

rorylaitila|14 days ago

I've skipped the whole 'modern' web stack. I've stayed SSR first, hrefs/forms/url as only routing primitives, progressive enhancement, vanillajs-islands only where absolutely necessary. Its great. Apps never randomly break. No build hell. UX easy to debug (just look at the HTML). No random performance degradations. No client has ever said it feels dated. Just finished my first app-like PWA, also SSR. Getting great compliments on the UI and slick interactions using just native browser transitions. The vanilla web stack gets better and better! Honestly don't know what people think they are gaining with a heavy frontend.

codr7|14 days ago

Likewise, I never liked JS much, nor the frontend dev experience.

I started out with the Seaside framework, but I've done several variations on that theme in different languages along the way.

It goes something like this: A typed server side DOM with support for native callbacks, generates HTML and hooks up callbacks. Changes are submitted to the server similar to traditional HTML forms, but using JSON. Changes to the DOM generate JS that's returned from the submit.

One headache with this approach is that documents need to stick around or callbacks will fail, and you need to hit the same server to get the document.

It should be doable to put a serialized version of the DOM on the client and passing that to callbacks to be rebuilt on the server.

exceptione|14 days ago

  > Honestly don't know what people think they are gaining with a heavy frontend.
True, but (I repeat myself here), it depends on what kind of website we are talking about. For instance, a data-heavy SPA that workers use the whole day (like a CRM) is at least perceptually faster and more user friendly compared to the same thing but with traditional whole page reloads.

prewett|14 days ago

A cogent article. But I think the biggest problem is that the DOM was built for documents, not apps. We know how to build a performant UI architecture: Qt, Java/Swing, Cocoa all have pretty similar architectures and they all ran fine on much poorer hardware than a modern browser on an M1. But unless you use WebAssembly, you can't actually use them on the browser.

When the industry shoehorns something into a tool designed for something else, yeah, performance suffers and you get a lot of framework churn with people trying to figure out how to elegantly cut steaks with spoons.

carshodev|14 days ago

But most apps are documents, they are built to render data and text fields in a nice way for the consumer to use.

You most certainly shouldn't be building graphs with table elements but JS has canvas and svg which make vectors pretty efficient to render.

The document model provides good accessibility and the ability for things like SEO and GEO to exist.

If you are making a racing simulator, then using HTML in no way makes sense, but for the apps that most of us use documents make sense.

It would be nice if browsers implemented a new interpreted statically typed language with direct canvas/viewport rendering that was more efficient than javascript, but chrome would need to adopt it, then developers would need to actually build things with it. It seems like it would currently have to come from within the chrome team directly and they are the only ones that can control something like this.

saidinesh5|14 days ago

While I don't have the performance bottleneck numbers of React, I don't think it's about Javascript vs. WASM here.

I've seen/built some large Qt/QML applications with so much javascript and they all performed much better than your average React webapp. In fact the V8 / other browser Javascript engines also have JIT while the QML engine didn't.

Comparing QtQuick/QML + JS to HTML + JS - both GPU accelerated scenegraphs, you should get similar performance in both. But in reality it is rarely the case. I suspect it might be the whole document oriented text layout and css rules, along with React using a virtual DOM and a lot of other dependencies to give us an abstraction layer.

I'd love to know more about this from someone who did an in depth profiling of the same/similar apps on something like QtQuick vs. React.

austin-cheney|14 days ago

You can easily prove if the DOM is a performance bottleneck. DOM performance means one of two things: lookup/access speed and render speed. Render speed is only a DOM concern with regard to quantity of nodes and node layering though, as the visual painting to display is a GPU concern.

To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.

I have also tested performance of node layering and it’s also not what most people think. To do this I used an application that was like desktop UI with many windows that can dragged around each with their own internal content. I found things slowed down considerably when the page had over 10000 nodes displayed across a hundred or so windows. I found this was slower than equivalent desktop environments outside the browser, but not by much.

Most people seem to form unmeasured opinions of DOM performance that do not hold up under tests. Likewise they also fail to optimize when they have the opportunity to do so. In many cases there is active hostility against optimizations that challenge a favorite framework or code pattern.

crazygringo|14 days ago

> the biggest problem is that the DOM was built for documents, not apps

I don't see the difference. They're both text and graphics laid out in a variable-sized nested containers.

And apps today make use all the same fancy stuff documents do. Fonts, vector icons, graphics, rounded corners, multilingual text including RTL, drop shadows, layers, transparency, and so forth.

Maybe you think they shouldn't. But they do. Of all the problems with apps in web pages, the DOM feels like the least of it.

wasmperson|14 days ago

> I think the biggest problem is that the DOM was built for documents, not apps.

The world wide web was invented in 1989, Javascript was released in 1995, and the term "web application" was coined in 1999. In other words: the web has been an application platform for most of its existence. It's wrong to say at this point that any part of it was primarily designed to serve documents, unless you completely ignore all of the design work that has happened for the past 25 years.

Now, whether it was designed well is another issue...

wiseowise|14 days ago

> Java/Swing

> performant UI architecture

Not sure if it’s a joke or something.

nine_k|14 days ago

It very possible to make lightning-fast React web UIs. DOM sucks, but modern computers are insanely fast, and browsers, insanely optimized. It is also very possible to make sluggish-feeling Qt or Swing applications; I've seen a number.

It mostly takes some thinking about immediate reaction, about "negligibly short" operations introducing non-negligible, noticeable delays. Anything not related to rendering should be made async, and even that should be made as fast as possible. This is to say nothing of avoiding reflows, repeated redraws, etc.

In short, sloppy GUI code feels sluggish, no matter what tools you use.

jakub_g|14 days ago

To me, the main problem is that inevitably, any SPA with dozens of contributors will grow into a multi-megabyte-bundle mess.

Preventing it it's extremely hard, because the typical way of developing code is to write a ton of code, add a ton of dependencies, and let the bundler figure it out.

Visualizing the entire codebase in terms of "what imports what" is impossible with tens of thousands of files.

Even when you do splitting via async `import()`, all you need is one PR that imports something in a bad way and it bloats the bundle by hundreds of kilobytes or megabytes, because something that was magically outsourced to an async bundle via the bundler suddenly becomes mandatory in the main bundle via a static import crossing the boundary.

The OP mentions it here:

> It’s often much easier to add things on a top-level component’s context and reuse them throughout the app, than it is to add them only where needed. But doing this means you’re paying the cost before (and whether) you need it.

> It’s much simpler to add something as a synchronous top-level import and force it to be present on every code path, than it is to load it conditionally and deal with the resulting asynchronicity.

> Setting up a bundler to produce one monolithic bundle is trivial, whereas splitting things up per route with shared bundles for the common bits often involves understanding and writing some complex configuration.

You can prevent that by having strong mandatory budgets on every PR, which checks that the bundle size did not grow by more than X kB.

But even then, the accumulation of 100s/1000s of PRs each adding 1KB, bloats the bundle enough to become noticeable eventually.

Perf work is thankless work, and having one perf team trying to keep things at bay while there are dozens of teams shipping features at a fast pace is not gonna cut it.

pas|14 days ago

It makes sense to have separate entry points for landing pages, no need for fancy providers and heavy imports.

In practice people are more than willing to wait for an SPA to load if it works well (figma, gmail/gdocs, discord's browser version used to be pretty good too, and then of course there are the horrible counter-examples AWS/GCP control panels, and so on).

g947o|14 days ago

Exactly. To avoid dependency explosion, you need at least one of 1) most libraries built internally 2) great discipline across teams 3) significant performance investment/mandate/enforcement (which likely comes from business requirement eg page load time). I have rarely seen that in my limited experience.

slopinthebag|14 days ago

> I’ll focus on React and Redux in some of my examples since that is what I have the most experience with, but much of this applies to other frameworks and to JS-heavy approaches in general.

That's not a fair assumption. Frameworks like Svelte, Solid, Vue etc have smaller bundle sizes and rendering speeds that approach the baseline vanilla-js cost.

I'm all for criticising Javascript, but moving everything to the server isn't a real solution either. Instead of slow React renders (50ms?), every interaction is a client-server round trip. The user pays the cost of the paradigm on each interaction instead of upfront with an initial JS payload. Etc.

carshodev|14 days ago

Yeah this article is only about React. But it makes sense that someone would think this way because many dev's think JS web apps==react only.

The problem is react is "good enough" for most cases and the performance degradations happen slow enough that the devs/project leads don't see it until it's too late and they are already overly invested in there project and switching would be too compliated/costly for them.

Svelte/kit and properly optimized packages solve almost all of the "problems" this article tries to bring up.

mosdl|14 days ago

Plus Redux is horrible for performance, slows things down and overcomplicates everything.

wasmperson|14 days ago

> Instead of slow React renders (50ms?), every interaction is a client-server round trip.

This is true only if you use zero javascript, which isn't what the article is advocating for (and even with zero javascript there's quite a bit of client-side interactivity built-in to CSS and HTML). Besides: in practice most interactions in an SPA also involve network round trips, in addition to that slow react render.

torginus|14 days ago

My 2 cents: I am not an experienced React dev, but the React compiler came out recently with React 19, which is supposed to do the same thing as Svelte's - eliminate unnecessary DOM modifications by explicitly tracking which components rely on what state - thus making useMemo() unnecessary.

Since the article still references useMemo(), I wonder how up-to-date the rest of the article is.

littlecranky67|14 days ago

React has always been tracking what component relies on what state, independent of the compiler. That is one of the reason the rule-of-hook has to exist - it tracks if a component calls useState() and thus knows the according setState function that manipulates that particular state.

Idk why people claim React is bloat, especially since you can switch to Preact (4kb) most of the time without changes if filesize is an issue for you.

bryanrasmussen|14 days ago

useMemo is for maintaining calculations of dependencies between renders, renders generally caused by state changes. React always tries to track state changes, but complicated states (represented by deep objects) can be recalculated too often - not sure if React 19 improves this, don't think so when reading documentation.

on edit: often useMemo is used by devs to cover up mistakes in rendering architecture. I believe React 19 improves stuff so you would not use useMemo, but not sure if it actually makes useMemo not at all useful.

TonyAlicea10|14 days ago

React compiler adds useMemo everywhere, even to your returned templates. It makes useMemo the most common hook in your codebase, and thus very necessary. Just not as necessary to write manually.

gatane|14 days ago

Github was usable and fast, now it is slow. Guess what changed...

ahartmetz|14 days ago

It's crazy slow. But it's also a closed source, Microsoft platform for Open Source, so it belongs in the trash anyway.

lenkite|14 days ago

It was re-written in React and performance was annihilated. The React Virus annexed another victim and we have one more zombie web-site.

gloosx|14 days ago

Acquisitionbombed by Microsoft! Instead of usability and performance it has AI now!

azangru|14 days ago

I think digressions about React dilute the message. Ok, we get it, react bad; but what is the actionable message here? What are the robust alternatives? There is a section on how changes to react (the introduction of hooks and of the concurrent mode) necessitated significant code changes almost amounting to rewrites; but which alternatives, other than vanilla web platform, can promise long-term stability? Which alternatives are better suited for which kinds of applications (roughly, an e-commerce site vs excalidraw)?

robertoandred|14 days ago

Eh, this argument falls apart for many reasons:

- His main example of bloated client-side dependencies is moment.js, which has been deprecated for five years in favor of smaller libraries and native APIs, and whose principal functionality (the manipulation and display of the user's date/time) isn't possible on the server anyway.

- There's an underlying assumption that server-side code is inherently good, performant, and well crafted. There are footguns in every single language and framework and library ever (he works for WordPress, he should know).

- He's right to point out the pain of React memoization, but the Compiler now does this for you and better than you ever could manually

- Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.

afavour|14 days ago

> There's an underlying assumption that server-side code is inherently good, performant, and well crafted

To me it’s an assumption that server side code is going to be running on a server. Which is a known quantity and can be profiled to the nth degree. It’s extremely difficult to profile every possible device your site will run on, which is crucial with low powered mobile devices.

> Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes

Not really, no. Large bundle sizes prevent the initialisation of the app, which means the user can’t do anything. By comparison images and videos download asynchronously and get processed on a separate thread. JS bundles also need to be parsed after being downloaded, if you pair a crappy Android phone with an 5G connection the parsing can literally take longer than the download.

youngtaff|14 days ago

> Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.

In network terms JS tends to be downloaded at a higher priority than both images and video so has a larger impact on content appearing

JS is also the primary main thread blocker for most apps / pages… profile any NextJS app and you’ll see what a horrendous impact NextJS has on the main thread and the visitor’s experience

gr4vityWall|14 days ago

> There's an underlying assumption that server-side code is inherently good, performant, and well crafted.

I didn't read it that way. I believe the underlying assumption is that the server-side code won't run in a power-constrained computer, thus having more performance headroom.

wackget|14 days ago

I am so grateful to the author for writing this article. For years I've been fighting a series of small battles with my peers who seem hell-bent on "upgrading" our e-commerce websites by rewriting them in React or another modern framework.

I've held the line, firm in my belief that there is truly no compelling reason for a shopping website to be turned into an SPA.

It's been difficult at times. The hype of new and shiny tools is real. Like the article mentions, a lot of devs don't even know that there is another way to build things for the web. They don't understand that it's not normal to push megabytes of JavaScript to users' browsers, or that displaying some text on a page doesn't have to start with `<React><App/></React>`.

That's terrifying to me.

Articles like this give me hope that no, I'm not losing my mind. Once the current framework fads eventually die out - as they always do - the core web technologies will remain.

hinkley|14 days ago

I think shopping gets this in spades too because not all shopping sites are meant to be particularly sticky.

It's one thing to browse the catalog at my leisure on gigabit networking, a 5k display and 16 CPU cores. It's another thing when I'm standing in Macy's or Home Depot and they don't quite have the thing I thought they have and I'm on my phone trying to figure out if I can drive half a mile to your store and get it. If you want to poach that sale your site better be fast, rather than sticky.

wmf|14 days ago

Let them have React and SSR everything? Didn't the NYT do this?

nchmy|13 days ago

he links to infrequently.org a few times in the article. You should read every article there - its a revelation.

Then check out Datastar

flohofwoe|14 days ago

This seems to be more about React than Javascript (or indirectly about the DOM). React sitting on top a browser-style DOM would be slow in any language, while Javascript itself can be surprisingly fast, especially when sticking to the right subset. It "only" needs a big and complex JS engine to get to that kind of performance.

tpoacher|14 days ago

As an aside, I like their use of blockquotes+details/summary blocks for inserting "afterthoughts/addendums".

It's a nice touch, and it works pretty well. Partly because, as a design choice, it forces you to add the afterthought between paragraphs, so the interruption in reading flow is minimal.

exceptione|14 days ago

I am the first to admit that SSR worked fine since PHP. But two points I like to make

1) choose boring technology; 2) it is the library, stupid

First question should be: what type of web application are we talking about? If your blog does not render or your e-commerce site stops working without javascript, shame on you indeed.

If we are talking about SPA's, aka those replacing traditional desktop applications, and you want to throw out React, then we have something to discuss. Because, how are you going to replace MUI/MUI(x)? I keep hearing about the coolness of alternatives, and I am certain open to it, but in the end it needs to be not just better in some aspects or even fundamentals, while completely lacking in practical matters. A simple benchmark I keep going back to is to see if $hotness is able to replace [1], which shows you just one component of many in a cohesive system.

I don't contend with the principles in the article, but I do with conclusions like "ripping out react would be the solution".

1. https://mui.com/x/react-data-grid/tree-data/

kylecazar|14 days ago

"Now’s a good time to figure out whether your client-side application should have a server-side aspect to it, to speed up initial renders."

My how the tables have turned!

verdverm|14 days ago

everything old is new again

BrenBarn|13 days ago

I'm not a big JS fan, but to be fair this article seems more like "bloated JS frameworks have poor performance". And like, yeah. But you can use JS without using a bloated framework.

It's funny though that the article is circling back to Web 2.0-era server-side stuff. It's an idea whose time has come!

allreduce|14 days ago

For websites I use regularly, I can identify those which render server side because I trust that when they load on a slow connection my state is still there. When I see a loading icon on heavy client side JS "apps" I anticipate something breaking and me having to reload and re-enter whatever I was working on.

LAC-Tech|14 days ago

Wonderful article. I do maintenance programming, and many of the problems you mention with typical react apps are also code maintenance nightmares. Managing a large number of fast moving third party dependencies will destroy your developer budget, but devs cant see it because they're "Best Practices"

Devasta|14 days ago

Why would performance be a goal?

If a native desktop app crashes, your users will gnash their teeth and curse your name. A web app? They shrug and refresh the page. They have been trained to do this for decades now, why would anyone believe this isn't intended behavior?

No one has a reasonable expectation of quality when it comes to the web.

zahlman|14 days ago

> No one has a reasonable expectation of quality when it comes to the web.

And that's exactly why I want native desktop apps to make a resurgence.

flatcoke|13 days ago

Agree with the general point, but the nuance matters. Static-first approaches like static exports solve most of this — you get the DX of a JS framework without shipping a runtime to the client. The problem isn't JavaScript itself, it's shipping too much of it.

SauntSolaire|13 days ago

Why do people bother creating new accounts just to post these generic AI comments? It's like people want to ruin HN as fast as (in)humanly possible.

suralind|14 days ago

I’ve been very happy using SvelteKit for some side projects. At this point I wouldn’t label myself as frontend developer anymore, but when done right, it’s probably the closest thing to modern, performant and reactive that I know of. It also works really well if the JS just breaks.

jadbox|14 days ago

I've gone all SSR (server-side render) with JSX using Astro or Elysia. If I need frontend logic, I just sprinkle in a little Htmx or roll my own inline js function. It makes debugging 1000% easier and the pagerank scores are usually amazing out of the box.

cyberax|14 days ago

Our webapp is nearly instant, and it's built on raw React with some sprinkling of Tanstack (their Local Collection DB is a masterpiece).

And our stack is intentionally client-heavy. We proactively synchronize all the relevant data and keep it in IndexedDB, with cross-tab coordination. The server is used only for mutating operations and for some actions that necessarily require server-side logic.

The issue with dependencies and the package size if valid, but it's also really not a big deal for performance unless you go WAAAY overboard. Or use crappy dependencies (Clerk, I'm looking at you, 2.5Mb for an auth library!?!?).

As for hydration, I found that it often slows down things. It's a valid approach when you're doing query-based apps, but when you have all the data locally, it just makes little sense. It also adds a ton of complexity, because now your code has to automagically run in multiple environments. I don't think it can really ever work reliably and safely.

tommek4077|14 days ago

In other news: water is wet. I genuinely don't understand how anyone is still pretending otherwise. Server-side rendering is so much easier to deliver in a performant way, yet it feels like it's being increasingly forgotten — or worse, actively dismissed as outdated. Out of convenience, more and more developers keep pushing logic and rendering onto the client, as if the browser were an infinitely capable runtime. The result is exactly what this article describes: bloated bundles, fragile performance, and an endless cycle of optimization that never quite sticks.

tjpnz|14 days ago

Server rendered HTML, htmlf endpoints and JQuery load was always the sweet spot for me - McMaster Carr[0] does the same thing behind the scenes and utterly destroys every "modern" webapp in existence today. Why did everything have to become so hard?

0: https://www.mcmaster.com/

holoduke|14 days ago

Pure client side rendering is the only way to get max speed with lowest latency possible. With ssr you always have bigger payloads or double network rounds.

Meneth|14 days ago

JavaScript isn't good for performance? Could've told you that 20 years ago.

monster_truck|14 days ago

This seems like it's more about React. Javascript is fine. React is garbage

prinny_|14 days ago

People in this thread hating on React seem to miss the crucial point that 2016 React was a godsend compared to just about every other option available. Vue only picked up steam quite later as React became more and more bloated or had its development forced by Vercel down a certain road. Angular was THE framework to avoid working on and people hated having to define multiple files for a simple component. The timing for React was just right back in the day.

Given how FEs are re-written every 7-10 years there is ample room for other frameworks to knock React off its throne, but before asking "but why not X" you also have to consider that organizations by now have almost a decade of experience building React apps and this plays a major role when deciding on which UI framework to rely on.

dsiegel2275|14 days ago

Lost me at "React is a Framework" assertion. The key difference between a "framework" and a "library" is the inversion of control that exists in a framework.

React is a library - your app still maintains control of application state and drives the main workings of the application. It is just simply using the React library to render that application state.

gloosx|14 days ago

Fully agree, Inversion of Control is not something which alone defines something as a framework. When defining it along the architectural axis React is architecturally unopinionated thus is has library-level scope, but framework-like control semantics. If framework = anything that uses inversion of control, then a lot of things suddenly become frameworks, including things nobody calls frameworks. One can call it a "rendering framework" but calling it a "web-application framework" is not factually correct.

shevy-java|14 days ago

Wasn't WebAssembly going to change all this?

Somehow that does not really have been achieved.

mcraiha|14 days ago

DOM access is not in Wasm.

Jgoauh|13 days ago

do you think using webassembly (.wasm) for non UI related code might help with client side performance for apps where server side work is less applicable ?

piyh|14 days ago

Only a single passing mention of web components?

vaylian|14 days ago

> You can use isolated JS scripts, or other approaches like progressively-enhanced web components

How would one use "progressively enchanced" web components? Maybe I misunderstand the intention behind this statement, but web components are either supported or not. There doesn't seem to be some kind of progression.

jongjong|14 days ago

I was never a big React fan myself. As someone who has used a lot of different JavaScript frameworks over many years, I can say confidently that it's not the best JS framework; especially nowadays due to bloat.

Yet it's better than anything available in any other programming language, on any other platform in existence.

Never bet against JavaScript. People have done this over and over and over again since it was invented. So many haters. It's like every junior dev was born a JavaScript hater and spends most of their career slowly working themselves to a state of 'tolerating JavaScript'.

JS was designed to be future-proof and it kept improving over the years; what happened is it improved much faster than people were able to adjust their emotions about it... These people even preached the JS hate to a whole generation of juniors; some of whom never even experienced JavaScript first-hand. It's been cool to hate JavaScript since I can remember.

JavaScript does have some bad parts, as does any other programming language, but the good parts of JS are better than anything else in existence. People keep trying to build abstractions on top (e.g. TypeScript) to try to distance people from JavaScript, but they keep coming back over and over again... And these people will never admit to themselves that maybe the reason they keep coming back to JavaScript is because it's pretty darn great. Not perfect, but great nonetheless.

It's hilarious that now we have Web Assembly; meaning you can compile any language to run in the browser... But almost nobody is doing that. Do people realize how much work was required to bring Web Assembly to the browser? Everyone knows it exists, it's there for you to use, but nobody is using it! What does that say? Oh, it's because of the bundle size? Common! Look at React, React bundles are so bloated! But they are heavily used! The excuse that JavaScript's success is a result of its browser monopoly is gone!

Enough is enough! The fact is; you probably love JavaScript but you're just too weak to admit it! I think the problem is that non-JS developers have got a big mouth and small hands...

Const-me|14 days ago

> but nobody is using it! What does that say?

It’s impossible to replace JS with WebAssembly because all state-mutating functions (DOM tree manipulation and events, WebGL rendering, all other IO) is unavailable to WebAssembly. They expect people to do all that using JavaScript glue.

Pretty sure if WebAssembly were designed to replace JS instead of merely supplementing it, we would have little JS left on the web.

someone_19|14 days ago

> but the good parts of JS are better than anything else in existence

What you talking about?! I can't think of a single thing in Js that I could say is good.

Okay, two big corporations have invested a lot of money and effort into making V8 and TypeScript, and now it's useful. But I don't consider it exactly part of Js.

TonyStr|14 days ago

You lost me at Typescript. Typescript is great not because it abstracts away any javascript functionality (it doesn't), but because it allows IDE integrations (mainly LSP) to better understand your code, enabling go-to-definition, hover docs, autocomplete, semantic highlighting, code actions, inline error messages, etc.

But I agree many people are jumping on the javascript hate train without really understanding the modern web landscape.

gorgolo|14 days ago

Do people really hate JavaScript, or do they just hate the design choices and results that it seems to be correlated with?

At the end of the day I’m using SaaS tools that are apparently written in React and I get astounded but how slow and heavy they are. If you are editing a page on our companies cloud-based wiki, I’ve seen my chrome RAM balloon from 3GB to 16G. A mistake was made somewhere, that I know.

rglover|14 days ago

Interop has entered the chat.

exabrial|14 days ago

I’m still baffled why this language is the universal browser standard. There should be hundreds competing.

phantomathkg|14 days ago

I'm still baffled why you don't see JavaScript is fast, but how page rendered is the issue.