> If the major consumers of the highest used frameworks all used the same CDN for delivery and smaller websites that want to use those same framework followed suit this would greatly reduce wasted traffic across the web and make everyone faster.
Except that browsers no longer cache across origins anymore, so this doesn't work, also there are just a huge number of libraries and versions, so even if they did cache it, it's unlikely to result in much benefit.
Also, that massively centralizes the Internet in a way that's not great. If one of those CDNs gets hacked or just goes down for an hour, it breaks every website that uses it.
The other problem is that JS bundlers will (attempt to) strip out unused code to reduce the final bundle size. Not possible if the library isn’t part of the build
And many sites now use their own CDN anyway. Maybe the move away from centralized CDNs is wasteful in the aggregate, but it's not responsible for any perceived reduction in site performance.
I wanted to write the same comment, the premise of the article is wrong: "if all websites link to the same resources, that resource will not be downloaded multiple times across domains".
I agree, and I'm trying to do something about it with two no-build libraries:
https://htmx.org - an 11k library that pushes HTML forward as a hypermedia, making things like AJAX, SSE and CSS transitions available directly in HTML
https://hyperscript.org - a 20k library that replaces jquery/javascript with an inline, HyperTalk inspired syntax
Both can be included on a site without any build infrastructure, and are attempts at bringing the joy of hypermedia and the joy of scripting, respectively, back to the web.
They're both server-render(e.g. django) oriented, am I right?
SPA makes things complicated, server-render does not handle all user cases either, is there a middle ground: that I still can manipulate DOMs on the client side but not like SPA that goes all way in. That is, something likes jquery(the best/simplest API ever even after ES6), but with virtual-DOM. Not sure if Mithril etc fits the bill.
IMO the biggest problem is that the industry became all about developer experience and neglected user experience.
Doing an analysis of the JS bundled with many major sites is an utter horror show: React is big and bloated, alternatives like Preact exist but no-one cares. React provides The Best Developer Experience so that is what we shall use. GraphQL is the cool thing to use these days so we’ll throw in a giant client library along with all our query text. Sure, browsers handle REST out of the box but that’s not what works best for our developers. We’ll use CSS in JS despite the performance being worse, it’s what our team likes. We’ll bundle a giant CoreJS shim to all browsers even though the vast majority won’t need it. Running two builds is too annoying to set up in our workflow. The site loads fine on my MacBook Pro so why stress what Lighthouse says?
So on and so forth. For years I’ve heard that this isn’t a problem because all the extra developer productivity means they can move so much faster. It’s been long enough now that I’m comfortable dismissing that argument as nonsense. And many users are essentially trapped: for example, I bank with Chase and their web site is absolutely atrocious. I talk to friends and they say the same. But the UX of online banking is very rarely a reason for anyone to change bank so the product folks aren’t going to push for performance. It’s something developers have to proactively care about and as an industry we just don’t.
I hope eventually we see the industry change but for now it really feels like no-one cares. We’re prioritising the speed of the shovelling while ignoring the quantity of shit.
> all about developer experience and neglected user experience
I think this is a mix of good old cargo culting and how hot the dev market is. Small companies don't have the leverage to tell their engineers "no" when they do a bit of resume driven development.
React makes sense for many of sites at larger tech companies paying big salaries because many of them are delivering a richer "application" experience. Some engineers blindly follow what "better" (massive air quotes) companies are doing. Others who want to work at these companies want the skills that will make them more hireable and so choose React. Devs who are not in either of these other camps make a small enough group they don't generally get their way.
I’d agree with you except for the fact that it’s really hard to say that developer experience today is better than what it was before. Especially once you adjust for the fact that browsers are so much better and standardized.
React and its ilk doesn't even deliver on those goals. It encourages bad programming, makes terrible software practices easy and separates code so the piles of unmaintainable garbage don't smash into each other too harshly.
It doesn't actually empower anything - it acts as a scolding nanny prohibiting people from making things explode.
It does this by imposing a narrow opinion about things that's constantly changing with each release as it herds the programmer through very specific and alarmingly complicated ways of doing things.
It's fundamentally a tool of oppression, not one of expression and one of coercion and not liberation. I hate it so much.
>> IMO the biggest problem is that the industry became all about developer experience and neglected user experience.
I recently transitioned from being a developer to being an accessibility engineer. I've been able to see first hand how the user experience has been neglected, especially for people with disabilities.
A lot of the JS frameworks are getting better at A11Y, but its still a major work in progress.
There is always a time vs end user experience tradeoff, even with physical goods. I think it's part of a bigger trend towards (more prolific) crappy stuff generally.
I do think there is space for tools that improve the user experience whilst not regressing on easy of development.
What is the alternative to catering to developer experience though?
If developers struggle with performant technologies / methods / frameworks, I'm sure they're going to end up producing crap apps in the end still.
Rather than complain, we should be focusing on making good performance the happy path. In fact, I'd say that's most of what the latest frameworks that are popular focus on, so I don't really understand the issue.
If you build the developer experience, they will come. If you don't, I hate to say it, but everyone has mouths to feed.
Hmm, I don’t think this is actually true. There was plenty to complain about at any point in the development of the web, and complaing about bloated frameworks are simply the complaint du jour.
In practice I load every webpage once (even very complicated applications can be loaded in under 2MB), and then go on my merry way.
It's been bleak for a while, but I think we are coming round the bend.
For example Next does a pretty good job of code splitting and offers rendering solutions from 100% client side to 100% static, as in no React runtime at all. Building a performant Next site that only pulls in the JS it needs is quite doable now.
React server side components seem at least interesting here. I still have no experience with them but they do at least claim to tackle some of these issues. As I understand it other frameworks like Svelte and Vue have been exploring these ideas too.
I think the bleakness will continue and there will be a long tail of it. But I really do think better websites are coming and in many cases are already here.
I think it's much simpler -- it's about the focus on design over usefulness. I really don't care if the u/x looks like someone's weekend VB project as long as it does what I want in an obvious, discoverable way. Some of the worst things in web (and desktop) u/x seem to come from the design-side of the house and are more about some 1950's design ethic of what would look good in a magazine layout, rather than treating the computer as a tool that gets shit done. (Of course, some sites _are_ magazines, and they get a pass...)
I feel the same about iOS apps -- they are all adventure games.
And in-place editing on a phone? Just fill the whole freaking screen with a big editor window in a giant font where I can target the cursor with my fingers, and let me submit it to the field when I'm done.
My company does complex cloud and on-prem web-based apps, JS front and back.
These inefficient frameworks make our developers more productive and allow us to design, deliver, and fix fast. Storage is cheaper, bandwidth is cheaper, processing cycles are cheaper, all are cheaper than developer time.
And these powerful frameworks give us great speed in iterating new features, which makes our customers happy.
The article is too focused on efficient use of computing resources at the expense of ignoring developer costs and customer satisfaction.
> My company does complex cloud and on-prem web-based apps, JS front and back.
Then you've certainly run into power users.
Chances are browsers can load and display a 2MB server-generated HTML table faster than the DI system in your 5 MB angular app can start all the required services to be ready to launch your pagination component.
However once it is ready it may be quickly apparent that users would be much more productive using that giant table and CTRL+F than a laggy angular app that would struggle even handling that much data at once in an idiomatic way.
Of course I am making this up to illustrate a point. However I use "one giant HTML table" as a sort of bar to meet when developing in modern frameworks. If with this insanely complicated and large array of tools I can't even do better than that (faster to implement!) competing solution, I might as well not bother.
That "a bunch of large server-rendered HTML tables" just so often happen to be the previous solution that was in place, and power users are likely to complain if we do worse, is just the cherry on top.
To do better we often have to avoid certain things that would be considered modern. Like not putting lots of empty space in our layout, no server-side search that would introduce too much (pointless) latency compared to client-side, and some un-idiomatic stuff to actually handle large[1] amounts of data without change-detection/updating in our framework of choice eating user's CPUs for breakfast.
[1]: The amount of data/elements at which change detection in idiomatic Rust and Angular becomes slow wouldn't have been considered 'large' 15 years ago. Though obviously people then had to put thought into this stuff and it certainly wasn't automatic.
> great speed in iterating new features […] makes our customers happy
For me this is a leap. I can't think of many examples of software which I use where new features have actually made me happy. Normally it's just change which forces me to learn something new while I'm in the middle of trying to accomplish something actually productive.
I think our industry over-estimates the value of "new features". But in my experience 90% of new features provide neutral or negative value. If—instead of a new feature—the software I used released a performance improvement, then that would actually help make me more productive.
As a JavaScript developer who has been doing this work since before jQuery became popular I sadly agree. It appears there is an arms race to the bottom, as in how low can we lower the bar of acceptability for prospective employment candidates.
I can remember back in the day at Travelocity they couldn't hire JavaScript developers to save their business. A $100 million business and the darling of the travel industry could not hire competent front end developers at all. Its how I became a developer, involuntary reassignment from designer.
I can also remember Travelocity trying to solve this problem with GWT (Google Web Toolkit). This mistake of an application compiled Java to JavaScript for people who probably should not have been writing either. Its all good so long as the code never needs changes or updates at which point Java developers wanted JavaScript developers to fix their obfuscated code.
How hard could writing this rinky dink language possibly be? Back in those days there were some actual challenges, but they were an inch deep and a mile wide. There were many technical areas to own, but only two separated the grossly incompetent from the barely employable:
* Cross browser compatibility (IE made this more challenging).
* The DOM. Then, just as now, tree models broke the souls of many software developers. I have never figured this out, because elementary schools are able to explain abstract concepts as tree models to children, but many adults cannot and will never understand the idea on even the most primitive level.
jQuery came and saved the day. No more walking tree models. Everything is now a chained method or a CSS selector through a very slow Sizzle engine. Now everybody and their dog could add events and text to a web page so long as they could read the jQuery API.
In hindsight none of this is surprising in the least. The surprising part is the compensation. Back in the day when the work was hard front-end developers were paid next to nothing. You were much better off being a Java developer where you could barely have a pulse and earn 2-3x as much. As the problems got easier and the number of available candidates blossomed the compensation surprisingly went up... slowly.
I remember my management trying to get me to take over a GWT based spreadsheet like app for managing FMEA's. The developer only used it because he did not now JavaScript and had no desire to learn. It was developed with zero thought into debugging. Needless to say, I punted that app over to another developer to handle as I saw it was going to fail miserably after the first demonstration and my initial questions to the developer. That project was an abject failure after the company spent hundreds of thousands of dollars at it.
The author is correct but not for that farcical reason. Here's some real ones. It encourages the following:
A URL that is now less Universal than ever, pointing to an application more than a piece of content
Content that is not self contained in a single document so it is hard to archive for the future
Content that is not served with the page nor containing a universal reference so it is hard to scrape and catalog
A profound inability to inspect websites because they are now transpiled via frameworks and emit something obtuse and only machine, not human readable
Content that is now in a non-generalized method of access so instrumenting it to fit it for different purposes (the "Free" part of "information is meant to be free") is now fragile and dubious
And more controversial:
The careful separation of concerns of the Javascript (Controller), HTML (Model), and CSS (View) gets tossed together and entangled in a single stream of hybrid languages where they no longer are separated. Instead people rely on less rigorously made tools to achieve the same ends with increased difficulty, less reliability, and less functionality - all cost and no benefit.
As someone working on an actual webapp ... eh it's much smaller than an equivalent iOS or Android app would be. Total transfer size around 5MB.
Links to individual pages would be useless to archive because they're not documents. Kinda like "Why would you bookmark a individual app window?"
You can scrape us with a browser, but why would you?
You can right click inspect or use react devtools, which is nice if you're trying to understand things. That part is way more accessible than an equivalent iOS, Android, or Desktop app would be.
Oh and we don't need to build an iOS, Android, and Desktop app! It's a single webapp that works in any client of your choosing.
Plus you get fresh updates and bugfixes with every page load, no delayed update cycle. This may be controversial if you dislike a UX change we make.
> A profound inability to inspect websites because they are now transpiled via frameworks and emit something obtuse and only machine, not human readable
IMO, the setting where open source communities are likely to grow is the right place to share code. This is also the setting where code is most likely to be understood, as understanding a codebase is very hard. Sharing code is not just a matter of dumping original source code to the public, and I think the use case around ordinary users examining source code is very, very niche.
And how should compiled-language communities like Rust handle the issue of sharing code? Should they distribute the original source code with the expectation that their users shall compile on their end? Is that the right way to share code, or is that just dumping code to the public?
In my opinion the solution was always progressive enhancement, but things went crazy with JavaScript: The Good Parts and especially Node.js which IMO is a huge hack.
I built https://mkws.sh as a solution to at least static site generation. It can be enhanced with a Makefile but for now it's simple enough. I don't believe static site generation is a JavaScript job, and to be honest, I don't believe JavaScript belongs on the server side.
This doesn't line up with my experience. IME these are the major drivers of web "slowness" for different types of sites:
1) Client-side-rendered apps that haven't embraced hydration of pre-rendered HTML suffer for it. Not because the JS is necessarily too large, but because it blocks the content from being visible. This would be bad even with a small bundle. Luckily this is an easy problem to solve today, but not every site has done so.
2) Many public-facing sites are afflicted with marketing scripts. I've seen sites with (I kid you not) several times the amount of marketing JS (by weight) as they have actual functionality-giving JS. The actual devs often don't even have control over these scripts (thanks to Google Tag Manager), and even if they do they probably don't have a say in what can be removed.
3) Complex sites/web-based tools are a separate beast. You're not optimizing for first-visit, you're optimizing for responsive interactions. This has even less to do with bundle size, and often not even much to do with JS itself. In my experience the limiting factor on these sites is usually reflow (DOM/style changes).
This general topic comes up often, and the top reasons usually end up being:
- JS development has a low barrier to entry, so there is a wide/diverse range of developer skills and knowledge
- Most (webpack) tutorials don't cover build optimization, like tree-shaking, especially entry level, which is normally what people find to "just get the job done."
- There are too many options for frameworks for people to make _good_ decisions, so they often fallback to _popular_ decisions
This isn't even a recent problem. I have a few clients that wanted to use old-school template packages, like metronic, and that is a massive bloated build, and yet it's basically jquery+bootstrap with a handful of libraries.
The biggest thing is, as professional web developers, we have a responsibility to the end user to make a web page load fast and be usable quickly, and I think we often over-look that. I don't care if your framework is a no-build thing, or uses webpack/rollup/etc., the real thing is if a user doesn't have to download 50MB of javascript for a single page, especially if the end result of that page is considerably less than 50MB worth of text, styles, and images.
The flip side is, most clients just want their project done quickly so they can make money. There's nothing wrong with that, but it can't be at the expense of their users, and developers should be able to give good consultation and advice to their clients about the trade-offs of libraries/frameworks/build tools/etc. and the impact they will have on the end-users.
Personally, I like building without extra build tools. `script type="module"` is great, and smaller frameworks (my go-to is hyperapp, but there are several others) are often not awful to grab a local es6 copy and toss it in your public web directory, import it, and go to town.
My hope is that the goal is/will at some point be for the web “standard library” provided by the browser will some day be rich enough so that you won’t need frameworks and would only need very few libraries at all. I really want a batteries included standard browser library. Does anyone anywhere (WHATWG, W3C, etc.) share this goal?
> The organic evolution of the web today has made the web worse not better and caused incredible amounts of waste
I think this is a very subjective take. There is also a strong argument that the web is much better today because of modern frameworks and technologies. The web has shifted from being a place for basic content and documents to a full-blown application platform. 10-15 years ago, if a certain application was only built for (say) macOS, well, tough luck! You won’t be able to use it. Today, many popular applications can be used in the browser on any device and any operating system. (Examples like full office suites including MS office and Google docs, or photo editing apps, or music streaming platforms, or video conferencing.) That’s a pretty incredible advantage and improvement to the general state of software, and it’s a very strong improvement for users who no longer need to worry about incompatible software or installation processes.
I would argue that this advancement would not have happened without frameworks. Or at the very least, it brought about frameworks. Since the web platform itself is not designed for applications, any large application will probably end up writing some sort of abstraction on DOM rendering and state management, just to make it halfway maintainable. So if we want web applications, which are very beneficial in many circumstances, frameworks and tooling are a part of that.
Of course bundle size is still a problem to worry about. It’s a problem inherent to the benefit of the platform. Typical software can make the bundle as big as they want, and you download and install it once. The web platform forces you to care about the size in a way that just isn’t necessary for other platforms.
“Modern JS” is really not what is making the web worse. What’s making the web worse is ads, tracking, platform-lock-in, big monopolies, censorship, spam, poor content safety practices, etc.
Like, of course you don’t need a big framework for your blog or basic content site. It’s all about choosing the tool for the job. Half the time, the blogger probably just wanted to play around with a new technology. It really doesn’t matter that much for small little projects that are just people playing around.
Bundle size is important when you have a big enough user base and really have a need to optimize for poor internet connection and slower devices — but it’s just bundle size. You’d get even bigger bundle sizes without using modern tools like webpack. The bundle size comes from the larger types of apps we’re building now that we didn’t used to build, not from web frameworks specifically. IMO, that makes bundle sizes an important, but relatively small aspect of “problems related to the web.”
Modern javascript was created by megacorporations and it is entirely designed for their use cases. The w3c hasn't really been involved since 2018 and whatwg, and so the DOM, is corporate controlled. Emcascript itself even more so.
There's an easy solution though. Just don't use javascript to make your personal websites. The corporate web can wallow in it's own filth and decay for all I care. If the humans on the web make websites without JS the web can be, and is, still great.
I’ve always assumed it went something like: the web became popular, lots of programmers from other fields moved to web programming since that is where the jobs were at, they brought with them all their cruft and complexity and forced them onto web development (regardless of whether it was appropriate or not), and thus we have the insane complex mess web development has become.
* JS's history is all about "quick and dirty". jQuery in older times was meant to paper over the quick and dirty and give developers more convenience, and old JS without JQuery was pretty awful
* we are asking JS to do things it was never meant to do; in particular, the desire for "app-like" experiences and SPAs.
* JS is very backward compatible, which means old webpages can be archived, but also means that all the warts are still around.
* JS interpretation depends on browsers, which means if you want to use some modern features you will have to use some kind of transpiling, polyfill, and or framework to get some consistent behavior on all of them (jQuery in older times, babel today), and the differences in browsers means that these adapters have to be mind-bogglingly complex a lot of times
This is a solved problem for many of us. Svelte, for example, has arguably unparalleled DX as a framework AND compiles hyper performant, tiny bundles with server rendered pages that load faster than you can blink! I think we’re past a lot of this now that we have frameworks like Astro and Sveltekit, and WASM for more heavy lifting.
He says Modern Javascript but what he really means is Modern JS Frameworks (React and others). Don't use a framework if you don't need it, or use a framework and pre-render.
Frameworks being large and taking a long time to parse isn't the fault of JavaScript or NPM.
[+] [-] WatchDog|4 years ago|reply
Except that browsers no longer cache across origins anymore, so this doesn't work, also there are just a huge number of libraries and versions, so even if they did cache it, it's unlikely to result in much benefit.
[+] [-] spacebear|4 years ago|reply
[+] [-] noahtallen|4 years ago|reply
[+] [-] brundolf|4 years ago|reply
[+] [-] XCSme|4 years ago|reply
[+] [-] recursivedoubts|4 years ago|reply
https://htmx.org - an 11k library that pushes HTML forward as a hypermedia, making things like AJAX, SSE and CSS transitions available directly in HTML
https://hyperscript.org - a 20k library that replaces jquery/javascript with an inline, HyperTalk inspired syntax
Both can be included on a site without any build infrastructure, and are attempts at bringing the joy of hypermedia and the joy of scripting, respectively, back to the web.
[+] [-] synergy20|4 years ago|reply
SPA makes things complicated, server-render does not handle all user cases either, is there a middle ground: that I still can manipulate DOMs on the client side but not like SPA that goes all way in. That is, something likes jquery(the best/simplest API ever even after ES6), but with virtual-DOM. Not sure if Mithril etc fits the bill.
[+] [-] afavour|4 years ago|reply
Doing an analysis of the JS bundled with many major sites is an utter horror show: React is big and bloated, alternatives like Preact exist but no-one cares. React provides The Best Developer Experience so that is what we shall use. GraphQL is the cool thing to use these days so we’ll throw in a giant client library along with all our query text. Sure, browsers handle REST out of the box but that’s not what works best for our developers. We’ll use CSS in JS despite the performance being worse, it’s what our team likes. We’ll bundle a giant CoreJS shim to all browsers even though the vast majority won’t need it. Running two builds is too annoying to set up in our workflow. The site loads fine on my MacBook Pro so why stress what Lighthouse says?
So on and so forth. For years I’ve heard that this isn’t a problem because all the extra developer productivity means they can move so much faster. It’s been long enough now that I’m comfortable dismissing that argument as nonsense. And many users are essentially trapped: for example, I bank with Chase and their web site is absolutely atrocious. I talk to friends and they say the same. But the UX of online banking is very rarely a reason for anyone to change bank so the product folks aren’t going to push for performance. It’s something developers have to proactively care about and as an industry we just don’t.
I hope eventually we see the industry change but for now it really feels like no-one cares. We’re prioritising the speed of the shovelling while ignoring the quantity of shit.
[+] [-] Hermitian909|4 years ago|reply
I think this is a mix of good old cargo culting and how hot the dev market is. Small companies don't have the leverage to tell their engineers "no" when they do a bit of resume driven development.
React makes sense for many of sites at larger tech companies paying big salaries because many of them are delivering a richer "application" experience. Some engineers blindly follow what "better" (massive air quotes) companies are doing. Others who want to work at these companies want the skills that will make them more hireable and so choose React. Devs who are not in either of these other camps make a small enough group they don't generally get their way.
[+] [-] addicted|4 years ago|reply
[+] [-] kristopolous|4 years ago|reply
It doesn't actually empower anything - it acts as a scolding nanny prohibiting people from making things explode.
It does this by imposing a narrow opinion about things that's constantly changing with each release as it herds the programmer through very specific and alarmingly complicated ways of doing things.
It's fundamentally a tool of oppression, not one of expression and one of coercion and not liberation. I hate it so much.
[+] [-] at-fates-hands|4 years ago|reply
I recently transitioned from being a developer to being an accessibility engineer. I've been able to see first hand how the user experience has been neglected, especially for people with disabilities.
A lot of the JS frameworks are getting better at A11Y, but its still a major work in progress.
[+] [-] swsieber|4 years ago|reply
There is always a time vs end user experience tradeoff, even with physical goods. I think it's part of a bigger trend towards (more prolific) crappy stuff generally.
I do think there is space for tools that improve the user experience whilst not regressing on easy of development.
[+] [-] diob|4 years ago|reply
If developers struggle with performant technologies / methods / frameworks, I'm sure they're going to end up producing crap apps in the end still.
Rather than complain, we should be focusing on making good performance the happy path. In fact, I'd say that's most of what the latest frameworks that are popular focus on, so I don't really understand the issue.
If you build the developer experience, they will come. If you don't, I hate to say it, but everyone has mouths to feed.
[+] [-] Aeolun|4 years ago|reply
In practice I load every webpage once (even very complicated applications can be loaded in under 2MB), and then go on my merry way.
[+] [-] musicale|4 years ago|reply
Web development leverages a terrible developer experience to create a terrible user experience.
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] city41|4 years ago|reply
For example Next does a pretty good job of code splitting and offers rendering solutions from 100% client side to 100% static, as in no React runtime at all. Building a performant Next site that only pulls in the JS it needs is quite doable now.
React server side components seem at least interesting here. I still have no experience with them but they do at least claim to tackle some of these issues. As I understand it other frameworks like Svelte and Vue have been exploring these ideas too.
I think the bleakness will continue and there will be a long tail of it. But I really do think better websites are coming and in many cases are already here.
[+] [-] thelazydogsback|4 years ago|reply
I feel the same about iOS apps -- they are all adventure games.
And in-place editing on a phone? Just fill the whole freaking screen with a big editor window in a giant font where I can target the cursor with my fingers, and let me submit it to the field when I'm done.
And get off my lawn! :)
[+] [-] PeterWhittaker|4 years ago|reply
How? Why? Efficiency?
My company does complex cloud and on-prem web-based apps, JS front and back.
These inefficient frameworks make our developers more productive and allow us to design, deliver, and fix fast. Storage is cheaper, bandwidth is cheaper, processing cycles are cheaper, all are cheaper than developer time.
And these powerful frameworks give us great speed in iterating new features, which makes our customers happy.
The article is too focused on efficient use of computing resources at the expense of ignoring developer costs and customer satisfaction.
[+] [-] chmod775|4 years ago|reply
Then you've certainly run into power users.
Chances are browsers can load and display a 2MB server-generated HTML table faster than the DI system in your 5 MB angular app can start all the required services to be ready to launch your pagination component.
However once it is ready it may be quickly apparent that users would be much more productive using that giant table and CTRL+F than a laggy angular app that would struggle even handling that much data at once in an idiomatic way.
Of course I am making this up to illustrate a point. However I use "one giant HTML table" as a sort of bar to meet when developing in modern frameworks. If with this insanely complicated and large array of tools I can't even do better than that (faster to implement!) competing solution, I might as well not bother.
That "a bunch of large server-rendered HTML tables" just so often happen to be the previous solution that was in place, and power users are likely to complain if we do worse, is just the cherry on top.
To do better we often have to avoid certain things that would be considered modern. Like not putting lots of empty space in our layout, no server-side search that would introduce too much (pointless) latency compared to client-side, and some un-idiomatic stuff to actually handle large[1] amounts of data without change-detection/updating in our framework of choice eating user's CPUs for breakfast.
[1]: The amount of data/elements at which change detection in idiomatic Rust and Angular becomes slow wouldn't have been considered 'large' 15 years ago. Though obviously people then had to put thought into this stuff and it certainly wasn't automatic.
[+] [-] blakehaswell|4 years ago|reply
For me this is a leap. I can't think of many examples of software which I use where new features have actually made me happy. Normally it's just change which forces me to learn something new while I'm in the middle of trying to accomplish something actually productive.
I think our industry over-estimates the value of "new features". But in my experience 90% of new features provide neutral or negative value. If—instead of a new feature—the software I used released a performance improvement, then that would actually help make me more productive.
[+] [-] kreeben|4 years ago|reply
[deleted]
[+] [-] austincheney|4 years ago|reply
I can remember back in the day at Travelocity they couldn't hire JavaScript developers to save their business. A $100 million business and the darling of the travel industry could not hire competent front end developers at all. Its how I became a developer, involuntary reassignment from designer.
I can also remember Travelocity trying to solve this problem with GWT (Google Web Toolkit). This mistake of an application compiled Java to JavaScript for people who probably should not have been writing either. Its all good so long as the code never needs changes or updates at which point Java developers wanted JavaScript developers to fix their obfuscated code.
How hard could writing this rinky dink language possibly be? Back in those days there were some actual challenges, but they were an inch deep and a mile wide. There were many technical areas to own, but only two separated the grossly incompetent from the barely employable:
* Cross browser compatibility (IE made this more challenging).
* The DOM. Then, just as now, tree models broke the souls of many software developers. I have never figured this out, because elementary schools are able to explain abstract concepts as tree models to children, but many adults cannot and will never understand the idea on even the most primitive level.
jQuery came and saved the day. No more walking tree models. Everything is now a chained method or a CSS selector through a very slow Sizzle engine. Now everybody and their dog could add events and text to a web page so long as they could read the jQuery API.
In hindsight none of this is surprising in the least. The surprising part is the compensation. Back in the day when the work was hard front-end developers were paid next to nothing. You were much better off being a Java developer where you could barely have a pulse and earn 2-3x as much. As the problems got easier and the number of available candidates blossomed the compensation surprisingly went up... slowly.
[+] [-] Timothycquinn|4 years ago|reply
[+] [-] kristopolous|4 years ago|reply
A URL that is now less Universal than ever, pointing to an application more than a piece of content
Content that is not self contained in a single document so it is hard to archive for the future
Content that is not served with the page nor containing a universal reference so it is hard to scrape and catalog
A profound inability to inspect websites because they are now transpiled via frameworks and emit something obtuse and only machine, not human readable
Content that is now in a non-generalized method of access so instrumenting it to fit it for different purposes (the "Free" part of "information is meant to be free") is now fragile and dubious
And more controversial:
The careful separation of concerns of the Javascript (Controller), HTML (Model), and CSS (View) gets tossed together and entangled in a single stream of hybrid languages where they no longer are separated. Instead people rely on less rigorously made tools to achieve the same ends with increased difficulty, less reliability, and less functionality - all cost and no benefit.
[+] [-] Swizec|4 years ago|reply
Links to individual pages would be useless to archive because they're not documents. Kinda like "Why would you bookmark a individual app window?"
You can scrape us with a browser, but why would you?
You can right click inspect or use react devtools, which is nice if you're trying to understand things. That part is way more accessible than an equivalent iOS, Android, or Desktop app would be.
Oh and we don't need to build an iOS, Android, and Desktop app! It's a single webapp that works in any client of your choosing.
Plus you get fresh updates and bugfixes with every page load, no delayed update cycle. This may be controversial if you dislike a UX change we make.
[+] [-] threatofrain|4 years ago|reply
IMO, the setting where open source communities are likely to grow is the right place to share code. This is also the setting where code is most likely to be understood, as understanding a codebase is very hard. Sharing code is not just a matter of dumping original source code to the public, and I think the use case around ordinary users examining source code is very, very niche.
And how should compiled-language communities like Rust handle the issue of sharing code? Should they distribute the original source code with the expectation that their users shall compile on their end? Is that the right way to share code, or is that just dumping code to the public?
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] bkm|4 years ago|reply
[+] [-] adriangrigore|4 years ago|reply
I built https://mkws.sh as a solution to at least static site generation. It can be enhanced with a Makefile but for now it's simple enough. I don't believe static site generation is a JavaScript job, and to be honest, I don't believe JavaScript belongs on the server side.
[+] [-] brundolf|4 years ago|reply
1) Client-side-rendered apps that haven't embraced hydration of pre-rendered HTML suffer for it. Not because the JS is necessarily too large, but because it blocks the content from being visible. This would be bad even with a small bundle. Luckily this is an easy problem to solve today, but not every site has done so.
2) Many public-facing sites are afflicted with marketing scripts. I've seen sites with (I kid you not) several times the amount of marketing JS (by weight) as they have actual functionality-giving JS. The actual devs often don't even have control over these scripts (thanks to Google Tag Manager), and even if they do they probably don't have a say in what can be removed.
3) Complex sites/web-based tools are a separate beast. You're not optimizing for first-visit, you're optimizing for responsive interactions. This has even less to do with bundle size, and often not even much to do with JS itself. In my experience the limiting factor on these sites is usually reflow (DOM/style changes).
[+] [-] mrozbarry|4 years ago|reply
- JS development has a low barrier to entry, so there is a wide/diverse range of developer skills and knowledge
- Most (webpack) tutorials don't cover build optimization, like tree-shaking, especially entry level, which is normally what people find to "just get the job done."
- There are too many options for frameworks for people to make _good_ decisions, so they often fallback to _popular_ decisions
This isn't even a recent problem. I have a few clients that wanted to use old-school template packages, like metronic, and that is a massive bloated build, and yet it's basically jquery+bootstrap with a handful of libraries.
The biggest thing is, as professional web developers, we have a responsibility to the end user to make a web page load fast and be usable quickly, and I think we often over-look that. I don't care if your framework is a no-build thing, or uses webpack/rollup/etc., the real thing is if a user doesn't have to download 50MB of javascript for a single page, especially if the end result of that page is considerably less than 50MB worth of text, styles, and images.
The flip side is, most clients just want their project done quickly so they can make money. There's nothing wrong with that, but it can't be at the expense of their users, and developers should be able to give good consultation and advice to their clients about the trade-offs of libraries/frameworks/build tools/etc. and the impact they will have on the end-users.
Personally, I like building without extra build tools. `script type="module"` is great, and smaller frameworks (my go-to is hyperapp, but there are several others) are often not awful to grab a local es6 copy and toss it in your public web directory, import it, and go to town.
[+] [-] avidphantasm|4 years ago|reply
[+] [-] noahtallen|4 years ago|reply
I think this is a very subjective take. There is also a strong argument that the web is much better today because of modern frameworks and technologies. The web has shifted from being a place for basic content and documents to a full-blown application platform. 10-15 years ago, if a certain application was only built for (say) macOS, well, tough luck! You won’t be able to use it. Today, many popular applications can be used in the browser on any device and any operating system. (Examples like full office suites including MS office and Google docs, or photo editing apps, or music streaming platforms, or video conferencing.) That’s a pretty incredible advantage and improvement to the general state of software, and it’s a very strong improvement for users who no longer need to worry about incompatible software or installation processes.
I would argue that this advancement would not have happened without frameworks. Or at the very least, it brought about frameworks. Since the web platform itself is not designed for applications, any large application will probably end up writing some sort of abstraction on DOM rendering and state management, just to make it halfway maintainable. So if we want web applications, which are very beneficial in many circumstances, frameworks and tooling are a part of that.
Of course bundle size is still a problem to worry about. It’s a problem inherent to the benefit of the platform. Typical software can make the bundle as big as they want, and you download and install it once. The web platform forces you to care about the size in a way that just isn’t necessary for other platforms.
“Modern JS” is really not what is making the web worse. What’s making the web worse is ads, tracking, platform-lock-in, big monopolies, censorship, spam, poor content safety practices, etc.
Like, of course you don’t need a big framework for your blog or basic content site. It’s all about choosing the tool for the job. Half the time, the blogger probably just wanted to play around with a new technology. It really doesn’t matter that much for small little projects that are just people playing around.
Bundle size is important when you have a big enough user base and really have a need to optimize for poor internet connection and slower devices — but it’s just bundle size. You’d get even bigger bundle sizes without using modern tools like webpack. The bundle size comes from the larger types of apps we’re building now that we didn’t used to build, not from web frameworks specifically. IMO, that makes bundle sizes an important, but relatively small aspect of “problems related to the web.”
[+] [-] neximo64|4 years ago|reply
[+] [-] superkuh|4 years ago|reply
There's an easy solution though. Just don't use javascript to make your personal websites. The corporate web can wallow in it's own filth and decay for all I care. If the humans on the web make websites without JS the web can be, and is, still great.
[+] [-] SV_BubbleTime|4 years ago|reply
[+] [-] irrational|4 years ago|reply
[+] [-] soperj|4 years ago|reply
[+] [-] bobthepanda|4 years ago|reply
* JS's history is all about "quick and dirty". jQuery in older times was meant to paper over the quick and dirty and give developers more convenience, and old JS without JQuery was pretty awful
* we are asking JS to do things it was never meant to do; in particular, the desire for "app-like" experiences and SPAs.
* JS is very backward compatible, which means old webpages can be archived, but also means that all the warts are still around.
* JS interpretation depends on browsers, which means if you want to use some modern features you will have to use some kind of transpiling, polyfill, and or framework to get some consistent behavior on all of them (jQuery in older times, babel today), and the differences in browsers means that these adapters have to be mind-bogglingly complex a lot of times
[+] [-] FractalHQ|4 years ago|reply
[+] [-] vjancik|4 years ago|reply
Frameworks being large and taking a long time to parse isn't the fault of JavaScript or NPM.