top | item 42772028

(no title)

dingosity | 1 year ago

I wonder how much of Ferris' latency is related to net speed and rendering speed / computational horsepower of their local machine. I've seen several different evaluation regimes which try to objectively measure web-site "responsiveness" independent of net and rendering speed. Maybe the OP could use one of them to show how different sites were using techniques that unduly reduce the perceived performance of their site.

Think about what happens if you're on a site that loads hundreds of images and has keep-alive explicitly turned off. Or if you're doing a TLS handshake across a high latency network (TLS, for all it's benefits, requires at least one back-and-forth to setup the secure transport before sending "content.") Or you have a weird / inefficient dynamic loading process. Each of these will cause perceived latency, but have different solutions.

I tend to agree with the OP, it certainly seems like sites I use on a regular basis are laggy, but we may need a more objective evaluation framework than "ugh. the web is slow." And who knows, maybe most of the problem could be solved by getting the OP a better ISP and a faster machine.

discuss

order

toast0|1 year ago

This rant doesn't have any specifics, but... I've got 1G fiber again, and tons of web pages are still slow as heck.

1-2 seconds to load for most users is not hard to hit if you care, and if most of your users aren't on 2g across the world from your hosting. At least for pages you're likely to enter the site on.

The rant points to pagespeed, which is a good start. If you serve your html in 200ms or less (measured on your server), have a reasonable implementation of TLS 1.2 or 1.3 and address the easy fixes on pagespeed, you'll probably have a faster than average site.

OhMeadhbh|1 year ago

Sure, but think about the past thirty years of web browser development. Every time software developers make a faster browser with cool new features, content developers make content that uses all that new capability. It's sort of the content equivalent of Wirth's Law (software is getting slower more rapidly than hardware is becoming faster.)

Developers almost always have reasonably beefy hardware setups (because the software they use requires plenty of memory or compute resources.) Does the OP's observation imply there's a wider range of hardware out there? Maybe people constructing the pages they're complaining about assume everyone will be on a kick-ass machine with the best GPU money can buy and on a low-latency / high-bandwidth network. Maybe it's an observation that too many web developers don't consider consumers with more mundane circumstances.

Also... I use Lynx and EWW a lot. The web seems pretty zippy when you're ignoring the images and javascript. But yeah, that's not a general solution, too many sites require javascript to function.

pdimitar|1 year ago

It's not JS that's the problem per se, it's what JS is being loaded and very often that's a metric ton of marketing crap i.e. stuff for retargeting, tracking and many other modern horrors.