top | item 13601451

The web sucks if you have a slow connection

1269 points| philbo | 9 years ago |danluu.com | reply

598 comments

order
[+] ikeboy|9 years ago|reply
>When I was at Google, someone told me a story about a time that “they” completed a big optimization push only to find that measured page load times increased. When they dug into the data, they found that the reason load times had increased was that they got a lot more traffic from Africa after doing the optimizations. The team’s product went from being unusable for people with slow connections to usable, which caused so many users with slow connections to start using the product that load times actually increased.
[+] samuell|9 years ago|reply
I've tried using Google products from Africa (Ethiopia ... last time this January), and generally, it is right out unusable. JS-heavy apps like GMail will never load properly at all.

This while the connection in itself is not THAT bad. I use to use a 3G/4G mobile connection and it generally works excellent, with pretty quick load times, for everything else than javascript-heavy web apps.

I have a hard time understanding why this issue is not paid more attention. Ethiopia alone has some 99 million inhabitants, with smart phone usage growing by the hour. Some sources say "the country could have some 103 million mobile subscribers by 2020, as well as 56 million internet subscribers" [1].

[1] https://www.budde.com.au/Research/Ethiopia-Telecoms-Mobile-a...

[+] Namrog84|9 years ago|reply
This is a perfect example of why "average" metrics for such values aren't that great and are often overused as vanity metrics.

A nice chart showing how many users are in each bucket of load time would be far more useful. One that you could easily change the bucketsize from 0.1ms to 1 second and these types of 'digging' wouldn't even be a second thought.

[+] dba7dba|9 years ago|reply
Funny anecdote about the freeway system of southern California.

When they were initially planning the system in 1930s, 40s, they were planning to have the system in use for next 100 years. So they built over sized roads (like 10 lane freeway, without having to stop for traffic lights, that go THROUGH center of a major city).

When the system proved so car friendly, more and more people moved in and bought cars. Within in a short period of time (much shorter than 100), the system is completely jammed.

Always look for unintended consequences...

[+] markatkinson|9 years ago|reply
As someone who lives in Africa, hoorah! More of this please. For me the best feeling is visiting a web page that is almost entirely text based. It loads in a few seconds which feels like quite a rare feeling these days.
[+] inian|9 years ago|reply
this was in relation to Project Feather of the Youtube - the Youtube website did not even load before for them and when it did, they started watching more videos even though it took more than 20 seconds to load!
[+] sogen|9 years ago|reply
Read somewhere it was YouTube
[+] gabemart|9 years ago|reply
Something I have had at the back of my mind for a long time: in 2017, what's the correct way to present optional resources that will improve the experience of users on fast/uncapped connections, but that user agents on slow/capped connections can safely ignore? Like hi-res hero images, or video backgrounds, etc.

Every time a similar question is posed on HN, someone says "If the assets aren't needed, don't serve them in the first place", but this is i) unrealistic, and ii) ignores the fact that while the typical HN user may like sparsely designed, text-orientated pages with few images, this is not at all true of users in different demographics. And in those demos, it's often not acceptable to degrade the experience of users on fast connections to accommodate users on slow connections.

So -- if I write a web page, and I want to include a large asset, but I want to indicate to user agents on slow/capped connections that they don't _need_ to download it, what approach should I take?

[+] curun1r|9 years ago|reply
This seems like the thing that we'd want cooperation with the browser vendors rather than everyone hacking together some JS to make it happen. If browsers could expose the available bandwidth as a media query, it would be trivial to have different resources for different connections.

This would also handle the situation where the available bandwidth isn't indicative of whether the user wants the high-bandwidth experience. For example, if you're on a non-unlimited mobile plan, it doesn't take that long to load a 10mb image over 4G, but those 10mb chunks add up to overage charges pretty quickly, so the user may want to set his browser to report a lower bandwidth amount.

[+] neuland|9 years ago|reply
One hack-y way to do it would be to load it via JavaScript. For example, see this stackoverflow [0]. Obviously not a great solution, but it works if your dying for something.

I bet people w/ slow connections are much more likely to disable javascript, though.

    let loadTime = window.performance.timing.domContentLoadedEventEnd- window.performance.timing.navigationStart;
    if (loadTime > someArbitraryNumber) {
        // Disable loading heavy things
    }
[0] http://stackoverflow.com/questions/14341156/calculating-page...
[+] tommorris|9 years ago|reply
There is a proposed API for that.

https://wicg.github.io/netinfo/

And like most such APIs, it has been kicked around for a long time and it has only been adopted by Chromium on Android, ChromeOS and iOS. It'd be great if it were more widely adopted...

[+] gkya|9 years ago|reply
> Every time a similar question is posed on HN, someone says "If the assets aren't needed, don't serve them in the first place", but this is i) unrealistic, and ii) ignores the fact that while the typical HN user may like sparsely designed, text-orientated pages with few images, this is not at all true of users in different demographics. And in those demos, it's often not acceptable to degrade the experience of users on fast connections to accommodate users on slow connections.

This is prejudice. People use Craigslist, for example. If the thing is useful, people will use it. If there's a product being sold, and if it's useful to the potential clientele, they'll buy it. Without regard to the UI.

In the past ten years while my connection speed increased, the speed at which I can browse decreased. As my bandwidth increased, all the major websites madly inflated.

> So -- if I write a web page, and I want to include a large asset, but I want to indicate to user agents on slow/capped connections that they don't _need_ to download it, what approach should I take?

Put a link to it with (optionally) a thumbnail.

[+] gnud|9 years ago|reply
Random idea: Get the current time in a JS block in the head, before you load any CSS and JS, and compare it to the time when the dom ready event fires. If there's no real difference, load hi-res backgrounds and so on. If there is a real time difference, don't.
[+] avhon1|9 years ago|reply
In the ideal future, FLIF [0] would become a standard, universally supported image and animation format. Almost any subset of a FLIF file is a valid, lower-resolution FLIF file. This would allow the browser - or the user - to determine how much data could be downloaded, and to display the best-quality images possible with that data. If more bandwidth or time became available, more of the image could be downloaded. The server would only have one asset per image. Nice and simple.

[0] http://flif.info/

[+] Someone1234|9 years ago|reply
I found out this the hard way.

T-Mobile used to offer 2G internet speeds internationally in 100+ countries included in Simple Choice subscriptions. 2G is limited to 50 kbit/s, that's slower than a 56K modem.

While this absolutely fine for background processes (e.g. notifications) and even checking your email, most websites never loaded at these speeds. Resources would time out, and the adverts alone could easily exceed a few megabytes. I even had a few website block me because of my "ad blocker" because the adverts didn't load timely enough.

Makes me feel for people in like rural India or other places still only at 2G or similar speeds. It is great for some things, not really useable for general purpose web browsing any longer.

PS - T-Mobile now offers 3G speeds internationally; this was just the freebie at the time.

[+] Jakob|9 years ago|reply
Disable JavaScript. You’ll be surprised at how most of the web still works and is much faster. Longer battery life on mobile, too.
[+] freehunter|9 years ago|reply
Yeah when I used to run over my T-Mobile data allotment (in the US) and they dropped me to whatever speed they throttle you to when your "high speed" data is gone, Google Maps wouldn't load, Facebook wouldn't load, YouTube wouldn't load. I remember using all of those things back in the days when a 3G connection was a luxury, back when Windows was the best smartphone platform. What happened between then and now that suddenly nothing works?
[+] CoolGuySteve|9 years ago|reply
I was in rural China with an EDGE connection on Google Fi last month.

Hacker News was pretty much the only site I visit that could reliably load quickly. m.facebook.com had a slight wait but was still bearable. I had to leave my phone for 10 or 15 minutes to get Google News.

WeChat and email worked well.

Everything else was horrible, especially ad networks that would ping pong several requests or load large images.

Opera has a compression proxy mode that helped a bit when it worked but it was still painful.

For search results, Stack Overflow, and YouTube, it was easier to easier to ssh into an AWS node and use elinks/youtube-dl.

Using SSH as a socks proxy/compression was insanely slow due to something with with the great firewall.

[+] _delirium|9 years ago|reply
> PS - T-Mobile now offers 3G speeds internationally; this was just the freebie at the time.

I don't think this has changed, at least not in general. The included roaming package is still free international 2G roaming everywhere except Mexico and Canada (which get free 4G), with "high-speed data pass" upgrades available for a daily or weekly fee if you want faster. They did have a promotion for the 2nd half of 2016 (initially for the summer, then extended through the end of the year), where international 3G, and in a few areas 4G/LTE, was free without buying the upgrade passes for most of Europe and South America [1]. But that's now over, and I believe it's back to free 2G internationally now.

[1] https://newsroom.t-mobile.com/news-and-blogs/t-mobiles-endle...

[+] Figs|9 years ago|reply
I use T-Mobile as my ISP because the only landline choice in my apartment building is AT&T and I absolutely refuse to do business with them. I regularly hit the monthly bandwidth cap on my plan and get booted down to 2G.

I live in California -- this is not just something people internationally are dealing with.

Annoyingly, T-Mobile's own website doesn't work properly when you're throttled to 2G speed. Found that out the hard way when I ran out of minutes on Thanksgiving and couldn't talk to my family, and couldn't load their website to add more minutes.

[+] r00fus|9 years ago|reply
I mainly used it for things like slack, skype and emails, and mapping.

With iOS9+ content blockers and things like Google AMP, I think the web is a lot more usable.

Apps tend to be less bloated in terms of bandwidth as well, since they usually don't load as many assets on request.

[+] megablast|9 years ago|reply
You have just discovered why apps are so good, they can download content in small amounts.
[+] ClassyJacket|9 years ago|reply
My 35Mbit cable got shaped down to 0.25 Mbit/s yesterday because we went over our download limit. It was like having no connection. I just gave up using it.

I hate the all-or-nothing approach to shaping. At least give me 5Mbit or something!

[+] stuckagain|9 years ago|reply
Having used both, I'll take the 2G mobile over the 56k modem every time.
[+] archagon|9 years ago|reply
FYI, I had the same connection and I'm pretty sure T-Mobile simulates 2G by switching 3G on and off to get the correct speed on average. Breaks a lot of stuff. Almost unusable!
[+] 23443463453|9 years ago|reply
It's what makes me wish designers and developers would work with artificial constraints. Sure, it's easy to design and develop without really thinking of bandwidth constraints, but reality is you are and will always be a better developer and designer by setting artificial bandwidth constraints in your mind and choices.

Seeking out or thinking as though you have bandwidth constraints can push you to find better solutions and thereby make your services better. The west and the tech centers in particular is really rather blinded by the glutinous bandwidth that keeps eating up greater and greater amounts of data with only marginal improvements in outcome or user experience.

[+] geforce|9 years ago|reply
Sad thing is that most of the web sucks on rather fast connections too. Pages being almost 5mb of data, making multiple dozens of requests for librairies and ads. Ads updating in the background, consuming evermore data.

I don't notice it much on my PC, since I've got a FTTH connection, but on LTE and 3G, it's very noticeable. Enough that I avoid certain websites. And that's nowhere near slow by his standards.

I do agree that everyone would benefit from slimmer websites.

[+] Terr_|9 years ago|reply
I have Javascript off-by-default, and about 80% of the time it simply makes everything better.

Oh, sure, a few sites need JS (and get whitelisted) and some just have minor layout quirks... But I can actually scroll down and read the text of a news article rather than suffering through waiting times and input-latency as Javascript churns.

[+] coldpie|9 years ago|reply
Firefox on Android supports uBlock Origin.
[+] lucb1e|9 years ago|reply
Figure out a few interesting/useful websites that work fine without Javascript. Try browsing those for half an hour, then switch Javascript back on and browse your usual websites. You'll probably notice it's so much slower, even with FTTH, because of network load but also CPU (and marginally RAM, though modern browsers are mostly to blame for that).
[+] ploxiln|9 years ago|reply
I notice it in my browser's memory usage.
[+] etatoby|9 years ago|reply
I design and write my company's framework, that other devs use to write websites and webapps.

I base my work on existing technologies (lately Laravel, which means Symfony, Gulp, and hundreds of other great libraries) but I always strive to:

1. Reduce the number of requests per page, ideally down to 1 combined and compressed CSS, 1 JS that contains all dependencies, 1 custom font with all the icons. Everything except HTML and AJAX should be cacheable forever and use versioned file naming.

2. Make the JS as optional as possible. I will go out of my way to make interface elements work with CSS only (including the button to slide the mobile menu, various kinds of tooltips, form widget styling, and so on.) Whenever something needs JS to work (such as picture cropping or JS popups) I'll make sure the website is usable and pretty, maybe with reduced functionality or a higher number of page loads, even if the JS fails to load or is turned off. Also, the single JS file should be loaded at the end of the body.

2b. As a corollary, the website should be usable and look good both when JS is turned off, and when it's turned on but still being loaded. This can be achieved with careful use of inline styles, short inline scripts, noscript tags, and so on.

3. Make the CSS dependency somewhat optional too. As a basic rule, the site should work in w3m, as pointed out above. Sections of HTML that make sense only when positioned by CSS should be placed at the end of the body.

I consider all of this common sense, but unfortunately not all devs seem to have the knowledge, skill, and/or time allowance to care for these things, because admittedly they only matter for < 1% of most website's viewers.

[+] SwellJoe|9 years ago|reply
I travel fulltime and my primary internet is 4G LTE. But, even though I spend $250 per month on data, I still run out, and end up throttled to 128kbps for the last couple days of the data cycle. The internet is pretty much unusable at that rate. I can leave my email downloading in Thunderbird for a couple of hours and that's usable (gmail, however is not very usable), and I can read Hacker News (but not the articles linked, in most cases). Reddit kinda works at those speeds. But nearly everything else on the web is too slow to even bother with. When I hit that rate cap, I usually consider it a forced break and take a walk, cook something elaborate, and watch a movie (on DVD) or play a game.

So, yeah, the internet has gotten really fat. A lot of it seems gratuitous...but, I'm guilty of it, too. If I need graphs or something, I reach for whatever library does everything I need and drop it in. Likewise, I start with a framework like Bootstrap, and some JavaScript stuff, and by the time all is said and done, I'm pulling a couple MB down just to draw the page. Even as browsers bring more stuff into core (making things we used to need libs for unnecessary) folks keep pushing forward and we keep throwing more libraries at the problem. And, well, that's probably necessary growing pains.

Maybe someday the bandwidth will catch up with the apps. I do wish more people building the web tested at slower speeds, though. Could probably save users on mobile networks a lot of time, even if we accept that dial-up just can't meaningfully participate in the modern web.

[+] nommm-nommm|9 years ago|reply
What really has baffled me lately is Chase's new website. They did a redesign around, maybe 6 months ago, to make it "more modern" or something, I guess.

Now the thing just loads and loads and loads and loads. And all I want to do is either view my statement/transactions or pay my bill! Or sometimes update my address or use rewards points. That's not complicated stuff. I open it up in a background tab and do other stuff in-between clicks to avoid excessively staring at a loading screen.

I just tried it out, going to chase.com with an empty cache took a full 16 seconds to load on my work computer and issued 96 requests to load 11MB. Why!?

I then login. The next page (account overview) takes a full 32 seconds to load. Yep, half a minute to see my recent transactions and account balances. And I have two credit cards with zero recent transactions.

I am just baffled as to who signed off on it!! "This takes 30 seconds to load on a high speed connection, looks good, ship it."

[+] iLoch|9 years ago|reply
> Why shouldn’t the web work with dialup or a dialup-like connection?

Because we have the capability to work beyond that capacity now in most cases. That's like asking "why shouldn't we allow horses on our highways?"

> Pretty much everything I consume online is plain text, even if it happens to be styled with images and fancy javascript.

No doubt, pretty much everyone who works on web apps for long enough understands that it's total madness. The cost however, in supporting people so far behind as to only be able to serve them text is quite frankly unmanageable. The web has grown dramatically over the past 20 years both in terms of physical scale and supported media types.

The web is becoming a platform delivery service for complex applications. Some people like to think of the web as just hyper text, and everything on it should be human parse-able. For me, as someone who has come late to the game, it has never seemed that way. The web is where I go to do things: work, learn, consume, watch, play. It's a tool that allows me to access the interfaces I use in my daily life. I think there's a ton of value in this, perhaps more than as a platform for simple reading news and blogs.

I look forward to WebAssembly and other advancements that allow us to treat the web as we once treated desktop environments, at the expense of human readability. It doesn't mean we need to abandon older + simpler protocols, because they too serve a purpose. But to stop technological advancement in order to appease the lowest common denominator seems silly to me.

[+] diggan|9 years ago|reply
Something that sticks out looking at the table. How can some sites simply FAIL loading? I mean, there is something inherently wrong with our web today, where if my internet is very slow and _could_ load a page in 80 seconds if I just leave it like that, the server itself could have configured the timeout to be 60 seconds. So I can never load the page?!

The assumption is here that both points of the connection is based on earth. When we have these hard timeout limits, how will stuff even remotely work when we are a interplanetary species or even from orbit around earth?

[+] whiddershins|9 years ago|reply
After spending a month in Mexico, including regions with spotty/inconsistent service from one minute to the next, I think the problem goes deeper.

Browsers are IMO terrible at mitigating intermittent and very slow connections. Nothing I browse seems to be effectively cached other than Hacker News. Browsers just give up when a connection disappears, rather than holding what they have and trying again in a little bit.

The only thing I used which kept working was DropBox. DropBox never gives up, it just keeps trying to sync and eventually it will succeed if there is any possibility of doing so.

I understand the assumptions of the web are different than an app like Dropbox, but I think it might be a good idea to reexamine those assumptions.

[+] 20years|9 years ago|reply
Most of the web really sucks on fast internet connections too. Thanks to so many web developers thinking every dang thing needs to be a single page app using a heavy JavaScript framework. Add animation, badly optimized images and of course ads and it becomes really unbearable.

We keep repeating our same mistakes but just in a different way.

[+] E6300|9 years ago|reply
> The main table in this post is almost 50kB of HTML

Just for fun, I just took a screenshot of that table and made a PNG with indexed colors: 21243 bytes.

[+] Filligree|9 years ago|reply
Not related to the contents of the article, but please add a max-width styling to your paragraphs. 40em or so is good.
[+] fenwick67|9 years ago|reply
By far the worst site I regularly use, from a page loading perspective, is my local newspaper.

It takes about 10 seconds before it loads to a usable state on a T1 connection.

If I pop open an inspector, requests go on for about 30 seconds before they die down. It's about 8MB.

http://www.telegraphherald.com/

[+] tetha|9 years ago|reply
I might need a reality check here because this is feeling weird.

I'm currently building a web-based application to store JVM threaddumps. This includes a JS-based frontend to efficiently sort and filter sets of JVM threads (for example based on thread names, or classes included in thread traces). Or the ability to visualize locking structures with d3, so you can see that a specific class is a bottle neck because it has many locks and many threads are waiting for it.

I'm doing that in a Ruby/Vue application because those choices make the app easy. You can upload a threaddump via curl, and share it with everyone via links. You can share sorted and filtered thread sets, you can share visualizations with a mostly readable link. This is good because it's easy to - automatically - collect and upload thread ddumps, and it's easy to collaborate with a problematic locking situation.

So, I'd call that a fairly heavy web-based application. I'm relying on JS, because JS makes my user experience better. JS can fetch a threaddump, cache it in the browser, and execute filters based on the cached data pretty much as fast as a native application would. Except you can share and link it easily, so it's better than visualvm or TDA.

But with all that heavywheight, fast moving web bollocks... Isn't it natural to think about web latency? To me it's the only sensible thing to webpack/gulp-concat/whatever my entire app so all that heavy JS is one big GET. It's the only sensible thing to fetch all information about a threaddump in on GET just to cache it and have it available. It's the only right thing to do or else network latency eats you alive.

Am I that estranged by now by having worked on one low-latency, high-throughput application by now? To avoid confusion, the threaddump storage is neither low-latency, nor high-throughput. Talking java with 100k+ events/s and < 1ms in-server latency there.

[+] Tade0|9 years ago|reply
Kudos to the author for making the post readable using a 32kbps connection.

My apartment does not have a landline, not to mention any other form of wired communication, so my internet connection is relegated to a Wi-Fi router that's separated by two walls(friendly neighbour) and a GSM modem that, after using the paltry 14GB of transfer it provides, falls back to a 32kbps connection.

Things that work in these circumstances:

- Mobile Facebook(Can't say I'm not surprised here).

- Google Hangouts.

- HN (obviously).

- A few other videoconferencing solutions(naturally in audio only mode).

Things that don't work, or barely work:

- Gmail.

- Slack(ok, this one sort of works, but is not consistent).

- Most Android apps.

- Github.

EDIT: added newlines.

[+] Entangled|9 years ago|reply
Can't browsers provide a service like

txt://example.com

that shows web content in plain text, no images, no javascript, nothing, something like readability but directly without loading the whole page first?

It would also be good for mobile connections.

* Wikipedia should be the first site to offer that txt: protocol, Google second.

* Btw, hacker news is the perfect example of a text only site.

[+] franciscop|9 years ago|reply
I totally agree. I used to have a really bad mobile connection up until a few years ago (Spain), and still when I use up all my mobile internet it reverses to 2G.

So I know the pain and decided I wouldn't do the same to my users as a web developer. I created these projects from that:

- Picnic CSS: http://picnicss.com/

- Umbrella JS (right now website in maintenance): http://github.com/franciscop/umbrella

Also I wrote an article on the topic:

- https://medium.com/@fpresencia/understanding-gzip-size-836c7...

Finally, I also have the domain http://100kb.org/ and intended to do something about it, but then I moved out of the country and after returning things got much better and now I have decent internet so I lost interest. If you want to do anything with that domain like a small website competition just drop me a line and I'll give you access.