(no title)
rcheu | 5 years ago
Anyways, this matches my expectations--people tend to be overly negative and only remember the good part. The mobile web as a whole has gotten faster due to network speeds+cpu improvements.
It is worth noting that pages are doing more after loading now than they used to be though. This won't show up in onload or first meaningful paint, etc. So the first paint is fast, but then if you try to scroll immediately afterwards you'll probably hit some jankyness while the rest of the page loads asynchronously (but only kind of asynchronously since there's a single main thread).
Some other things that could cause the regression are that more people own a budget Android phone now than before. People may not realize how slow these phones are. The single core performance of the top budget phone, the Samsung A50, is comparable to an iPhone 6 which came out in 2015.
philwelch|5 years ago
The question is whether those pages are doing more for me, or whether they are doing more to me. When I load a page that would have been a normal hypertext document 10 years ago, instead I get a clown show filled with "we have cookies" pop-ups, tracking scripts, ads, ad-blocker-blockers, and more.
> Some other things that could cause the regression are that more people own a budget Android phone now than before. People may not realize how slow these phones are. The single core performance of the top budget phone, the Samsung A50, is comparable to an iPhone 6 which came out in 2015.
If I owned an iPhone 6 in 2015 and own a Samsung A50 today, and the web is slower and jankier today than it was five years ago, then isn't it fair to say that the web got slower?
Yhippa|5 years ago
The best part is that I'm paying for that privilege when I'm not on WiFi.
jakemal|5 years ago
Yes. But what do you suggest as an alternative? Refuse to take advantage of technological progress to cater to the small portion of the population who haven't updated their phones in half a decade?
konjin|5 years ago
The network speeds have increased, network latency has decreased, the hardware has gotten faster and we're at best stuck in the same place we were in 2010.
>Page weight has increased over time, but so has bandwidth. Round-trip latency has also gone down.
>Downloading a file the size of the median mobile website would have taken 1.7s in 2013. If your connection hasn't improved since then downloading this much data would now take 4.4s. But with an average connection today it would only take 0.9s.
The problem today is that the average website sends a couple dozen to a couple of hundred requests to complete a load. The average website 10 years ago sent a couple to a couple of dozen requests for the same thing.
So after 10 years of constantly improving technology and spending $5000 on phones to keep up with the latest CPUs the performance is pretty much the same.
Imagine if you had to buy a new car every 5 years to drive at the speed limit. That is the situation we are in.
kristopolous|5 years ago
It's not about "the web" it's about various schools of web building.
This is most noticable when a property fundamentally changes their approach (reddit) or when twitter did it a while back and then (sensibly) retreated.
A better, more sensible approach then say, graphing CPU clockspeeds, would be to fragment web development into these various schools, give them names and then characterize them accordingly.
There's really only two ways to talk about this problem: one is hopelessly divisive and factional and the other is irrelevant and useless.
That sounds unpleasant? Correct! That's why it's still a problem and getting worse.
When the "make things better" axe falls on the fingers of the "mostly harmless" it's the passions of the axe wielder that get the focus and the blame. So instead we all slide into mediocrity together. It's the path of human institutions and the web isn't immune from the pattern.
FridgeSeal|5 years ago
Making something faster by throwing more hardware at it doesn't meaningfully count as making something faster IMO, you can make the most inefficient piece of software "fast" by throwing the biggest CPU and network you can find at it.
The real issue is why should an otherwise capable CPU from 5 years ago struggle to render the average website today, when it really shouldn't be that hard. Scrolling through someone's marketing website _should_ be a painless experience on even a low-end budget phone.
ooobit2|5 years ago
I stopped programming around 8 years ago because I hate the current MVC model most software is created and maintained with. What got me interested recently in dipping back in was a video on branchless programming. I love the idea of unit testing at the machine code level for efficiency, and then figuring out how to trick the compiler or runtime and the chipset into making quick, predictive outputs to reduce idling on branches or making 15 steps for something doable in as little as 4.
That feels like a completely opposing direction to take given the current priorities of engineers across almost all industries, even oldtime ones like Gaming.
arvinsim|5 years ago
Because marketing/product/design decides to add bells and whistles. Optimization is also not zero-cost effort. The business has to pay for it. I would assume that most business don't think it is worth it.
uCantCauseUCant|5 years ago
[deleted]
ksec|5 years ago
That depends on what is a capable CPU. I would say even iPhone dont have an Capable CPU 5 years ago today. ( iPhone 6, iPhone 6s not released yet )
And it is even worst on Android, and the current state of things aren't much better. Hopefully ARM will catch up in the next 5 years, but that means it will take another 5 years to filter down to market.
i.e Not looking Good.
masswerk|5 years ago
There are actually cases, were faster infrastructure had slowed down a system significantly. E.g., British railways (in various organisational form over the years) entertained rolling post office trains, which grabbed mail bags on the go, sorted the mail and dropped it again without any halts, since 1838. This played quite a role in the evolution of fast delivery of national news papers, up to 8 deliveries of mail per day in urban centers, etc. By the 1960s the procedure had become too dangerous for the increased speed of trains (with several firemen loosing their heads in accidents involving the scaffolds for handing over the mail bags) and the last Travelling Post Office ceased operations in 1971. Moral: by speeding up the network by a few miles per hours, mail delivery slowed down by a day.
Similarly, as mobile network speeds increased, expectations what could be done with this rose faster than the actual speed of infrastructure. Add high-res resources with previously unheard of page loads and you've established a system of ever increasing expectations and visions, which will be always bound to significantly outclass the real life capabilities of the infrastructure. As long as we stick to this paradigm, increasing network speed will always result in a slower web, due the Wirth factor involved. I'm afraid this will be even more true for any further significant speed-ups, like those promised by a fully operational 5G network. (Also, visions and concept that are apt to exploit and even challenge the capabilities of 5G will probably also pose a new challenge to any hardware on the end points, which may prove eventually financially challenging for an average user, by this introducing yet another significant gap and respective drops in average real-world performance.)
konjin|5 years ago
mulmen|5 years ago
knaq|5 years ago
That budget Android phone blows away the hardware that I was using. A quick search for the Samsung A50 tells me: "on Verizon's network in downtown Manhattan [...] average data speeds of 57.4Mbps down and 64.8Mbps up". That is 38 to 506 times faster. It has an absurdly fast 8-core CPU running at 2300 MHz. Ignoring the fact that MHz is a terrible benchmark, that is a factor of 138 faster. The RAM is bigger by a factor of 128 or 192. There isn't really any hard drive latency on the phone.
Yes, the web is slow.
The trouble is that browsers make no attempt to stop web sites from using infinite resources. The assumptions are that web sites will politely cooperate to share my computing resources, and of course I couldn't possibly want to actually use tabbed browsing to access lots of web sites, and we all discard our hardware as electronic waste after just a few years.
noodlesoups|5 years ago
You can actually play hw-accelerated doom3 on a browser today no problems. No add-ons, no nothing needed.
zwaps|5 years ago
In fact, despite new tech, it is a relatively recent phenomenon that I feel my phone slowed down by a freaking website.
Furthermore, phones may be faster, but websites load slower and are janky due to unnecessary async loading.
So what’s the point of faster phones? The point is that webdev is terrible.
The article goes into all sorts of hoopla to claim that the web isn't slower. Thing is, I can still open those websites on of yore on my phone right now! And to no one's surprise, the new web tech is indeed slower and imo also worse in user experience. So yeah.
ssalka|5 years ago
This point is agreeable, though if you browse the web on the same phone for 5+ years (my dad still uses his iPhone 6S), you may notice a difference over time.
Also, it's not a good sign if consumers are forced to go along with planned obsolescence just to keep their Internet browsing experience from becoming increasingly slow. The principle of progressive enhancement means that websites built today should work well on phones made years ago.
codetrotter|5 years ago
Ideally (well ideally really, companies themselves would provide APIs for accessing the content but unfortunately that makes it difficult for them to make money, both directly through loss of ad revenue when clients don’t show their ads and indirectly by making it easier to pirate stuff), we’d have a joint open effort to do this on a massive scale. For now I am doing it on a small scale on my own, writing tools for my own use only.
In addition to this I have also started work on retrieving the content that is walled off in apps that I use. For example, there are some magazines that I used to subscribe to for a while, and I’d rather be able to keep my access to the content indefinitely than to see it disappear whenever the publisher decides that the magazine has run its course and subsequently stops updating the app and then shuts down the servers that host the data.
On top of this, I have for a longer period of time (years now) been using for example Facebook as little as possible. I only use it for Messenger and for upcoming events mostly. Meanwhile, Instagram also owned by Facebook is worth it for me to continue to use much more actively for now. But I am also slowly working to make something of my own to host content that I myself produce, with the intent of continuing to consume content shared on Instagram but to cut back on posting to Instagram and instead to post my stuff on my own server. It’s not like any of my stuff gets much attention anyways, so for me it will not be a big difference in terms of engagement. Mostly the way I use Instagram in terms of the content that I myself post, is that I post pictures and videos that I have made that I think are worth sharing and then when in conversation with friends and aquaintances I sometimes pull up my phone to show them in person something that I did or made recently. A self-hosted service could serve the same purpose.
As for the increasingly slow experience of browsing the web, I come to realize that this might in fact contribute to what the parent to yours said, about people on HN not reading the linked article. At least, for myself I find that I often don’t click through to the linked articles, and I think the experience of slow-scrolling, megabytes heavy pages is contributing to this. I try, however, to not comment on the story itself unless I have read it first. Meanwhile, HN itself is lightweight and comfortable to be on. And often the comments will encourage one to click through to the linked story if it is worth reading, either directly stating that it is worth reading or indirectly stating it by quoting something good from the page or talking about some good data points or novel information from the linked page. (Novel to me, I should note.)
etripe|5 years ago
What's distinctly lacking in that assessment is web applications getting more resource efficient, or more conservative with storage. I'd argue that hardware and CPU improvements are enabling bad tech stacks, like a friend might enable an alcoholic. Sure, you can minify, tree-shake, etc but with sufficient hardware, you don't strictly have to.
I also don't see TFA as an actual rebuttal of the hypothesis, since it focuses on the US. Half the planet has an uplink, so you're gonna end up with skewed results if you focus only on the top end of the technology distribution. While a rural connection in the US might be just as bad as a connection in rural India, I'd wager the mean and average connection speeds and latency are still way better in the US. The Internet is for everyone, not just Silicon Valley engineers on a MacBook Pro connected through fibre optics.
xondono|5 years ago
"Still, I don't think the mobile web – as experienced by users – has become slower overall."
To me it's hardly a positive. people are paying for fastest speeds, fastest phones, fastest CPUs, and yet they get nothing in return.
The fact that hardware is faster is no excuse for the very real web bloat, specially when most of this bloat is due to stuff that adds 0 value to customers, like tracking scripts, anoying pop ups, overly complex and intentionally confusing tracking disclaimers, etc...
unknown|5 years ago
[deleted]
ksec|5 years ago
Yes. And also.
The single core performance of Entry Level iPhone, iPhone SE; is faster than Flagship Android Phone.
And that is not counting the System and Software efficiency.
ksec|5 years ago
It is an unpopular statement. But you cant deny it.
ngold|5 years ago