top | item 6814650

The average top 1,000 web page has grown 150% in three years

45 points| sebg | 12 years ago |webperformancetoday.com

55 comments

order
[+] daviddaviddavid|12 years ago|reply
The explosion in use of web fonts is something I find a bit mystifying. There's nothing more jarring than watching the font of a web page change at load time. It's hard for me to imagine that the positives outweigh this one potential negative. It's like watching someone put their wig on in the morning.

I'm sure that there are tons of sites using webfonts in a way such that load time isn't visually compromised. However, I've seen the "font flip" at sites such as nytimes.com which I assume pay great attention to performance.

[+] nucleardog|12 years ago|reply
I used to find it pretty confusing to. I mean, most of the shit I worked in I was quite happy with Arial or some monospace font.

Since then, I've worked with some truly excellent designers. They're people who are great at design, but know very little about the web, so they tend to bring some of the more 'traditional' design concepts with them.

I've seen that the difference in feeling/perception a font can convey for your message when used well is much more significant than I ever would have presumed.

Simply put, there is an aspect of your message that cannot be conveyed with fonts that are available across every device and operating system. (What... Arial, Verdana, Georgia and Times New Roman?).

I still don't necessarily grasp the difference between, say, "Proxima Nova" and "Montserrat" (Google Web Font), but at this point I'm a little more willing to trust the designer.

[+] lignuist|12 years ago|reply
Using web fonts is one of the most annoying web design trends. My browser sometimes loads the wrong font for some reason, which for example leads to news rendered with a unreadable baroque typeface.

I would prefer to have them disabled completely, but didn't find an easy way to achieve this.

[+] lucian1900|12 years ago|reply
That's precisely why webkit chooses to instead not render any text depending on the font. I find that even more annoying, as I often get to see images first, then wait a lot more to see text.

Afaik only Firefox chooses to change fonts later.

[+] kevincennis|12 years ago|reply
It would be interesting to know what the median has done.

The average doesn't really tell much of a story, since there's a lower bound at 0MB but no real upper bound. My guess would be that a certain subset of sites have gotten a lot bigger and pulled the average up, while the remainder have seen more modest growth.

Anecdotally, it definitely seems like news sites and blogs have seen a pretty staggering increase in page weight in the past few years. It's not uncommon to see a ridiculous number of resources loaded from 30+ domains on a single site anymore (CNN.com, I'm looking at you).

[+] ams6110|12 years ago|reply
Advertising and linking to every social network under the sun. As well as a lot of gratuitous javascript dancing baloney.
[+] onion2k|12 years ago|reply
Is 'page size' alone enough to worry about this sort of thing? We don't download all the assets used every time we load a page.

Just a thought, but the total size of assets on the page ignores the use of caches and CDNs for common assets - if I visit 100 websites that all use Google's CDN to deliver jQuery I'll download a few hundred kbs and do 99 HTTP requests that return just a "nothing changed" header. The fact that those 100 websites are all a few hundred kb bigger than they were a few years ago means very little.

If I visit the same website every day for 100 days straight I'll probably only download jQuery once even if it's hosting the file itself because of my browser cache.

Obviously it's preferable if a site optimises things where it can, but I don't think a 150% increase in total page size equates directly to a 150% increase in data usage. It might for archive.org, but they're pretty atypical web users.

tl;dr Overall page size is less important than setting caching headers properly.

[+] pwnna|12 years ago|reply
I don't usually like to host my JavaScript elsewhere such as on the Google CDN. Are there any web standards in the works where you can specify a hash of a file to be cached regardless where it is?
[+] coldtea|12 years ago|reply
>Is 'page size' alone enough to worry about this sort of thing? We don't download all the assets used every time we load a page.

Actually we do download most of them. Browser caches are mostly useless for most of the stuff, including widely use JS frameworks and fonts. There were a few widely circulated articles on this.

[+] jol|12 years ago|reply
ok, but the article says that there is more images (btw, other media too), i.e. images make 50% of web page now and image size growth contibutes ~400KB out of ~900KB, images tend to change, given that top 1000 pages are dynamic, thus, no cache will help. Also I find it interesting that stylesheets are growing, given that css3 are more powerful now and IE6 is likely to be retired for most of these sites. What is interesting is what makes this "other" part? is it just webfonts or some media (also very cross-site cache friendly)?
[+] ams6110|12 years ago|reply
99 http requests still take some time even if the response is 304.
[+] Joeri|12 years ago|reply
I wonder how much of this growth is due to retina images. A consequence of the page bloat is that the ipad 1 has become almost unusable to browse the web because it has only 256 mb ram. If the current page doesn't fit in ram, safari closes down. I remember at one point that browsing slashdot and amazon on a machine with 128 mb ram worked just fine. There's no reason for the current bloat aside from simply not caring about efficiency.
[+] Ensorceled|12 years ago|reply
I'm guessing this is due to the current trend towards "infinite" scrolling pages with lots of images instead of the old "everything above the fold" school of design.

Much of this is being driven by mobile devices forcing users to become used to scrolling anyways, may as well take advantage of that.

For mobile surfing, I'm much happier with one large slow page than trying to navigate a bunch of smaller sub pages, each of which is also slow and hard to get to.

[+] crayola|12 years ago|reply
Two remarks:

- Median would be nice to have as well, as it is more robust to outliers.

- I would like to know whether this is driven by big new pages receiving many visits (changes in user behaviour), or by existing pages becoming bigger over time (changes in web practices).

[+] jayhuang|12 years ago|reply
I think many of us expected this. In previous years, much of the discussion and knowledge-sharing among web developers has been about ways to minimize page-loads, and about flash dying.

Nowadays with the proliferation of JS MV*, Node on the server-side, HTML5 games in JS, much of the focus and attention has been about how to find better performance in our JS code, how to more efficiently deal with the DOM and re-paints, how to get JS to a state where we can have a lot more complex games inside the browser. Also has to do with the nature of putting everyone on the client-side and the increased amounts of libraries being used (many of which are probably in the user's browser cache already).

That said, I'm not sure if it's the change in focus alone, or if it's also because many of the new-er web developers haven't yet been working for the web back when the discussion was about page-loads and when people cared more about supporting legacy-legacy-legacy browsers (I'm glad we're slowly letting that go...).

Regarding web fonts, well I guess it has to do with our obsession with pretty apps and thus, pretty fonts too.

[+] TimPC|12 years ago|reply
I think the problem is in part due to looking at raw data. In economics, any serious data set for analysis needs to be adjusted for inflation. When talking about bandwidth usage we need to do the same: how do the increases in data compare relative to the increases in bandwidth. Perhaps several graphs (one for mobile and one for desktop infrastructure). I suspect the growth in inflation adjusted terms is probably still significant, but if we're looking at a time period where many users have gone from 3G to 4G on mobile. I suspect the custom font behaviour is driven in part by how easy/inexpensive custom fonts are on native platforms and the fact that brands now try and associate a font as part of their identity in technology. We saw this with Helvetica and other fonts as an expression of corporate identity in the real world in a time period where conformity was more valued, but now the culture is everyone trying to be unique, so it's not surprising to see resources spent on different custom fonts.
[+] gravedave|12 years ago|reply
While the initial chart is telling, I find the subsequent analysis disappointing.

* It's not all about raw data amounts. Sure, images are the biggest share of traffic, but their size "only" doubled in the last 3y (according to my eyeing the article's chart).

* On the other hand, scripts seem to have tripled or quadrupled in size.

* The "Other" content also looks significant enough to warrant a deeper look, since it currently seems to be bigger than Flash, HTML and CSS combined, and has also grown most significantly. What's this "Other" content, and in what amounts? Web fonts? XML? JSON?

* The pie chart under #2 of the article is horrible.

* How do CDNs and caching factor into all this? How much of the shown amounts must really be downloaded every time?

[+] tux|12 years ago|reply
.. and yet I hardly see any speed difference on cable, because many websites use CDN/Cache. Also many users switched from heavy websites to green sites. Very light and fast. Using alternative DNS like OpenBSD helps ^_^
[+] paaaaaaaaaa|12 years ago|reply
I think a chart of average load times would be a lot more interesting.

If they are getting slower than us web developers are doing it wrong.

[+] mavhc|12 years ago|reply
How does this affect un-upgradable computers like ultrabooks? Can't even add more RAM to them
[+] coldtea|12 years ago|reply
Not much.

In the current market, you're supposed to upgrade your laptop every 3-4 years anyway.

[+] iaskwhy|12 years ago|reply
I would find it interesting to know how the average height changed over time. It seems now it's much more acceptable to have really tall pages unlike the above-the-fold way of thinking of the previous years.
[+] adventured|12 years ago|reply
The title is very misleading, I'd argue that it's link-bait.

The average is for only the top 1,000 sites.

I expected a large scale study of the Web. The top 1,000 sites are a very poor representation for the wider Web.

[+] netrus|12 years ago|reply
Yet, they represent a overwhelming part of the site-loads users experience (certainly >50%, at least for people from big countries).
[+] mtkd|12 years ago|reply
What's the benefit of using src='data:image for the images?
[+] pastr|12 years ago|reply
You save an http request
[+] GigabyteCoin|12 years ago|reply
Am I the only one here who thinks a 151% increase in 3 years is modest at best?

If I am not mistaken, the cost of wholesale bandwidth continues to drop at a near exponential rate year after year.

[+] ubercow13|12 years ago|reply
What about the speed of home connections?
[+] purephase|12 years ago|reply
In my experience, design/UX needs will always trump dev/ops. I don't see this changing in the near future. I think we need to figure out ways to make browsers and/or delivery improve instead.
[+] youngtaff|12 years ago|reply
Performance is all about User Experience
[+] asdasf|12 years ago|reply
>In my experience, design/UX needs will always trump dev/ops

What does that have to do with anything? Things are not getting bigger because of UX needs, they are getting bigger contrary to UX needs. They are getting bigger because marketing weasels have always wanted things to be this bloated and shitty, and now they think everyone has high bandwidth, low latency connectivity so they can force designers to do it now.

[+] eonil|12 years ago|reply
I believe most of those Flashes are Ad or Youtube link.