top | item 1253783

It’s Official: Google Now Counts Site Speed As A Ranking Factor

107 points| ilamont | 16 years ago |searchengineland.com | reply

53 comments

order
[+] tokenadult|16 years ago|reply
This is a very user-friendly policy.

P.S. There is something to think about in that the first time I tried to post this comment, there was a problem loading the page. HN is actually one of the slowest sites I put up with--great quality is worth the wait, but not the quality of most websites.

[+] ShardPhoenix|16 years ago|reply
It usually seems pretty fast to me, but maybe that's because I mostly browse outside of US peak hours.
[+] noodle|16 years ago|reply
agreed, specifically about HN. it seems to have gotten much worse as of late. probably attracting a larger crowd nowadays, since more people are migrating here.
[+] g89|16 years ago|reply
Agreed about the policy, but what's that about HN being slow? It's loads incredibly fast for me, much faster than other news aggregators (Reddit, etc.)
[+] swombat|16 years ago|reply
Interestingly, they appear to count loading times of third party stuff (which doesn't affect the display of the main body of content, e.g. the blog post body) as part of the site load time.

This means that if you use Disqus and a few badges from Reddit and the like, even if you set them up so that they don't slow down your main content, Google will hold it against you. That's a little... not great.

[+] snprbob86|16 years ago|reply
How do you know that? Are you going by the webmasters tool performance page? I'd imagine that view is a very simplified perspective on the actual metrics that search uses for ranking.
[+] callmeed|16 years ago|reply
What about Flash movies loaded with JS (like swfobject)?
[+] petesalty|16 years ago|reply
Wouldn't this also have an impact on ad heavy sites, even if the ads load after the content?
[+] vaksel|16 years ago|reply
I have two concerns here.(well 3, but swombat covered that one)

1. If you are going to penalize sites for being "slow" then how about you tell us what slow is? Is a site loading in 10 seconds "slow"? is a site with a 5mb index page that loads in 15 seconds "slow"? How about some metrics so that we can optimize properly?

And what will that do to a lot of content rich sites? If you have a lot of images/flash/javascript it sounds like you are going to get screwed for trying to make a better looking user experience.

2. Of course Google search results experiment would affect user satisfaction. You are looking for results, and you want to do a lot of searches. BUT when you are clicking to see the result you like, I think most people would be willing to wait 2 seconds extra to load the more relevant information.

Sounds like this is yet another attempt at boosting big sites, where large sites like eHow and Mahalo get preference in results just because they can afford faster servers.

[+] nostrademons|16 years ago|reply
Some of the very worst offenders are big sites, eg. cnn.com takes about 2.5 seconds to load close to 100 external resources. Mahalo takes about 2 seconds for about 40 external resources.

As for metrics - as a user, I can say that 10 seconds is too slow. So is a 5M index page that loads in 15 seconds (what the hell do you need 5 megs of data on your index page for? That's like a 5 minute YouTube video). I want to see results within 1-2 seconds of clicking on a page; otherwise, you've broken my train of thought and I need to mentally context-switch each time I visit a page.

[+] pavs|16 years ago|reply
You do need faster server to serve more people, but you don't (necessarily) need faster server to make faster sites. There are some exceptions. If you have a site that takes 10 seconds to load even with low traffic, something is wrong with your configuration.
[+] brandnewlow|16 years ago|reply
Not pleased. It can cost a lot of money to have a fast site.

If you're a neighborhood blog trying to make a go of it, you can't afford a developer to optimize your site and cache the crap out of it. Meanwhile, the local newspaper site, running a tag archive page for your neighborhood powered by Outside.in or some other McLocal scraper app, can do that. You lose every time on the speed front, despite having original content.

[+] eli|16 years ago|reply
Not disagreeing, but I think it's often a matter of setting up or tweaking your caching strategy, not throwing hardware at the solution.
[+] sokoloff|16 years ago|reply
If your end users want a fast site (I would imagine most do), then the faster sites are going to win anyway, with everything else being equal.

At worst, Google is just reflecting that reality (in ~1% of searches where it changes the results).

Personally, I would prefer to compete in a world where everything relevant is "up for competition". You can compete on content, speed, price, or a myriad of other factors. If your complaint is that you want to compete on all of those things EXCEPT speed because that metric is unfair in your opinion, I don't have sympathy for that point of view.

[+] postfuturist|16 years ago|reply
Many blog engines, even WordPress with the right plugin, serve your pages as static files. Plenty of speed that way.
[+] ComputerGuru|16 years ago|reply
Google is saying my site takes 7 seconds to load... Well, it doesn't.

http://neosmart.net/dl.php?id=1 is one of the slowest pages... according to them.

Browse it and see for yourself. It's super fast.

TribalFusion and PubMatic take some time, as do the user tracking JS, but (a) not 7 seconds and (b) do not affect the actual content.

[+] dkubb|16 years ago|reply
FWIW, I just visited your link and with a cold cache it took ~5 seconds the status bar to report every asset was downloaded. Definitely not a scientific measurement by any means, but I thought you might like to know.

I also ran the site through http://www.webpageanalyzer.com/ (one of many such services), and it said on a T1 it would take approximately 6.5 seconds to load. It also provides a number of improvements to cut down the page size, and improve rendering speed.

[+] ShardPhoenix|16 years ago|reply
That page took several seconds to fully load for me (no more spinning thing) the first time, though the main content came up quicker.
[+] cullenking|16 years ago|reply
Definitely took > 5 seconds for me (measured by guesstimate, until last asset loaded).
[+] davidmurphy|16 years ago|reply
Could be faster in my experience. Not great.
[+] lwhi|16 years ago|reply
You run a popular site, with little cash.

How can you afford to keep your access time down?

Host advertising, perhaps?

[+] chaosmachine|16 years ago|reply
Don't worry about it.

Unless your site is taking 10 seconds to respond, it probably won't affect you. Google has said this change will affect only 1% of queries, so there's a 99% chance you're fine.

[+] pavs|16 years ago|reply
iframe ads.

Most popular sites use iframe to (either) asynchronously load javascript ads or load it on a separate page so that it doesn't effect your initial site speed. Most popular ad platforms also offer iframe specific codes you just have to ask for them (I know adify does). If they don't offer iframe codes, ask if it against their policy to load codes on iframe, they might make exceptions for high traffic sites (Arstechnica loads all ads on iframe).

For general optimization, yahoo has an excellent resource page: http://developer.yahoo.com/performance/rules.html I was able to bring my site from ~8-9s loading time to ~2-3s running on a not too powerful server.

Three optimizations that worked great for me.

- CDN for static files (maxCDN has a great cheap introductory offer of 1tb for $9.99 and offers PULL)

- Minify and Gzip CSS and js files and then fetch them from CDN.

- PHP cache (APC, eaccelerator or xcache.)

I am trying to reach <2sec speed point now.

[+] kadavy|16 years ago|reply
“Quality should still be the first and foremost concern [for site owners],” Cutts says. “This change affects outliers; we estimate that fewer than 1% of queries will be impacted. If you’re the best resource, you’ll probably still come up.”
[+] kpanghmc|16 years ago|reply
What if the page you're looking for just happens to contain a lot of content / images / etc?

Adding speed to Google's ranking algorithms is only useful for searches where there are several equally good search results (in which case the fastest would be the one you would want). But in the event that you're actually searching for a lot of information, having fast (but less informative) sites propagate to the top would be detrimental.

[+] donaldc|16 years ago|reply
Speed is only one factor in their ranking. If a site is notably better than other sites for a query, they said in the article that it will still rank first for that query.
[+] yanw|16 years ago|reply
As mentioned relevance is still king:

“Google also cautions web site owners not to sacrifice relevance in the name of faster web pages, and even says this new ranking factor will impact very few queries.”

It will only factor in less that 1% of querys.

[+] jbyers|16 years ago|reply
If it is true that non-HTML resources are a factor based on Google Toolbar reporting, that's scary. On high confidence sites -- more than 1000 datapoints, in Google Webmaster terms -- average speed seems stable and correlated with other performance measures. This is not true in my experience with medium confidence sites, 100 to 1000 Toolbar datapoints.

I'm hopeful that Googlebot is the primary signal.

[+] Raphael|16 years ago|reply
IIRC, this has been a factor on AdWords for some time.
[+] nfriedly|16 years ago|reply
I like this move. I hate it when I click a link and it ends up taking 20+ seconds to load.
[+] luckyland|16 years ago|reply
Yet another standard of measurement that is not directly related to the number one reason anyone uses Google: finding the most appropriate content.
[+] metamemetics|16 years ago|reply
How would googlebot measure page-load speed accurately? Wouldn't it be ignoring a slow site's cruft filled javascript and stylesheets?
[+] nostrademons|16 years ago|reply
The GoogleBot's aware of your cruft-filled javascript and stylesheet...
[+] eli|16 years ago|reply
The Google toolbar