P.S. There is something to think about in that the first time I tried to post this comment, there was a problem loading the page. HN is actually one of the slowest sites I put up with--great quality is worth the wait, but not the quality of most websites.
agreed, specifically about HN. it seems to have gotten much worse as of late. probably attracting a larger crowd nowadays, since more people are migrating here.
Agreed about the policy, but what's that about HN being slow? It's loads incredibly fast for me, much faster than other news aggregators (Reddit, etc.)
Interestingly, they appear to count loading times of third party stuff (which doesn't affect the display of the main body of content, e.g. the blog post body) as part of the site load time.
This means that if you use Disqus and a few badges from Reddit and the like, even if you set them up so that they don't slow down your main content, Google will hold it against you. That's a little... not great.
How do you know that? Are you going by the webmasters tool performance page? I'd imagine that view is a very simplified perspective on the actual metrics that search uses for ranking.
I have two concerns here.(well 3, but swombat covered that one)
1. If you are going to penalize sites for being "slow" then how about you tell us what slow is? Is a site loading in 10 seconds "slow"? is a site with a 5mb index page that loads in 15 seconds "slow"? How about some metrics so that we can optimize properly?
And what will that do to a lot of content rich sites? If you have a lot of images/flash/javascript it sounds like you are going to get screwed for trying to make a better looking user experience.
2. Of course Google search results experiment would affect user satisfaction. You are looking for results, and you want to do a lot of searches. BUT when you are clicking to see the result you like, I think most people would be willing to wait 2 seconds extra to load the more relevant information.
Sounds like this is yet another attempt at boosting big sites, where large sites like eHow and Mahalo get preference in results just because they can afford faster servers.
Some of the very worst offenders are big sites, eg. cnn.com takes about 2.5 seconds to load close to 100 external resources. Mahalo takes about 2 seconds for about 40 external resources.
As for metrics - as a user, I can say that 10 seconds is too slow. So is a 5M index page that loads in 15 seconds (what the hell do you need 5 megs of data on your index page for? That's like a 5 minute YouTube video). I want to see results within 1-2 seconds of clicking on a page; otherwise, you've broken my train of thought and I need to mentally context-switch each time I visit a page.
You do need faster server to serve more people, but you don't (necessarily) need faster server to make faster sites. There are some exceptions. If you have a site that takes 10 seconds to load even with low traffic, something is wrong with your configuration.
Not pleased. It can cost a lot of money to have a fast site.
If you're a neighborhood blog trying to make a go of it, you can't afford a developer to optimize your site and cache the crap out of it. Meanwhile, the local newspaper site, running a tag archive page for your neighborhood powered by Outside.in or some other McLocal scraper app, can do that. You lose every time on the speed front, despite having original content.
If your end users want a fast site (I would imagine most do), then the faster sites are going to win anyway, with everything else being equal.
At worst, Google is just reflecting that reality (in ~1% of searches where it changes the results).
Personally, I would prefer to compete in a world where everything relevant is "up for competition". You can compete on content, speed, price, or a myriad of other factors. If your complaint is that you want to compete on all of those things EXCEPT speed because that metric is unfair in your opinion, I don't have sympathy for that point of view.
FWIW, I just visited your link and with a cold cache it took ~5 seconds the status bar to report every asset was downloaded. Definitely not a scientific measurement by any means, but I thought you might like to know.
I also ran the site through http://www.webpageanalyzer.com/ (one of many such services), and it said on a T1 it would take approximately 6.5 seconds to load. It also provides a number of improvements to cut down the page size, and improve rendering speed.
Unless your site is taking 10 seconds to respond, it probably won't affect you. Google has said this change will affect only 1% of queries, so there's a 99% chance you're fine.
Most popular sites use iframe to (either) asynchronously load javascript ads or load it on a separate page so that it doesn't effect your initial site speed. Most popular ad platforms also offer iframe specific codes you just have to ask for them (I know adify does). If they don't offer iframe codes, ask if it against their policy to load codes on iframe, they might make exceptions for high traffic sites (Arstechnica loads all ads on iframe).
For general optimization, yahoo has an excellent resource page: http://developer.yahoo.com/performance/rules.html I was able to bring my site from ~8-9s loading time to ~2-3s running on a not too powerful server.
Three optimizations that worked great for me.
- CDN for static files (maxCDN has a great cheap introductory offer of 1tb for $9.99 and offers PULL)
- Minify and Gzip CSS and js files and then fetch them from CDN.
“Quality should still be the first and foremost concern [for site owners],” Cutts says. “This change affects outliers; we estimate that fewer than 1% of queries will be impacted. If you’re the best resource, you’ll probably still come up.”
What if the page you're looking for just happens to contain a lot of content / images / etc?
Adding speed to Google's ranking algorithms is only useful for searches where there are several equally good search results (in which case the fastest would be the one you would want). But in the event that you're actually searching for a lot of information, having fast (but less informative) sites propagate to the top would be detrimental.
Speed is only one factor in their ranking. If a site is notably better than other sites for a query, they said in the article that it will still rank first for that query.
“Google also cautions web site owners not to sacrifice relevance in the name of faster web pages, and even says this new ranking factor will impact very few queries.”
If it is true that non-HTML resources are a factor based on Google Toolbar reporting, that's scary. On high confidence sites -- more than 1000 datapoints, in Google Webmaster terms -- average speed seems stable and correlated with other performance measures. This is not true in my experience with medium confidence sites, 100 to 1000 Toolbar datapoints.
[+] [-] tokenadult|16 years ago|reply
P.S. There is something to think about in that the first time I tried to post this comment, there was a problem loading the page. HN is actually one of the slowest sites I put up with--great quality is worth the wait, but not the quality of most websites.
[+] [-] ShardPhoenix|16 years ago|reply
[+] [-] noodle|16 years ago|reply
[+] [-] g89|16 years ago|reply
[+] [-] swombat|16 years ago|reply
This means that if you use Disqus and a few badges from Reddit and the like, even if you set them up so that they don't slow down your main content, Google will hold it against you. That's a little... not great.
[+] [-] snprbob86|16 years ago|reply
[+] [-] callmeed|16 years ago|reply
[+] [-] petesalty|16 years ago|reply
[+] [-] zackattack|16 years ago|reply
http://disqus.theresumator.com/apply/RwUhQj/
[+] [-] vaksel|16 years ago|reply
1. If you are going to penalize sites for being "slow" then how about you tell us what slow is? Is a site loading in 10 seconds "slow"? is a site with a 5mb index page that loads in 15 seconds "slow"? How about some metrics so that we can optimize properly?
And what will that do to a lot of content rich sites? If you have a lot of images/flash/javascript it sounds like you are going to get screwed for trying to make a better looking user experience.
2. Of course Google search results experiment would affect user satisfaction. You are looking for results, and you want to do a lot of searches. BUT when you are clicking to see the result you like, I think most people would be willing to wait 2 seconds extra to load the more relevant information.
Sounds like this is yet another attempt at boosting big sites, where large sites like eHow and Mahalo get preference in results just because they can afford faster servers.
[+] [-] nostrademons|16 years ago|reply
As for metrics - as a user, I can say that 10 seconds is too slow. So is a 5M index page that loads in 15 seconds (what the hell do you need 5 megs of data on your index page for? That's like a 5 minute YouTube video). I want to see results within 1-2 seconds of clicking on a page; otherwise, you've broken my train of thought and I need to mentally context-switch each time I visit a page.
[+] [-] pavs|16 years ago|reply
[+] [-] brandnewlow|16 years ago|reply
If you're a neighborhood blog trying to make a go of it, you can't afford a developer to optimize your site and cache the crap out of it. Meanwhile, the local newspaper site, running a tag archive page for your neighborhood powered by Outside.in or some other McLocal scraper app, can do that. You lose every time on the speed front, despite having original content.
[+] [-] eli|16 years ago|reply
[+] [-] sokoloff|16 years ago|reply
At worst, Google is just reflecting that reality (in ~1% of searches where it changes the results).
Personally, I would prefer to compete in a world where everything relevant is "up for competition". You can compete on content, speed, price, or a myriad of other factors. If your complaint is that you want to compete on all of those things EXCEPT speed because that metric is unfair in your opinion, I don't have sympathy for that point of view.
[+] [-] postfuturist|16 years ago|reply
[+] [-] ComputerGuru|16 years ago|reply
http://neosmart.net/dl.php?id=1 is one of the slowest pages... according to them.
Browse it and see for yourself. It's super fast.
TribalFusion and PubMatic take some time, as do the user tracking JS, but (a) not 7 seconds and (b) do not affect the actual content.
[+] [-] dkubb|16 years ago|reply
I also ran the site through http://www.webpageanalyzer.com/ (one of many such services), and it said on a T1 it would take approximately 6.5 seconds to load. It also provides a number of improvements to cut down the page size, and improve rendering speed.
[+] [-] unknown|16 years ago|reply
[deleted]
[+] [-] ShardPhoenix|16 years ago|reply
[+] [-] cullenking|16 years ago|reply
[+] [-] pavs|16 years ago|reply
http://tools.pingdom.com/fpt/?url=http://neosmart.net/dl.php...
[+] [-] davidmurphy|16 years ago|reply
[+] [-] ComputerGuru|16 years ago|reply
[+] [-] lwhi|16 years ago|reply
How can you afford to keep your access time down?
Host advertising, perhaps?
[+] [-] chaosmachine|16 years ago|reply
Unless your site is taking 10 seconds to respond, it probably won't affect you. Google has said this change will affect only 1% of queries, so there's a 99% chance you're fine.
[+] [-] pavs|16 years ago|reply
Most popular sites use iframe to (either) asynchronously load javascript ads or load it on a separate page so that it doesn't effect your initial site speed. Most popular ad platforms also offer iframe specific codes you just have to ask for them (I know adify does). If they don't offer iframe codes, ask if it against their policy to load codes on iframe, they might make exceptions for high traffic sites (Arstechnica loads all ads on iframe).
For general optimization, yahoo has an excellent resource page: http://developer.yahoo.com/performance/rules.html I was able to bring my site from ~8-9s loading time to ~2-3s running on a not too powerful server.
Three optimizations that worked great for me.
- CDN for static files (maxCDN has a great cheap introductory offer of 1tb for $9.99 and offers PULL)
- Minify and Gzip CSS and js files and then fetch them from CDN.
- PHP cache (APC, eaccelerator or xcache.)
I am trying to reach <2sec speed point now.
[+] [-] unknown|16 years ago|reply
[deleted]
[+] [-] kadavy|16 years ago|reply
[+] [-] kpanghmc|16 years ago|reply
Adding speed to Google's ranking algorithms is only useful for searches where there are several equally good search results (in which case the fastest would be the one you would want). But in the event that you're actually searching for a lot of information, having fast (but less informative) sites propagate to the top would be detrimental.
[+] [-] donaldc|16 years ago|reply
[+] [-] yanw|16 years ago|reply
“Google also cautions web site owners not to sacrifice relevance in the name of faster web pages, and even says this new ranking factor will impact very few queries.”
It will only factor in less that 1% of querys.
[+] [-] jbyers|16 years ago|reply
I'm hopeful that Googlebot is the primary signal.
[+] [-] Raphael|16 years ago|reply
[+] [-] nfriedly|16 years ago|reply
[+] [-] spokey|16 years ago|reply
[+] [-] luckyland|16 years ago|reply
[+] [-] metamemetics|16 years ago|reply
[+] [-] nostrademons|16 years ago|reply
[+] [-] eli|16 years ago|reply