Seems like strong competition to Cloudflare (https://www.cloudflare.com/). Always thought that was their greatest risk, someone with a huge, worldwide, already paid for CDN stepping in an offering the same service.
They still have the statistics and security features going for them, though...
Cloudflare does a LOT more than this. In fact, the area where they cross-over is the area that interests me the least - I am quite capable of making a website balls-to-the-wall fast without either of their products, but Cloudflare leverage their size to accomplish things I literally could not accomplish without them. I initially wrote Cloudflare off because of this, they don't do a great job of telling you about the coolest things they do.
Google have strong statistics/reporting products in other areas and I can't see it being long until they add this into the mix for Page Speed (if they haven't already.)
It's pretty iteresting in what direction google is pushing.
Since all the requests are handled by google (you have to change the dns servers to point to google), this will give them complete website profiles, on what surfers visit and how they can rank websites better. They could even adapt their crawling on that No need to crawl websites which a human has never visited before.
I would also consider that they might evaluate the logs from their open dns servers to rank websites and decide what's important and what not.
It's like having a cookie in adsense or analytics, just nobody noticing it.
> Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe
I don't get it. Given that most websites have some dynamic content, that static assets can be updated, edited or otherwise altered at any time, Google must be fetching content on a request-by-request basis, i/e acting as a proxy. How can this be faster? You've got the existing latency of fetching your content, PLUS the extra latency of Google zipping and concatenating assets and forwarding the result on to the client.
Didn't make sense to me either. Now I realized it optimizes static content: css, javascrips and images on the firs run, then it serves optimized version until you update them. There is some latency to fetching dynamic content and that's the reason optimized sites are slower on this test.
The cache-and-optimize-offline strategy that Google Page Speed is using is fraught with problems for dynamic sites. We tried this at first with some CloudFlare optimizations and it was a disaster. We rewrote the system from scratch to do on-the-fly optimization so it works even with dynamic sites. Here's a blog post about it from some time ago:
If anyone on HN thinks their site might be sub-optimal on the front-end, contact me on [email protected] and I'll take a look, and let you know what you can do to speed things up :)
If anyone could give me feedback on why the above is being downvoted, I'd appreciate it. Ta!
Edit: I'd actually like feedback please... it went down to 0, then up to 1, then down to 0 again.. and then back up to 2 after I posted this reply. I wasn't asking for upvotes, I want to know what it is that is making some people downvote this. Is it not in keeping with the etiquette here, is there some unspoken rule I'm violating?
It is not technically possible because to use the service, a web site has to change their hosting provider to Google. There are no longer any servers outside Google hosting the content, if I understand the bit about DNS change correctly.
Absolutely great. What do people await? It is a free service helps to improve ya site. I run the g-pagespeedtest + webpagetest and received a list with errors in my project http://www.webdesign-angebote.com
Especially pics and js made problems.
And the suggestions are okay, really helpful {background-color: expression(this.runtimeStyle.backgoundColor = 'none'}
and "Compress Images", but thats been my fault.
I adepted it immediatly and hope for improved user- and rankingresults.
I really don't see the point of passing it through them. Instead I'd rather they just tell me how to optimize it so I can apply the improvements myself.
Good stuff but I'm not sure if if optimizing a website consisting of static pages this way really makes sense. If the website was done by an idiot then yes, but most of the users probably won't even notice any change in speed if the website was developed correctly earlier.
Also I probably wouldn't trust some other company enough to proxy my stuff to my users for the price of a slight optimization.
Testing from within their own datacenters would be biased _very_ heavily toward their own service, don't you think? These tests are only meaningful if performed on an external ISP. Note also that if you do the test it says "US East (Virginia) - IE 8 - DSL", suggesting that it's on a regular DSL connection, not a EC2 box.
WebPageTest originally came out of AOL as an open source project. Patrick Meenan was the lead on that. He recently took a job at Google, but WebPageTest is still open source and has contributions from lots of companies who care about performance. For example, my company Torbit sponsors a WebPageTest instance in Ireland.
AWS regions and what Google is using are common to many service providers. These are typical peering locations. Google does have the benefit of having caches local to various ISPs.
+1 useful. On a related note, does anyone know of something similar that can diagnose and optimize a browser's connection to the internet (the other side of end-to-end performance)?
Just another example of Google trying to improve the world.
I would personally be a bit worried about the level of control this would mean giving up. But I just like to tinker. I know plenty of people for whom this is perfect.
Seems hard to believe their motivation for seeing a ton of consumer web traffic on third party sites is to improve the world. I mean, obviously google analytics isn't some altruistic attempt to help people track their web usage better.
[+] [-] dave1010uk|14 years ago|reply
The dashboard docs page [3] shows that there's some analytics but not a great deal.
[1] https://code.google.com/speed/pss/docs/settings.html
[2] http://www.pssdemos.com/
[3] https://code.google.com/speed/pss/docs/dashboard.html
[+] [-] bobfunk|14 years ago|reply
They still have the statistics and security features going for them, though...
[+] [-] ZoFreX|14 years ago|reply
[+] [-] nextparadigms|14 years ago|reply
[+] [-] dave1010uk|14 years ago|reply
[+] [-] damoncloudflare|14 years ago|reply
We also do far more than just act as a CDN.
[+] [-] robtoo|14 years ago|reply
[+] [-] bluelu|14 years ago|reply
Since all the requests are handled by google (you have to change the dns servers to point to google), this will give them complete website profiles, on what surfers visit and how they can rank websites better. They could even adapt their crawling on that No need to crawl websites which a human has never visited before.
I would also consider that they might evaluate the logs from their open dns servers to rank websites and decide what's important and what not.
It's like having a cookie in adsense or analytics, just nobody noticing it.
[+] [-] jbk|14 years ago|reply
I must be missing something obvious...
[+] [-] niyazpk|14 years ago|reply
To be fair we have done a lot of optimizations on the site, but still feels like I may be missing something obvious.
[+] [-] retube|14 years ago|reply
I don't get it. Given that most websites have some dynamic content, that static assets can be updated, edited or otherwise altered at any time, Google must be fetching content on a request-by-request basis, i/e acting as a proxy. How can this be faster? You've got the existing latency of fetching your content, PLUS the extra latency of Google zipping and concatenating assets and forwarding the result on to the client.
Doesn't make any sense.
[+] [-] zokiboy|14 years ago|reply
[+] [-] eastdakota|14 years ago|reply
http://blog.cloudflare.com/an-all-new-and-improved-autominif...
[+] [-] dholowiski|14 years ago|reply
[+] [-] ZoFreX|14 years ago|reply
[+] [-] ZoFreX|14 years ago|reply
Edit: I'd actually like feedback please... it went down to 0, then up to 1, then down to 0 again.. and then back up to 2 after I posted this reply. I wasn't asking for upvotes, I want to know what it is that is making some people downvote this. Is it not in keeping with the etiquette here, is there some unspoken rule I'm violating?
[+] [-] ErikD|14 years ago|reply
[+] [-] ck2|14 years ago|reply
It should be technically possible but I doubt they will offer that.
[+] [-] hollerith|14 years ago|reply
[+] [-] RyanKearney|14 years ago|reply
[+] [-] crc321|14 years ago|reply
http://blog.craigrcannon.com/post/8170801499/my-site-tested-...
[+] [-] scytale|14 years ago|reply
And the suggestions are okay, really helpful {background-color: expression(this.runtimeStyle.backgoundColor = 'none'} and "Compress Images", but thats been my fault. I adepted it immediatly and hope for improved user- and rankingresults.
[+] [-] bcl|14 years ago|reply
[+] [-] joshuacc|14 years ago|reply
[+] [-] pawelwentpawel|14 years ago|reply
Also I probably wouldn't trust some other company enough to proxy my stuff to my users for the price of a slight optimization.
[+] [-] rkalla|14 years ago|reply
I was assuming webpagetest was a Google property and figured they would use their own data centers.
[+] [-] bdonlan|14 years ago|reply
[+] [-] joshfraser|14 years ago|reply
[+] [-] jackwagon|14 years ago|reply
[+] [-] brown9-2|14 years ago|reply
[+] [-] mbh|14 years ago|reply
[+] [-] smtroan|14 years ago|reply
[+] [-] zokiboy|14 years ago|reply
[+] [-] eli|14 years ago|reply
[+] [-] devmach|14 years ago|reply
- orginal : load time 5 seconds *
- optimized : infinity....
I think they have to optimize their optimizer...
* : all js & css & image files about 900kb and the connection is 1Mbit adsl.
[+] [-] pavpanchekha|14 years ago|reply
I would personally be a bit worried about the level of control this would mean giving up. But I just like to tinker. I know plenty of people for whom this is perfect.
[+] [-] trotsky|14 years ago|reply
[+] [-] tybris|14 years ago|reply