top | item 2816166

Page Speed Service

192 points| handraiser | 14 years ago |googlecode.blogspot.com | reply

61 comments

order
[+] dave1010uk|14 years ago|reply
The settings docs page [1] shows what this service does:

  Combine CSS
  Combine JavaScript
  Image Optimize
  Image Resize
  JavaScript Optimize
  Move CSS to head
  Proxy CSS
  Proxy Images
  Proxy JavaScript
You can see demos of the Page Speed Service doing each of these things at [2].

The dashboard docs page [3] shows that there's some analytics but not a great deal.

[1] https://code.google.com/speed/pss/docs/settings.html

[2] http://www.pssdemos.com/

[3] https://code.google.com/speed/pss/docs/dashboard.html

[+] bobfunk|14 years ago|reply
Seems like strong competition to Cloudflare (https://www.cloudflare.com/). Always thought that was their greatest risk, someone with a huge, worldwide, already paid for CDN stepping in an offering the same service.

They still have the statistics and security features going for them, though...

[+] ZoFreX|14 years ago|reply
Cloudflare does a LOT more than this. In fact, the area where they cross-over is the area that interests me the least - I am quite capable of making a website balls-to-the-wall fast without either of their products, but Cloudflare leverage their size to accomplish things I literally could not accomplish without them. I initially wrote Cloudflare off because of this, they don't do a great job of telling you about the coolest things they do.
[+] nextparadigms|14 years ago|reply
Does this mean that by going with Google, you'll also mitigate the risk of DDoS's? Because I know that was one of Cloudfare's advantages, too.
[+] robtoo|14 years ago|reply
Google have strong statistics/reporting products in other areas and I can't see it being long until they add this into the mix for Page Speed (if they haven't already.)
[+] bluelu|14 years ago|reply
It's pretty iteresting in what direction google is pushing.

Since all the requests are handled by google (you have to change the dns servers to point to google), this will give them complete website profiles, on what surfers visit and how they can rank websites better. They could even adapt their crawling on that No need to crawl websites which a human has never visited before.

I would also consider that they might evaluate the logs from their open dns servers to rank websites and decide what's important and what not.

It's like having a cookie in adsense or analytics, just nobody noticing it.

[+] jbk|14 years ago|reply
I think I just don't get it... The optimized version is actually slower than the normal one... http://www.webpagetest.org/result/110728_KN_1bd8e23bea97037c...

I must be missing something obvious...

[+] retube|14 years ago|reply
> Page Speed Service fetches content from your servers, rewrites your pages by applying web performance best practices, and serves them to end users via Google's servers across the globe

I don't get it. Given that most websites have some dynamic content, that static assets can be updated, edited or otherwise altered at any time, Google must be fetching content on a request-by-request basis, i/e acting as a proxy. How can this be faster? You've got the existing latency of fetching your content, PLUS the extra latency of Google zipping and concatenating assets and forwarding the result on to the client.

Doesn't make any sense.

[+] zokiboy|14 years ago|reply
Didn't make sense to me either. Now I realized it optimizes static content: css, javascrips and images on the firs run, then it serves optimized version until you update them. There is some latency to fetching dynamic content and that's the reason optimized sites are slower on this test.
[+] eastdakota|14 years ago|reply
The cache-and-optimize-offline strategy that Google Page Speed is using is fraught with problems for dynamic sites. We tried this at first with some CloudFlare optimizations and it was a disaster. We rewrote the system from scratch to do on-the-fly optimization so it works even with dynamic sites. Here's a blog post about it from some time ago:

http://blog.cloudflare.com/an-all-new-and-improved-autominif...

[+] dholowiski|14 years ago|reply
Many business sites still consist of mainly static content.
[+] ZoFreX|14 years ago|reply
If anyone on HN thinks their site might be sub-optimal on the front-end, contact me on [email protected] and I'll take a look, and let you know what you can do to speed things up :)
[+] ZoFreX|14 years ago|reply
If anyone could give me feedback on why the above is being downvoted, I'd appreciate it. Ta!

Edit: I'd actually like feedback please... it went down to 0, then up to 1, then down to 0 again.. and then back up to 2 after I posted this reply. I wasn't asking for upvotes, I want to know what it is that is making some people downvote this. Is it not in keeping with the etiquette here, is there some unspoken rule I'm violating?

[+] ErikD|14 years ago|reply
People should realize that by using this service, all sensitive data posted to your website will be readable by google.
[+] ck2|14 years ago|reply
Can end-users somehow opt-out of the pages they are viewing passing through Google?

It should be technically possible but I doubt they will offer that.

[+] hollerith|14 years ago|reply
It is not technically possible because to use the service, a web site has to change their hosting provider to Google. There are no longer any servers outside Google hosting the content, if I understand the bit about DNS change correctly.
[+] RyanKearney|14 years ago|reply
Why would you want to?
[+] scytale|14 years ago|reply
Absolutely great. What do people await? It is a free service helps to improve ya site. I run the g-pagespeedtest + webpagetest and received a list with errors in my project http://www.webdesign-angebote.com Especially pics and js made problems.

And the suggestions are okay, really helpful {background-color: expression(this.runtimeStyle.backgoundColor = 'none'} and "Compress Images", but thats been my fault. I adepted it immediatly and hope for improved user- and rankingresults.

[+] bcl|14 years ago|reply
I really don't see the point of passing it through them. Instead I'd rather they just tell me how to optimize it so I can apply the improvements myself.
[+] pawelwentpawel|14 years ago|reply
Good stuff but I'm not sure if if optimizing a website consisting of static pages this way really makes sense. If the website was done by an idiot then yes, but most of the users probably won't even notice any change in speed if the website was developed correctly earlier.

Also I probably wouldn't trust some other company enough to proxy my stuff to my users for the price of a slight optimization.

[+] rkalla|14 years ago|reply
Anyone notice all the testing locations are AWS Regions?

I was assuming webpagetest was a Google property and figured they would use their own data centers.

[+] bdonlan|14 years ago|reply
Testing from within their own datacenters would be biased _very_ heavily toward their own service, don't you think? These tests are only meaningful if performed on an external ISP. Note also that if you do the test it says "US East (Virginia) - IE 8 - DSL", suggesting that it's on a regular DSL connection, not a EC2 box.
[+] joshfraser|14 years ago|reply
WebPageTest originally came out of AOL as an open source project. Patrick Meenan was the lead on that. He recently took a job at Google, but WebPageTest is still open source and has contributions from lots of companies who care about performance. For example, my company Torbit sponsors a WebPageTest instance in Ireland.
[+] jackwagon|14 years ago|reply
AWS regions and what Google is using are common to many service providers. These are typical peering locations. Google does have the benefit of having caches local to various ISPs.
[+] brown9-2|14 years ago|reply
That doesn't prove much though, does it? "Northern Virginia" is a pretty broad region.
[+] smtroan|14 years ago|reply
+1 useful. On a related note, does anyone know of something similar that can diagnose and optimize a browser's connection to the internet (the other side of end-to-end performance)?
[+] zokiboy|14 years ago|reply
Does this work with dynamic web pages, like Facebook or news.ycombinator.com?
[+] eli|14 years ago|reply
I haven't checked, but I assume it works fine if you've got all your Cache and Vary headers set properly.
[+] devmach|14 years ago|reply
just out of curiosity i checked some of my web based application :

- orginal : load time 5 seconds *

- optimized : infinity....

I think they have to optimize their optimizer...

* : all js & css & image files about 900kb and the connection is 1Mbit adsl.

[+] pavpanchekha|14 years ago|reply
Just another example of Google trying to improve the world.

I would personally be a bit worried about the level of control this would mean giving up. But I just like to tinker. I know plenty of people for whom this is perfect.

[+] trotsky|14 years ago|reply
Seems hard to believe their motivation for seeing a ton of consumer web traffic on third party sites is to improve the world. I mean, obviously google analytics isn't some altruistic attempt to help people track their web usage better.
[+] tybris|14 years ago|reply
Deprecated in 5, 4, 3, ...