I'm not surprised he got these results. While the author raises some valid points there are a lot of flaws with this. He doesn't say what the times are for. Is it the time to DOM ready? To the page completely loaded and rendered? Did he load each site at least once before running the tests to make sure CF's caches were primed?
CloudFlare can make your site faster or slower depending on how you use it and what you're serving through it. Since it's a proxy response times will be higher for dynamic content it needs to fetch (the page itself, unless it's cacheable). Static content (with proper cache headers) will be served through its CDN (everything else) which will almost always be faster, again assuming CF's cache is primed.
The author's finding may be accurate about a base level install of these platforms with no performance improvements. I believe that without proper caching headers CF is going to have to query your server for everything to make sure nothing has changed. Sending proper cache headers with your static files will eliminate this issue and improve your performance. AFAIK none of these platforms use proper caching headers out of the box.
One last nitpick. Why the hell is this article spread across 6 pages? That's incredibly useless, annoying, and takes away from the user experience.
Edit: Originally I called out the author for claiming his site has a PageSpeed score of 97. I did so because the site issues 57 requests, including 8 CSS and 20 JS files, some of which aren't even minified. I assumed there was no way his score could be 97 with these glaring issues. Turns out I was wrong. I tested it[1] and it does have a score of 97, which just goes to show that PageSpeed and YSlow have plenty of their own issues. IMO his site shouldn't have over a 90 based on these obvious and easy to fix flaws.
I don't really understand Cloudflare's problems with latency. When using FULL CDN+Optimizations the entire site's HTML and assets should be cached and served super fast via cloudflare. That is the best case scenario for them to speed up load times.
But based on my testing, Cloudflare adds more latency to response times even in a best case scenario of full caching. However, when I do the same exact type of optimizations with Google PageSpeed service I see great results.
The most interesting part about cloudflare's problems is that they weren't there 2 years ago when I first started using it. I blogged about how great it was.
> Since it's a proxy response times will be higher for dynamic content it needs to fetch (the page itself, unless it's cacheable).
This is actually not true; I mean, it might very well be true for CloudFlare (as they may be doing something exceedingly stupid at the edge, might have bad connectivity for their servers far from the trunk, etc.), but it does not follow in general: due to how TCP works, there are numerous advantages to both throughput and latency from adding an intelligent middle-man. For reference:
This article/site appears to be down. Cannot help but wonder if it wouldn't be if they had used CloudFlare...
Jokes aside, I never really expected that CloudFlare would increase the speed of your average site. I mean there is limited caching going on but in general that isn't the benefit of a CDN.
The benefit of a CDN is: consistent speed across geographical zones (Europe, Americas, Asia, Russia, etc), better handling of load variations (Slashdotted, etc), and also some level of DDOS protection (just due to the virtue of more availability).
Any increase in speed supplied by CloudFlare depends a great deal on how the underlying site delivers content and how well cache-control is done. For example if you move all of your static content onto a static host (e.g. static.example.com) and then set the cache to a week, then CloudFlare is going to do a lot more for you, then if your app supplies most content dynamic with no-cache set.
CDN doesn't help at all, if it is based on caching only (like CloudFlare). In some cases http headers prevent all benefits of having cdn. Like pages with no-store, no-cache, post&precheck=0. In these cases CDN might only add latency.
I have seen sites hosting images with those headers too. (like Google sites). If you're using slower network connection it becames painfully apparent that all images are always completely redownloaded. With these parameters images must be downloaded from the source, or otherwise caching cdn would break things. Other cdn networks like coral cache clearly states that they always cache content, what ever headers say... But it isn't acceptable for all sites.
The unlabeled referral link here at the end praising his host bothers me a bit -- shouting praises for a good service is fine, but at least note that you are getting a kick back.
> Because I know I will get an angry email or two, just let me say this. I know that Cloudflare touts other features such as security, load balancing, and keeping your site up if your server is offline. I did not test or take any of these features into account. Their biggest sector is to people that want to speed their website up, so I took them to the task on just that claim.
If this is true and people really think that CDN = faster site, then it's a misunderstanding, perhaps perpetuated by malicious marketing by the CDNs themselves.
But putting a proxy in front of an unloaded site (hosted on a relatively fast server) is of course unnecessary.
Re-run the test using `ab2' instead of simple 3-hit tests, and that's when the CDN becomes more useful. Or perhaps host the site on an oversold bargain shared server and see how it fares.
This is a small test[1] that shows the absence of miracles. My interpretation is:
• Location, location, location. – Sometimes cloudflare will have better positioning than your server. In this test the Amsterdam clients benefit significantly compared to his Atlanta Georgia US based server.
• It is not magic. – For the content he is using (base installs of popular packages and highly web optimized site) cloudflare's compression and optimizations are not helping much (at all?). My next path for further study would be to see if these base installs are highly optimized already. It seems reasonable that they would be, but it would need to be looked at. Then consider if you are planning to have unoptimized content (3rd party generated), or could benefit from skipping the optimization task and live with what cloudflare does.
• Do you only have one customer? – One of the attractive features for me is I can share my "worst case"[2] customer load with someone who won't even notice the blip. Tests during a deluge would be interesting.
• Does no one hate you? – Haters aren't necessarily sane[3]. Performance tests during a DDOS would be interesting.
I'm evaluating cloudflare for a site. I don't expect it to out optimize me. I do expect it to help European and Asian load times, and I have high hopes that it will kick in during bad times and make them less bad. To that end, does anyone know of a friendly DDOS service? I'd like to be able to schedule an X gbps DDOS for Y seconds with traffic of form Z for testing purposes.
␄
[1] 26 numbers, each made from three samples, five minutes apart, averaged together. No standard deviations.
[2] Which is the best case in the big picture. Just hard for the computers.
[3] I ended up retiring an IP address out of our C block because someone hated it. It was the "friends and family" email and hosting machine back before Facebook and free web mail services. Maybe someone got offended by something, and it got a persistent DDOS attack that would saturated our incoming IP links. Six months later when I tried to reuse the IP the attack came back immediately. I just marked it "unusable" in our DNS files.
off-topic: Since you obviously are interested in Unicode (you used • and even ␄), you could continue the trend by using ¹, ² and ³ instead of [1], [2] and [3]. For the corresponding explanation at the bottom, use ①, ② and ③. Also, instead of "worst case", write “worst case” (typographically correct quotation marks) and instead of ', write ’ (typographically correct apostrophe).
In case anyone wonders how to enter them efficiently, I use neo-layout.org which provides ¹ on Mod3+1, ① with Multi_key + ( + 1 + ), “ with Shift+9, ” with Shift+0 and ’ with Mod3+0.
Could it be that your denial of service attack is a misconfiguration somewhere? I had a similar issue at one point, a typo in someone's DNS meant that my private server got absolutely wiped out by junk requests.
I've heard of people using simple load testers like http://loadimpact.com/ to test dynamic pages, though I've not had a serious use for one yet. That's probably as far as you can get without hiring a botnet yourself.
For load testing, you could use apache bench with a fast bandwidth connection, beeswithmachineguns (EC2), locust.io (simulate users), blitz.io (cloud service)
I stopped using Cloudflare because of their "protection service". They would block certain visitors, seemingly by random, and present them with a captcha. The page presenting the captcha had big ads on them.
There's no way that it's acceptable to show ads and captchas to potential customers, before they can even see my website.
Can't you disable any of the features you don't want? I agree that it's tacky to have ads on this page, but what would you expect if you're on the free service?
Yeah, at one stage about 2 years ago they blocked the whole of South Africa, because most web traffic comes from a small number of transparent proxies from each ISP (bandwidth is pretty scarce here). Incredibly irritating
If you are fronting a heroku-based app on their cedar platform and have rails (or whatever) serving up static resources CloudFlare is a no brainer: you free up your dyno's to handle the dynamic stuff, and it's a few config-clicks to get it going, with no crazy asset deployment steps.
Additionally, for the cost of just Heroku's SSL endpoint per month, CloudFlare will effectively issue you a wild-card SSL cert (hundreds of dollars a year) and provide SSL service.
Add on top of that the CDN and the DDOS-mitigation features... Well, suffice to say, I love CloudFlare.
My experience with CloudFlare couldn't be more to the contrary.
We rolled it out for a good few months and gave them quite a while to get their act straight. Ultimately, we had to do an emergency switch to another CDN because the performance was SOOOOOO bad and we had an important event occurring the following day (Not the ideal time to be playing with DNS on a production website).
The Theory behind CloudFlare makes sense right? They'll protect you from DDOS by getting everyone onto their network, so the network gets so big no one can take it down and they have specialized equipment and techniques. Well, maybe that makes sense if you have a problem with DDOS, but if you don't, why join a network that is obviously being DDOS every day? That doesn't make much sense to me. I assume they were being DDOS'd because every time they went down, taking us with them, that's what they would said on twitter.
The worse part was response times. With them, individual assets where taken around 500ms to 800ms to load. Once we switched to another service provider, we were seeing around 20ms-30ms. And if it's not already obvious, dynamic pages served off Heroku are faster if they're not stuck behind CloudFlare. Our total cold page load time when from 5s-6s down to 2s with this switch.
Also, all the asset rewriting and page optimization magic is so silly IMOHO. Just use a good framework like ROR with Asset Pipeline and write good code and you won't have that problem. Not like for a small site it should be much of a problem, and for a big site, they should have competent programmers and adequate resources.
Also the SSL Cert they give you sucks. It will have a bunch of other companies names on it, and perhaps besides allowing you to rollout SSL very quickly and easily, doesn't do much in the way of validating your identity.
I wish CloudFlare luck, and hopefully they will fix their issues. Until then, I'm staying away from them.
Carson, I'm in the exact same boat as you and it's served us really well. I'm using CloudFlare only to cache my static assets for a Node.js app and I've seen performance in Heroku (and NodeJitsu) go up as a result. I've also added some subdomains to CloudFlare so I'm sharding my static assets (cdn1.mysitecdn.com, cdn2.mysitecdn.com). Add in that all of the assets are Gzipped and SSL secured and I'm a very happy camper. My app on 3 dynos can handle 1000+ concurrent users/sec no problem.
If you are going to make these tests, you need to write how you tested, what you tested with, how many times you tested and exactly what you are measuring...
There is basically no information there except some load times, which we are supposed to just take your word for. Even the load times are unclear about what is included in that time, and if they are aggregated values or not.
Also, don't paginate it over so many pages. You aren't even running any ads, so there's no point.
There are a ton of variables that affect website performance --- location, connection speed, browser used, etc. There is simply no possible way to get an accurate measurement of how a CDN will impact the performance of your site without using RUM (JavaScript instrumentation of the page to record the actual experience for every visitor). Every website has a unique audience with unique characteristics. We offer a free RUM tool at Torbit for anyone interested in getting an accurate measurement of how Cloudflare (or any CDN) is performing.
CDN's are not magic, neither are proxy services. Cloudflare is a proxy service which is able to do the following things for you:
Distribute load around the world, when if you're not a massively large site like Facebook you will only have one, maybe two points of presence. Protect you from DDoS attacks if someone decides they don't like you. As long as you set proper caching rules for ALL your static content (keyword: ALL), there is a decent chance that your bandwidth usage will drop in half or more... the list goes on.
I can't comment on this particular test as the page is down (Guess probably they should be using a proxy service or CDN...), but if it's anything like the previous report of Cloudflare slowness, it was light on science and high on personal thoughts.
Edit: It appears that this page was using a shared hosting provider called Netfirms? Probably shouldn't do that and then post to HN front-page...
CloudFlare doesn't shine on single page loads. CloudFlare shines when your site hits Reddit's front page, you have 1000 people online at the same time, and your servers are already pumping 30 megabits per second of data of content to visitors.
Source: I hit Reddit's front page a couple of times a month. Before I used CloudFlare, my servers would die. Now, they idle leisurely at loads 1.0 to 2.0 when that happens.
It's unclear if the tools used (Pingdom/Neustar) put cache-busting headers on their requests. It's likely that they do: such probes are more comparable/reproduceable between runs, and more likely to give guidance to the app-designer/web-designer (as opposed to cache-runner).
If so, of course CloudFlare would be slower in such tests: its largest benefit, caching unchanged resources for subsequent reloads, has been disabled.
Also, it's hard to take performance hints from someone who splits such a short, simple blog post across 6 (!) pages. Six discretionary click-requiring page-loads is always worse than one, and is the easiest thing to fix if you're respecting my reading-time.
I have to wonder about the testing results that were obtained. Testing the improvements of optimizations need to be done carefully; you need to make sure capture sufficient data to draw conclusions.
#1 -> You need to capture enough data samples for each location and browser
#2 -> You need to capture data from a set of global locations
#3 -> You need to capture data from the commonly used web browsers.
You can see a test run from a single location and browser using one sample (2.873) second for the time to interact.
There is a big difference between the one location and the multiple locations with 5 samples. Looking at just Washington with 5 samples; the time to interact is (4.1) seconds.
(Disclosure, I work at yottaa.com the provider of websitetest.com) For those people looking to verify optimizations are working (automated or hand-tuned) you should use websitetest.com to simplify the testing process. It makes running tests (multiple locations, multiple browsers, multiple connectivities) possible with one click and test results make it easy to draw conclusions.
--- All test data for the information in the comment is available through these links
I can barely read this article but from what I can see so far, it's missing the point. This author is not even doing the basics like combining CSS and JS, much less minifying that, and the site cannot take a traffic spike.
This page requires 42 requests for me, many of which are to dh42.com ... it may not be a big difference when you throw 25 requests at it but if you're serving any real traffic (as this HN traffic spike is clearly demonstrating), you're just sabotaging yourself. On some sites I've consulted with, just turning on KeepAlive and combining resources where possible is enough to get them to the next level, without resorting to a CDN. The difference between 40 requests to your box per page and 5 is pretty significant with any real load.
In addition to this, when I tried out CloudFlare their caching feature never worked for me. I also strongly disliked having to setup all my DNS in their system. I've switched to S3+CloudFront and have been very happy.
What's the point of this test? Yes, if your webserver's purpose in life is to serve the Wordpress default template 3 times total in its lifespan, neither Cloudflare nor any other such service will be useful. In fact, if anything, I'm a little unclear on what took .5 seconds at all.
I have neither connection to nor interest in Cloudflare, but I would intellectually be interested in a useful comparison test. This looks less like an interesting comparison test and more like somebody constructing a benchmark to say what they wanted it to say in advance.
Two annoying usability flaws about this website (for me).
1. The article is over 6 pages, each page is too small. 1 - 3 pages would be preferable. There are no next/previous page links, which would provide a larger area to click. There are no print or 'all pages' options. Nice would be a keyboard shortcut to go to the next page.
2. The Olark chat box is positioned right over the main text, which makes it very distracting. There is no option to close it. It would be preferable though if I didn't have to close it, and it was in an area of the page where I could just ignore it. This is a problem with many websites these days, where the user has to find and click a close button in order to use the website.
[+] [-] driverdan|13 years ago|reply
CloudFlare can make your site faster or slower depending on how you use it and what you're serving through it. Since it's a proxy response times will be higher for dynamic content it needs to fetch (the page itself, unless it's cacheable). Static content (with proper cache headers) will be served through its CDN (everything else) which will almost always be faster, again assuming CF's cache is primed.
The author's finding may be accurate about a base level install of these platforms with no performance improvements. I believe that without proper caching headers CF is going to have to query your server for everything to make sure nothing has changed. Sending proper cache headers with your static files will eliminate this issue and improve your performance. AFAIK none of these platforms use proper caching headers out of the box.
One last nitpick. Why the hell is this article spread across 6 pages? That's incredibly useless, annoying, and takes away from the user experience.
Edit: Originally I called out the author for claiming his site has a PageSpeed score of 97. I did so because the site issues 57 requests, including 8 CSS and 20 JS files, some of which aren't even minified. I assumed there was no way his score could be 97 with these glaring issues. Turns out I was wrong. I tested it[1] and it does have a score of 97, which just goes to show that PageSpeed and YSlow have plenty of their own issues. IMO his site shouldn't have over a 90 based on these obvious and easy to fix flaws.
1: http://www.webpagetest.org/result/130105_G4_DEA/
[+] [-] xpose2000|13 years ago|reply
But based on my testing, Cloudflare adds more latency to response times even in a best case scenario of full caching. However, when I do the same exact type of optimizations with Google PageSpeed service I see great results.
The most interesting part about cloudflare's problems is that they weren't there 2 years ago when I first started using it. I blogged about how great it was.
Here are the articles I've written about cloudflare in the past 2 years: http://www.x-pose.org/category/cloudflare/
[+] [-] saurik|13 years ago|reply
This is actually not true; I mean, it might very well be true for CloudFlare (as they may be doing something exceedingly stupid at the edge, might have bad connectivity for their servers far from the trunk, etc.), but it does not follow in general: due to how TCP works, there are numerous advantages to both throughput and latency from adding an intelligent middle-man. For reference:
http://news.ycombinator.com/item?id=2823268
http://news.ycombinator.com/item?id=4203371
[+] [-] UnoriginalGuy|13 years ago|reply
Jokes aside, I never really expected that CloudFlare would increase the speed of your average site. I mean there is limited caching going on but in general that isn't the benefit of a CDN.
The benefit of a CDN is: consistent speed across geographical zones (Europe, Americas, Asia, Russia, etc), better handling of load variations (Slashdotted, etc), and also some level of DDOS protection (just due to the virtue of more availability).
Any increase in speed supplied by CloudFlare depends a great deal on how the underlying site delivers content and how well cache-control is done. For example if you move all of your static content onto a static host (e.g. static.example.com) and then set the cache to a week, then CloudFlare is going to do a lot more for you, then if your app supplies most content dynamic with no-cache set.
[+] [-] Sami_Lehtinen|13 years ago|reply
I have seen sites hosting images with those headers too. (like Google sites). If you're using slower network connection it becames painfully apparent that all images are always completely redownloaded. With these parameters images must be downloaded from the source, or otherwise caching cdn would break things. Other cdn networks like coral cache clearly states that they always cache content, what ever headers say... But it isn't acceptable for all sites.
[+] [-] dorianj|13 years ago|reply
> Because I know I will get an angry email or two, just let me say this. I know that Cloudflare touts other features such as security, load balancing, and keeping your site up if your server is offline. I did not test or take any of these features into account. Their biggest sector is to people that want to speed their website up, so I took them to the task on just that claim.
If this is true and people really think that CDN = faster site, then it's a misunderstanding, perhaps perpetuated by malicious marketing by the CDNs themselves.
But putting a proxy in front of an unloaded site (hosted on a relatively fast server) is of course unnecessary.
Re-run the test using `ab2' instead of simple 3-hit tests, and that's when the CDN becomes more useful. Or perhaps host the site on an oversold bargain shared server and see how it fares.
[+] [-] jws|13 years ago|reply
• Location, location, location. – Sometimes cloudflare will have better positioning than your server. In this test the Amsterdam clients benefit significantly compared to his Atlanta Georgia US based server.
• It is not magic. – For the content he is using (base installs of popular packages and highly web optimized site) cloudflare's compression and optimizations are not helping much (at all?). My next path for further study would be to see if these base installs are highly optimized already. It seems reasonable that they would be, but it would need to be looked at. Then consider if you are planning to have unoptimized content (3rd party generated), or could benefit from skipping the optimization task and live with what cloudflare does.
• Do you only have one customer? – One of the attractive features for me is I can share my "worst case"[2] customer load with someone who won't even notice the blip. Tests during a deluge would be interesting.
• Does no one hate you? – Haters aren't necessarily sane[3]. Performance tests during a DDOS would be interesting.
I'm evaluating cloudflare for a site. I don't expect it to out optimize me. I do expect it to help European and Asian load times, and I have high hopes that it will kick in during bad times and make them less bad. To that end, does anyone know of a friendly DDOS service? I'd like to be able to schedule an X gbps DDOS for Y seconds with traffic of form Z for testing purposes.
␄
[1] 26 numbers, each made from three samples, five minutes apart, averaged together. No standard deviations.
[2] Which is the best case in the big picture. Just hard for the computers.
[3] I ended up retiring an IP address out of our C block because someone hated it. It was the "friends and family" email and hosting machine back before Facebook and free web mail services. Maybe someone got offended by something, and it got a persistent DDOS attack that would saturated our incoming IP links. Six months later when I tried to reuse the IP the attack came back immediately. I just marked it "unusable" in our DNS files.
[+] [-] secure|13 years ago|reply
In case anyone wonders how to enter them efficiently, I use neo-layout.org which provides ¹ on Mod3+1, ① with Multi_key + ( + 1 + ), “ with Shift+9, ” with Shift+0 and ’ with Mod3+0.
[+] [-] nwh|13 years ago|reply
I've heard of people using simple load testers like http://loadimpact.com/ to test dynamic pages, though I've not had a serious use for one yet. That's probably as far as you can get without hiring a botnet yourself.
[+] [-] mctx|13 years ago|reply
[+] [-] jakobe|13 years ago|reply
There's no way that it's acceptable to show ads and captchas to potential customers, before they can even see my website.
[+] [-] eli|13 years ago|reply
[+] [-] bgentry|13 years ago|reply
[+] [-] mje__|13 years ago|reply
[+] [-] thematt|13 years ago|reply
[+] [-] carsongross|13 years ago|reply
Additionally, for the cost of just Heroku's SSL endpoint per month, CloudFlare will effectively issue you a wild-card SSL cert (hundreds of dollars a year) and provide SSL service.
Add on top of that the CDN and the DDOS-mitigation features... Well, suffice to say, I love CloudFlare.
[+] [-] blakefrost|13 years ago|reply
We rolled it out for a good few months and gave them quite a while to get their act straight. Ultimately, we had to do an emergency switch to another CDN because the performance was SOOOOOO bad and we had an important event occurring the following day (Not the ideal time to be playing with DNS on a production website).
The Theory behind CloudFlare makes sense right? They'll protect you from DDOS by getting everyone onto their network, so the network gets so big no one can take it down and they have specialized equipment and techniques. Well, maybe that makes sense if you have a problem with DDOS, but if you don't, why join a network that is obviously being DDOS every day? That doesn't make much sense to me. I assume they were being DDOS'd because every time they went down, taking us with them, that's what they would said on twitter.
The worse part was response times. With them, individual assets where taken around 500ms to 800ms to load. Once we switched to another service provider, we were seeing around 20ms-30ms. And if it's not already obvious, dynamic pages served off Heroku are faster if they're not stuck behind CloudFlare. Our total cold page load time when from 5s-6s down to 2s with this switch.
Also, all the asset rewriting and page optimization magic is so silly IMOHO. Just use a good framework like ROR with Asset Pipeline and write good code and you won't have that problem. Not like for a small site it should be much of a problem, and for a big site, they should have competent programmers and adequate resources.
Also the SSL Cert they give you sucks. It will have a bunch of other companies names on it, and perhaps besides allowing you to rollout SSL very quickly and easily, doesn't do much in the way of validating your identity.
I wish CloudFlare luck, and hopefully they will fix their issues. Until then, I'm staying away from them.
[+] [-] jamies|13 years ago|reply
[+] [-] aioprisan|13 years ago|reply
[+] [-] wldlyinaccurate|13 years ago|reply
[+] [-] mazsa|13 years ago|reply
[+] [-] mazsa|13 years ago|reply
[+] [-] jws|13 years ago|reply
[+] [-] andypants|13 years ago|reply
There is basically no information there except some load times, which we are supposed to just take your word for. Even the load times are unclear about what is included in that time, and if they are aggregated values or not.
Also, don't paginate it over so many pages. You aren't even running any ads, so there's no point.
[+] [-] joshfraser|13 years ago|reply
Check out how Wayfair used Torbit to measure how much of a difference Akamai was making to their site: http://torbit.com/blog/2012/07/23/wayfair-uses-real-user-mea...
If you're interested, I'd love to help you do a similar test for your site.
[+] [-] druiid|13 years ago|reply
Distribute load around the world, when if you're not a massively large site like Facebook you will only have one, maybe two points of presence. Protect you from DDoS attacks if someone decides they don't like you. As long as you set proper caching rules for ALL your static content (keyword: ALL), there is a decent chance that your bandwidth usage will drop in half or more... the list goes on.
I can't comment on this particular test as the page is down (Guess probably they should be using a proxy service or CDN...), but if it's anything like the previous report of Cloudflare slowness, it was light on science and high on personal thoughts.
Edit: It appears that this page was using a shared hosting provider called Netfirms? Probably shouldn't do that and then post to HN front-page...
[+] [-] Falkvinge|13 years ago|reply
CloudFlare doesn't shine on single page loads. CloudFlare shines when your site hits Reddit's front page, you have 1000 people online at the same time, and your servers are already pumping 30 megabits per second of data of content to visitors.
Source: I hit Reddit's front page a couple of times a month. Before I used CloudFlare, my servers would die. Now, they idle leisurely at loads 1.0 to 2.0 when that happens.
Cheers, Rick
[+] [-] gojomo|13 years ago|reply
If so, of course CloudFlare would be slower in such tests: its largest benefit, caching unchanged resources for subsequent reloads, has been disabled.
Also, it's hard to take performance hints from someone who splits such a short, simple blog post across 6 (!) pages. Six discretionary click-requiring page-loads is always worse than one, and is the easiest thing to fix if you're respecting my reading-time.
[+] [-] jaysonlane|13 years ago|reply
[+] [-] bbuffone|13 years ago|reply
#1 -> You need to capture enough data samples for each location and browser #2 -> You need to capture data from a set of global locations #3 -> You need to capture data from the commonly used web browsers.
You can see a test run from a single location and browser using one sample (2.873) second for the time to interact.
http://www.websitetest.com/ui/tests/50e89376479876092f000012
but when you run the test over 17 location and run 5 samples for each locations. (6.4) second for the time to interact.
http://www.websitetest.com/ui/tests/50e893d7479876092f000016
There is a big difference between the one location and the multiple locations with 5 samples. Looking at just Washington with 5 samples; the time to interact is (4.1) seconds.
(Disclosure, I work at yottaa.com the provider of websitetest.com) For those people looking to verify optimizations are working (automated or hand-tuned) you should use websitetest.com to simplify the testing process. It makes running tests (multiple locations, multiple browsers, multiple connectivities) possible with one click and test results make it easy to draw conclusions.
--- All test data for the information in the comment is available through these links
Tests by browsers in Washington DC -> http://www.websitetest.com/ui/tests/50e89598479876124100000e
[+] [-] mscarborough|13 years ago|reply
This page requires 42 requests for me, many of which are to dh42.com ... it may not be a big difference when you throw 25 requests at it but if you're serving any real traffic (as this HN traffic spike is clearly demonstrating), you're just sabotaging yourself. On some sites I've consulted with, just turning on KeepAlive and combining resources where possible is enough to get them to the next level, without resorting to a CDN. The difference between 40 requests to your box per page and 5 is pretty significant with any real load.
[+] [-] Sami_Lehtinen|13 years ago|reply
Forbidden
You don't have permission to access /cloudflare-showdown/ on this server.
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.
[+] [-] bcl|13 years ago|reply
[+] [-] jerf|13 years ago|reply
I have neither connection to nor interest in Cloudflare, but I would intellectually be interested in a useful comparison test. This looks less like an interesting comparison test and more like somebody constructing a benchmark to say what they wanted it to say in advance.
[+] [-] kristianp|13 years ago|reply
1. The article is over 6 pages, each page is too small. 1 - 3 pages would be preferable. There are no next/previous page links, which would provide a larger area to click. There are no print or 'all pages' options. Nice would be a keyboard shortcut to go to the next page.
2. The Olark chat box is positioned right over the main text, which makes it very distracting. There is no option to close it. It would be preferable though if I didn't have to close it, and it was in an area of the page where I could just ignore it. This is a problem with many websites these days, where the user has to find and click a close button in order to use the website.