top | item 5637445

Fallback from CDN to local jQuery

130 points| shawndumas | 13 years ago |hanselman.com | reply

77 comments

order
[+] romaniv|13 years ago|reply
I still find the idea of CDNS repugnant. No matter how you slice it, you rely on an external resource for important parts of your application. "What it it goes down?" is one question. But you also should be asking yourself about what will happen if it gets hacked. There are also user privacy issues, which get completely overlooked in the chase for shaving off several milliseconds off request time.

A much better architecture would be to serve JavaScript from your server by default, but allow for distributed content-based caching. For example, your script tag could look like this

<script src="some.js" hash="ha3ee938h8eh38a9h49ha094h" />

The hash would be calculated based on the content of the file. The browser then could fetch it from whatever source it wants. Users could cache stuff locally (across websites), without needing to dial into a CDN every time. You could even use a torrent-like network for distributed delivery of popular script libraries.

[+] olegp|13 years ago|reply
It's not just a few milliseconds though. For example at https://starthq.com, we are based in Finland, but host on Amazon in US East. A round trip to the US is 200ms+ whereas with CloudFront it's 8ms. Before we used a CDN our page took a few seconds to load - now it takes around 200ms.

I should also mention that all this happens only on first load. We embed etags in the URLs and use far off cache control expires dates, so subsequent page loads get the JS and CSS from the browser's cache.

[+] baddox|13 years ago|reply
Isn't the concern of downtime and hacking only relevant if you have reason to believe that they are more likely to happen with the CDN than with your own servers?
[+] bigiain|13 years ago|reply
While I mostly agree with what you're saying... The truth is, why worry about Google getting hacked and having the attacker modify their CDN version of jQuery (or leaking user behaviour and identity), when I'm already letting them load unknown (to me) code in ga.js?

To me, there's a trade-off I've chosen, and while I'm not 100% comfortable with having the availability of my site depend on Google - "repugnant" is _way_ to strong a word to describe the downside to the pragmatic choice I've made.

[+] lmm|13 years ago|reply
Would mega's "load signed scripts" approach work here?
[+] gwgarry|13 years ago|reply
that's a great idea. the src should be the cdn though, but the browser should download the file into something like local storage and make it available for the future.
[+] wyuenho|13 years ago|reply
This only works if the CDN actually returns 4xx or 5xx codes. This still won't work if the CDN is getting DDOS'ed, as in taking forever to return anything.
[+] gavinpc|13 years ago|reply
Along the same lines, Chrome (maybe just webkit generally) does not fire DOMContentLoaded until external script requests have resolved or timed out, even if they are async.

Also, I don't understand why people feel so strongly about reposting this "document.write" method everywhere, which I found in some cases made my page disappear at load time. You can do the same thing using regular DOM methods, and you get more control over the process.

[+] zalew|13 years ago|reply
Came here to say this.

Besides, in bigger projects jquery is only a small part of the whole js stack. I've been using these jquery cdns for years, now I just pack it together with everything else uglified. There are so many jquery versions in use around there right now I don't feel I gain anything by caring about the chance that this particular 30kb will be cached by some percentage of users.

[+] isalmon|13 years ago|reply
I personally decided to use a local file after all.

Pros:

+ I can cache it for a very long time, so all my returning visitors don't have re-download it. I was very surprised to see that CDN's jquery had a very short 'Expire' headers

+ If my server is up and users can open a web page - there's a very high chance that the .js file will load as well.

+ I can combine different jquery libraries/plugins into one file, so my page can load MUCH faster

Cons:

- It might load a little more slowly, because it's not on CDN.

Am I missing something?

[+] martin-adams|13 years ago|reply
You may also have to pay the bandwidth cost.
[+] MatthewPhillips|13 years ago|reply
It saddens me that ES6 modules don't have fallbacks built in. You do:

    import 'http://developer.yahoo.com/modules/yui3.js' as YUI;
wish it was:

    import ['http://developer.yahoo.com/modules/yui3.js', '/libs/yui3'] as YUI;
[+] SoftwareMaven|13 years ago|reply
I think the notion of "fallbacks" (and the sibling comment's "timeouts") are extremely specific to the web. However, there is some prior art on this subject in the python world:

  try:
      import simplejson as json
  except ImportError:
      import json
So it might be worth contacting the committee and expressing that.
[+] numbsafari|13 years ago|reply
Based on comments above, it might also be helpful to configure a timeout of some type. So if, for example, the CDN is suffering from a DDOS or other latency inducing issue, you can force a fallback to your own server.
[+] TazeTSchnitzel|13 years ago|reply
Send the committee an email suggesting this improvement. They might not yet have considered it.
[+] mrharrison|13 years ago|reply
If you guys take a look at html5 boilerplate http://html5boilerplate.com/ . It has redundancy built in, so if the cdn fails, it will load your local copy.
[+] gertef|13 years ago|reply
Is html5boilerplate a "competitor" to Bootstrap?

Is there a good compendium of modern popular toolkits?

[+] mrharrison|13 years ago|reply
Also, you shouldn't be serving your content from a cdn anyway. All of your js files should be compressed into one file, browser cached, and gzipped.
[+] 3825|13 years ago|reply
Will we see (a few very) popular frameworks like jQuery built into web browsers with the server just declaring what version to use? (I have a feeling that, although I have good intentions, this is a bad idea.) Thoughts?
[+] emehrkay|13 years ago|reply
I hope not. Especially at the pace that these things change (just about every jQuery release is followed up by a point update a day later)
[+] ChrisLTD|13 years ago|reply
I've wanted this for a while, although I suppose that aggressive caching achieves roughly the same result.
[+] gwgarry|13 years ago|reply
I personally think that building a list of common frameworks like that and storing them on the browser with something like an md5 hash that developers can then use to include the framework would be very useful.

Like 4ea5b90bdb6f54a9b050cfd8dd19083d.js

[+] jrochkind1|13 years ago|reply
The document.write method makes it impossible to do async script loading, that you ordinarily could do here to improve perceived page load time. No?

I mean, for instance, you couldn't load that FIRST CDN jquery as async, because you need the browser to block on it so your NEXT script tag (which also can't be loaded async, naturally, cause it has a document.write in it) can check to see if it was loaded.

[+] xkcdfanboy|13 years ago|reply
Yes, that first method is hideous. Async is a necessary speedup and the `if (jQuery) ` slaughters that optimization.
[+] esalman|13 years ago|reply
This is built into Bootstrap.
[+] jrochkind1|13 years ago|reply
What are you talking about? How? What? Are you sure you mean 'bootstrap'?

Maybe it's just some feature of bootstrap I don't know about, but I'm not even sure what to go looking for because I'm not sure what you're suggesting is built into bootstrap. Built into the Javascript parts of Bootstrap somehow? What is, exactly?

[+] feralmoan|13 years ago|reply
How/Where? I've had exactly this problem and have had to shim (requirejs) bootstrap with jquery cdn + fallback paths... ? It certainly doesn't resolve its own dependencies under AMD
[+] kmfrk|13 years ago|reply
The main reason you should do this is not so much as a CDN fall-back, but to prevent users from downloading redundant files retrieved from other sites.

Also remember to always use the https URL for the assets, whenever able.

[+] imjared|13 years ago|reply
Is there a reason to use https over a protocol-relative url? My go-to is <script src="//ajax.googleapis.com/ajax/libs/jquery/1.8.3/jquery.min.js"></script>
[+] dmbass|13 years ago|reply
Isn't using already cached assets one of the reasons to use a CDN for popular scripts?
[+] gwgarry|13 years ago|reply
I have always thought that all the stuff in common CDNs should be available in local storage by default. Common stuff like jQuery and the like. Firefox should have this as a feature where it downloads those scripts once and stores them in local storage. That way you're not leaking privacy everywhere you go to Google et. al.
[+] addandsubtract|13 years ago|reply
That would be ideal, but currently local storage doesn't work across domains.