Pete Freitag Pete Freitag

Quick Google CDN jQuery Tip

Published on January 13, 2011
By Pete Freitag
web

Here's a quick tip if your using Google's Content Delivery Network (CDN) for serving up jQuery (or other scripts).

So if you want jQuery 1.4.4 you can use a url like this:

http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js

Now suppose you just want the latest 1.4.x release, you can use (caution, keep reading):

http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js

And if you just want the latest 1.x.x release, you can use this (caution, keep reading):

http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js

I would think that using a less specific version number (like 1.4 or just 1) would result in a higher cache hit ratio for your visitors. One drawback to this would be that when a new version comes out your users will start using it without giving you a chance to do any testing.

Update: However if you take a look at the comments, one of my readers points out that there is a big difference in the Expires http header when using the full version, vs using a version shortcut. Here's what the HTTP headers look like on a request for 1.4.4:

Date: Thu, 13 Jan 2011 16:41:02 GMT
Expires: Fri, 13 Jan 2012 16:41:02 GMT
Cache-Control: public, max-age=31536000

Now here's what they look like when you request just the 1.4:

Date: Fri, 14 Jan 2011 16:45:21 GMT
Expires: Fri, 14 Jan 2011 15:45:21 GMT
Cache-Control: public, must-revalidate, proxy-revalidate, max-age=3600

When requesting 1.4.4 we get a file with an expiration date of next year, and the Cache-Control with a max-age of one year.

Now when you look at the request for 1.4 you will notice that the Expires date is prior to the Date header, and the max-age specified in the Cache-Control header is only one hour.

What are the Pro's and Con's to using Google's CDN to serve up jQuery?

  • Pro's
    • Google's CDN has a very low latency, it can serve a resource faster than your webserver can.
    • There is a good chance that the user may have a cached copy of jQuery from google CDN, so they won't need to download it again.
    • Chance are good that the dns lookup for ajax.googleapis.com is cached by the user as well.
    • Because the script is on a separate domain, modern browsers will download the script in parallel with scripts on your domain.
    • Google's CDN also works with HTTPS, so you can offload the SSL processing to google's server.
  • Con's
    • Security - If someone hacks google's CDN then they can run javascript on your site.
    • Additional DNS request may be required, but I'm betting the Pro's make this negligible in most cases.
    • May be slower if your site is internal (an intranet for example).

    Do you have any other Pro's or Con's to using Google's CDN for common javascript files?



google cdn jquery tips ajax

Quick Google CDN jQuery Tip was first published on January 13, 2011.

If you like reading about google, cdn, jquery, tips, or ajax then you might also like:

Discuss / Follow me on Twitter ↯

Comments

Parallel script loading is nice... unless your website use scripts that are dependent on certain core libraries (ie, jQuery) being loaded first. If you are using Google CDN, you may want to consider using LABjs as it will allow you to asynchronously load several resources from your own domain and external domains without blocking while retaining chaining dependencies.
http://LABjs.com/
[NOTE: The LABjs website appears to be down right now and you may need to use a search for more info.]
by James Moberg on 01/13/2011 at 2:30:15 PM UTC
The biggest Con in my eyes is that it creates another point of failure for your site. If there is a problem with the CDN, it affects your site.

In theory, this should be very rare, but I have had problems serving the Dojo Toolkit from the Google CDN. In fairness, if you don't create a custom build of Dojo you can end up having to download a huge number of files. In my case it seemed like any latency on the CDN caused my site to slow to a crawl. This problem would probably not be that noticeable if you are only including the single jQuery file, but it's worth noting that latency problems can happen on the CDN.
by David Hammond on 01/13/2011 at 2:39:59 PM UTC
I did not realize that you could leave out the specific versions like you mentioned to get the latest. Interesting tip, thanks for sharing it.
by John Sieber on 01/13/2011 at 5:15:10 PM UTC
The various Page Speed tools (Yslow etc) will ping a warning to say you're not using long cache expiry dates if you don't opt for the very specific versions.

@David - you can always fall back to a local copy of jQuery if the CDN is down.. have a look at the html5 boilerplate code (http://html5boilerplate.com/) for example..
by Geoff on 01/13/2011 at 5:45:03 PM UTC
Head JS (headjs.com) is one quite useful script for helping to load scripts in parallel but execute in order. It helped to reduce amount of load time even when part of the scripts are still loaded from Google's CDN (but just using Head JS as loader). There's also quite bit of other features in it so I surely recommend to look at it.
by Daniel Schildt on 01/13/2011 at 6:18:29 PM UTC
@John:

I'd highly recommend *always* pointing to the specific version of jQuery you've tested your site against.

There are just too many changes between jQuery point releases that could impact your code in a way that's difficult to track down. There's nothing worse then users reporting your sites not working, yet you know that you haven't made any changes, only to eventually track down the issue as a change to some external files being served up that you can't control. While this could happen if the CDN is down as well, at least then you'll see more obvious JS errors instead of the more subtle changes that can occur if behavior changes.

I've just been using jQuery long enough (since like jQuery 1.1) I've just seen enough changes in point releases that caused me to update code to realize it's a bad thing to automatically point to the latest version of jQuery in a production environment.

The only time I would think about using this technique is for a development version of a jQuery plug-in or for a test environment, where you just want to make sure your code works w/the latest stable version.
by Dan G. Switzer, II on 01/14/2011 at 9:55:36 AM UTC
@Geoff, that's an interesting technique. I think that would work if the CDN is unavailable but wouldn't help if it is just slow. So it looks like a great idea if you're going to use a CDN, but wouldn't have solved my dojo problem.
by David Hammond on 01/14/2011 at 10:17:03 AM UTC
Thanks for all the various comments on this, great feedback - I'll update the entry to direct the reader to the comments, and also update my recommendation that you do use the full version number based on this.

@Geoff, I took a look at the expires headers on the short version number files and they have an expiration date prior to the current date!

@Dan - That is a good point too, in my experience I haven't had much if any issue between minor versions, but I've only been using it since 1.3.
by Pete Freitag on 01/14/2011 at 12:04:43 PM UTC
Browsers such as Chrome or Firefox should have a precached version of all most famous JS library ! All web sites are using JQuery or Mootools or Prototype, it will be a big big improvement !
by eBuildy on 01/26/2011 at 6:05:11 AM UTC
Although Google's servers are very fast (and in many cases your user may already have it loaded), we took the approach of generating a single, combined and minimized JS file for our site. We take jQuery, Google Analytics, AddThis and Mr. Switzer's qForms and bundle it into a core.js file that we dynamically name and push to our S3 bucket. A single, minimized and gzipped asset will probably trump multiple connections even if you have a fast link to somewhere like Google.
by Brian G on 02/05/2011 at 6:28:41 PM UTC
@Brian G

Are you sure you are serving your single js code gzip'd on S3? gzip speeds up page load times significantly but S3 doesn't have standard support for gzip and trying to force it on all browsers can be problematic.

Instead I would suggest checking out google's new google storage for developer's service, it is feature compatible with S3, but unlike S3 they automatically gzip css/js when possible and even sets correct content-type headers. I've even found load times to be better then S3.

I'd also recommend google's closure compiler service when minifying and combining your source, it has the best compression rate and can even optimize the code for faster performance.

Cheers
by Jordan on 02/17/2011 at 3:18:34 AM UTC
About the CON's point 1, I think I'll have bigger things to worry about than some malicious script running on my site if Google's servers get hacked :)
by Doobyrocks on 07/12/2011 at 2:32:58 AM UTC