Quick Google CDN jQuery Tip

web

Here's a quick tip if your using Google's Content Delivery Network (CDN) for serving up jQuery (or other scripts).

So if you want jQuery 1.4.4 you can use a url like this:

http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js

Now suppose you just want the latest 1.4.x release, you can use (caution, keep reading):

http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js

And if you just want the latest 1.x.x release, you can use this (caution, keep reading):

http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js

I would think that using a less specific version number (like 1.4 or just 1) would result in a higher cache hit ratio for your visitors. One drawback to this would be that when a new version comes out your users will start using it without giving you a chance to do any testing.

Update: However if you take a look at the comments, one of my readers points out that there is a big difference in the Expires http header when using the full version, vs using a version shortcut. Here's what the HTTP headers look like on a request for 1.4.4:

Date: Thu, 13 Jan 2011 16:41:02 GMT
Expires: Fri, 13 Jan 2012 16:41:02 GMT
Cache-Control: public, max-age=31536000

Now here's what they look like when you request just the 1.4:

Date: Fri, 14 Jan 2011 16:45:21 GMT
Expires: Fri, 14 Jan 2011 15:45:21 GMT
Cache-Control: public, must-revalidate, proxy-revalidate, max-age=3600

When requesting 1.4.4 we get a file with an expiration date of next year, and the Cache-Control with a max-age of one year.

Now when you look at the request for 1.4 you will notice that the Expires date is prior to the Date header, and the max-age specified in the Cache-Control header is only one hour.

What are the Pro's and Con's to using Google's CDN to serve up jQuery?

  • Pro's
    • Google's CDN has a very low latency, it can serve a resource faster than your webserver can.
    • There is a good chance that the user may have a cached copy of jQuery from google CDN, so they won't need to download it again.
    • Chance are good that the dns lookup for ajax.googleapis.com is cached by the user as well.
    • Because the script is on a separate domain, modern browsers will download the script in parallel with scripts on your domain.
    • Google's CDN also works with HTTPS, so you can offload the SSL processing to google's server.
  • Con's
    • Security - If someone hacks google's CDN then they can run javascript on your site.
    • Additional DNS request may be required, but I'm betting the Pro's make this negligible in most cases.
    • May be slower if your site is internal (an intranet for example).

    Do you have any other Pro's or Con's to using Google's CDN for common javascript files?



Related Entries

2 people found this page useful, what do you think?

Trackbacks

Trackback Address: 777/838702C347FD45DC610D4CDFE039BC1B

Comments

On 01/13/2011 at 4:30:15 PM EST James Moberg wrote:
1
Parallel script loading is nice... unless your website use scripts that are dependent on certain core libraries (ie, jQuery) being loaded first. If you are using Google CDN, you may want to consider using LABjs as it will allow you to asynchronously load several resources from your own domain and external domains without blocking while retaining chaining dependencies. http://LABjs.com/ [NOTE: The LABjs website appears to be down right now and you may need to use a search for more info.]

On 01/13/2011 at 4:39:59 PM EST David Hammond wrote:
2
The biggest Con in my eyes is that it creates another point of failure for your site. If there is a problem with the CDN, it affects your site.

In theory, this should be very rare, but I have had problems serving the Dojo Toolkit from the Google CDN. In fairness, if you don't create a custom build of Dojo you can end up having to download a huge number of files. In my case it seemed like any latency on the CDN caused my site to slow to a crawl. This problem would probably not be that noticeable if you are only including the single jQuery file, but it's worth noting that latency problems can happen on the CDN.

On 01/13/2011 at 7:15:10 PM EST John Sieber wrote:
3
I did not realize that you could leave out the specific versions like you mentioned to get the latest. Interesting tip, thanks for sharing it.

On 01/13/2011 at 7:45:03 PM EST Geoff wrote:
4
The various Page Speed tools (Yslow etc) will ping a warning to say you're not using long cache expiry dates if you don't opt for the very specific versions.

@David - you can always fall back to a local copy of jQuery if the CDN is down.. have a look at the html5 boilerplate code (http://html5boilerplate.com/) for example..

On 01/13/2011 at 8:18:29 PM EST Daniel Schildt wrote:
5
Head JS (headjs.com) is one quite useful script for helping to load scripts in parallel but execute in order. It helped to reduce amount of load time even when part of the scripts are still loaded from Google's CDN (but just using Head JS as loader). There's also quite bit of other features in it so I surely recommend to look at it.

On 01/14/2011 at 11:55:36 AM EST Dan G. Switzer, II wrote:
6
@John:

I'd highly recommend *always* pointing to the specific version of jQuery you've tested your site against.

There are just too many changes between jQuery point releases that could impact your code in a way that's difficult to track down. There's nothing worse then users reporting your sites not working, yet you know that you haven't made any changes, only to eventually track down the issue as a change to some external files being served up that you can't control. While this could happen if the CDN is down as well, at least then you'll see more obvious JS errors instead of the more subtle changes that can occur if behavior changes.

I've just been using jQuery long enough (since like jQuery 1.1) I've just seen enough changes in point releases that caused me to update code to realize it's a bad thing to automatically point to the latest version of jQuery in a production environment.

The only time I would think about using this technique is for a development version of a jQuery plug-in or for a test environment, where you just want to make sure your code works w/the latest stable version.

On 01/14/2011 at 12:17:03 PM EST David Hammond wrote:
7
@Geoff, that's an interesting technique. I think that would work if the CDN is unavailable but wouldn't help if it is just slow. So it looks like a great idea if you're going to use a CDN, but wouldn't have solved my dojo problem.

On 01/14/2011 at 2:04:43 PM EST Pete Freitag wrote:
8
Thanks for all the various comments on this, great feedback - I'll update the entry to direct the reader to the comments, and also update my recommendation that you do use the full version number based on this.

@Geoff, I took a look at the expires headers on the short version number files and they have an expiration date prior to the current date!

@Dan - That is a good point too, in my experience I haven't had much if any issue between minor versions, but I've only been using it since 1.3.

On 01/16/2011 at 12:47:38 PM EST confinedspace wrote:
9
CON: when Google went down in the UK it took a massive number of sites off-line. It even took jquery.com down as they were even using Googles CDN.

But, Google going down is extremely rare.

We always use Googles CDN. But we now have something in our config files to switch between cloud and local files. If it happened again, all we need to do is change a flag in the config to switch over (not just jquery, but jqueryui, all the associate CSS files, etc).

As with anything, just make sure you have a backup plan.

On 01/26/2011 at 8:05:11 AM EST eBuildy wrote:
10
Browsers such as Chrome or Firefox should have a precached version of all most famous JS library ! All web sites are using JQuery or Mootools or Prototype, it will be a big big improvement !

On 02/05/2011 at 8:28:41 PM EST Brian G wrote:
11
Although Google's servers are very fast (and in many cases your user may already have it loaded), we took the approach of generating a single, combined and minimized JS file for our site. We take jQuery, Google Analytics, AddThis and Mr. Switzer's qForms and bundle it into a core.js file that we dynamically name and push to our S3 bucket. A single, minimized and gzipped asset will probably trump multiple connections even if you have a fast link to somewhere like Google.

On 02/17/2011 at 5:18:34 AM EST Jordan wrote:
12
@Brian G

Are you sure you are serving your single js code gzip'd on S3? gzip speeds up page load times significantly but S3 doesn't have standard support for gzip and trying to force it on all browsers can be problematic.

Instead I would suggest checking out google's new google storage for developer's service, it is feature compatible with S3, but unlike S3 they automatically gzip css/js when possible and even sets correct content-type headers. I've even found load times to be better then S3.

I'd also recommend google's closure compiler service when minifying and combining your source, it has the best compression rate and can even optimize the code for faster performance.

Cheers

On 07/12/2011 at 4:32:58 AM EDT Doobyrocks wrote:
13
About the CON's point 1, I think I'll have bigger things to worry about than some malicious script running on my site if Google's servers get hacked :)

On 04/02/2014 at 7:08:01 AM EDT ?????? ??`???? wrote:
14
??????????1??????|????r?F???? ????????? Quick Google CDN jQuery Tip ?????????????? ???????????Y?????????????ยค????g|??R}????????{?????S????? ?????? ??`???? http://www.erikbelgum.com/data/montblanc.html

Post a Comment




  



Spell Checker by Foundeo

Recent Entries



foundeo


did you hack my cf?