Let's talk

Call us: +44 (0)1603 878240

Optimising Your Pages With YSlow And PageSpeed

09:13 on Mon, 27 Sep 2010 | Development | 3 Comments

There’s simply no debate when it comes to the speed and efficiency of web pages. Faster is better. We’re almost all guilty of relying on the increasing speeds of broadband, and increasing power of computers to take up the slack in our coding practices. Computers and Internet connections have been more than happy to take up this challenge, but the ever-evolving practice of Search Engine Optimisation has taken an interest in the performance of our pages. In fact, the speed at which your pages load is now a deciding ranking factor within search engine algorithms.

Google like fast pages. They say so themselves, and they even provide a tool in the form of Google PageSpeed, a FireFox plugin, to guide us through the various optimisations and rate our websites for speed. Yahoo do the same with their YSlow plugin for Firefox. These utilities afford us a reasonably clear picture of the optimisations both of these search outlets expect, and break these down into logical steps which can be followed to improved the overall page speed.

From here on in, things will get a little technical. If this isn’t your cup of tea then you may wish to stop reading now and forward this on to your technical staff or web-build agency.

In this article I will run through the quick and easy changes you can make to increase speed and how to make them in IIS7. There are no quick wins, and very rarely a single point that will dramatically increase a score on its own, but a number of small changes will soon sum up into a greater whole.

Download Parallelisation

Most of us have reasonably decent Internet connections in this day and age, and surf at speeds that the people who created HTTP could never have imagined. In theory, a browser should allow a maximum of two connections to any one hostname, letting it download only two things simultaneously. Modern browsers often break this specified limit, but we can’t always rely on this to be true. Irrespective of the actual limit, any resources over and above this number will be queued and downloaded only once the others have been completed.

Serving images and scripts from sub-domains artificially boosts the number of hostnames from which resources are being served, bringing us up to a theoretical total of 6 simultaneous transfers and, thus, a potentially shorter overall load time.

There are two methods of setting up these sub domains, depending on your set up and additional optimisations you may want to perform:

  1. The easy way - Simply add scripts.yourdomain.com and media.yourdomain.com to the bindings of your site in II7. Setting up these as necessary with your domain provider. You can then access the entire structure of your site under these additional domains and can simply rewrite any reference to /images/yourimage.jpg to media.yourdomain.com/images/yourimage.jpg.

    You may already see a problem with this technique. You’ve created multiple references to the same site and resources, breaking what might otherwise be carefully set up canonical URLs.

    You can rewrite any request outside of images/ to the original domain, but this isn’t the most elegant setup.

  2. The right way – Create two new sites within IIS7 that reference the subfolders that contain your images and scripts. There are multiple benefits to this method, first and foremost your resources will be right at the root of the domain, and your web-pages will not, so yourdomain.com/yourpage.html will not be accessible through scripts.yourdomain.com/yourpage.html.

    Similarly, yourdomain.com/scripts/yourscript.js would become scripts.yourdomain.com/yourscript.js. Logical and tidy.

    Having discrete sites for these resources also allows you to experiment with further optimisations, such as disabling any scripting support, adding 48 hour plus cache expiry times and enabling compression.

This leads me on to a clever trick- Assuming you have the images for your site within /media/images and the CSS for your site lives within /media/css with relative URLs (eg: ../images/yourimage.gif) you can rely on the relativity of these CSS URLs and parallelize all of the images referenced in your CSS simply by changing your stylesheet link from /media/css/yourcss.css to media.yourdomain.com/css/yourcss.css.

This means that when yourcss.css looks for ../images/yourimage.gif it ends up finding it at media.yourdomain.com/images/yourimage.gif, so you don’t have to go fiddling with any individual URLs for all the non-layout images on your site and should, hopefully, not have to modify your CSS at all.

Minify And Combine Your Resources

Another way to achieve more simultaneous downloads is to combine the contents of multiple compatible files into a single file. This typically applies to Javascript, but can apply to CSS and even images. Although I’m not a big fan of the latter unless it makes logical sense to combine those images- the multiple states of a button graphic, for example, can be combined and switched between using the background-position property for quicker load times.
Similarly, reducing the size of resources also serves to help matters, and you’ll be surprised at Google PageSpeed’s suggestions in this respect. With images it will often deliver a PNG image that looks identical to one you may be using, but with much better compression (I've seen reductions from 54kb to 4kb!). These can be saved to your computer, uploaded and included with nothing more than a simple change (from .jpg or .gif to .png) in your CSS.

JavaScript and CSS can be “compressed” using a process called Minifying. When you minify JavaScript, all comments and white space are removed and variable/function names can even be shortened. Minifying CSS will typically remove white space and comments. Both Google PageSpeed and Yahoo YSlow include tools to provide you with these minified files. But beware, minified Javascript is much more difficult to work with, so you should copy your /js directory to /js-src or otherwise, so you always have a copy of the original file to edit and re-minify.

Combining JavaScript can sometimes be tricky, but is often literally just a case of pasting everything into one single text file. Usually these disparate classes and functions will play nice together and result in a single, sleek file.

If you use jQuery, or any of its common UI extensions, you should be including these scripts from Google’s servers. For example, the URL http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js references the minified version of jQuery 1.4. This will increase your download parallelisation, and also means that you’re serving users a static file from a “Content Delivery Network”, which they could quite possibly already have cached and ready to use. YSlow likes to advocate the use of Content Delivery Networks, but will penalise you unnecessarily when they are not used for every resource on your site.

Compression And Caching

If you set up a couple of sites for media.yourdomain.com and scripts.yourdomain.com, you’ll be able to add caching and compression. Compression is generally turned on by default in IIS7, however dynamic compression is not. Whilst this may not apply to your resources, there are often stylesheets serves by handlers (WebResource.axd) that are dynamic and will not normally be compressed.

To enable Dynamic Compression you first need to install it. There’s an easy-to-follow guide for installing Dynamic Compression at IIS.net. Use Dynamic Compression sparingly, however, as it can cause additional load on your server and should definitely not be used if you’re already running with a high CPU load. When it is used, however, it will speed up the delivery of your actual pages, and strike another point from Google PageSpeeds list of suggestions.

Caching is a different beast. Google PageSpeed and Yahoo YSlow will look for at least two days of cacheability on your images and scripts. You can set this up by selecting your scripts and/or resources sites and clicking “HTTP Response Headers” under the “IIS” heading. On the right-hand side you will find a link entitled “Set Common Headers…” click this and you’ll see a new dialogue where you can check “Expire Web Content” and enter a number of days.

Be careful with caching, however, as it can be a real gotcha. Set it up after you’ve finished merging and minifying your JavaScript and be wary when you make future changes to the site that users will not receive the latest version of your script/CSS unless you force them to.

Google Analytics Thwarts "Serve static content from a cookieless domain"

This is a niggling and easy to solve point that will haunt you even after you've created subdomains for your images and scripts. Google analytics is typically configured to set cookies for all subdomains of a website, but you will normally only use www. for any trackable content. Requests to yoursite.com should be redirected (using a 301) to www.yoursite.com to canonicalize on one domain, and similarly you should use: pageTracker._setDomainName("www.yoursite.com") to prevent Google Analytics setting cookies for your static content subdomains.

Getting Sane Results From Google PageSpeed

PageSpeed is odd, it doesn’t always seem to check the resources on your server for the conditions it desires and will often falsely report a lack of Caching or Compression. To get a more accurate rating from PageSpeed on these points you need to use Developer Toolbar to “Disable Cache,” clear your cache, or refresh your heart out. Beware “Disable Cache,” however, as it can omit some otherwise important optimisations from your results such as Optimise Images. Make sure you address these problems first.

Getting Sane Results From YSlow

YSlow includes a handful of Rulesets, which perform different tests depending on the nature of your site and which speed optimisations are relevant. Running the stock Yahoo YSlow(V2) rule set is not advised, and can lead to unnecessarily penalisation regarding both the use of Content Delivery Networks (or lack thereof) and the use of Entity Tags which you almost certainly wont need to worry about on any single-server, average website.

To create a new Ruleset simply hit the Edit button and uncheck “Use a Content Delivery Network (CDN)” and “Configure entity tags (ETags)”. This can inflate your rating by a whole grade, and should help you focus on the things that really matter and let you know when you should stop. Simply put, getting a “C” rating because you’re checking for these two things will probably lead you to try and improve the site elsewhere, don’t worry! Turn them off and get the “B” you deserve. It’s not cheating, honest!
 

Thanks for reading, and I sinerely hope you've learned something. Optimising pages for speed, whether or not you're going it for the sake of Google rankings, is a noble cause that benefits your users. We have SEO to thank for championing many such causes, from Accessibility to readability. As Google gets smarter our websites, whether intentionally or not, get better.

Comments & Discussion

(3 Comments)

Post a comment, your email will not be published, nor will it be harvested.

Dasia

Dasia • Years ago

I'm impressed by your writing. Are you a professional or just very knowledgeable?

Reply

Daniel Smith

Daniel Smith • Years ago

Another way of increasing page speed is to use CSS sprites. This lowers the number of HTTP Requests to the server, thus improving the speed. One technique I always use is to create a structure image, then map around it using the CSS background-position property. Works a treat!

Reply

Optimising Your Pages With YSlow And PageSpeed

Optimising Your Pages With YSlow And PageSpeed • Years ago

[...] There's simply no debate when it comes to the speed and efficiency of web pages. Faster is better... [...]

Reply