Website Optimisation Techniques
Thursday 2nd December 2010
Having recently worked on a high-profile, relatively heavy traffic website, I found myself becoming obsessively interested in reducing page load time and improving efficiency. With a site that gets over 16,000 views in a month, every kb counts, the savings on overheads and improvements on performance would be huge.
There are a number of factors that affect the efficiency of a website, pretty much all of which are covered in the best practice sets of 2 of the web’s optimisation tools – Google Page Speed and Yahoo’s YSlow
Both these tools give varying levels of importance to what they see as the main factors to be considered when optimising a site, and you can test your site with them to get a result and advice on how to improve its efficiency.
Another great tool that I recently discovered is GTMetrix, which uses both Page Speed and YSlow, to rate your site; it also gives detailed information on load time and headers, and can be used to track the improvements in your site’s performance, or even to compare sites.
Through my work optimising so far, here are a few of what I’d consider the more important techniques that you can use to optimise your site:
Fewer HTTP Requests
Every time a HTTP request is made, there is a small but significant overhead involved in making the request and retrieving the data. By decreasing the number of components (external files and images used) on a page, you can reduce the number of HTTP requests required to render the page, which in turn results in faster page loads. There are a number of ways to do this, in short by using the least amount of external scripts and images as possible:
- Combine files
- Combine multiple scripts into one script
- Combine multiple CSS files into one style sheet
- Use CSS Sprites and image maps
A lot of web hosting providers only enable gzip or deflate on their dedicated servers, which means that the above probably won’t work if you’re on a shared platform. Not to worry though, GZip compression can still be achieved.
Minification (ok so that might not actually be a word)
Correct use of Browser caching
By setting an expiry date or a maximum age in the HTTP headers for static resources used on your site, you can force the browser to load previously downloaded resources from local disk rather than requesting them over the internet again. This technique ensures that repeat visitors don’t waste time and bandwidth downloading files that they already have in their cache, unless the file has been updated. An example of this would be to set the caching information for the css on your site using the following code in .htaccess:
<IfModule mod_headers.c> <FilesMatch "\.( css)$"> Header set Cache-Control "public" Header set Expires "Thu, 15 Apr 2011 20:00:00 GMT" Header set Last-Modified "Wed, 24 Nov 2010 16:30:00 GMT" </FilesMatch> </IfModule>
If your apache setup has enabled the mod_headers module (unfortunately for me, mine doesn’t) this sets the Expires date of the css files to a time in the future (15th April 2011 in this case), and specifies when they were last changed (24th Nov 2010). When the browser accesses the site, it is given this information first, then it can check the versions of the files on cache and determine whether or not it needs to request these resources. It is important though, to make sure that you change these dates every time you change the files in question, to keep your repeat users’ cache up to date.
Of course, there are a number of other basic best practice coding techniques that can help your page become more optimised, including avoiding bad requests, image optimisation, using image sprites, putting scripts at the bottom, or even using a content delivery network to serve your assets. Between YSlow and Page Speed they really have them all covered, so if you want to optimise your site I’d say the first step is to run a quick test and see what recommendations you get back. You’ll be surprised at how quickly you can increase your score and decrease load time.