How to make your Website Faster
Website speed is directly linked to search engine performance and conversion performance. Slow loading websites alienate visitors and reduce engagement.
A study carried out by KISSmetrics discovered that a 1 second delay in page response can result in a 7% reduction in conversions. That means that an e-commerce store making £100,000 a day would lose £2.5 million worth of sales every year with a 1 second page delay.
With those scary statistics in mind, website speed becomes critical to business performance. It becomes even more critical when you consider that site speed is a ranking factor to Google now, so if your online business relies on organic traffic for leads and sales, it‘s essential that you tune your site to load quickly.
So, how can you make your website faster?
Analysis
Analysis is the first step toward creating a fast website. Without detailed insight into what‘s holding your website back, you will be making improvements blind-folded.
There are plenty of free online tools for analysing your website speed. Our favourite is GTmetrix. This tool will serve up detailed information about your page speed with a list of 27 standard elements, which will be graded between 0 – 100. This tool also checks your website speed against YSlow standards. Unlike many tools, GTmetrix fully analyses a page and gives accurate data on exactly what needs to be addressed.
For example, a GTmetrix test on Epic New Media reveals lots of improvements for us. Our page speed grade was a D, at 64%.
Another useful analysis tool for webmasters is Page Speed Insights, by Google.
The basics
It‘s important to understand how websites work before you attempt to make your own faster.
For the website visitor, 80% of a website load time is made up of HTTP requests, according to research carried out by YUI. As such, the first step toward creating a faster website is to minimise those requests.
There are two main ways you can do that – simplify your pages so that they are made up of simple text and a few images, and combine all scripts together into a single script. Also, combine all CSS into a single stylesheet, to reduce the requests made to your host / server.
Combine images with CSS sprites
This is a tip recommended by many marketing professionals. The concept is simple – the more images on a page the more files there are. This forces multiple roundtrips on the server by the web browser. This, of course, slows down page speed.
Sprites combine images on a page into one image, thus reducing server requests. CSS Tricks have an extremely detailed guide about CSS sprites, which is well worth the read.
Leverage browser caching
Setting an expiry date or a maximum age on HTTP headers for static resources reduces load time.
Strangely, though, this tactic is not used by most websites, despite being one of the easiest to implement.
If your website is powered by WordPress, there are a variety of free plugins you can install that will automate this process for you. Our favourite is W3 Total Cache.
If your website is a custom build, your expires headers should be set one year into the future on static files. For Apache servers, you should be using the ExpiresDefault command to specify “access plus 1 year”.
Enable gzip compression
Gzipping reduces the size of HTTP responses, and thus, reduces response time and increases page speed. This is among the easiest ways to increase site speed. This can be achieved through a WordPress plugin, such as Gzip Ninja for Apache Servers, or simply be enabling it by adding the following code to your .htaccess file:
# compress text, html, javascript, css, xml:
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
# Or, compress certain file types by extension:
SetOutputFilter DEFLATE
Alternatively, the following code can be placed at the top of your HTML/PHP file:
Avoid bad requests
Broken links will result in error pages, such as 404/410. These are what marketers class as bad requests, and Google hates them. Although these won‘t be detrimental to website speed per se, they will put unnecessary load on a server and they can result in visitor alienation.
You can track down any broken links by using an online tool, such as Broken Link Check.