Quebec Web Design Company

Website Performance and SEO (Search Engine Optimization) Tips

Shawn Purdy On September 22, 2014

Bookmark and Share

This article is going to go into some detail about Website Performance which indirectly also can effect your SEO. Since google in particular uses webpage performance as part of a ranking algorithm.

So what must you do to begin digging into your website to figure out wither you need to speed it up or not. In some cases this may be vary obvious as you may find your website slow. The first thing you should do is know what software you are using on your website. If you're using wordpress or some other CMS. You may be able to find some plugins that already exist to help you get started. For example wordpress has W3Total Cache, and WP SuperCache. Both are useful plugins for wordpress performance.

The next most important aspect is to install Mozilla Firefox, and grab both the ySlow and Google Page Speed. While you are at it also get the FireBug plugin. As all 3 of these tools are very useful tools for any web developer. They are also great tools for optimizing your websites performance. Another handy tool you can use is to check out GTMetrix they offer an online performance test tool, that combines the ySlow and Google Page Speed tests

So first I will cover some general examples of what to look for. The number one most common problem is a website with a lot of http requests. To explain what an http request is. To put it in simple terms. For every image file or javascript file or CSS(Cascading Style Sheet) you load on your website will cause 1 http request. So for example if your homepage has 50 images and 15 javascript files and 10 css files. You now have 76 http requests. (1 will be the loading of the homepage) While 76 is quite a bit. I've seen worse though. So what would be a good number? Well as low as possible. There is no right answer. If you could do it in 1 http request that would be awesome, but very unpractical with today's websites. So I would aim for 40 or under in this example. But really it would depend a lot of how heavy the sites content is. If it's pretty light already for example and is already only at 40. I may aim for 20.

So how do we lower http requests?
Using our example above. If we have 50 images being loaded. We could use Lazy Loading. Some plugins like jQuery Lazy Load exist to allow you to only load images that are within the viewport. So when the user scrolls more images will load. So lets say 40 of the 50 images are off the screen. By using this method you can save 40 http requests on page load.

Besides Lazy Loading is there another way?
base64 loading of images is another way you could load small images like thumbnails say 150x150 pixils or under. You do this server side. Like in PHP. So the server does the work instead of the browser (client side) This has practical uses, but won't always be the right solution as it does increase the document size. So going too far with this will eventually not improve performance. Depending on your hosting provider you may consume more CPU resources, which could slow down your page.

What if a lot of my images are CSS images?
CSS Sprites can be used to combine images into a single image. However I only suggest you do this for small images. Say under 75x75 pixels. The reason to not do this on larger images is because of the size of the single image will be quite large. The time it takes to download this image would negate any performance improvement. Plus it's usually not that practical.

What about Javascript and CSS Files?
Combining JS files into other JS files, and CSS files into other CSS files. So it will be a single large file of each is the best solution. You will want to make sure when you doing this for JS files that the order they are loaded in is the same order in the file from top to bottom. This is to avoid any javascript errors.

Still have too many http requests?
Another trick to speed up your website even after you have got rid of as many http requests as possible, is to load files from different domains or URL's. This can done in many different ways. Using a CDN. Or simply making a subdomain. For example you can have: images.mydomain.com load all your images from your website. This would in most cases require you to edit your websites coding to implement this. The reason it helps performance is because browsers are limited to how many requests they can make at once. This varies from the different browsers. To give you an example if the browser is limited to 8 requests at once, and you are loading 24 images. You have to wait for the first 8. However by using a different URL you allow another 8. Sometimes it can be useful to have 2 or 3 different subdomains to load these resources.

GZip Compression
GZip Compression can make a huge difference the loading time of your website. It's very easy to deploy. However some shared hosting provides do not provide the modules on the server to allow you to use it. If they do. With some simple code in your htaccess file you can deploy Gzip compression using apache's mod_deflate. I also suggest having an expire time and allow the browser to cache resources. This also requires mod_expires.

Below is an example htaccess file that would cache resources and gzip them.

# 1 YEAR
ExpiresActive On
<FilesMatch "\.(otf|ico|pdf|flv)$">
Header set Cache-Control "max-age=29030400, public"
ExpiresDefault "access plus 1 years"
Header unset Last-Modified
Header unset ETag
SetOutputFilter DEFLATE
</FilesMatch>

# 1 MONTHS
<FilesMatch "\.(jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=2419200, public"
ExpiresDefault "access plus 1 month"
SetOutputFilter DEFLATE
</FilesMatch>

<FilesMatch "\.(xml|txt|css|js)$">
Header set Cache-Control "max-age=604800, public"
ExpiresDefault "access plus 1 week"
SetOutputFilter DEFLATE
</FilesMatch>

# 30 MIN
<FilesMatch "\.(html|htm|php)$">
SetOutputFilter DEFLATE
</FilesMatch>

Images
Image size in bytes and in resolution is another common issue on websites. Where the size of the jpg or png is bigger than it really needs to be. Instead of being 400KB it could be 90KB for example. This alone can make a huge difference in your pages performance. However many CMS systems do optimize images to some extent, there is cases where a user uploads a giant image without resizing it. This should be avoided. You should always try to load images in the their native sizes where possible. Even if your taking reponsive design into account. If you know your images is going to be 300pixels wide. You can define the html as 300 width and 100% height. That will allow responsive design to function normally. At the same time you tell the browser the actual size of the image.

Minify!
This means to remove not needed bytes of your code. HTML, CSS, JS for starters. They're is some online tools that will allow you to compress your code. It removes not needed spaces and text that is not used.
Avoid 404 errors
This can happen sometimes if your site has changed a lot over a long period of time, and you forgot to remove references to an old image file for example. This can slow down your website, but it can also use server CPU for no benefit. It's a good idea to clean up your code every so often to ensure you have no missing resources.

Server side Opcache is an important factor in the speed of your website.
A lot of servers support this today as it is becoming very useful. From EAccelerator to xCache, Varish, to name a few. Find a host that supports these, and don't be afraid to ask if it's in memory or on disk. As you can setup these applications to use disk space instead of memory. From my experience using disk based caching usually provides very little benefit, and causes a big increase in disk i/o. Memory is cheap these days, make sure your hosting provider is not to cheap to provide good hosting.

Server side PHP wrapper
What wrapper is your hosting provider using. Are they using mod_php, suPHP, fcgid, fast-cgi, php-fpm, mod_ruid2, lsphp. Find out from your provider which they use.
Why?
  • mod_php = very insecure. Should be avoided at all costs.
  • suPHP = secure, but very slow
  • fcgid = moderate security decent speed
  • fast-cgi = moderate security , pretty good speed
  • php-fpm = very fast, complex configuration
  • mod_ruid2 = fast, but not so secure
  • lsphp = with cagefs very fast and very secure.

A lot of shared providers use suPHP or fcgid. A lot of hosts that are using cloudlinux are using lsphp which is what we use ourselves and highly recommend.

That wraps up this article, We hope you found it useful in your quest to improve your websites performance!
 





Comments

Subscribe to our mailing list

Live Help