Where Do We Begin?
The speed of your website (also known as load time) depends on 4 things:
- The server hosting the website
- The internet connection of the user downloading the website
- The DNS configuration of the domain for the website
- The file size and file structure of the website
The first three items of this list we can assume are optimized or perhaps not within your control. That leaves you with optimizing the website which is 9 times out of 10 what needs to be “fixed”. Please also note that each page is unique and must be diagnosed separately. How do we diagnose the page? Head on over to http://tools.pingdom.com and enter the url to any page in your website. At a glance, this tool will calculate the total file size of all the images, scripts and other downloads required to bring up your page. This number should never exceed 2 mb if you expect it to load within 2 seconds on a high-speed internet connection.
I have seen e-commerce pages with dozens of products total 18 MB and take more than 20 seconds to load. Gruesome I know, sadly it’s true. It’s as simple as uploading a couple of high-res photographs that are set to display at 90 px width but the reality is that file is actually 9,000 px and just being squeezed down to 90px for display but you are still downloading the same 3mb of data for that 9,000 px wide photo (the one that looks like it’s only 100px wide on your web page).
Another common issue I see is too many “http requests”… an http request is counted every time a file loads. That could be your css file, your image file, your html file, your cgi file, your htaccess file, your xml file, and the list goes on and on and on. Every single file counts, that’s every image, every script, everything. Even if they are all tiny tiny file sizes, a new connection needs to be made for each one of them… there are ways of merging files so that there are fewer connections required, i.e. http requests
That’s your meat and potatoes on optimizing speed performance for a web page. Each one has it’s own set of http requests (connections) and it’s own set of images, videos, etc. (file size). The recommended limit for a web page (a unique url) is 2 MB and less than 100 http requests in order for it to be “lightning fast”. Otherwise, even the fastest server with the fastest internet connection is going to be slow to load your website. That being said, some of the first few things I would look at are;
- Compressing Image Files
- Disabling unused plugins
- Merging CSS files
- Script compression
- Caching systems (server and browser)
If you ever notice your website is dreadfully slow, trace back your steps, more often than not something has been deleted but is still being requested. Perhaps an image that was deleted via FTP but not deleted in the HTML code… so the browser that tries to load the website gets stuck on that missing image that is no longer on the server. A missing file like that could stall the page by a good 2 to 4 seconds. What happens is the browser tries several times to download that file before it gives up and moves on to another one. Once the code is cleaned out and the image properly deleted, then the request for the missing file doesn’t even happen and you’ve just shaved off 4 seconds on your load time and are back down to 1.64 seconds 😉