Navigation

How to speed up your website

How To, Usability | May 7, 2012

Site speed is one of the most important things about creating web content, and web applications.

In fact, it has been noted by various analysts at Google that people don’t often sit through the first 30 seconds of a video, much less the first 15, so it would be wise of you to get the content of your site loaded as fast as you can so that people can make a judgement and agree to either use it or not.

It may be a bit superficial that people judge sites that fast, but it is often the case, and we shouldn’t take it for granted.

Optimizing your site should be a top priority, and often times when using WordPress and other engines they have nice plugins that help. I will assume though, that you may not be using WordPress, because there are a lot that don’t, and I will give you a few of the best ways to optimize any site regardless of where its hosted.

 

Images

Image optimization can be a tough topic, but one that actually has a lot of aspects from which to choose. There are file formats, image optimization tools, and code/CSS best practices to follow to make sure you are saving and acting with images in the best way possible.

I want to give an example as to why this is important though, so let’s take a recent example. It has recently become apparent to various iOS developers and app creators that apps that use the Retina ready images are taking up 2–4 times as much space on the person’s phone than their previous versions, and it is causing people’s phones to simply run out of space from simple app downloads.

This isn’t as relevant to us web developers and designers, but it does go to show you just how important it is to properly handle your images on any platform. The following are a few of what I think are the most important topics to remember when optimizing images for the web.

Image formats

The formatting of images is a heated topic, and it seems to be because everyone believes a different format will increase speed, but there is a pretty prevalent school of thought on this, and we can always use this as a de-facto standard. JPEG’s are for photographs, GIF’s are for low color images/flat color images, and PNG’s are for everything else. Most web designers and developers that I know of prefer to use PNG’s for just about everything, unless they have a button perhaps that has one or two colors, whereby they find GIF’s to work great.

Now, of course you can play with those specifications but always remember that these are standards for what will save smaller and lighter vs bigger and heavier. If you are doing a photography site though, it will be loading pretty slow regardless compared to other sites — so try out some of these next methods to increase the image optimization overall.

Image code

One of the worst things we can do for server time when loading images is let the code do the sizing for us. Well, that could be said for anything regarding ‘letting the code do ____ for us’. The common saying is, “If you can do it, then do it”, and it is a darn good one. Using things like width='50px' height='30px' can really throw off the server load time as far as that image is concerned, because the server is parsing the page and sees there is a task it has to accomplish – one that could have been done by the creator. So make sure you go ahead and do that with all of your images.

Image optimization tools

Tools are always helpful. Well, most of the time. Sometimes they are a burden and a distraction, but in this case it seems that they are often very useful. If you can find a great image optimization tool, first of all — link it in the comments because we are all on the hunt, but a few of my favorites are following. I love ImageOptim for Mac, and Riot for Windows. These two tools are very different, but perform a similar task.

You can put images in and it will decipher a way and method to optimize them, do so, and then spit back out the final result all the while saving the format you sent them in with. They are really quite nice, and there are tons more out there. In fact, there are a bunch that will analyze an image’s bitmap and tell you what format is best. You can easily tell that these are some of the most useful things in a web designers tool bucket other than a text-editor and design program, and rightfully so.

Image based server optimization

I’m not an expert when it comes to setting up servers, but I certainly have enough background on the small scale to give this advice. Don’t have massive image loads stored locally. That is, don’t leave a database of images stored on your servers that you are serving the other site files from. Take note of the technologies such as Amazon S3 or Flickr’s servers, and use those to serve your files from.

I’ve recently implemented an Amazon S3 bucket to server our files from, and it was actually quite easy — so feel free to try that. It is a great method. The main reason is that you don’t want a database bottleneck to happen in an instance that you are serving multiple loads from, because it can be a diagnosing nightmare. It’s good practice to store separate file’s on different servers (if under massive load) unless of course it is just a simple general purpose string storage database or something similar.

 

CSS and JavaScript Optimization

CSS and JavaScript are really important languages when it comes to web design, and especially when it comes to creating dynamic content. I think that people often forget that they can optimize their dynamic content, and they forget that they can optimize their JavaScript and CSS. These aren’t really the most significant things for smaller sites, but with larger sites it’s really important — especially when it comes to sites that rely on a lot of design. Let’s step through a few of the “CSS and JavaScript Rules” that are pretty standardized when it comes to creating web applications.

First rule of CSS and JavaScript

If you can do it in CSS, then do it

Often times we forget that we have amazing tools right in front of us, and I’d say CSS classifies as one of the most amazing web designers have. I’d also say that designers jump into photoshop too quickly by nature (but it is their job so who can blame them). Do keep in mind though that as you design you have something in your browser that can do quick mock ups too: CSS3. Take advantage of it! Having a place to do quick mockups really helps, and it will lead you away from doing hacked together things in HTML later on. Instead of ” ” I am sure you can find a way to add that space in CSS, so do it!

Second rule of CSS and JavaScript

Minify, minify, minify!

The minification of code is perhaps one of the best and easiest things you can do to speed up your site. Keep in mind, we are talking miliseconds, but still it has a noticible effect — and especially if you are using something like a jQuery library. Remember that if you are ever adding plugins for JavaScript/CSS and you are given the option to download the minified version (and don’t need to edit it), do so. Some of my favorite tools to do this are, Code Minifier for Mac, Minify for Windows, and JSCompress/CSSCompressor for those of you who want some browser-based cross platform solutions. Happy minifying!

Third rule of CSS and JavaScript

In-line is a no-no

It is bad practice to use in-line CSS or JavaScript, but especially when it comes to CSS. The reason for that is not only due to legacy issues, but also because if we leave the CSS within the HTML code (especially in-line) it will read as such: HTML/CSS/HTML/CSS/HTML/CSS/HTML/CSS instead of just a simple HTML => CSS. As you can tell, this is really bad for server load times, and can often lead to the detriment of most web applications should there be a designer who refused to use it in a separate file. It certainly wouldn’t cause your site to crash, but it will cause another employee to go through and extract it — it is that important. So always remember to be the one who is extracting it, not the guy who leaves it for others to extract.

Fourth rule of CSS and JavaScript

Move it down

If you have to put your JavaScript in the page with the HTML itself, and have no way around it, then put it at the bottom of the HTML document. This helps speed up the site load time as well, because we can perform all of those functions and other JavaScript goodies after the page itself has loaded. Another thing is that this decreases the likelihood of a bug squashing the performance of the entire site, because when there is a bug with the JavaScript in a sight it will often eat memory like no tomorrow. So it is good practice to make sure your site isn’t doing that, and to warn against future events in which it may — none of us want people to visit our site and then have their browsers crash.

Fifth rule of CSS and Javascript

DOM optimization

Reduce the DOM if you can. Take for instance an example that you are using a lot of jQuery that points to various DOM elements or reads through all the DOM to find something — it can slow your site down quite a bit. There is a little saying I always loved and it fits here, “If you are doing things because it is the only way you know how, then there are probably better ways to do it.” You could also say, “If you are doing things because it is the only way you know how, then you are doing it wrong,” but that version is a bit harsher.

Research, and find those things out in such a case. If you are working with a div in HTML just because you need it for one little thing and it is the only way you know how to do that then it may not be the best way. Now, of course I understand that using div tags because you need them for your CSS is entirely understandable, but perhaps you can remove a few and find a more broad manner of handling that style issue.

I just recently did this myself, as I am going through a Ruby on Rails project currently. Earlier in the week I nested roughly 5 div’s within each other in HAML of all things, just to do something I wanted (a box in a box in a box inside of something else in this case). And I just looked at it, knew it was crap, but didn’t know a better way to do it, so I scrapped it all to re-do it. Having to re-do that made it much harder but it forced me to learn a new way to handle that issue. And in the end I learned a lot from it, and I would recommend the solution to anyone in the future. Go ahead and grab one of those knowledge nuggets for yourself! They are certainly low hanging fruit.

 

General optimizations

These are more of the broad topics that really don’t fit in anywhere else, but that I still feel deserve some attention. In fact, some of these may be the most important things you can do to speed up a web application or site.

Slashes on links

This is noticeably important. When a user opens a link without a slash at the very end from a website the server literally has to figure out what kind of file or page is at that address. The server will then include said slash, but if you add it yourself then you are reducing milliseconds of load time. These milliseconds all do add up, I promise. Often times I find designers especially who don’t think about it think that their unoptimized code will not burden anything, but it does in the end. If you save quarters for 10 years you certainly will have a lot of money, and the same concept applies here — just on a smaller or larger scale depending on your site’s traffic.

Favicons

Browsers always do a pull for a favicon.ico file at the root level of your server, so you may as well just go ahead and include it. Even if it is something temporary, it is always good to have. If you don’t, the browser itself will give an ‘internal 404′, and just cache that 404 up on the browser’s favicon.ico section, and we all know reducing 404’s speed up load time.

Reduce cookie size

This one may not apply to all of us, but if you are developing web applications then reducing the cookie size is really important. For instance, in what I am familiar with — Ruby on Rails applications — you can use cookies (or other methods) for authentication from session to session and often times people will prefer to use the other methods because they can decrease user load times with them.

Now, a cookie does imply that it is caching things on your computer so you may think it would increase load time, but typically all they are good for is authenticating user sessions or tracking you around the web (as Google and Facebook have been accused of). If you have to work with Cookies, though, make sure you keep the size low and you use them with your better judgement. If you have to, set the expire date shorter to decrease the load time.

Cache

This is a massive topic, and one that I am not an expert on. Caching though is a pretty simple concept. It is storing files (typically HTML/CSS code) from sites that you frequently visit on your computer so that you don’t have to load them every time you visit.

It is really an incredibly useful technology, and one that a lot of web applications are starting to employ as of the past few years. There have been a number of database solutions for caching and probably the most notable is Memcached. What this does is store a copy of database files to your browser as you are using a web application. So, for instance, if you have various profiles you visit often it may store the profile pictures to your computer, and the beauty of Memcached comes in the next phase. In your code, you can actually call (before you pull from the DB) from the Memcached servers and see if you can pull a cached version of the file(s). And if not it will, of course, pull the file from the Database, and if it isn’t in the cache already it will add it to save time next time. This is a beautiful example of caching on a large scale and it has helped tons and tons of companies speed up servers and databases throughout the past 2+ years.

And that will just about sum it up. Those aren’t all of the ways to speed up your site, of course, but it should start to peak your curiosity and get you looking for all the great things out there that will.


I hope you embark on this journey with all of us, and while you do be sure to share your opinions on what you think are the best solutions! You can do on a blog, twitter, or even better in the comments below here so we can all stay in touch on what the best solutions are! I know some people even go as far as to minify their HTML files. Do you do that? If so, have you noticed a performance boost? Let’s get the sharing started!

Dain Miller is a Sr. Engineer at a Chicago based start-up, though he is still accepting freelance projects. He has a passion for responsive design and Ruby on Rails. You can follow him on twitter at @_dain, or find him on about.me.

Share this post
Comments (no login required)
  • Anonymous

    A huge amount of great advice – I will be passing it on to a few of my front-end friends who could use a bit more discipline in optimization (as well as applying a few of the tips myself). Thanks!

    • Anonymous

      No problem!  

  • http://bikramkawan.com.np Bikram Kawan

    By using cdn for wordpress site can rapidly increase your website load faster.
    Any ways Nice Article. I have never tried riot tool. I will try this once.

  • http://twitter.com/xtcsh Sebastian

    Didn’t quite get the point about 
    Image code
    A few bytes that specify the image size will not increase load time, but will make the page load faster as the browser doesnt have to fetch all images before being able to render the page.

    • Anonymous

      If it doesn’t have to do the computation on it’s end, then it will save milliseconds – from the past experience and research I did.  The browser will have to fetch the images regardless.  Could you elaborate on that point?  I would love to know how that actually wastes browser time/resources, because that is what this is all about sharing back and forth.  :)

      • http://twitter.com/xtcsh Sebastian

        If every image has it’s pixel dimensions specified, the browser software can start *rendering* the page even before requesting the first image from the server. This will make a page get its form/layout faster and will hence appear to load faster. Browser caches aside, each request for an image requires a network roundtrip plus the download time.

        If the dimensions aren’t specified, there’s some guessing and waiting time for the browser (while trying to figure out what to do with what it has so far). In general, web servers couldn’t care less about what they’re serving and don’t “parse” anything. Of course, the output (HTML and media) can be cached at desired intervals instead of being pulled from a database or similar at every request. Was this elaboration :)

      • Anonymous

        Thank you Sebastion, yes that was very well put!  The one question I have though is as follows:

        The impression I was under was that say you had an image that was 200px by 200px and you were trying to size it down to 100px by 100px.  My impression was that to hard-code that scale in HTML/CSS would take longer to render than actually scaling it yourself in photoshop.  But, what I think you are saying (and please let me know if it’s incorrect) is that if you have a 100px by 100px image that you scaled yourself AND include the hardcoded values of width=”100px” height=”100px”, it will actually save time as the browser doesn’t have to guess.  And if so, does that apply only to that scenario?  Or is it faster even if the browser is scaling a massive image down to a smaller one using the hard-coded values, itself?

      • http://twitter.com/xtcsh Sebastian

        Naturally, a physical 200×200 will always mean a larger file size than 100×100, regardless of what size it will be scaled down to with HTML/CSS.

      • Anonymous

        Ah, interesting.

  • Anonymous

    Hey There Ben,

    First off, thanks for the compliment at the top, but I don’t think that a lot of the tech behind this entire article is wrong just because I misjudged the effects of one or two parts. I will agree with you though on all the other parts, and I am not a server admin and a lot of this is a bit over my head though I do feel I am aware of what is happening and just stated it wrong.  I say that because when I read your statement, “It’s the client’s browser that has to render the image at the correct size”, I knew it was something I am aware of, but seemed to not make clear in the article.  So I apologize for not being clear enough, or by being misleading with regards to what I meant there by phrasing it in a poor manner.  

    On to the memcached topic.  You’re very correct there, and I even mentioned it in the article, I am really not a server admin nor someone who is really experienced with caching in general or memcached at all – so it was a bit difficult to understand, though I was pretty sure I had it down very well before writing.  The reason I say that is because what you said here, “the server first checks to see if the result of the query is in memcached – if so, it grabs it and returns what it needs to, bypassing talking to the database, saving a lot of time.”  As that is pretty much what I said in the article (referencing: “you can actually call (before you pull from the DB) from the Memcached servers and see if you can pull a cached version of the file(s”).  So I feel I understood a good bit of this topic before writing, but you did make a lot of great points here.  My statement regarding the copy of database files on the browser was really inaccurate, or rather strangely vague to write.  What I should have said was that it stores that cached version on the memcached servers, correct?  I think I was confusing regular caching and memcached in that statement.  

    Again, sorry for the confusion therein, but that is a new topic for me and was just trying to elaborate on it in a way that I learned it to help others who may be confused. Though I am glad there is a good comment area here for them to read though that will help elaborate on the topics on the article quite a bit, provided by you (Ben) and a few others!  :) 

    Thanks for pointing all this out, really appreciate the clarification.  

  • Anonymous

    Some great concepts here! A few I wasn’t actually aware of either from an impact perspective or even to try it for optimization.

    One question/dilemma I’ve been curious about… with the rise of responsive sites, we have to strip out the hardcoded width and height on inline images. I was aware it was more optimal for a width and height to be specified – from what I understood it had to do with browser rendering time when not knowing the amount of space for an image. Obviously responsive is a relatively new concept and there are a lot of aspects that need further refinement, especially when it comes to images. 

    Any thoughts specifically on the impact of not specifying a width and height to allow for responsive images?

    • Anonymous

      Hey Brian, 

      Well as far as I was concerned responsive images were best done using various sizes you made yourself and then using media queries to tell the browser when to start initializing that new image (or not).  Not sure how hardcoded styles come into play there.  Though, me and Sebastion are exchanging a bit of info in the comment above because he has much more knowledge on this than I.  Actually I just asked him something that is related to this question, as I was under the impression it was best done the way I stated above – but it may indeed not be! 
      So let me get back to you in just a few, should Sebastion have the time to reply.  :) 

      Thanks!

      • Anonymous

        Sebastian put it much more eloquently then I did… though I think there are two points there. Yes, it ‘s my understanding that specifying the width and height of an inline image is better than not because the browser doesn’t need to guess what the native size it should be displayed at is – you are telling it and it has space reserved on the page for that size. Second, yes… it requires more processing for the browser to scale an image for you as opposed to loading an image that is already sized to what you ultimately want it to display at.

        My follow-up dilemma has to do with the idea that with responsive concepts for delivering fluid layouts you can’t specify a width and height for an inline image other wise it won’t scale. You simply put a CSS width of 100% in conjunction with a max-width.

        Am I right in thinking this is a limitation when it comes to optimizing page content based on what you mentioned above?

      • Anonymous

        I just typed up this massive comment in reply and my browser refreshed and deleted it, so I will just sum it up.  I hate it when that happens, always such a bummer.

        In summation, no I don’t think it’s a limitation.  The trade off between using media queries versus hard-coded styles is easier to me now a days.  As I would now say that it is an obvious choice of using the queries, because they are just so impressive every time I see them being used.  Also, I think that using width of 100% is somehow similar, yet empirically different, to using hard coded sizes.  Sorry though man for the small comment, I will try to get the other comments I had written out by the morning.  

        Thanks though for the follow up though.  Oh, that was something I  was going to say, that you really helped clear that up with regards to your first paragraph!  I really appreciate that.

        Thanks,
        – Dain

      • Anonymous

        Poor comment lost to the ether… been there before. I mourn it’s loss with you. Haha

        Was wondering about your thoughts on if percentage widths provided the same means of optimization. Truthfully I’ve never done the research, but I would think it helps to have something specified in conjunction with using a visual editor to properly size the image to the max size you’d want it displayed at anyway.

        Thanks again for the post! Always love talking through the options!

        Brian

  • http://twitter.com/getsolidstate getsolidstate

    Good effort by op but these really won’t make much difference in speed, atleast in terms of perception where it matters most. Only good advice here I’d agree with is CDN, but even then, it’s not really noticeable to users unless they live in rural locations that benefits from geographically distributed servers.

    There are lot of talks about WPO (website performance optimization) which is seemingly becoming more of a front-end optimization and I think it’s time to let out some of the studies and experiences I have gathered. 

    I’ll tell you that, the only “true” optimization that actually lets you see “speed improvement” is by having a faster hardware, more specifically faster I/O. For most websites and webservers, hard drive is probably the biggest difference maker in “perceived” performance, where your site just feels snappy.

    I can tell you that you’ll see a difference from SATA to SAS. And even bigger difference from SAS to SSD. Most of you are probably hosted on SATA and let me tell you no matter how much tweaks/yslow/wpo you do, it probably won’t really make much of a difference.

    That is because the real perception of speed comes from the work that has to be processed by the backend (PHP/Ruby/any other serverside-language), which results in “first time to byte” time.
    This involves the timing of the database and other server-side coding, typically any dynamic websites, such as WordPress, Drupal, Joomla, and Magento.

    For instance, Magento is a very heavy app and it can take anywhere from 3 – 10 seconds (depending on hardware) before your server can start to render the front-end (HTML) content. Cutting down this rendering time is what will make your site seem to be fast and instantaneous. And never mind the browser-caching, as most sites are probably dynamic and visitors will be hitting new pages. This timing while your visitors wait between pages is what makes visitors want to either stay or leave your site, so having a faster I/O is probably where you should put most of your investment in.

    When browsing many sites on the internet, you can generally tell if they are on faster I/O or not. But I’ll save that for another day.

    • Anonymous

      Wow.  Thanks for this response.  That was a brilliant read.  You clearly are very versed in these topics, and I totally agree with you with regards to the sentiment stating that Front-end developers are typically limited by what they know.  I can say so because I was a front-end guy for years by choice, and it can indeed limit your perspective a lot.  Now though, I mostly work with Ruby (Rails) and back-end work, as I am a lone developer on a massive project and I actually have been tasked with server maintenance as well. So that is perhaps why these topics are so important to me, and why I want to learn so much about them.  I’d also love to help teach people what I learned one day, but yeah these are really heavy bits that can be hard to understand at first.  Though, after reading something like what you wrote I feel like it gave me a better understanding on server speed when comparing the types (SSD’s, etc).  

      One thing to note though is, when writing this article in particular I tried to talk about front-end or user facing technologies (or as close as I could), because a lot of the people that read web designer depot aren’t server admins or sys admins in general.  I feel though that what you brought up is a lot more relevant than the user-facing features when it comes to real raw core speed.  As a result now I want to think about researching these server-based changes further for trying to help people in the future.  

      Thanks man, really loved that comment.

      • http://twitter.com/getsolidstate getsolidstate

        Sure no problem. Also one thing I’d have liked your article to have pointed out (as I understand you were targeted towards front-end devs) was lazy-loading javascripts, perhaps this can be one of the topic you can write about on your future articles. When you lazy-load, you not only save the initial load times but with it you can also do parallel loading of scripts, stylesheets, and even images. I think this is one of the biggest optimization you can do, atleast in terms of front-end optimization wise. Cheers. :)

      • Anonymous

        Thanks man, and yeah I hear ya there.  Lazy-loading is a big topic.  I will be jotting that down for the future for sure.  I’d like to do follow up pieces on a lot of my articles just because you get so much great feedback from the community.

  • Anonymous

    No problem!  Thanks for reading.

  • Anonymous

    Interesting.  Great feedback guys, I am going to look into this more.  Thanks for the two perspectives though this helps to see how much out there in this area can be researched more thoroughly by all of us.  Really great stuff.

  • http://twitter.com/Blogerko Blogerko

    Great post this can be very useful

    • Anonymous

      Thanks!

  • Anonymous

    Awesome!  No problem, glad to hear it helped.  :D

  • http://www.webstudio55.com/ Shahzad Anjum

    here is some other ways to reduce page load time and save some bandwidth
    using php gzip compression or apache mod deflate to deliver compressed contents to the visitor’s browser
    any way good article

  • http://tinywp.in/ Pothi Kalimuthu

    Great insights on front end optimization, Dain. At the same time, there are lots of things happen behind the scene at the server side too (as already stated by @twitter-545303076:disqus regarding solid state drives). Let me provide another example… moving to Nginx, php-fpm and Varnish can bring a lot of change on how dynamic content is cached. Of course, Memcached can be added as an additional caching layer.

  • http://www.coderblog.de/ Simon Sprankel

    Maybe you want to link CSS Minifier http://cssminifier.com/ instead of CSS Compressor… CSS Compressor breaks media queries – CSS Minifier does not…