Posted on 22nd, May 2014
Over the past few days I have been playing around with the performance of my blog. In this post I’ll outline what I did the enhance the speed, and performance. Hopefully from this post you’ll be able to take what I did and apply it to your own sites.
At the time of writing the site hasn’t visually changed massively. A few minor tweaks such as fonts, and icons are absent and resized images. Other than that, everything remains the same.
Here are a few stats about the old site (based on the homepage):
For a site as simple as mine, that is pretty damn wasteful. In fact, it was just plain awful.
Furthermore, I added gzip compression which further reduces the sizes of these files (with the exception of Google Analytics, although that is served gzipped anyway). Gzipping the content, reduces the file size, and then the browser will deflate/uncompress in order to render. I don’t currently have the size stats including gzipping (I blame the differences I’m seeing in Firefox and Chrome for these stats).
First and foremost, I originally planned on moving my site over to Amazon S3, in order to reduce costs slightly. Whilst reading up on the pricing per request and data transferred I figured I’d take a look at the size of my site.
So, the first thing I did was transfer my site over to Amazon S3. This increased page speed slightly. Here are the resources I used for transferring my Jekyll site to S3:
s3_website pushwill push your built site!
The first thing I did was create a list of what I saw as wasteful. My list looked a little like this:
As I mentioned before I now serve all content gzipped, which is then decompressed in the browser. It’ll reduce the size of requests, and speed up your site. As my site is Jekyll based, and uses
s3_website (link above), this is the configuration I set in my
max_age: "assets/*": 604800 "images/*": 604800 "*": 86400 gzip: - .html - .css - .md - .js gzip_zopfli: true
What this does is set the max lifetime age of all my images and assets (js/css) to 1 week (the number is in seconds), then everything else (.html files) lasts for 1 day. The gzip code specifies which files types should be served gzip compressed.
gzip_zopfli: true instructs S3 to use the zopfli compression algorithm which while 100 times slow, it will actually compresses better. View the project here: https://code.google.com/p/zopfli/
Once I started using just Amazon S3, I kind of wanted more, so I went a head and setup a CDN using Amazon CloudFront. It’s dead simple to setup, and you can point it to your S3 bucket, then change the A record in your DNS to point to the CDN.
A CDN is a Content Distributed Network. Essentially, your content will be cached on the CDN, and shared between this network of servers. The network then figures out the closest server to the end-user and serves the assets cached to the user.
There is also another benefit for using CloudFront, as my AWS account is new, I’m on their free trier. You get 50gb data transfer and 2 million http/https request for free. My blog is tiny, and therefore I’ll barely touch that limit. This will save money on requests to S3 (storage costs still apply, but those are marginal)
Using a theme without checking what it actually does.
I'm Ash, a freelance Magento 2 Developer, keen cyclist and aspiring triathlete. With over eight years Magento experience, I have a wealth of experience with developing Magento stores.
Need a Magento Contractor/Freelancer? Contact me