Hey people, hello again, this time we're going to talk about SEO 📈📊
In this article I'll walk with you through the path I followed to improve the visibility of this blog you're accessing right now.
This is a Gatsby + Ghost blog, and as you may know, it is a static site. Gatsby generates the bundle and we keep our posts on Ghost. Gatsby access these posts using the awesome Ghost Source Plugin and we deploy the bundle on AWS S3. From now on, we distribute the content everywhere using AWS Cloudfront and here is where our journey begins...
Alright, so our blog is on the air, what could go wrong?
Well, this went wrong. This was our mobile ranking using https://developers.google.com/speed/pagespeed/insights/?hl=pt-br Google tool (punch in my face). Our desktop ranking wasn't that bad, it was something at 65 if I remember well. So, this Google tool is a good starting point if you want to check how to improve your website, and I'm still improving this one here (laughs).
I started fixing the small problems...
The first problem I've found is that our blog was just slow, very slow. Any request was taking too much time to respond.
And here I was, compressing all of our blog images (they was almost twenty). To do so, I used https://tinyjpg.com/ , it's a good tool and recommend it. After all, I got a better ranking, but very far from the result I was seeking. The site was a bit faster but not for our company standards (and absolutely not for mine) 😅🤓. So leeeet's move on...
I don't know if you're familiar with AWS Cloudfront but we do many nice things there. One of these things is the compression on the fly and, believe me, this come in hand if we want to boost a static site performance.
I activated the compression on Cloudfront for both gzip and brotli (click here if you don't know what brotli is). Now, for my surprise, the blog wasn't serving compressed content yet 🤔 and I was just wondering why. So, I realized that I need to allow Cloudfront access to know the content-length of my files in order to activate the compression.
To let Cloudfront know the content length of any file you must activate this in your S3 bucket configuration. Very simple indeed, so I allowed it by going to my CORS configuration, inside my bucket, and allowing the content-length header there. To check the configuration options of S3 CORS take a look right here: https://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html
And that's it! Whenever the browser calls the blog now and sends an accept-encoding header, Cloudfront will provide the compression on the fly and the smallest asset will be delivered. If the browser doesn't send this header it will deliver the original files with no compression. Nice.
Now the blog is faster (a bit), and I can see that the files are delivered compressed, but guess what, we can make this even better if we allow cache. Now there are two cache strategies we can use, the first is controlled by AWS and the other is controlled by us. Since this is just a simple blog site with static files I choose to get the control of this cache settings and by-pass AWS configurations.
To do so, I changed the S3 CORS configurations again. Now I added a cache of 2 hours (7200 seconds) to all files and told Cloudfront to use the Legacy Cache System (this is where you can choose between AWS controlling cache or you by yourself, you can even build a request based cache system, is pretty nice). It worked like a charm.
Faster than fast, quicker than quick, I'm lightening, but no, not yet haha
Alright, now things are really faster than before but there are also many things we must improve. Let's walk through the simple ones first.
A good way to increase your site rate is the lazy loading the images that are off from the viewport. If you have no idea about what I'm talking, please read this article.
The HTML tag img has a loading property, but some browsers just ignore this tag and you'll need a helpful JS script to handle that.
Now, the second problem was the visibility, Google must know somehow that our blog exists and every new post must be indexed. To do that we need to add some tools and provide Google's coverage to our site (or any other search engine you want).
Think about it for a moment, even if you create a site and deploy it, Google will know nothing about it unless you somehow tell Google that your site is there.
Sitemap, as the name says, it's a simple XML with all of your site routes. The sitemap of this blog can be found at https://xnv.io/sitemap.xml. Go ahead, go there and take a look.
To create this sitemap I used another gatsby plugin, the Gatsby Plugin Advanced Sitemap (https://www.gatsbyjs.com/plugins/gatsby-plugin-advanced-sitemap/). It generated the sitemap you're seeing there, very easy right?
Now we just need to add the sitemap to the Google Webmasters right? No, before we do that we must add another file to our site. The robots.txt
This is the shiny file that tells any web crawler which page it can (or cannot) crawl. You basically open your site to the crawlers telling them which pages (or directories) are available to be indexed by Google.
Again, using Gatsby, this is very easy to do, all you need to do is search for the right plugin on gatsby plugins repository and find a good one that fit your needs. In this case I used the Gatsby Plugin Robots TXT (https://www.gatsbyjs.com/plugins/gatsby-plugin-robots-txt). Pretty simple, we can check the result here: https://xnv.io/robots.txt
Now we're ready to enter the webmasters tool.
Google Webmasters is the coverage paradise, here you can see how many pages of your blog were indexed and what pages were not. You can even request a new indexing of any new page. Note that, to have access you need to be the owner of the given domain.
Here I added all of my sitemap.xml files on the Sitemaps section and I also started indexing some of the pages one by one. After that I could see some results on Google when I typed site:xnv.io
With this feature, you can see which pages Google knows about your site. It won't show all of the indexed and crawled pages at first, given that Google does the indexing from time to time. If you're trying to improve your site performance or visibility, go to Google and just type site:yourdomain.com and it will show all the known pages there.
Doing all of this, I could reach a better ranking for mobile, that increased from 43 to 72 and sometimes to 75. On my desktop the result were even better, we increased it from 65 to something over 95.
Here are my results while I'm writing this post.
I know, it was a bit of work, but c'mon, nothing that you cannot do in one day of work and some cups of coffee. With this improvement we are also seeing better results on Google Analytics and more clicks on our blog.
Thanks folks, I hope you enjoyed the post and if you have any questions about anything here, please let me know at https://www.linkedin.com/in/rhuankarlus/