Launching a Supersonic Site: Some things I learned

Chris Baume | 12 Jan 2024 | 5 mins

With the start of a new year, and some more time to dedicate to this little venture, it’s time for a new website. I hope you’re enjoying it! I’ve written this little post to explain how I went about creating it – partly to make notes for my future self, but also in the hope that it might be useful to someone else.

It’s worth saying upfront that if you need a website the best thing to do, in almost all cases, is to use a service such as SquareSpace, Wix, or WordPress. They make it easy and convenient, allowing you to focus on the important aspects of building a website, such as having a clear message and high-quality content.

But where’s the fun in that? If you’re anything like me, then you’d much rather build something from scratch and spend most your time tweaking the layout, adding new features, optimising it to load faster, or streamlining your deployment system. And that’s exactly where I focused my efforts, which hopefully doesn’t show. The upside is that I learned quite a lot from doing it all, so here are some of the neat things I discovered along the way.


Going into this I had some high-level aims which, with the gift of retrospect, are:

  • Modern - responsive site that looks and feels like a website from this century
  • Inexpensive - costs me very little to build and run (I’m excluding my time here!)
  • Efficient - loads quickly and doesn’t require lots of Javascript or bandwidth
  • Privacy-focused - respects the privacy of its visitors and doesn’t need a cookie banner
  • Low maintenance - requires me to do very little once up, and is easy to update


The content of this site is fairly fixed and doesn’t change much day-to-day, so to build it I used a static site generator called Jekyll. If you haven’t come across SSGs before, they allow you to generate HTML programmatically using templates. Jekyll was one of the first and remains very popular, partly because you can use it to publish sites for free on Github Pages.

There is an ecosystem of plugins you can use to add various features to Jekyll. Some of the plugins I used include jekyll-feed to generate an Atom feed, jekyll-seo-tag to add meta descriptors, and jekyll-target-blank to open external links in a new tab.


For the UI, I used a CSS framework called Bulma. This was my first experience with it, but I was super impressed by the design of all the components, the quality of documentation, and the styling it produced. I’d definitely recommend it to others and use it again.

The look-and-feel was vastly improved by customising the fonts. I used Google Fonts, which has a huge selection of openly-licenced fonts with an impressively easy and efficient setup. In this case, I used Lexend for titles and headers, and Poppins Light for the body text.

With Bulma, the primary downside I found was that it contains too much CSS. Even when you only import modules that you need, it still produces >200kB files. This is a problem because the browser has to download and parse all of the CSS before it starts rendering the page, which can delay loading by several seconds on mobile. However, I found a solution to this…


Initially the site scored pretty poorly on Lighthouse, mosly due to the high render delay from having so much CSS. This wasn’t caused just by Bulma, but also FontAwesome which, by default, includes all of the icons. After some research, I discovered PurgeCSS and the accompanying plugin jekyll-purge-css. It analyses the site’s HTML to see which bits of the CSS are actually used, and discards those that aren’t. In my case, this reduced the total CSS from 388kB to 53kB – an 86% reduction! This was further reduced down to 45kB by setting up SASS in the Jekyll config to minify the CSS which, for some reason, is not on by default.

The other major slowdown was caused by having poorly compressed images that are too big for the viewport. This can be helped by using the <picture> element to provide images in a variety of resolutions for differently-sized viewports. However, it’s a faff to have to create multiple versions of each image and configure which to display. Fortunately I discovered the amazing plugin jekyll-picture-tag. Not only does it automatically configure the picture tag for different sizes, but it generates all of the images for you, and uses modern compression algorithms like WebP. It makes a huge difference to the amount of bandwidth required, and I’d definitely recommend using it.

With the big problems out of the way, the last piece of the puzzle is using resource hints to load files in the most efficient order. Adding rel=preload to critical stylesheets and images means they are downloaded in advance of being requested by the rendering engine. Adding async to non-critical scripts allows the parser to continue without having to wait to process the code. And rel=preconnect hints that a connection will need to be made to a certain domain, to fetch fonts for example, and so these are set up in advance of being needed.


I could have used Github Pages to deploy and serve the website, but unfortunately it doesn’t allow commercial sites or natively support many of the Jekyll plugins. I had heard good things about Netlify, so decided to give it a try, especially as it can handle any plugin. I was super impressed by how easy it made everything – I just have to connect my Github account and it does the rest. It will automatically redeploy when changes are pushed, and created staging websites for any pull requests. Everything is served on their CDN that includes Brotli compression and HTTP/2, which each provide an extra speed boost. And so long as I stay below 100GB bandwidth a month, it should be free to operate.


Clearly, I’d like to see how many people visit this website so I can get some sense of what’s happening. But I’d like to do that in a way that preserves people’s privacy and doesn’t require an obnoxious cookie banner. I had read good things about Plausible and I really like their philosophy, but for small websites the pricing is a bit steep. I considered self-hosting it, but it would require me to run a server 24/7 with at least 4GB of RAM, which feels a bit wasteful.

The solution I landed on was Umami – an open-source self-hosted analytics library that doesn’t use cookies or track users. Crucially, as it’s built using Next.js it can be deployed using serverless functions on Netlify, so doesn’t require a dedicated server to run. It does need a database though, and for this I went with the Heroku Postgres database-as-a-service. You can spin up an EU-based database for only $5/month which is great value, especially if you use it for multiple services.

I hope you found some of that useful. If you learned something new, or think I missed anything, then I’d love to hear from you 😊


You reached the end!

If you like what you read, then sign up to our monthly newsletter using the form below to receive more content like this, and to hear about new products and updates. Get in touch to tell us about what you're working on to see if we can help.