Setup blog deployments
Posted on November 2023 in Web developmentPrehistory
After deciding how blog will be built, it's time to figure out how to deploy it. Few years ago I used to live in a bit different place and I had optic fiber and almost static IP address reachable from the internet. So when you have that and couple old laptops, you basically have a server. I used to play with linux servers. First it was Apache, then nginx, later on I even wanted to setup my bare metal k8s cluster. And all the fun with VPNs and firewalls (especially liked port knocking to connect to SSH server). It's all fun and games because if you lock yourself out, you just open laptop lid and fix whatever magic you implemented. But then I had to move and I don't have that luxury anymore. So here I am today looking into cheapest ways to host my blog. Not because I want (or even have) anything interesting to say, but because I just want it to be finally up.
Hosting options
Quick search on the internet shows that there are quite a lot of options. And I felt so out of touch with the current state of things. Since search results showed a bunch of unheard names, I reevaluated my requirements. I want:
- to have option to start using k8s in the future (most likely not, but still) without switching to another provider
- not to go with AWS (heard quite some stories about unexpectedly high bills, and I still have to pay my mortgage)
- google and azure also seemed like AWS in terms of pricing
That left me basically with DigitalOcean and Linode. I've heard about both of them, but reviews I found were more favorable towards DigitalOcean, especially their self-service. Plus they have a free tier and prepaid plans. So I went with them, nothing much to think about here. After looking around my new account, I found that it's not all that great in terms of me deploying my blog. I can't just upload my static files and be done. DigitalOcean has a bunch of services, but none of them is just a simple file upload. But at least they're offering free static site hosting. Going with what they offer, I can't really deploy Pelican sites without building it first. Bummer. So it seems like I'll have to build it somewhere else and then trigger DigitalOcean to fetch it: whether it is committing built files or running CI/CD pipeline, it will have to be done outside of DigitalOcean.
That made me look into GitHub Pages. I've heard about it, but never used it. If I didn't have my own domain, I could just use GitHub Pages and be done with it. But I really want to have https://blog.ramoska.eu, not https://ramoska.github.io/blog. Other thing that I didn't like about GitHub is that I need pro account to have GitHub Pages for private repositories. I'm not sure I want to do that. On the other hand, I don't really need private repo for content I'll be publishing anyway. For now I'll go with GitHub Pages setup, but leave my repo private. I'll leverage GitHub Pages setup to trigger DigitalOcean to fetch built files and deploy them. That way I won't block myself from switching between GitHub Pages and DigitalOcean static site hosting.
GitHub Pages and DigitalOcean setup
Ok, to setup GitHub Pages I'll be making repo public (at least for now) and add gh-pages branch and unignore output directory. Ideally for that branch and that branch only. Let's start by publishing gh-pages branch with current main branch content. And instantly I ran into limitations - you can only have root or /docs dir as a source for GitHub Pages (limitations of Jekyll github is using to power their pages?). I guess I'll have to go with root dir, mainly because I don't want to have /docs dir (sounds weird) and also I won't have to duplicate content too much - only built htmls will be in that branch. Still feels somewhat weird to not be able to provide custom source dir. And creating separate branch, which is totally different from main one feels even more weird. Anyways, moving on onto next steps. At least DigitalOcean is cool with output dir to deploy from. Since I'm elbows deep with Python, let's try ghp-import package to see if it will make my life easier.
ghp-import -b gh-pages -p output
And what it did is it recreated gh-pages branch with content of output in root dir and pushed it to repo. And that's exactly what I wanted. Not let's fix DigitalOcean deployment no to fetch content from output dir. For https://ramoska.gihub.io/blog/ it is broken - styles not loaded. That's because I'm using relative URLs for now, but Pelican renders them starting with /, which would be OK, if not for the fact that GitHub Pages is serving content from /blog dir. I could fix that with SITEURL setting, but I won't be doing that - I'll be using DigitalOcean static site hosting and I know how to fail over to GitHub Pages if I need to.
So that is covered, adding CNAME record in Cloudflare and I'm done with blog setup - it will be using basically GitHub Pages setup except repo will be private and I'll be using DigitalOcean static site hosting.
Automatic deployments
Only thing left is automatic deployments. I don't really want to build blog locally and then run ghp-import to push it to gh-pages branch. I could do that, but don't really want to. That leaves me with GitHub Actions. Looking into it I think 2000 free minutes per month will be enough for me. If not, I can always make repo public and in my understanding that will give me unlimited build minutes. Pelican has a tip on how to deploy to GitHub Pages, and I think it's almost what I need. Looking into workflow file there doesn't seem to be any magic involved. Let's create similar GitHub Action config file and see what happens.
Looking into Actions docs I see I need to create .github/workflows dir and put my build/deploy config there. Let's do that and define jobs:
name: Build blog
on:
push:
branches: [ "main" ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.11"
- name: Install requirements
run: pip install pelican
- name: Build Pelican site
run: |
pelican \
--settings "publishconf.py" \
--output "output"
- name: Reset gh-pages branch
run: |
git checkout --orphan gh-pages
git --work-tree output add --all
git --work-tree output commit -m "GitHub Pages deployment"
git push origin HEAD:gh-pages --force
And that fails because git on pipeline doesn't have user info to write commits. Prepepnding few lines to Reset gh-pages branch step fixes that:
...
- name: Reset gh-pages branch
run: |
git config user.name github-actions
git config user.email [email protected]
git checkout --orphan gh-pages
...
Now I have automatic builds and every push to main branch will trigger build, which in result will trigger DigitalOcean to deploy new content. I'm happy with that and consider this task done.
This is part 4 of the Resurrecting blog series.
- Previous posts:
- Stick to Pelican or switch to Writerside
- Create blog site with Pelican
- Create site with Writerside
Tags: Pelican, CI/CD, Deployments, GitHub Pages