Easy CI/CD with Gitlab Pipelines

Blog post updates published automatically using GitLab's pipelines feature

Imagem de capa

Blogging is easy with a platform like Medium or Dev.to but it’s more fun with a more DIY approach. In an earlier blog post I explained why I settled on Jekyll as the framework for my blog, this post explains in more detail how the whole thing works.

Configure Your Dev Environment

Jekyll is built with Ruby so you will require some Ruby related tools installed, this is all pretty straightforward on MacOS and Linux, following the quickstart guide will get you started with a basic site. The Jekyll installation instructions state that it’s not officially supported on Windows but it’s possible to get it working, when working with Jekyll on Windows I use Ubuntu via WSL and edit the source using VSCode which works well.

During development running

bundle exec jekyll serve 

will spin up a local development server that watches for changes to files and recompiles automatically.

Authoring content

Creating a new blog post is as simple as adding a new markdown file to the _posts directory with a filename that matches YEAR-MONTH-DAY-title.md

The post must start with the assignment of some basic values known as front matter, these usually include the name, description and image for the post. This is then followed by the content for your post as markdown.

layout: post
title: "Easy CI/CD with Gitlab Pipelines"
description: Blog post updates published automatically using GitLab's pipelines feature
image: '../../assets/img/gitlab-pipelines.jpg'
category: 'devops'
tags:
- devops
- gitlab
- ci/cd
- jekyll
- docker
twitter_text: Blog post updates published automatically using GitLab's pipelines feature
introduction: I walk through the simple setup I use to deploy updates to this blog site

The development process of editing and saving a post in VSCode then refreshing the browser window for the most part has been fairly streamlined but occasionally one tiny mistake in the front matter definition has caused the whole site to no longer compile with fairly unhelpful error messages. Sometimes it has taken quite some time to slowly work back through my commit history to find the change that caused the failure. Frustrating but not enough to put me off using the platform.

Publishing content

My blog site Jekyll project is a git repo that lives on GitLab, even though I am the only contributor on this project pushing changes to GitLab is a great way of 1) keeping an offsite backup of the work and 2) providing an easy way to look back through the change history when trying to identify when a bug was introduced. I host my personal projects on cheap Ubuntu VMs with Linode.

Once I have finished a new blog post and pushed changes to the Gitlab repo I need to deploy the updates to my web server. Rather than this being a separate step that must be manually performed (connecting with SSH to manually copy files to a web server is no way to live) we can leverage pipelines in the Gitlab repo to handle this automatically.

To enable this feature in the root of our repo we create a file called .gitlab-ci.yml. This YAML file declares the steps that we would like performed each time we push changes to Gitlab.

The steps we require Gitlab to perform are

  1. Run the production build process to generate the final static assets (HTML, JS, CSS, etc) from the various markdown files
  2. Take the resulting files and copy them to the appropriate directory on the production web server

The first step uses a Docker image that comes already prepared with all the tools required for performing a Jekyll build. The second step only needs to connect to the remote server and copy the assets, because of the simple requirement I’m using the standard Alpine image which is popular because of it’s tiny size, the downside of this is it comes with very little functionality out of the box so we need to install rsync and openssh. The install process for these two packages is time consuming and must be run every time these jobs run, ideally I should find an image that comes preconfigured with rsync and openssh or publish my own Docker image with these included. Something for me to revisit eventually.

The .gitlab-ci.yml file with comments for explanation

build site: # name of the first job, the Jekyll build process generates the static site from source
  image: jekyll/jekyll:latest # uses the latest Jekyll image from hub.docker.com
  stage: build # I'm not leveraging stages, can be used to run some jobs in parallel
  script: # these are the commands that are executed to do the work
    - bundle install # install any dependencies that are specified in the Gemfile
    - bundle exec jekyll build # run Jekyll's production build process, output goes in _site
  artifacts: # the artifact is the output from this job that we want to use in subsequent jobs
    expire_in: 1 week # in this case it's regenerated each time so expiry doesn't really matter
    paths:
      - _site # we take the contents of our build directory _site and make it available to subsequent jobs

deploy:
  image: alpine # Docker image that's very lean and missing openssh and rsync, not ideal
  stage: deploy # as mentioned, not making use of this
  script: # these are the commands that are executed to do the work
    - apk add --no-cache rsync openssh # install the missing packages in the docker container :(
    - mkdir -p ~/.ssh # create a directory for SSH keys in the home directory
    - echo "$SSH_PRIVATE_KEY" >> ~/.ssh/id_dsa # SSH key is stored as a variable in Gitlab, take the variable and echo it into a file, keeps SSH key out of source code.
    - chmod 600 ~/.ssh/id_dsa # update permissions on the SSH key file
    - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config # set some configuration on SSH
    - rsync -rav --delete _site/ user@remote-server:/var/www/blog.ae.id.au/ # rsync the files across to the server

Now when I push updates to the repo the build and deploy steps run automatically, I can log into Gitlab to view progress on the job and if an error occurs I am notified via email. Leveraging pre build Docker images within GitLab pipelines really cuts down on the amount of configuration required to assemble a complete CI/CD process. If your repo has automated tests it’s trivial to add an additional step to run these and stop deployment if any of them fail.

Check out the GitLab docs for more details on configuring CI/CD pipelines.