Continuous Deployment of Blog Posts

In my first and second posts on this topic I talked about setting up my static blog. This post is about deploying changes in a continuous fashion. I mean, let’s face it, one of the not so nice things about having static blogs is the lack of ability to create and edit content from any computer. There are several ways to continuously deploy changes but as always, I chose the most frugal path: a simple wercker build script.

Desktop to the cloud

I chose Github as my choice of remote storage for raw content. With that and S3 and CloudFront, my overall sequence flow of deplyoments look something like this:

I have covered things outside the big box on the left in previous posts, so I will touch the components inside the box. Github and git is fairly common and there is nothing special with my setup. Here’s my github repository if you are interested in taking a quick peek. I will jump right in to the wercker build file.

Set Wercker up

Wercker builds are based on YAML based config files. This is how my config look:

box: python
    - arjen/hugo-build:
        version: "0.36.1"
        theme: kiss
        flags: --buildDrafts=true
        dev_flags: -D -F -v
        config: config.toml
    - pip-install:
        requirements_file: ""
        packages_list: "certifi"
    - s3sync:
        key-id: $AWS_ACCESS_KEY
        key-secret: $AWS_SECRET_ACCESS_KEY
        source-dir: $SOURCE_PATH
        bucket-url: $DESTINATION_BUCKET
        opts: --acl-public
        delete-removed: false

There are two broad concepts with Wercker: pipelines and steps. As you have probably guessed, a pipeline is a deployment target and it consists of one or more steps, each tasked to do some atomic task. In my case, I have two pipelines, one dependent on the other: a build pipeline to build my static site and a deploy pipeline to deploy the artifacts to S3.

The Build pipeline

My build pipeline consists of only one step: arjen/hugo-build. Wercker enables pipeline to consume steps created by other developers and shared via their marketplace. This particular step simply accepts a bunch of arguments, the files from my git repository, compiles them and stores the generated artifacts.

The Deploy pipeline

My deploy pipeline consists of two steps: a pip-install step to install the certifi library for the root certificates required by the next step and s3sync step to sync the stored artifacts generated by the previous pipeline with my S3 bucket. Both pip-install and s3sync are available in the Wercker marketplace.

And that’s all I need to continuously deploy my static site. If you are curious about how my builds look, here they are.