Automatically building a Docker Container using Gitlab CI

Many developers have been with Github for their projects for quite some time. While Github is a truly amazing service it would seem that Gitlab is gaining ground with some pretty amazing features. Two features in particular are wonderful when you work a lot with Docker: Gitlab CI and the Gitlab Private Container Registry.

Previously I had to set up my own Jenkins CI instance and a Private Registry on my VPS, which meant I had to maintain a lot of fragile infrastructure myself. The Hosted version of Gitlab helps to reduce that maintenance by providing tools like this out of the box.

To make use of this you need to take three steps:

  1. Setup a new project on Gitlab
  2. Have Gitlab CI and the Gitlab Private Docker registry enabled for your project and
  3. Create a Dockerfile and a Gitlab CI configuration file (.gitlab-ci.yml)

Steps 1 and 2 are well documented on the Gitlab site and is a matter of filling in the right fields and ensuring that the Gitlab CI and Docker Registry checks are checked; as such I won’t be describing these steps. If you have questions with regards to these steps, please leave a comment.

To create a Dockerfile in your own repository you can take inspiration from this example. Here docker sets up a basic PHP application running on Apache:

FROM php:5.6-apache

ADD . /var/www/html

RUN docker-php-ext-install -j$(nproc) mysql && a2enmod rewrite

The only thing the above does is use the official PHP base container that runs on apache, import the contents of the current working directory and ensures that both the mysql extension is installed and mod_rewrite is enabled.

When using this technique, remember that you can use a file called .dockerignore to exclude specific files from being included in your docker container. This file works the same way as a .gitignore file and is a handy way to exclude development resources from your production container. 

Next up: making sure that Gitlab CI creates a Docker image and uploads it to the private registry. This is a matter of creating a file named .gitlab-ci.yml in the root of your project that instructs Gitlab how to do that, and pushing that together with the rest of your changes.

Here is an example .gitlab-ci.yml that I have used for this blog.

build_image:
    image: docker:git
    services:
        - docker:dind
    script:
        - docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN registry.gitlab.com
        - docker build -t registry.gitlab.com/[YOUR ORG]/[YOUR REPO] .
        - docker push registry.gitlab.com/[YOUR ORG]/[YOUR REPO]:latest
    only:
        - master

What this configuration instructs Gitlab to is to use a default git container as a basis for building the project and to load the Docker in Docker service (docker:dind). Docker in Docker is a service needed to build Docker containers from within a Docker container since Gitlab CI uses Docker Containers itself to run from.

Gitlab, after having configured the container to use as basis for running your CI, will already have a checkout of the code present. By inspecting the three lines in the script heading of the gitlab-ci.yml you can see that it logs into the gitlab private registry, builds a docker container using the Dockerfile in your repository and then pushes that onto your own private registry.

Don’t forget to replace the placeholders [YOUR ORG] and [YOUR REPO] with the names of your organisation and repository on Gitlab. The text $CI_BUILD_TOKEN should not be replaced; this represents an internal environment variable containing the token used to sign into your own private registry.

After committing, and pushing, your new Dockerfile and .gitlab-ci.yml configuration files Gitlab will immediately start running your build and a built Docker Container is available for you to use at https://registry.gitlab.com/[YOUR ORG]/[YOUR REPO]:latest.

Using this technique you can even go as far as setting up Continuous Deployment by adding another build step to the .gitlab-ci.yml that will provision your cluster to upgrade this service. In a future blog post I will describe how to do that using the Docker Orchestration tool Rancher.

Leave a Reply

Your email address will not be published. Required fields are marked *