My Multi-Repository CI/CD Pipeline
Two repositories to run my site. Two.
I really hate myself, I guess.
Read on to see how I can commit to either repository and have it compile and deploy to my VPS in less than a minute.
My setup
Here are the facts about my setup:
- The site is hosted on a Linux VPS
- I have two repositories: one for the structure and one for the content
- I am using SaaS Gitlab
- The site is compiled using Hugo
The only docker used is what Gitlab spins up to build and deploy the site.
Goal
The goal of this project was to keep my content separated from structure and increase the ease of building and deploying through automation.
Steps
Get a simple build working
First we start with the gitlab-ci.yml
for the structure repository. To make sure this is working, we'll add steps to download the content respository and run the hugo
command.
stages:
- build
build:
stage: build
image: dettmering/hugo-build
before_script:
- mkdir content
- "wget -O content.tar.gz https://gitlab.com/some-user/content/-/archive/master/content-master.tar.gz?private_token=${GITLAB_PRIVATE_TOKEN}"
- tar --strip-components=1 -zxf content.tar.gz -C content
script:
- "hugo"
artifacts:
paths:
- public
only:
- master
This looks a bit busy, so here is a breakdown of what is occurring:
- Stages are defined. We are going to start with one.
- The
build
stage will use the docker imagedettmering/hugo-build
- In the
before_script
we do the following:- Make a content directory
- Download your content repository, you will need a private token saved into your environment. Can get the download link from UI.
- The download process tars a directory named after your repository:
${repository-name}-${branch}
, the--strip-components=1
removes that first level when extracting
Add in a deployment step
Now it is time to update the YAML file with a deploy step. First, add a new stage to the stages
section.
stages:
- build
- deploy
Next, add the deploy step below the build step.
deploy:
stage: deploy
image: alpine:3.7
dependencies:
- build
before_script:
- apk update
- apk add rsync
- apk add openssh-client
- mkdir "${HOME}/.ssh"
- echo "$VPS_SSH_KEY" > "${HOME}/.ssh/id_rsa"
- chmod 600 "${HOME}/.ssh/id_rsa"
- eval "$(ssh-agent -s)"
- ssh-add "${HOME}/.ssh/id_rsa"
- echo $VPS_HOST >> "${HOME}/.ssh/known_hosts"
script:
- rsync -hrvz --delete --exclude=_ public/ some-user@192.168.1.1:/var/www/my-site/
The important bits:
- Install the tools we need:
rsync
andopenssh-client
- On your VPS, create a new ssh key. Copy that private key into your project pipeline variables.
- On your local machine, go to your
known_hosts
file (on linux/mac:~/.ssh/known_hosts
). Grab the line for your server, save it to a variable (VPS_HOST
here), and then that will be appended to the pipeline container'sknown_hosts
file. - Finally, sync files to your server. Since the
public
folder is an artifact of the build process, it is automatically inserted into this container.
Trigger from another repository
Now, we want changes in the content repository to trigger the site's pipeline. The best way to do this is use the code that Gitlab provides when you create a new trigger.
-
Go to the site structure repository
-
Go to Settings > CI / CD and expand Pipeline Triggers
-
Create a new token with any name you want
-
Copy the code below the token, it will look something like this:
trigger_build: stage: deploy image: alpine:3.7 before_script: - apk add --update curl script: - "curl -X POST -F token=TOKEN -F ref=REF https://gitlab.com/api/v4/projects/123456/trigger/pipeline"
-
Change
REF
to whatever branch you want (probablymaster
) -
Change
TOKEN
to the token you just created
Conclusion
As you can see, this is a tad complicated because I went the route of having two repositories for my site. As well, I wanted things to deploy directly to my VPS instead of Gitlab pages or the like. In Part Two, I will cover how to use multiple environments and create a manual gating process.
Resources
Links that helped me get here:
- Alpine Linx: How to get regular stuff working
- Continuous Integration and Deployment with Gitlab, Docker-compose, and DigitalOcean
- How can I pass artifacts to another stage?
- Extracting .tar.gz files in current directory...
Sample gitlab-ci files: