Azure DevOps agent with Docker Compose

I’ve dealt in the past on using Docker for your Azure DevOps Linux Build Agent in a post called Configure a VSTS Linux agent with docker in minutes and also I’ve blogged on how you can use Docker inside a build definition to have some prerequisite for testing (like MongoDb and Sql Server), now it is time to move a little step further and leverage Docker compose.

Using Docker commands in pipeline definition is nice, but has some drawbacks: First of all this approach suffers in speed of execution, because the container must start each time you run a build (and should be stopped at the end of the build). Is indeed true that if the docker image is already present in the agent machine startup time is not so high, but some images, like MsSql, are not immediately operative, so you need to wait for them to be ready for Every Build. The alternative is leave them running even if the build is finished, but this could lead to resource exaustion.

Another problem is dependency from Docker engine. If I include docker commands in build definition, I can build only on a machine that has Docker Installed. If most of my projects uses MongoDb, MsSql and Redis, I can simple install all three on my build machine maybe using a fast SSD as storage. In that scenario I’m expecting to use my physical instances, not waiting for docker to spin new container.

Including Docker Commands in pipeline definition is nice, but it tie the pipeline to Docker and can have a penalty in execution speed

What I’d like to do is leverage docker to spin out an agent and all needed dependencies once, then use that agent with a standard build that does not require docker. This gives me the flexibility of setting up a build machine with everything preinstalled, or to simply use Docker to spin out in seconds an agent that can build my code. Removing Docker dependency from my pipeline definition gave user the most flexibility.

For my first experiment I want also use Docker in Windows 2109 to leverage Windows Container.

First of all you can read the nice MSDN article about how to create a Windows Docker image that downloads, install and run an agent inside a Windows server machine with Docker for Windows. This allows you to spin out a new Docker Agent based on Windows image in minutes (just the time to download and configure the agent).

Thanks to Windows Containers, running an Azure DevOps agent based on Windows is a simple Docker Run command.

Now I need that agent to being able to use MongoDb and MsSql to run integration tests. Clearly I can install both db engine on my host machine and let docker agent to use them, but since I’ve already my agent in Docker I wish for dependencies to run also in Docker; so… welcome Docker Compose.

Thanks to Docker Compose I can define a YAML file with a list of images that are part of a single sceanrio so I specified an Agent image followed by a Sql Server and a MongoDb images. The beauty of Docker-compose is the ability to refer to other container machines by name. Lets do an example: here is my complete docker compose YML file.

version: '2.4'

services:
  agent:
    image: dockeragent:latest
    environment:
      - AZP_TOKEN=efk5g3j344xfizar12duju65r34llyw4n7707r17h1$36o6pxsa4q
      - AZP_AGENT_NAME=mydockeragent
      - AZP_URL=https://dev.azure.com/gianmariaricci
      - NSTORE_MONGODB=mongodb://mongo/nstoretest
      - NSTORE_MSSQL=Server=mssql;user id=sa;password=sqlPw3$secure

  mssql:
    image: mssqlon2019:latest
    environment:
      - sa_password=sqlPw3$secure
      - ACCEPT_EULA=Y

    ports:
      - "1433:1433"

  mongo:
    platform: linux
    image: mongo
    ports:
      - "27017:27017"

To simplify everything all of my integration tests that needs a connection string to MsSql or MongoDb grab the connection string by environment variable. This is convenient so each developer can use db instances of choice but also this technique makes super easy to configure a Docker agent specifying database connection strings as seen in Figure 1. I can specify in environment variables connection string to use for testing and I can simply use other docker service names directly in connection string.

image

Figure 1: Environment variable to specify connection string.

As you can see (1) connection strings refers to other containers by name, nothing could be easier.

The real advantage of using Docker Compose is the ability to include Docker Compose file (as well as dockerfiles for all custom images that you could need)  inside your source code. With this approach you can specify build pipelines leveraging YAML build of Azure DevOps and also the configuration of the agent with all dependencies.

Since you can configure as many Agent you want for Azure DevOps (you actually pay for number of concurrent executing pipeline) thanks to Docker Compose you can setup an agent suitable for your project in less than one minutes. But this is optional, if you do not like to use Docker compose, you can simply setup an agent manually, just as you did before.

Including a docker compose file in your source code allows consumer of the code to start a compatible agent with a single command.

Thanks to docker compose, you pay the price of downloading pulling images once, also you are paying only once the time needed for any image to become operative (like MsSql or other databases that needs a little bit before being able to satisfy requests). After everything is up and running, your agent is operative and can immediately run your standard builds, no docker reference inside your YAML build file, no time wasted in waiting your images to become operational.

Thanks to experimental feature of Windows Server2019, I was able to specify a docker-compose file that contains not only windows images, but also Linux images. The only problem I had is that I did not find a Windows 2019 Image for Sql Server. I started getting error using standard MsSql images (build for windows 2016); So I decided to download official Docker file, change reference image and recreate the image and everything worked like a charm.

image

Figure 2: Small modification to Sql Server docker windows image file, targeting Windows Server 2019.

Since it is used only for test, I’m pretty confident that it should work and indeed my build runs just fine.

image

Figure 3: My build result, ran on docker agent.

Actually my agent created with Docker Compose is absolutely equal to all other agentw, from the point of view of Azure DevOps it has nothing different, but I’ve started it with a single line of Docker-Compose command

image

Figure 4: Agent running in docker it is treated as standard agents.

That’s all, with a little effort I’m able to include in my source code both YAML build definition as YAML Docker Compose file to specify the agent with all prerequisites to ran the build. This is especially useful for Open Source projects, where you want to fork a project then activate CI with no effort.

Gian Maria.

Multiline PowerShell on YAML pipeline

Sometimes having a few lines of PowerShell in your pipeline is the only thing you need to quickly customize a build without using a custom task or having a PowerShell file in source code. One of the typical situation is: write a file with some content that needs to be determined by a PowerShell script, in my situation I need to create a configuration file based on some build variable.

Since using standard graphical editor to put a PowerShell task and then grab the YAML with the “View YAML” button is the quickest way to do this, you need to be warned because you can incur in the following error.

can not read a block mapping entry; a multiline key may not be an implicit key

This error happens when you put multiline text inside a YAML file with bad indentation of a multiline string. Inline PowerShell task comes really in hand but you need to do special attention because “View YAML” button in the UI sometimes generates bad YAML.

In Figure 1 You can verify what happens when I copy and paste a YAML task using the “View YAML” button of standard graphical editor and paste into a YAML build. In this situation the editor immediately shows me that the syntax is wrong. The real problem here is that, using Visual Studio Code with Azure Pipelines extension did not catch the error, and you have a failing build.

image

Figure 1: Wrong YAML syntax due to multiline PowerShell command line

It turns out that the View YAML button of classic graphical editor misses an extra tab needed to the content of PowerShell, the above task should be fixed in this way:

image

Figure 2: Correct syntax to include a multiline script

If you want to include an inline PowerShell script,  most of the time you do not want to limit yourself to a single line and you need to use the multiline string syntax. Just use a pipe character (|) followed by a multiline string where each newline will be replaced by regular \n. The important rule is: the string has an extra tab respect the line that initiate the multiline string. This fact was highlighted in Figure 2. The tab is important because YAML parser will consider the string finished when it encounter a new line with a one tab less than the multiline string.

The pipe symbol at the end of a line indicates that any indented text that follows is a single multiline string. See the YAML spec – Literal styles.

This is another reason to use online editor for YAML build, because, as you can see in Figure 1, it is able to immediately spot errors in syntax.

Gian Maria

Azure DevOps gems, YAML Pipeline and Templates

If you read my blog you already know that I’m a great fan of YAML Pipeline instead of using Graphic editor in the Web UI, there are lots of reasons why you should use YAML; one for all the ability to branch Pipeline definition with code, but there is another really important feature: templates.

There is a really detailed documentation on MSDN on how to use this feature, but I want to give you a complete walkthrough on how to start to effectively use templates. Thanks to templates you can create a standard build definition with steps or jobs and steps in a template file, then reference that file from real build, just adding parameters.

The ability to capture a sequence of steps in a common template file and reuse it over and over again in real pipeline is probably one of the top reason for moving to YAML template.

One of the most common scenario for me is: account with lots of utilities projects (multitargeted for full framework and dotnetstandard), each one with its git repository and the need for a standard CI definition to:

1) Build the solution
2) Run tests
3) Pack a Nuget Package with semantic versioning
4) Publish Nuget Package inside an Azure DevOps private package repository

If you work on big project you usually have lots of these small projects: Utilities for Castle, Serilog, Security, General etc. In this scenario it is really annoying to define a pipeline for each project with Graphical editor, so it is pretty natural moving to YAML. You can start from a standard file, copy it in the repository and then adapt for the specific project, but when a task is updated, you need to re-update all the project to update all the reference. With this approach the main problem is: after some time the builds are not anymore in sync and each project start to behave differently.

I start defining my template once, in a dedicated repository, then I can reuse it in any project. When the template changes, I want to be able to manually update all pipelines to reference the new version or, even better, decide which project will be updated automatically.

Lets start with the real build file, that is included in the real repository and lets check how to reference a template stored in another repository. The only limit is that the repository should be in the same organization or in GitHub. Here is full content of the file.

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

resources:
  repositories:
    - repository: templatesRepository
      type: git
      name: Jarvis/BuildScripts
      ref: refs/heads/hotfix/0.1.1

jobs:

- template: 'NetStandardTestAndNuget.yaml@templatesRepository'
  
  parameters:
    buildName: 'JarvisAuthCi'
    solution: 'src/Jarvis.Auth.sln'
    nugetProject: 'src/Jarvis.Auth.Client/Jarvis.Auth.Client.csproj'
    nugetProjectDir: 'src/Jarvis.Auth.Client'

The file is really simple, it starts with the triggers (as for a standard YAML build), then it comes a resources section, that allows you to references objects that lives outside the pipeline; in this specific example I’m declaring that this pipeline is using a resource called templateRepository, an external git repository (in the same organization)  called BuildScripts and contained in Team Project called Jarvis; finally the ref property allows me to choose the branch or the tag to use with standard refs git syntax (refs/heads/master, refs/heads/develop, refs/tags/xxxx, etc). In this specific example I’m freezing the version of the build script to the tag 0.1.0, if the repository will be upgraded this build will always reference version 0.1.0. This imply that, if I change BuildScripts repository, I need to manually update this build to reference newer version. If I want this definition to automatically use new versions  I can simply reference master or develop branch.

The real advantage to have the template versioned in another repository is that it can use GitFlow so, every pipeline that uses the template, can choose to use specific version, or latest stable or even latest development.

Finally I start to define Jobs, but instead of defining them inside this YAML file I’m declaring that this pipeline will use a template called NetStandardTestAndNuget.yaml contained in the resource templatesRepository. Following template reference I specify all the parameters needed by the template to run. In this specific example I have four parameters:

buildName: The name of the build, I use a custom Task based on gitversion that will rename each build using this parameter followed by semversion.
solution: Path of solution file to build
nugetProject: Path of the csproject that contains the package to be published
nugetProjectDir: Directory of csproject to publish

The last parameter could be determined by the third, but I want to keep YAML simple, so I require the user of the template to explicitly pass directory of the project that will be used as workingDirectory parameter for dotnet pack command.

Now the real fun starts, lets examine template file contained in the other repository. Usually a template file starts with a parameters section where it declares the parameters it is expecting.

parameters:
  buildName: 'Specify name'
  solution: ''
  buildPlatform: 'ANY CPU'
  buildConfiguration: 'Release'
  nugetProject: ''
  nugetProjectDir: ''
  dotNetCoreVersion: '2.2.301'

As you can see the syntax is really simple, just specify name of the parameter followed by the default value. In this example I really need four parameters, described in the previous part.

Following parameters section a template file can specify steps or event entire jobs, in this example I want to define two distinct jobs, one for build and run test and the other for nuget packaging and publishing

jobs:

- job: 'Build_and_Test'
  pool:
    name: Default

  steps:
  - task: DotNetCoreInstaller@0
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

As you can see I’m simply writing a standard jobs section, that starts with the job Build_and_test that will be run on Default Pool. The jobs starts with a DotNetCoreInstaller steps where you can see that to reference a parameter you need to use special syntax ${{parameters.parametername}}. The beautiful aspect of templates is that they are absolutely like a standard pipeline definition, just use ${{}} syntax to reference parameters.

Job Build_and_test prosecute with standard build test tasks and it determines (with gitversion) semantic version for the package. Since this value will be use in other jobs, I need to made it available with a specific PowerShell task.

  - powershell: echo "##vso[task.setvariable variable=NugetVersion;isOutput=true]$(NugetVersion)"
    name: 'SetNugetVersion'

This task simply set variable $(NugetVersion) as variable NugetVersion but with isOutput=true to made it available to other jobs in the pipeline. Now I can define the other job of the template to pack and publish nuget package.

- job: 'Pack_Nuget'
  dependsOn: 'Build_and_Test'

  pool:
    name: Default

  variables:
    NugetVersion: $[ dependencies.Build_and_Test.outputs['SetNugetVersion.NugetVersion'] ]

The only difference from previous job is the declaration of variable NugetVersion with a special syntax that allows to reference it from a previous job. Now I simply trigger the build from the original project and everything run just fine.

image

Figure 1: Standard build for library project, where I use the whole definition in a template file.

As you can see, thanks to Templates, the real pipeline definition for my project is 23 lines long and I can simply copy and paste to every utility repository, change 4 lines of codes (template parameters) and everything runs just fine.

Using templates lower the barrier for Continuous integration, every member of the team can start a new Utility Project and just setup a standard pipeline even if he/she is not so expert.

Using templates brings a lots of advantage in the team, to add to standard advantages of using plain YAML syntax.

First: you can create standards for all pipeline definitions, instead of having a different pipeline structure for each project, templates allows you to define a set of standard pipeline and reuse for multiple projects.

Second: you have automatic updates: thanks to the ability to reference templates from other repository it is possible to just update the template and have all the pipelines that reference that template to automatically use new version (reference a branch). You keep the ability to pin a specific version to use if needed (reference a tag or a specific commit).

Third: you lower the barrier for creating pipelines for all team members that does not have a good knowledge of Azure Pipelines, they can simply copy the build, change parameters and they are ready to go.

If you still have pipelines defined with graphical editor, it is the time to start upgrading to YAML syntax right now.

Happy Azure Devops.