Windows Docker Container for Azure Devops Build agent

Thanks to Docker Compose, I can spin off an agent for Azure Devops in mere seconds (once you have all the images). Everything I need is just insert the address of my account a valid token and an agent is ready.

With .NET core everything is simple, because we have a nice build task that automatically install .NET Core SDK in the agent, the very same for node.js. This approach is really nice, because it does not require to preinstall too much stuff in your agent, everything is downloaded and installed on the fly when a build needs that specific tooling.

.NET core Node.Js and other SDK can be installed directly from build definition, without any requirement on the machine where the agent is running

In my projects I need also to build solution based on Full Framework SDK and clearly I want to use the very same Docker approach for Agents. Everything I need is a machine with Windows Server2019 with docker enabled and looking in the internet for needed Dockerfile or pre build images on public registry.

I’ve started with a nice article that explain how to install Build Tools inside a container. This articles gives you a dockerfile that basically is doing everything, just docker build and the container is ready to be used.

The only real modification I need is excluding most of the packages I’m not using in my project; I’ve also changed base image to use .NET Framework 4.8 and not 4.7.2.

image

Figure 1: RUN instruction to install build tools inside a Docker Image.

Be prepared to wait a litlle bit for the image to be built, because it downloads all the packages from network and then install into the machine. To build the container image from dockerfile you can simply type.

docker build -t buildtools2019:1.0 -m 4GB .

Now I need to modify my Azure Devops Agent dockerfile built following MSDN instructions  and change base image to FROM buildtools2019:1.0 this will base the agent image on the new image built with all .NET sdk for full framework and all build tools. With this image the agent can build Full Framework projects.

Creating a Docker Images with all needed tools for full framework SDK was really easy.

The other challenge on creating your build agent in Docker is having all the requirements in other docker images, for SQL Server I’ve rebuild the container on top of Windows Server Core 2019 image and it runs just fine, for MongoDb I’ve enabled experimental feature to mix Linux and Windows container so I’ve used a Linux based mongodb image with no problem.

My only remaining dependency was ElasticSearch, because I’m using an old version, 2.4 and I’d like to have that image running in Windows Container. It is not difficult to find some nice guy that published dockerfile for Elasticsearch, here https://github.com/StefanScherer/dockerfiles-windows you can find lots of interesting images for Windows. In my situation I only need Elasticsearch, so I’ve gotten the image, modified to install the version I need and the game is done.

Now I need to have some specific plugins installed in Elasticsearch for my integration tests to run; these situations are the ones where Docker shows its full powers. Since I’ve already an image with elasticsearch, I only need to create a DockerFile based on that image that install required plugin.

FROM elasticsearch_custom:2.4.6.1

RUN "C:\elasticsearch\bin\plugin.bat" install delete-by-query

Wow, just two lines of code launch docker build command and I have my ElasticSearch machine ready to be used.

image

Figure 2: All my images ready to be used in a Docker compose file

Now everything you need to do is create the docker compose file, here is an example

version: '2.4'

services:
  agent:
    image: azdoagent:1.0
    environment:
      - AZP_TOKEN=****q
      - AZP_POOL=docker
      - AZP_AGENT_NAME=TestAlkampfer
      - AZP_URL=https://dev.azure.com/prxm
      - NSTORE_MONGODB=mongodb://mongo/nstoretest
      - NSTORE_MSSQL=Server=mssql;user id=sa;password=sqlPw3$secure
      - TEST_MONGODB=mongodb://mongo/
      - TEST_ES=http://es:9200

  mssql:
    image: mssqlon2019:latest
    environment:
      - sa_password=sqlPw3$secure
      - ACCEPT_EULA=Y
    ports:
      - "1433:1433"

  mongo:
    platform: linux
    image: mongo
    ports:
      - "27017:27017"

  es:
    image: elasticsearch_jarvis:1.0
    ports: 
      - "9200:9200"

Once the file is ready just start an agent with

docker-compose -f docker-compose.jarvis.yml  up –d

and your agent is up and running.

image

Figure 3: All docker images up and running

Then I go to my C# project and included all the docker file needed to create images as well as docker compose file directly in source repository. This allows everyone to locally build docker image and spin out an agent with all the dependency needed to build the project.

If your organization uses azure or any other container registry (you can also push to a public registry because all these images does not contains sensitive data) spinning the agent is only a matter of starting docker composer instance.

After a couple of minutes (the time needed by azure devops agent to download everything and configure itself) my builds start running.

image

Figure 4: A build running a Full Framework solution with integration tests on MongoDB and Elasticsearch is running in my docker container.

From now on, every time I need to have another agent to build my project, I can simply spin out another docker-compose instance and in less than one minute I have another agent up and running.

Gian Maria.

Consume Azure DevOps feed in TeamCity

Azure DevOps has an integrated feed management you can use for nuget, npm, etc; the feed is private and only authorized users can download / upload packages. Today I had a little problem setting up a build in Team City that uses a feed in Azure Devops, because it failed with 201 (unauthorized)

The problem with Azure DevOps NuGet feeds, is how to authenticate other toolchain or build server.

This project still have some old build in TeamCity, but when it starts consuming packages published in Azure Devops, TeamCity builds start failing due 401 (unauthorized) error. The question is: How can I consume an Azure DevOps nuget feed from agent or tools that are not related to Azure Devops site itself?

I must admit that this information is scattered in various resources and Azure DevOps, simply tells you to add a nuget.config in your project and you are ready to go, but this is true only if we are using Visual Studio connected to the account.

image

Figure 1: Standard configuration for nuget config to point to the new feed

Basic documentation forget to mention authentication, except from Get The Tool instruction, where you are suggested to download the Credential Provider if you do not have Visual Studio.

image

Figure 2: Instruction to get NuGet and Credential Provider

Credential provider is useful, but is really not the solution, because I want a more Nuget Friendly solution, the goal is: when the agent in TeamCity is issuing a Nuget restore, it should just work.

A better alternative is to use standard NuGet authentication mechanism, where you simply add a source with both user and password. Lets start from the basic, if I use NuGet command line to list packages for my private source I got prompted for a user.

image

Figure 3: NuGet asking for credential to access a feed

Now, as for everything that involves Azure DevOps, when you are asked for credential, you can use anything for the username and provide an Access Token as a password.

image

Figure 4: Specifying anything for user and my accesstoken I can read the feed.

This the easiest and more secure way to login in Azure DevOps with command line tools, especially because to access the feed I’ve generated a token that have a really reduced permission.

SNAGHTML2160fc9

Figure 5: Access token with only packaging read permission

As you can see in Figure 5, I created a token that has only read access to packaging, if someone stoles it, he/she can only access packages and nothing more.

Now the question is: how can I made TeamCity agent to use that token? Actually TeamCity has some special section where you can specify username and password, or where you can add external feed, but I want a general way to solve for every external tool, not only for Team City. It is time to understand how nuget configuration is done.

Nuget can be configured with nuget.config files placed in some special directories, to specify configuration for current user or for the entire computer

Standard Microsoft documentation states we can have three levels of nuget.config for a solution, one file located in the same directory as solution file, then another one with user scope and is located at %appdata%/Nuget/nuget.config, finally we can have settings for all operation for entire computer in %programfiles(x86)%/nuget/config/nuget.config.

You need to know also, from the standard nuget.config reference documentation, that you can add credentials directly into nuget.config file, because you can specify username and password for each package source. Now I can place, for each computer with an active TeamCity agent, a nuget.config at computer level to specify credential for my private package and the game is done. I created just a file and uploaded in all Team City computers with agents.

The real problem with this approach is that the token is included in clear form in config.file located in c:\program files (x86)/nuget/config.

At this point you can hit this issue https://github.com/NuGet/Home/issues/3245 if you include clear text password in nuget.config password field, you will get a strange “The parameter is incorrect” error, because clear text password should be specified in ClearTextPassword parameter. After all, who write a clear text password in an file?

This leads to the final and most secure solution, in your computer, download latest nuget version in a folder, then issue this command

 .\nuget.exe sources add -name proximo
  -source https://pkgs.dev.azure.com/xxxxx/_packaging/yyyyyy/nuget/v3/index.json 
  -username anything 
  -password **myaccesstoken**

This command will add, for current user, a feed named proximo, that points to the correct source with username and password. After you added the source, you can simply go to %appdata%/nuget/ folder and open the nuget.config file.

image

Figure 6: Encrypted credentials stored inside nuget.config file.

As you can see, in your user configuration nuget.exe stored an encrypted version of your token, if you issue again a nuget list –source xxxxx you can verify that nuget is able to automatically log without any problem because it is using credentials in config file.

The real problem of this approach is that credentials are encrypted only for the machine and the user that issued the command, you cannot reuse in different machine or in the same machine with a different user.

Nuget password encryption cannot be shared between different users or different computers, making it works only for current user in current computer.

image

Figure 7: Include clear text password in machine nuget.config to made every user being able to access that specific feed

To conclude you have two options: you can generate a token and include in clear text in nuget.config and copy it to all of your TeamCity build servers. If you do not like clear text tokens in files, have the agent runs under a specific build user , login in agent machine with that specific build user  and issue nuged add to have token being encrypted for that user.

In both cases you need to renew the configuration each year (token last for at maximum one year). (You can automate this process with a special build that calls nuget sources add).

Gian Maria.

Multiline PowerShell on YAML pipeline

Sometimes having a few lines of PowerShell in your pipeline is the only thing you need to quickly customize a build without using a custom task or having a PowerShell file in source code. One of the typical situation is: write a file with some content that needs to be determined by a PowerShell script, in my situation I need to create a configuration file based on some build variable.

Since using standard graphical editor to put a PowerShell task and then grab the YAML with the “View YAML” button is the quickest way to do this, you need to be warned because you can incur in the following error.

can not read a block mapping entry; a multiline key may not be an implicit key

This error happens when you put multiline text inside a YAML file with bad indentation of a multiline string. Inline PowerShell task comes really in hand but you need to do special attention because “View YAML” button in the UI sometimes generates bad YAML.

In Figure 1 You can verify what happens when I copy and paste a YAML task using the “View YAML” button of standard graphical editor and paste into a YAML build. In this situation the editor immediately shows me that the syntax is wrong. The real problem here is that, using Visual Studio Code with Azure Pipelines extension did not catch the error, and you have a failing build.

image

Figure 1: Wrong YAML syntax due to multiline PowerShell command line

It turns out that the View YAML button of classic graphical editor misses an extra tab needed to the content of PowerShell, the above task should be fixed in this way:

image

Figure 2: Correct syntax to include a multiline script

If you want to include an inline PowerShell script,  most of the time you do not want to limit yourself to a single line and you need to use the multiline string syntax. Just use a pipe character (|) followed by a multiline string where each newline will be replaced by regular \n. The important rule is: the string has an extra tab respect the line that initiate the multiline string. This fact was highlighted in Figure 2. The tab is important because YAML parser will consider the string finished when it encounter a new line with a one tab less than the multiline string.

The pipe symbol at the end of a line indicates that any indented text that follows is a single multiline string. See the YAML spec – Literal styles.

This is another reason to use online editor for YAML build, because, as you can see in Figure 1, it is able to immediately spot errors in syntax.

Gian Maria

Azure DevOps multi stage pipeline environments

In a previous post on releasing with Multi Stage Pipeline and YAML code I briefly introduced the concept of environments. In that example I used an environment called single_env and you can be surprised that, by default, an environment is automatically created when the release runs.

This happens because an environment can be seen as sets of resources used as target for deployments, but in the actual preview version, in Azure DevOps, you can only add Kubernetes resources. The question is: why have I used an environment to deploy an application to Azure if there is no connection between the environment and your azure resources?

At this stage of the preview, we can only connect Kubernetes to an environment, no other physical resource can be linked.

I have two answer for this, the first is: Multi State Release pipeline in YAML is still in preview and we know that it is still incomplete, the other is: an environment is a set of information and rules, and rules can be enforced even if there is no a direct connection with physical resources.

image

Figure 1: Environment in Azure DevOps

As you see in Figure 1 a first advantage of environment is that I can immediately check its status. From the above picture I can immediately spot that I have a release successfully deployed on it. Clicking on the status opens pipeline details released on that environment.

Clicking on Environment name opens environment detail page, where you can view all information for the environment (name, and all release made on it) and it is possible to add resources and manage Security and Checks.

image

Figure 2: Security and Checks configuration for an environment.

Security is pretty straightforward, it is used to decide who can use and modify the environment, the real cool feature is the ability to create checks. If you click checks in Figure 2 you are redirected on a page that lists all the checks that need to be done before the environment can be used as a target for deploy.

image

Figure 3: Creating a check for the environment.

As an example I created a simple manual approval, put myself as the only approver and add some instruction. Once a check was created, it is listed in the Checks list for the environment.

image

Figure 4: Checks defined for the environment single_env

If I trigger another run, something interesting happens: after the build_test phase was completed , deploy stage is blocked by approval check.

image

Figure 4: Deploy stage was suspended because related environment has check to be fulfilled

Check support in environment can be used to apply deployment gates for my environments, like manual approval for the standard Azure DevOps classic release pipeline

Even if there is no physical link between the environment and my azure account where I’m deploying my application, azure pipeline detects that the environment has a check and block the execution of the script, as you can check in Figure 4.

Clicking on the check link in Figure 4 opens a details with all checks that should be done before continuing with deploy script. In Figure 5 you can check that the deploy is waiting for me to approve it, I can simply press the Approve button to have deploy script to start.

image

Figure 5: Checks for deploy stage

Once an agent is available, deploy stage can now start because I’ve done all check for related environment.

image

Figure 6: Deploy stage started, all the check are passed.

Once deploy operation finished, I can always verify checks, in Figure 7 I can verify how easy is to find who approved the release in that environment.

image

Figure 7: Verifying the checks for deploy after pipeline finished.

Actually the only check available is manual approval, but I’m expecting more and more checks to be available in the future, so keeps an eye to future release notes.

Gian Maria.

Sample report for Azure DevOps

Reporting was always a pain point in Azure DevOps, because people used on SQL Server reporting Services for the on-premise version, missed a similar ability to create custom reports in Azure Dev Ops.

Now you have a nice integration with Power BI and a nice article here that explains how to connect Power BI to your instance and create some basic query. The nice part is that you can use a query that will connect directly with the OData feed, no need to install anything.

Power BI - Advanced Editor - Replace strings in query

Figure 1: Sample OData query that directly connects to your organization account to query for Work Item Data.

I strongly encourage you to experiment with Power BI because is a powerful tool that can create really good report, closing the gap with on-premise version and old Reporting Services.

Gian Maria.