Consume Azure DevOps feed in TeamCity

Azure DevOps has an integrated feed management you can use for nuget, npm, etc; the feed is private and only authorized users can download / upload packages. Today I had a little problem setting up a build in Team City that uses a feed in Azure Devops, because it failed with 201 (unauthorized)

The problem with Azure DevOps NuGet feeds, is how to authenticate other toolchain or build server.

This project still have some old build in TeamCity, but when it starts consuming packages published in Azure Devops, TeamCity builds start failing due 401 (unauthorized) error. The question is: How can I consume an Azure DevOps nuget feed from agent or tools that are not related to Azure Devops site itself?

I must admit that this information is scattered in various resources and Azure DevOps, simply tells you to add a nuget.config in your project and you are ready to go, but this is true only if we are using Visual Studio connected to the account.

image

Figure 1: Standard configuration for nuget config to point to the new feed

Basic documentation forget to mention authentication, except from Get The Tool instruction, where you are suggested to download the Credential Provider if you do not have Visual Studio.

image

Figure 2: Instruction to get NuGet and Credential Provider

Credential provider is useful, but is really not the solution, because I want a more Nuget Friendly solution, the goal is: when the agent in TeamCity is issuing a Nuget restore, it should just work.

A better alternative is to use standard NuGet authentication mechanism, where you simply add a source with both user and password. Lets start from the basic, if I use NuGet command line to list packages for my private source I got prompted for a user.

image

Figure 3: NuGet asking for credential to access a feed

Now, as for everything that involves Azure DevOps, when you are asked for credential, you can use anything for the username and provide an Access Token as a password.

image

Figure 4: Specifying anything for user and my accesstoken I can read the feed.

This the easiest and more secure way to login in Azure DevOps with command line tools, especially because to access the feed I’ve generated a token that have a really reduced permission.

SNAGHTML2160fc9

Figure 5: Access token with only packaging read permission

As you can see in Figure 5, I created a token that has only read access to packaging, if someone stoles it, he/she can only access packages and nothing more.

Now the question is: how can I made TeamCity agent to use that token? Actually TeamCity has some special section where you can specify username and password, or where you can add external feed, but I want a general way to solve for every external tool, not only for Team City. It is time to understand how nuget configuration is done.

Nuget can be configured with nuget.config files placed in some special directories, to specify configuration for current user or for the entire computer

Standard Microsoft documentation states we can have three levels of nuget.config for a solution, one file located in the same directory as solution file, then another one with user scope and is located at %appdata%/Nuget/nuget.config, finally we can have settings for all operation for entire computer in %programfiles(x86)%/nuget/config/nuget.config.

You need to know also, from the standard nuget.config reference documentation, that you can add credentials directly into nuget.config file, because you can specify username and password for each package source. Now I can place, for each computer with an active TeamCity agent, a nuget.config at computer level to specify credential for my private package and the game is done. I created just a file and uploaded in all Team City computers with agents.

The real problem with this approach is that the token is included in clear form in config.file located in c:\program files (x86)/nuget/config.

At this point you can hit this issue https://github.com/NuGet/Home/issues/3245 if you include clear text password in nuget.config password field, you will get a strange “The parameter is incorrect” error, because clear text password should be specified in ClearTextPassword parameter. After all, who write a clear text password in an file?

This leads to the final and most secure solution, in your computer, download latest nuget version in a folder, then issue this command

 .\nuget.exe sources add -name proximo
  -source https://pkgs.dev.azure.com/xxxxx/_packaging/yyyyyy/nuget/v3/index.json 
  -username anything 
  -password **myaccesstoken**

This command will add, for current user, a feed named proximo, that points to the correct source with username and password. After you added the source, you can simply go to %appdata%/nuget/ folder and open the nuget.config file.

image

Figure 6: Encrypted credentials stored inside nuget.config file.

As you can see, in your user configuration nuget.exe stored an encrypted version of your token, if you issue again a nuget list –source xxxxx you can verify that nuget is able to automatically log without any problem because it is using credentials in config file.

The real problem of this approach is that credentials are encrypted only for the machine and the user that issued the command, you cannot reuse in different machine or in the same machine with a different user.

Nuget password encryption cannot be shared between different users or different computers, making it works only for current user in current computer.

image

Figure 7: Include clear text password in machine nuget.config to made every user being able to access that specific feed

To conclude you have two options: you can generate a token and include in clear text in nuget.config and copy it to all of your TeamCity build servers. If you do not like clear text tokens in files, have the agent runs under a specific build user , login in agent machine with that specific build user  and issue nuged add to have token being encrypted for that user.

In both cases you need to renew the configuration each year (token last for at maximum one year). (You can automate this process with a special build that calls nuget sources add).

Gian Maria.

Multiline PowerShell on YAML pipeline

Sometimes having a few lines of PowerShell in your pipeline is the only thing you need to quickly customize a build without using a custom task or having a PowerShell file in source code. One of the typical situation is: write a file with some content that needs to be determined by a PowerShell script, in my situation I need to create a configuration file based on some build variable.

Since using standard graphical editor to put a PowerShell task and then grab the YAML with the “View YAML” button is the quickest way to do this, you need to be warned because you can incur in the following error.

can not read a block mapping entry; a multiline key may not be an implicit key

This error happens when you put multiline text inside a YAML file with bad indentation of a multiline string. Inline PowerShell task comes really in hand but you need to do special attention because “View YAML” button in the UI sometimes generates bad YAML.

In Figure 1 You can verify what happens when I copy and paste a YAML task using the “View YAML” button of standard graphical editor and paste into a YAML build. In this situation the editor immediately shows me that the syntax is wrong. The real problem here is that, using Visual Studio Code with Azure Pipelines extension did not catch the error, and you have a failing build.

image

Figure 1: Wrong YAML syntax due to multiline PowerShell command line

It turns out that the View YAML button of classic graphical editor misses an extra tab needed to the content of PowerShell, the above task should be fixed in this way:

image

Figure 2: Correct syntax to include a multiline script

If you want to include an inline PowerShell script,  most of the time you do not want to limit yourself to a single line and you need to use the multiline string syntax. Just use a pipe character (|) followed by a multiline string where each newline will be replaced by regular \n. The important rule is: the string has an extra tab respect the line that initiate the multiline string. This fact was highlighted in Figure 2. The tab is important because YAML parser will consider the string finished when it encounter a new line with a one tab less than the multiline string.

The pipe symbol at the end of a line indicates that any indented text that follows is a single multiline string. See the YAML spec – Literal styles.

This is another reason to use online editor for YAML build, because, as you can see in Figure 1, it is able to immediately spot errors in syntax.

Gian Maria

Azure DevOps multi stage pipeline environments

In a previous post on releasing with Multi Stage Pipeline and YAML code I briefly introduced the concept of environments. In that example I used an environment called single_env and you can be surprised that, by default, an environment is automatically created when the release runs.

This happens because an environment can be seen as sets of resources used as target for deployments, but in the actual preview version, in Azure DevOps, you can only add Kubernetes resources. The question is: why have I used an environment to deploy an application to Azure if there is no connection between the environment and your azure resources?

At this stage of the preview, we can only connect Kubernetes to an environment, no other physical resource can be linked.

I have two answer for this, the first is: Multi State Release pipeline in YAML is still in preview and we know that it is still incomplete, the other is: an environment is a set of information and rules, and rules can be enforced even if there is no a direct connection with physical resources.

image

Figure 1: Environment in Azure DevOps

As you see in Figure 1 a first advantage of environment is that I can immediately check its status. From the above picture I can immediately spot that I have a release successfully deployed on it. Clicking on the status opens pipeline details released on that environment.

Clicking on Environment name opens environment detail page, where you can view all information for the environment (name, and all release made on it) and it is possible to add resources and manage Security and Checks.

image

Figure 2: Security and Checks configuration for an environment.

Security is pretty straightforward, it is used to decide who can use and modify the environment, the real cool feature is the ability to create checks. If you click checks in Figure 2 you are redirected on a page that lists all the checks that need to be done before the environment can be used as a target for deploy.

image

Figure 3: Creating a check for the environment.

As an example I created a simple manual approval, put myself as the only approver and add some instruction. Once a check was created, it is listed in the Checks list for the environment.

image

Figure 4: Checks defined for the environment single_env

If I trigger another run, something interesting happens: after the build_test phase was completed , deploy stage is blocked by approval check.

image

Figure 4: Deploy stage was suspended because related environment has check to be fulfilled

Check support in environment can be used to apply deployment gates for my environments, like manual approval for the standard Azure DevOps classic release pipeline

Even if there is no physical link between the environment and my azure account where I’m deploying my application, azure pipeline detects that the environment has a check and block the execution of the script, as you can check in Figure 4.

Clicking on the check link in Figure 4 opens a details with all checks that should be done before continuing with deploy script. In Figure 5 you can check that the deploy is waiting for me to approve it, I can simply press the Approve button to have deploy script to start.

image

Figure 5: Checks for deploy stage

Once an agent is available, deploy stage can now start because I’ve done all check for related environment.

image

Figure 6: Deploy stage started, all the check are passed.

Once deploy operation finished, I can always verify checks, in Figure 7 I can verify how easy is to find who approved the release in that environment.

image

Figure 7: Verifying the checks for deploy after pipeline finished.

Actually the only check available is manual approval, but I’m expecting more and more checks to be available in the future, so keeps an eye to future release notes.

Gian Maria.

Sample report for Azure DevOps

Reporting was always a pain point in Azure DevOps, because people used on SQL Server reporting Services for the on-premise version, missed a similar ability to create custom reports in Azure Dev Ops.

Now you have a nice integration with Power BI and a nice article here that explains how to connect Power BI to your instance and create some basic query. The nice part is that you can use a query that will connect directly with the OData feed, no need to install anything.

Power BI - Advanced Editor - Replace strings in query

Figure 1: Sample OData query that directly connects to your organization account to query for Work Item Data.

I strongly encourage you to experiment with Power BI because is a powerful tool that can create really good report, closing the gap with on-premise version and old Reporting Services.

Gian Maria.

Azure DevOps gems, YAML Pipeline and Templates

If you read my blog you already know that I’m a great fan of YAML Pipeline instead of using Graphic editor in the Web UI, there are lots of reasons why you should use YAML; one for all the ability to branch Pipeline definition with code, but there is another really important feature: templates.

There is a really detailed documentation on MSDN on how to use this feature, but I want to give you a complete walkthrough on how to start to effectively use templates. Thanks to templates you can create a standard build definition with steps or jobs and steps in a template file, then reference that file from real build, just adding parameters.

The ability to capture a sequence of steps in a common template file and reuse it over and over again in real pipeline is probably one of the top reason for moving to YAML template.

One of the most common scenario for me is: account with lots of utilities projects (multitargeted for full framework and dotnetstandard), each one with its git repository and the need for a standard CI definition to:

1) Build the solution
2) Run tests
3) Pack a Nuget Package with semantic versioning
4) Publish Nuget Package inside an Azure DevOps private package repository

If you work on big project you usually have lots of these small projects: Utilities for Castle, Serilog, Security, General etc. In this scenario it is really annoying to define a pipeline for each project with Graphical editor, so it is pretty natural moving to YAML. You can start from a standard file, copy it in the repository and then adapt for the specific project, but when a task is updated, you need to re-update all the project to update all the reference. With this approach the main problem is: after some time the builds are not anymore in sync and each project start to behave differently.

I start defining my template once, in a dedicated repository, then I can reuse it in any project. When the template changes, I want to be able to manually update all pipelines to reference the new version or, even better, decide which project will be updated automatically.

Lets start with the real build file, that is included in the real repository and lets check how to reference a template stored in another repository. The only limit is that the repository should be in the same organization or in GitHub. Here is full content of the file.

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

resources:
  repositories:
    - repository: templatesRepository
      type: git
      name: Jarvis/BuildScripts
      ref: refs/heads/hotfix/0.1.1

jobs:

- template: 'NetStandardTestAndNuget.yaml@templatesRepository'
  
  parameters:
    buildName: 'JarvisAuthCi'
    solution: 'src/Jarvis.Auth.sln'
    nugetProject: 'src/Jarvis.Auth.Client/Jarvis.Auth.Client.csproj'
    nugetProjectDir: 'src/Jarvis.Auth.Client'

The file is really simple, it starts with the triggers (as for a standard YAML build), then it comes a resources section, that allows you to references objects that lives outside the pipeline; in this specific example I’m declaring that this pipeline is using a resource called templateRepository, an external git repository (in the same organization)  called BuildScripts and contained in Team Project called Jarvis; finally the ref property allows me to choose the branch or the tag to use with standard refs git syntax (refs/heads/master, refs/heads/develop, refs/tags/xxxx, etc). In this specific example I’m freezing the version of the build script to the tag 0.1.0, if the repository will be upgraded this build will always reference version 0.1.0. This imply that, if I change BuildScripts repository, I need to manually update this build to reference newer version. If I want this definition to automatically use new versions  I can simply reference master or develop branch.

The real advantage to have the template versioned in another repository is that it can use GitFlow so, every pipeline that uses the template, can choose to use specific version, or latest stable or even latest development.

Finally I start to define Jobs, but instead of defining them inside this YAML file I’m declaring that this pipeline will use a template called NetStandardTestAndNuget.yaml contained in the resource templatesRepository. Following template reference I specify all the parameters needed by the template to run. In this specific example I have four parameters:

buildName: The name of the build, I use a custom Task based on gitversion that will rename each build using this parameter followed by semversion.
solution: Path of solution file to build
nugetProject: Path of the csproject that contains the package to be published
nugetProjectDir: Directory of csproject to publish

The last parameter could be determined by the third, but I want to keep YAML simple, so I require the user of the template to explicitly pass directory of the project that will be used as workingDirectory parameter for dotnet pack command.

Now the real fun starts, lets examine template file contained in the other repository. Usually a template file starts with a parameters section where it declares the parameters it is expecting.

parameters:
  buildName: 'Specify name'
  solution: ''
  buildPlatform: 'ANY CPU'
  buildConfiguration: 'Release'
  nugetProject: ''
  nugetProjectDir: ''
  dotNetCoreVersion: '2.2.301'

As you can see the syntax is really simple, just specify name of the parameter followed by the default value. In this example I really need four parameters, described in the previous part.

Following parameters section a template file can specify steps or event entire jobs, in this example I want to define two distinct jobs, one for build and run test and the other for nuget packaging and publishing

jobs:

- job: 'Build_and_Test'
  pool:
    name: Default

  steps:
  - task: DotNetCoreInstaller@0
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

As you can see I’m simply writing a standard jobs section, that starts with the job Build_and_test that will be run on Default Pool. The jobs starts with a DotNetCoreInstaller steps where you can see that to reference a parameter you need to use special syntax ${{parameters.parametername}}. The beautiful aspect of templates is that they are absolutely like a standard pipeline definition, just use ${{}} syntax to reference parameters.

Job Build_and_test prosecute with standard build test tasks and it determines (with gitversion) semantic version for the package. Since this value will be use in other jobs, I need to made it available with a specific PowerShell task.

  - powershell: echo "##vso[task.setvariable variable=NugetVersion;isOutput=true]$(NugetVersion)"
    name: 'SetNugetVersion'

This task simply set variable $(NugetVersion) as variable NugetVersion but with isOutput=true to made it available to other jobs in the pipeline. Now I can define the other job of the template to pack and publish nuget package.

- job: 'Pack_Nuget'
  dependsOn: 'Build_and_Test'

  pool:
    name: Default

  variables:
    NugetVersion: $[ dependencies.Build_and_Test.outputs['SetNugetVersion.NugetVersion'] ]

The only difference from previous job is the declaration of variable NugetVersion with a special syntax that allows to reference it from a previous job. Now I simply trigger the build from the original project and everything run just fine.

image

Figure 1: Standard build for library project, where I use the whole definition in a template file.

As you can see, thanks to Templates, the real pipeline definition for my project is 23 lines long and I can simply copy and paste to every utility repository, change 4 lines of codes (template parameters) and everything runs just fine.

Using templates lower the barrier for Continuous integration, every member of the team can start a new Utility Project and just setup a standard pipeline even if he/she is not so expert.

Using templates brings a lots of advantage in the team, to add to standard advantages of using plain YAML syntax.

First: you can create standards for all pipeline definitions, instead of having a different pipeline structure for each project, templates allows you to define a set of standard pipeline and reuse for multiple projects.

Second: you have automatic updates: thanks to the ability to reference templates from other repository it is possible to just update the template and have all the pipelines that reference that template to automatically use new version (reference a branch). You keep the ability to pin a specific version to use if needed (reference a tag or a specific commit).

Third: you lower the barrier for creating pipelines for all team members that does not have a good knowledge of Azure Pipelines, they can simply copy the build, change parameters and they are ready to go.

If you still have pipelines defined with graphical editor, it is the time to start upgrading to YAML syntax right now.

Happy Azure Devops.