Strange Error uploading artifacts in Azure DevOps pipeline

I have a library that is entirely written in .NET core that deal with some self signed X509 certificates used to encrypt and digitally sign some data. Software runs perfectly and is composed by a server and client part.

At a certain point we decided that the client should be used not only by software that runs .NET core, but also software with full framework, so I’ve changed target framework to target both netstandard 2.0 and full framework 4.6.1, everything compiles perfectly, tests run fine, everything seems to be green. The problem is that unit test project ran tests only with .NET Core, so I was not exercising tests in full framework.

Running the new project in .NET core application raise no error, but when it run in full Framework, it throws an exception while accessing the private key property of X509Certificate2 object: System.Security.Cryptography.CryptographicException: Invalid provider type specified.

Code runs perfect on .NET core runtime, but throws Invalid provider type specified on Full Framework.

Actually the error is quite straightforward, it is just telling me that the implementation of RSA private key of certificate is probably done differently in .NET Standard, an information that I already know, but indeed it was difficult to find where my code was wrong.

After long digging in the internet I found some interesting article that pointed me in the right direction a library that allows for some extension methods called HasCngKey and GetCngPrivateKey(). I spare you the gritty details, but basically the problem arise because Microsoft change Cryptographic API in Windows and I was doing a bad thing.

Original code simply cast the PrivateKey property of the certificate to the RSA class, a piece of code that I’ve found somewhere in the internet and that seems pretty straightforward. Since my library created the certificate I’m pretty sure that the private key is RSA so I can safely cast to RSA object Right? I was deadly wrong.

var rsa = (RSA)clientCertificate.PrivateKey;

The above code works only in .NET Standard because even if Full Framework has an RSA class, its implementation is different from NetStandard and make impossible to directly access PrivateKey property once the certificate was written by .NET Standard code. Welcome to Crypto API next generation whose acronym is CNG used by .NET Standard.

.NET standard code used CNG API to write the certificate, this means that, once loaded by the same class in Full Framework, PrivateKey property cannot be accessed directly from Full Framework, that uses CAPI and is not able to decrypt the key. To validate the assumption I referenced a special package from Nuget that allows me to write code that can interact with CNG API from full framework. (wrapped in a #if NETFULL directive to be used only with full framework)

if (clientCertificate.HasCngKey())
{
    var rsa2 = clientCertificate.GetCngPrivateKey();
    var cng = new RSACng(rsa2);
    aesDecryptedKey = cng.Decrypt(licenseVault.EncryptionKey, RSAEncryptionPadding.OaepSHA512);
}

This fixed the problem perfectly, actually the HasCngKey methods tells me if the private key is using CNG Api and then allows me to retrieve the key with a call to GetCngPrivateKey().

image

Figure 1: GetCngPrivateKey() method explanation

Ok, that confirms all of my suspicions, but I felt really dirty because I have to run a different code for full framework and .NET core and I was really convinced that I was doing something wrong. It turns out that the error was in my code that directly cast PrivateKey property of certificate to RSA, because the right call to obtain the RSA object from a X509Certificate2 object is a call to GetRSAPrivateKey();

RSA rsa = clientCertificate.GetRSAPrivateKey();

This method hides nifty gritty implementation details from the caller, it just retrieve the RSA key or returns null if the private key is not RSA. While it seems perfectly legit to cast the PrivateKey to RSA object in .NET Standard, this assumption is not anymore legit to Full Framework.

As always, this is a classic example on how Cryptography is a complex subject, and you should always double check your code.

Gian Maria.

Strange Error uploading artifacts in Azure DevOps pipeline

I have a pipeline that worked perfectly for Years, but yesterday a build failed while uploading artifacts, I queued it again and it still failed, so it does not seems to be an intermittent error (network could be unreliable). I was really puzzled because from the last good build we changed 4 C# files, nothing really changed that can justify the failing and also we have no network problem that can justify problem uploading artifacts to Azure DevOps.

Well, the error was also not telling me anything, it is a simple statement that file upload failed and since it told me that there were a retry, definitely this is not an intermittent network error.

SNAGHTML156bf49

Figure 1: Upload file error, nothing is suggesting me the real error

Digging deeper in the execution logs I found something really strange, an error after a GetTempFileName() function call.

image

Figure 2: The real error reported in the detail log

Ok, this is definitely weird, it seems that a task is calling GetTempFileName() method, but the call failed with a strange error “The file already exists”; digging around in the internet it seems that the error can be generated by a temp directory with too much files (someone claims this limit to be 65535).

Temp directories tends to grow really big and if you have plenty of space in your HDD, usually they are not cleared, but you can incur in problems if the number of files becomes too high.

The trick part is knowing which is the real temp directory used by the agent that runs the build, so I’ve added an Execute Command task to simply echo “%TEMP% variable.

image

Figure 3: Dumping temp variable inside a build

Checking the build server, I verified that temp folder contains more than 100k files for several gigabytes of space. After deleting all those temp files, the build started working again and the team was happy again. Yesterday was a strange day, because this is not the weirdest bug I had to solve :P. (another post will follow).

Gian Maria.

Azure DevOps Pipeline template steps and .NET Core 3 local tools

I’m a strong fan of Azure DevOps templates for pipelines because it is a really good feature to both simplify Pipeline authoring and avoid proliferation of too many way to do the same things. In some of my previous examples I’ve always used a template that contains full Multi Stage pipeline definition, this allows you to create a new pipeline with easy, reference repository with the template, choose right template, set parameters and you are ready to go.

Using a template file that contains all stages allows you to define in a single place an entire pipeline to reuse with simple parameters definition

Today my dear friend Giulio Vian told me that he was investigating the use of a cool feature of .NET core 3.0, called Local Tools. Basically with Local Tools you are able to create a special file called dotnet-tools.json that contains all the tools you need for your project. Since I’m an heavy fan of GitVersion, it seems to me standard to include in every project such file with the actual version of GitVersion used in my project. Here is an example of the file.

{
  "version": 1,
  "isRoot": true,
  "tools": {
    "gitversion.tool": {
      "version": "5.2.4",
      "commands": [
        "dotnet-gitversion"
      ]
    }
  }
}

Once you have this file in place, you can simply issue a dotnet tool restore command and all referenced tools will be automatically installed locally and ready to use. This makes me extra simple to use GitVersion in my pipelines, because a simple dotnet tool restore will make GitVersion available on my pipeline (given that I’ve previously created a dotnet-tools.json in my project).

To experiment this new feature I want to give you another approach in using Azure DevOps templates, shared steps. Lets have a look at this template file:

parameters:
  buildName: 'Specify name'
  dotNetCoreVersion: '2.2.301'

steps:
  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

  - task: DotNetCoreCLI@2
    displayName: 'install if needed dotnet gitversion tool'
    inputs:
      command: 'custom'
      custom: 'tool'
      arguments: 'restore'
  
  - script: |
      dotnet tool run dotnet-gitversion $(Build.Repository.LocalPath) /output buildserver
    name: Run_dotnet_gitversion

  - powershell: |
      Write-Host "##vso[build.updatebuildnumber]${{parameters.buildName}}-$env:GITVERSION_FULLSEMVER"

      $var = (gci env:*).GetEnumerator() | Sort-Object Name
      $out = ""
      Foreach ($v in $var) {$out = $out + "`t{0,-28} = {1,-28}`n" -f $v.Name, $v.Value}

      write-output "dump variables on $env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      $fileName = "$env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      set-content $fileName $out

      write-output "##vso[task.addattachment type=Distributedtask.Core.Summary;name=Environment Variables;]$fileName"
    name: Update_build_number

  - powershell: |
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "##vso[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "##vso[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
    name: 'SetGitVersionVariables'

It is a quite long template but basically it begins with usual parameters section, followed by a series of steps, not an entire multi stage definition. The sequence of steps are used to: runs dotnet tool restore, run GitVersion and finally does some PowerShell dumping of the variables. This kind of template was more similar to a Task Group because it is basically just a sequence of steps with parameters.

With this approach you are defining small pieces of an entire pipeline, allowing for a more granular reuse. This other template contains steps to build and run tests for a solution in .NET core.

parameters:
  dotNetCoreVersion: '3.1.201'
  buildConfiguration: release
  solution: ''
  continueOnTestErrors: true
  GitVersionFullSemVer: ''
  GitVersionAssemblyVer: ''
  GitVersionAssemblySemFileVer: ''
  SkipInstallDotNetCore: false

steps:

  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}
    condition: ne(${{parameters.SkipInstallDotNetCore}}, 'true')

  - task: DotNetCoreCLI@2 
    displayName: 'dotnet restore'
    inputs:
      command: restore
      projects: '${{parameters.solution}}'
      feedsToUse: config
      nugetConfigPath: src/NuGet.Config

  - task: DotNetCoreCLI@2
    displayName: 'dotnet build'
    inputs:
      command: build
      projects: '${{parameters.solution}}'
      configuration: '${{parameters.buildConfiguration}}'
      arguments: /p:AssemblyVersion=${{parameters.GitVersionAssemblyVer}} /p:FileVersion=${{parameters.GitVersionAssemblySemFileVer}} /p:InformationalVersion=${{parameters.GitVersionAssemblyVer}}_$(Build.SourceVersion)

  - task: DotNetCoreCLI@2
    displayName: 'dotnet test'
    inputs:
      command: test
      nobuild: true
      projects: '${{parameters.solution}}'
      arguments: '--configuration ${{parameters.buildConfiguration}} --collect "Code coverage" --logger trx' 
    continueOnError: ${{parameters.continueOnTestErrors}}

This steps declare as parameters some of semVer numbers extracted by GitVersion, so this sequence of steps are based on the fact that you should have run GitVersion in some preceding steps. Finally I’ve the last template that is used to publish with NuGet.

parameters:
  dotNetCoreVersion: '3.1.201'
  buildConfiguration: release
  nugetProject: ''
  GitVersionFullSemVer: ''
  GitVersionAssemblyVer: ''
  GitVersionAssemblySemFileVer: ''
  SkipInstallDotNetCore: false

steps:

  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}
    condition: ne(${{parameters.SkipInstallDotNetCore}}, 'true')

  - task: DotNetCoreCLI@2
    displayName: NuGet Pack
    inputs:
      command: custom
      custom: pack
      projects: ${{parameters.nugetProject}}
      arguments: -o "$(Build.ArtifactStagingDirectory)\NuGet" -c ${{parameters.buildConfiguration}} /p:PackageVersion=${{parameters.GitVersionFullSemVer}} /p:AssemblyVersion=${{parameters.GitVersionAssemblyVer}} /p:FileVersion=${{parameters.GitVersionAssemblySemFileVer}} /p:InformationalVersion=${{parameters.GitVersionAssemblyVer}}_$(Build.SourceVersion)

  - task: NuGetCommand@2
    displayName: NuGet Push
    inputs:
      command: push
      packagesToPush: '$(Build.ArtifactStagingDirectory)\NuGet\*.nupkg'
      nuGetFeedType: internal
      publishVstsFeed: '95a01998-aa90-433c-8077-41da981289aa'
    continueOnError: true

Now that I have these three distinct template files in a repository of my Azure DevOps account I can refer them to pipelines in another repository.

Here is the real pipeline file of final project that is actually using all those three steps template defined in another repository.

trigger:
  branches:
    include:
      - master
      - develop
      - release/*
      - hotfix/*
      - feature/*

resources:
  repositories:
  - repository: templatesRepository
    type: git
    name: Jarvis/BuildScripts
    ref: refs/heads/develop

jobs:

- job: net_build_test

  pool:
    name: '$(Pool)'
    demands:
      - vstest
      - msbuild
      - visualstudio

  steps:
  - template: 'steps/GitVersionDotnetCoreLocal.yml@templatesRepository' # Template reference
    parameters:
      dotNetCoreVersion: '3.1.201'
      buildName: 'License Manager'
  
  - template: 'steps/BuildAndTestCore.yml@templatesRepository' # Template reference
    parameters:
      dotNetCoreVersion: '3.1.201'
      solution: 'src/LicenseManager.sln'
      SkipInstallDotNetCore: true
      GitVersionFullSemVer: '$(GITVERSION.FULLSEMVER)'
      GitVersionAssemblyVer: '$(GITVERSION.ASSEMBLYSEMVER)'
      GitVersionAssemblySemFileVer: '$(GITVERSION.ASSEMBLYSEMFILEVER)'

  - template: 'steps/PublishNuget.yml@templatesRepository' # Template reference
    parameters:
      dotNetCoreVersion: '3.1.201'
      nugetProject: 'src/LicenseManager.Client/LicenseManager.Client.csproj'
      GitVersionFullSemVer: '$(GITVERSION.FULLSEMVER)'
      GitVersionAssemblyVer: '$(GITVERSION.ASSEMBLYSEMVER)'
      GitVersionAssemblySemFileVer: '$(GITVERSION.ASSEMBLYSEMFILEVER)'
      SkipInstallDotNetCore: true

As you can see, with steps template I decide on final build how many stage I need, in this example I’m perfectly confortable with a single stage. The cool part of this approach is that I can mix standard steps and steps template files, giving me more flexibility in how the pipeline is constructed. Clearly this pipeline is more complex than one that use a full template file, because we need to pass parameter to every template. Running the pipeline gives you a standard run.

image

Figure 1: Steps are expanded during execution.

As you can verify from Figure 1 all steps templates are expanded in basic steps, allowing you to verify the output of every single step. Thanks to local tool feature I can simply run dotnet tool restore to have GitVersion automatically installed

image

Figure 2: Restoring tooling with dotnet tool restore automatically restore gitversion

This will greatly simplify agent requirements, because all requirements are automatically restored by the pipeline.

Gian Maria.

Continuous Integration in GitHub Actions, deploy in AzureDevops

My dear friend Matteo just published an interesting article on integration between GitHub actions and Azure Devops Pipeline here. I have a different scenario where I’ve already published a GitHub release from a GitHub action, but I have nothing ready in GitHub to release in my machines.

While GitHub is really fantastic for source code and starts having a good support for CI with Actions, in the release part it still miss a solution. Usually this is not a problem, because we have Azure DevOps or other products that can fill the gap.

This specific project is a firewall that closes every port in a machine and listen on UDP ports for a specific message to open other ports, thus, a machine where the service is installed has no way to be contacted unless you use the appropriate client to ask for port opening. I want the deploy to be automatic, no way I’m going to all my machines, login in RDP and then update the service, everything should happen automatically.

The really nice aspect of Azure DevOps release pipelines, is that, once you installed agents in one or more machines, those machines will contact Azure DevOps and pull works to do without the need for the machine to be contacted from outside world.

This is a key point of Azure DevOps release pipeline, you do not need to have any special setup in deploy target, you should simply let the target to be able to contact Azure DevOps site (https://dev.azure.com).

Another nice aspect of Azure DevOps release pipeline, is that it can use many sources for artifacts, not only those produced by Azure DevOps CI pipeline. When you add an artifacts, you can choose a GitHub release as well as Jenkins and other CI providers like Azure Artifacts (check Matteo’s article to see how to publish in azure artifacts from a GitHub Action)

image

Figure 1: Choose GitHub release as artifact source

To use GitHub as source you should have already connected your azure DevOps to GitHub with a service connection, another cool feature of Azure DevOps. As an administrator you can connect Azure DevOps account to GitHub, then give permission to specific people to use that service connection, without requiring them to know the real credentials to connect to the service (Github in this example). Once you have one or more connection active you can simply choose the repository to use. In Figure 2 You can see the configuration I choose for my project.

image

Figure 2: Configure GitHub release as artifact source.

Settings are: Repository (1), default version of the release to use (2) and finally alias you use for that specific artifact in your release (3). Remember that a release can have more than a single artifact as source, if you have a simple project like this, probably you have a single artifact.

Now you have the full power of Azure DevOps pipeline at your fingertips, in this specific example I just need to deploy a Windows Service and this is the pipeline to release in my stages.

image

Figure 3: Release pipeline for a Windows Service

This is a standard  four phase for a service release, first step is needed to stop the service if it is running, then I extract the artifacts coming from GitHub as 7zipped files, then I overwrite the directory where I’ve installed the service and finally I install the service if needed and restart it.

Before launching the release, you need to be sure that you have at least one release associated to that repository, in this example I have release 0.4.1 and others available.

image

Figure 4: Available releases for my GitHub repository

When you create a release (if the release is launched manually) you can choose GitHub release you want to use (if the release is automatic it will use release configured in the artifact configuration, usually latest one), the connection is done by Azure DevOps for you, no need to know credentials of GitHub, just choose the version you want to install and Bam, you are ready.

image

Figure 5: Choose the release you want to use directly from Azure DevOps

When the release starts, your target will download the workflow, it will instruct the agent to download artifacts from GitHub and then your scripts will run releasing the software.

image

Figure 6: Release completed, version 0.4.1 is now released on my machines.

As you can verify from detail page, artifacts are indeed downloaded by a GitHub standard release.

image

Figure 7: Artifacts downloaded directly from GitHub.

If everything runs successfully, you will have the new version installed on all machines part of deployment group used.

image

Figure 8: All steps executed successfully.

As you can see, Azure DevOps has a real powerful way to connect other services like GitHub and this is ideal to compensate the gap that other tools has at the moment. This leaves you free to compose your tooling chain, using the service that is best for the specific part.

Gian Maria.

Azure DevOps pipeline template for build and release .NET core project

Some days ago I’ve blogged on how to release projects on GitHub with actions, now it is time to understand how to do a similar thing in Azure DevOps to build / test / publish a .NET core library with nuget. The purpose is to create a generic template that can be reused on every general that needs to build an utility dll, run test and publish to a Nuget feed.

The ability to create template pipeline in Azure DevOps is a great opportunity to define a standard way to build / test /  deploy projects in your organization

Everything starts with a dedicated repository where I store a single build template file to create a MultiStage pipeline, where the first stage is a .NET core build test, and the second stage is publishing with nuget. Such simple build could be done with a single stage, but creating it with MultiStage gives me the opportunity to explain some interesting aspect of Azure DevOps pipelines.

Everything starts with parameters declaration.

parameters:
  buildName: 'Specify name'
  solution: ''
  buildPlatform: 'ANY CPU'
  buildConfiguration: 'Release'
  nugetProject: ''
  nugetProjectDir: ''
  dotNetCoreVersion: '2.2.301'
  pool: 'Default'
  nugetPublish: true

Every single parameter can have a default option, and can be overridden, after parameter first stage starts, build and test .NET core project.

jobs:

- job: 'Build_and_Test'
  pool:
    name: ${{parameters.pool}}

  steps:
  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

  - task: DotNetCoreCLI@2
    displayName: 'install if needed dotnet gitversion tool'
    inputs:
      command: 'custom'
      custom: 'tool'
      arguments: 'update GitVersion.Tool --tool-path $(Agent.ToolsDirectory)/gitversion/5.1.3 --version 5.1.3'
  
  - script: |
      $(Agent.ToolsDirectory)/gitversion/5.1.3/dotnet-gitversion $(Build.Repository.LocalPath) /output buildserver

  - powershell: |
      Write-Host "##vso[build.updatebuildnumber]${{parameters.buildName}}-$env:GITVERSION_FULLSEMVER"

      $var = (gci env:*).GetEnumerator() | Sort-Object Name
      $out = ""
      Foreach ($v in $var) {$out = $out + "`t{0,-28} = {1,-28}`n" -f $v.Name, $v.Value}

      write-output "dump variables on $env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      $fileName = "$env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      set-content $fileName $out

      write-output "##vso[task.addattachment type=Distributedtask.Core.Summary;name=Environment Variables;]$fileName"

  - task: DotNetCoreCLI@2
    displayName: 'dotnet restore'
    inputs:
      command: restore
      projects: '${{parameters.solution}}'
      feedsToUse: config
      nugetConfigPath: src/NuGet.Config

  - task: DotNetCoreCLI@2
    displayName: 'dotnet build'
    inputs:
      command: build
      projects: '${{parameters.solution}}'
      configuration: '$(BuildConfiguration)'
      arguments: /p:AssemblyVersion=$(GITVERSION.ASSEMBLYSEMVER) /p:FileVersion=$(GITVERSION.ASSEMBLYSEMFILEVER) /p:InformationalVersion=$(GITVERSION.SHA)

  - task: DotNetCoreCLI@2
    displayName: 'dotnet test'
    inputs:
      command: test
      nobuild: true
      projects: '${{parameters.solution}}'
    continueOnError: true

  - powershell: |
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "##vso[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "##vso[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
    name: 'SetGitVersionVariables'

You can recognize in this script many of the techniques already discussed in previous GitHub Action post, it is just declined for Azure DevOps. The main difference is that Actions are directed toward a simple way to execute a “script” composed by a series of commandline istruction and tasks while Pipelines are more structured to create a workflow, but everything is really similar.

Since this pipeline will run on Windows, I can simply use PowerShell task to execute inline script. The most peculiar part of the script is the last PowerShell script, that contains a series of Pipeline commands echoing ##vso in the output stream. The purpose of that step is to save some variable values to reuse in subsequent stages. This is a killer feature, in this example I runs GitVersion on first stage only, then pass all the output to be reused by subsequent Stages.

The ability to pass variable values between stages opens a wide range of opportunities, where you can run special tools on special agents, then reuse output in all other stages.

This is really handy if you need to execute subsequent stages, in different operating system / environment and you want to simply reuse some variable values that was calculate in some previous stage. Suppose you have a tool that runs only on windows, you can run in a stage, then reuse output in subsequent stages that runs in linux.

Publish stage is really simple, the only really interesting part is the declaration.

- job: 'Pack_Nuget'
  dependsOn: 'Build_and_Test'
  condition: eq(${{parameters.nugetPublish}}, true)

  pool:
    name: ${{parameters.pool}}

  variables:
    GITVERSION_ASSEMBLYSEMVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_ASSEMBLYSEMVER'] ]
    GITVERSION_ASSEMBLYSEMFILEVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_ASSEMBLYSEMFILEVER'] ]
    GITVERSION_SHA: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_SHA'] ]
    GITVERSION_FULLSEMVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_FULLSEMVER'] ]

The stage starts with a name and a dependency declaration on previous Build_and_test stage, this implies that this stage can run only if the previous stage run successfully. The execution is also dependent on a parameter called nugetPublish, that should be true for this stage to execute. This allows the pipeline that uses this template to choose if publish stage should run.

The ability to conditionally execute stages allows for complex workflow execution, where each stage can decide on following stages execution.

Following the declaration we can find a variables section, where I actually load variable from previous stage in this stage. In this specific example I’m retrieving all GitVersion output value that I need to build NugetPackage.

The stage ends with standard pack and publish of NugetPackage, using SemVer numbers that were passsed from previous stage.

steps:

  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

  - powershell: |
      echo "GITVERSION_ASSEMBLYSEMVER $(GITVERSION_ASSEMBLYSEMVER)"
      echo "GITVERSION_ASSEMBLYSEMFILEVER $(GITVERSION_ASSEMBLYSEMFILEVER)"
      echo "GITVERSION_SHA $(GITVERSION_SHA)"
      echo "GITVERSION_FULLSEMVER $(GITVERSION_FULLSEMVER)"
    name: 'Dumpvariables'

  - task: DotNetCoreCLI@2
    displayName: NuGet Pack
    inputs:
      command: custom
      custom: pack
      projects: ${{parameters.nugetProject}}
      arguments: -o "$(Build.ArtifactStagingDirectory)\NuGet" -c ${{parameters.BuildConfiguration}} /p:PackageVersion=$(GITVERSION_FULLSEMVER) /p:AssemblyVersion=$(GITVERSION_ASSEMBLYSEMVER) /p:FileVersion=$(GITVERSION_ASSEMBLYSEMFILEVER) /p:InformationalVersion=$(GITVERSION_SHA)

  - task: NuGetCommand@2
    displayName: NuGet Push
    inputs:
      command: push
      packagesToPush: '$(Build.ArtifactStagingDirectory)\NuGet\*.nupkg'
      nuGetFeedType: internal
      publishVstsFeed: '95a01998-aa90-433c-8077-41da981289aa'
    continueOnError: true

Once this template file is checked in in AzureDevOps repository, you can refer it from another project in the same Organization. This is the real power of templates, I wrote the definition one time in a dedicated repository and every other project that needs to declare a pipeline to build / test / publish can simply refers to this template. With a few lines of YAML you can create a pipeline for your new project.

trigger:
  branches:
    include:
      - master
      - develop
      - release/*
      - hotfix/*
      - feature/*

resources:
  repositories:
  - repository: templatesRepository
    type: git
    name: Jarvis/BuildScripts
    ref: refs/heads/develop

jobs:

- template: 'NetStandardTestAndNuget.yaml@templatesRepository'
    
  parameters:
    buildName: 'LicenseManager'
    solution: 'src/LicenseManager.sln'
    pool: '$(pool)'
    dotNetCoreVersion: '3.1.100'
    nugetPublish: true
    nugetProject: 'src/LicenseManager.Core/LicenseManager.Core.csproj'

Look at how simple it is, just define triggers, add repository that contains the build script in the resources section, and simple populate the parameters, and, BAM, your project has a pipeline for build / test / publish.

The ref parameter of reference section allows you to choose which branch use to grab script template, in this project I want the latest trunk version, so I’ve choose develop, other project can stay on master to have a better stable version.

The template once uses an old version of task of mine to perform GitVersion, that is become really obsolete and it is not useful to maintain anymore. I’ve decided to upgrade the template to use dotnet-gitversion command line tool, I’ve upgraded the template in a feature branch, using a project as test, then I’ve merged in develop and when I’ll finally close it on master, every project that uses this template, will use the new definition, without any user intervention.

Thanks to template I can upgrade the template in dedicated branch, test with actual project, then promote the upgrade through standard develop, release and master branch to automatically upgrade pipeline of all projects that uses this template.

How cool is that.

It is actually superfluous telling you how important  is to have an automatic build / test pipeline, as an example it seems that yesterday night I’ve broke the test Smile, shame on me.

image

Figure 1: Build results showing me that some test fails

The nice aspect of Azure DevOps pipeline is that they have a dedicated section to examine test failures, that gives me immediate insight on what went wrong. It seems that I’ve messed something in exception handling.

image

Figure 2: Dedicated pane to view test result.

Actually Azure DevOps pipelines are more complex than GitHub actions, but they can also solve more complex problems and are (at date of today) probably better suited for an enterprise, especially with the ability to define template to make a standard in how to build projects of the company. Another key value is the ability to immediately explore failed tests and code coverage for your build, not to mention multi stage pipeline to create complex build / release workflows.

Actually we have a superposition between Azure DevOps and GitHub pipelines, both of them now owned by Microsoft. My advice is just look at what are the capabilities as today and choose what better suites you.

Remember also that you can easily use Azure DevOps pipeline to build a GitHub project, just point the build to a GitHub repository after you connected your GitHub organization  / account to Azure DevOps. The only limitation is that build template file should still reside on AzureDevOps account (a limitation that probably will expire soon).

Remember that you can freely use Azure DevOps pipeline to build GitHub projects without any problem, just choose the product that better suites your need.

Gian Maria.