Continuous Integration in GitHub Actions, deploy in AzureDevops

My dear friend Matteo just published an interesting article on integration between GitHub actions and Azure Devops Pipeline here. I have a different scenario where I’ve already published a GitHub release from a GitHub action, but I have nothing ready in GitHub to release in my machines.

While GitHub is really fantastic for source code and starts having a good support for CI with Actions, in the release part it still miss a solution. Usually this is not a problem, because we have Azure DevOps or other products that can fill the gap.

This specific project is a firewall that closes every port in a machine and listen on UDP ports for a specific message to open other ports, thus, a machine where the service is installed has no way to be contacted unless you use the appropriate client to ask for port opening. I want the deploy to be automatic, no way I’m going to all my machines, login in RDP and then update the service, everything should happen automatically.

The really nice aspect of Azure DevOps release pipelines, is that, once you installed agents in one or more machines, those machines will contact Azure DevOps and pull works to do without the need for the machine to be contacted from outside world.

This is a key point of Azure DevOps release pipeline, you do not need to have any special setup in deploy target, you should simply let the target to be able to contact Azure DevOps site (https://dev.azure.com).

Another nice aspect of Azure DevOps release pipeline, is that it can use many sources for artifacts, not only those produced by Azure DevOps CI pipeline. When you add an artifacts, you can choose a GitHub release as well as Jenkins and other CI providers like Azure Artifacts (check Matteo’s article to see how to publish in azure artifacts from a GitHub Action)

image

Figure 1: Choose GitHub release as artifact source

To use GitHub as source you should have already connected your azure DevOps to GitHub with a service connection, another cool feature of Azure DevOps. As an administrator you can connect Azure DevOps account to GitHub, then give permission to specific people to use that service connection, without requiring them to know the real credentials to connect to the service (Github in this example). Once you have one or more connection active you can simply choose the repository to use. In Figure 2 You can see the configuration I choose for my project.

image

Figure 2: Configure GitHub release as artifact source.

Settings are: Repository (1), default version of the release to use (2) and finally alias you use for that specific artifact in your release (3). Remember that a release can have more than a single artifact as source, if you have a simple project like this, probably you have a single artifact.

Now you have the full power of Azure DevOps pipeline at your fingertips, in this specific example I just need to deploy a Windows Service and this is the pipeline to release in my stages.

image

Figure 3: Release pipeline for a Windows Service

This is a standard  four phase for a service release, first step is needed to stop the service if it is running, then I extract the artifacts coming from GitHub as 7zipped files, then I overwrite the directory where I’ve installed the service and finally I install the service if needed and restart it.

Before launching the release, you need to be sure that you have at least one release associated to that repository, in this example I have release 0.4.1 and others available.

image

Figure 4: Available releases for my GitHub repository

When you create a release (if the release is launched manually) you can choose GitHub release you want to use (if the release is automatic it will use release configured in the artifact configuration, usually latest one), the connection is done by Azure DevOps for you, no need to know credentials of GitHub, just choose the version you want to install and Bam, you are ready.

image

Figure 5: Choose the release you want to use directly from Azure DevOps

When the release starts, your target will download the workflow, it will instruct the agent to download artifacts from GitHub and then your scripts will run releasing the software.

image

Figure 6: Release completed, version 0.4.1 is now released on my machines.

As you can verify from detail page, artifacts are indeed downloaded by a GitHub standard release.

image

Figure 7: Artifacts downloaded directly from GitHub.

If everything runs successfully, you will have the new version installed on all machines part of deployment group used.

image

Figure 8: All steps executed successfully.

As you can see, Azure DevOps has a real powerful way to connect other services like GitHub and this is ideal to compensate the gap that other tools has at the moment. This leaves you free to compose your tooling chain, using the service that is best for the specific part.

Gian Maria.

Azure DevOps pipeline template for build and release .NET core project

Some days ago I’ve blogged on how to release projects on GitHub with actions, now it is time to understand how to do a similar thing in Azure DevOps to build / test / publish a .NET core library with nuget. The purpose is to create a generic template that can be reused on every general that needs to build an utility dll, run test and publish to a Nuget feed.

The ability to create template pipeline in Azure DevOps is a great opportunity to define a standard way to build / test /  deploy projects in your organization

Everything starts with a dedicated repository where I store a single build template file to create a MultiStage pipeline, where the first stage is a .NET core build test, and the second stage is publishing with nuget. Such simple build could be done with a single stage, but creating it with MultiStage gives me the opportunity to explain some interesting aspect of Azure DevOps pipelines.

Everything starts with parameters declaration.

parameters:
  buildName: 'Specify name'
  solution: ''
  buildPlatform: 'ANY CPU'
  buildConfiguration: 'Release'
  nugetProject: ''
  nugetProjectDir: ''
  dotNetCoreVersion: '2.2.301'
  pool: 'Default'
  nugetPublish: true

Every single parameter can have a default option, and can be overridden, after parameter first stage starts, build and test .NET core project.

jobs:

- job: 'Build_and_Test'
  pool:
    name: ${{parameters.pool}}

  steps:
  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

  - task: DotNetCoreCLI@2
    displayName: 'install if needed dotnet gitversion tool'
    inputs:
      command: 'custom'
      custom: 'tool'
      arguments: 'update GitVersion.Tool --tool-path $(Agent.ToolsDirectory)/gitversion/5.1.3 --version 5.1.3'
  
  - script: |
      $(Agent.ToolsDirectory)/gitversion/5.1.3/dotnet-gitversion $(Build.Repository.LocalPath) /output buildserver

  - powershell: |
      Write-Host "##vso[build.updatebuildnumber]${{parameters.buildName}}-$env:GITVERSION_FULLSEMVER"

      $var = (gci env:*).GetEnumerator() | Sort-Object Name
      $out = ""
      Foreach ($v in $var) {$out = $out + "`t{0,-28} = {1,-28}`n" -f $v.Name, $v.Value}

      write-output "dump variables on $env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      $fileName = "$env:BUILD_ARTIFACTSTAGINGDIRECTORY\test.md"
      set-content $fileName $out

      write-output "##vso[task.addattachment type=Distributedtask.Core.Summary;name=Environment Variables;]$fileName"

  - task: DotNetCoreCLI@2
    displayName: 'dotnet restore'
    inputs:
      command: restore
      projects: '${{parameters.solution}}'
      feedsToUse: config
      nugetConfigPath: src/NuGet.Config

  - task: DotNetCoreCLI@2
    displayName: 'dotnet build'
    inputs:
      command: build
      projects: '${{parameters.solution}}'
      configuration: '$(BuildConfiguration)'
      arguments: /p:AssemblyVersion=$(GITVERSION.ASSEMBLYSEMVER) /p:FileVersion=$(GITVERSION.ASSEMBLYSEMFILEVER) /p:InformationalVersion=$(GITVERSION.SHA)

  - task: DotNetCoreCLI@2
    displayName: 'dotnet test'
    inputs:
      command: test
      nobuild: true
      projects: '${{parameters.solution}}'
    continueOnError: true

  - powershell: |
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMVER)"
      echo "##vso[task.setvariable variable=GITVERSION_ASSEMBLYSEMFILEVER;isOutput=true]$(GITVERSION.ASSEMBLYSEMFILEVER)"
      echo "##vso[task.setvariable variable=GITVERSION_SHA;isOutput=true]$(GITVERSION.SHA)"
      echo "##vso[task.setvariable variable=GITVERSION_FULLSEMVER;isOutput=true]$(GITVERSION.FULLSEMVER)"
    name: 'SetGitVersionVariables'

You can recognize in this script many of the techniques already discussed in previous GitHub Action post, it is just declined for Azure DevOps. The main difference is that Actions are directed toward a simple way to execute a “script” composed by a series of commandline istruction and tasks while Pipelines are more structured to create a workflow, but everything is really similar.

Since this pipeline will run on Windows, I can simply use PowerShell task to execute inline script. The most peculiar part of the script is the last PowerShell script, that contains a series of Pipeline commands echoing ##vso in the output stream. The purpose of that step is to save some variable values to reuse in subsequent stages. This is a killer feature, in this example I runs GitVersion on first stage only, then pass all the output to be reused by subsequent Stages.

The ability to pass variable values between stages opens a wide range of opportunities, where you can run special tools on special agents, then reuse output in all other stages.

This is really handy if you need to execute subsequent stages, in different operating system / environment and you want to simply reuse some variable values that was calculate in some previous stage. Suppose you have a tool that runs only on windows, you can run in a stage, then reuse output in subsequent stages that runs in linux.

Publish stage is really simple, the only really interesting part is the declaration.

- job: 'Pack_Nuget'
  dependsOn: 'Build_and_Test'
  condition: eq(${{parameters.nugetPublish}}, true)

  pool:
    name: ${{parameters.pool}}

  variables:
    GITVERSION_ASSEMBLYSEMVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_ASSEMBLYSEMVER'] ]
    GITVERSION_ASSEMBLYSEMFILEVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_ASSEMBLYSEMFILEVER'] ]
    GITVERSION_SHA: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_SHA'] ]
    GITVERSION_FULLSEMVER: $[ dependencies.Build_and_Test.outputs['SetGitVersionVariables.GITVERSION_FULLSEMVER'] ]

The stage starts with a name and a dependency declaration on previous Build_and_test stage, this implies that this stage can run only if the previous stage run successfully. The execution is also dependent on a parameter called nugetPublish, that should be true for this stage to execute. This allows the pipeline that uses this template to choose if publish stage should run.

The ability to conditionally execute stages allows for complex workflow execution, where each stage can decide on following stages execution.

Following the declaration we can find a variables section, where I actually load variable from previous stage in this stage. In this specific example I’m retrieving all GitVersion output value that I need to build NugetPackage.

The stage ends with standard pack and publish of NugetPackage, using SemVer numbers that were passsed from previous stage.

steps:

  - task: DotNetCoreInstaller@2
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

  - powershell: |
      echo "GITVERSION_ASSEMBLYSEMVER $(GITVERSION_ASSEMBLYSEMVER)"
      echo "GITVERSION_ASSEMBLYSEMFILEVER $(GITVERSION_ASSEMBLYSEMFILEVER)"
      echo "GITVERSION_SHA $(GITVERSION_SHA)"
      echo "GITVERSION_FULLSEMVER $(GITVERSION_FULLSEMVER)"
    name: 'Dumpvariables'

  - task: DotNetCoreCLI@2
    displayName: NuGet Pack
    inputs:
      command: custom
      custom: pack
      projects: ${{parameters.nugetProject}}
      arguments: -o "$(Build.ArtifactStagingDirectory)\NuGet" -c ${{parameters.BuildConfiguration}} /p:PackageVersion=$(GITVERSION_FULLSEMVER) /p:AssemblyVersion=$(GITVERSION_ASSEMBLYSEMVER) /p:FileVersion=$(GITVERSION_ASSEMBLYSEMFILEVER) /p:InformationalVersion=$(GITVERSION_SHA)

  - task: NuGetCommand@2
    displayName: NuGet Push
    inputs:
      command: push
      packagesToPush: '$(Build.ArtifactStagingDirectory)\NuGet\*.nupkg'
      nuGetFeedType: internal
      publishVstsFeed: '95a01998-aa90-433c-8077-41da981289aa'
    continueOnError: true

Once this template file is checked in in AzureDevOps repository, you can refer it from another project in the same Organization. This is the real power of templates, I wrote the definition one time in a dedicated repository and every other project that needs to declare a pipeline to build / test / publish can simply refers to this template. With a few lines of YAML you can create a pipeline for your new project.

trigger:
  branches:
    include:
      - master
      - develop
      - release/*
      - hotfix/*
      - feature/*

resources:
  repositories:
  - repository: templatesRepository
    type: git
    name: Jarvis/BuildScripts
    ref: refs/heads/develop

jobs:

- template: 'NetStandardTestAndNuget.yaml@templatesRepository'
    
  parameters:
    buildName: 'LicenseManager'
    solution: 'src/LicenseManager.sln'
    pool: '$(pool)'
    dotNetCoreVersion: '3.1.100'
    nugetPublish: true
    nugetProject: 'src/LicenseManager.Core/LicenseManager.Core.csproj'

Look at how simple it is, just define triggers, add repository that contains the build script in the resources section, and simple populate the parameters, and, BAM, your project has a pipeline for build / test / publish.

The ref parameter of reference section allows you to choose which branch use to grab script template, in this project I want the latest trunk version, so I’ve choose develop, other project can stay on master to have a better stable version.

The template once uses an old version of task of mine to perform GitVersion, that is become really obsolete and it is not useful to maintain anymore. I’ve decided to upgrade the template to use dotnet-gitversion command line tool, I’ve upgraded the template in a feature branch, using a project as test, then I’ve merged in develop and when I’ll finally close it on master, every project that uses this template, will use the new definition, without any user intervention.

Thanks to template I can upgrade the template in dedicated branch, test with actual project, then promote the upgrade through standard develop, release and master branch to automatically upgrade pipeline of all projects that uses this template.

How cool is that.

It is actually superfluous telling you how important  is to have an automatic build / test pipeline, as an example it seems that yesterday night I’ve broke the test Smile, shame on me.

image

Figure 1: Build results showing me that some test fails

The nice aspect of Azure DevOps pipeline is that they have a dedicated section to examine test failures, that gives me immediate insight on what went wrong. It seems that I’ve messed something in exception handling.

image

Figure 2: Dedicated pane to view test result.

Actually Azure DevOps pipelines are more complex than GitHub actions, but they can also solve more complex problems and are (at date of today) probably better suited for an enterprise, especially with the ability to define template to make a standard in how to build projects of the company. Another key value is the ability to immediately explore failed tests and code coverage for your build, not to mention multi stage pipeline to create complex build / release workflows.

Actually we have a superposition between Azure DevOps and GitHub pipelines, both of them now owned by Microsoft. My advice is just look at what are the capabilities as today and choose what better suites you.

Remember also that you can easily use Azure DevOps pipeline to build a GitHub project, just point the build to a GitHub repository after you connected your GitHub organization  / account to Azure DevOps. The only limitation is that build template file should still reside on AzureDevOps account (a limitation that probably will expire soon).

Remember that you can freely use Azure DevOps pipeline to build GitHub projects without any problem, just choose the product that better suites your need.

Gian Maria.

GitHub actions improvements

GitHub actions is really new kid on the block and even if I still prefer Azure DevOps pipelines, because they are really more production ready, GitHub actions is rapidly evolving.

SNAGHTML3961c2

Figure 1: GitHub actions now has a dedicated editor for actions to quickly include actions

As you can see in Figure 1, when you edit workflow file in GitHub online editor you can simply browse all available actions. Choosing a specific action reveal the snippet of text you should enter to use that action without the need to search around.

image

Figure 2: Detail of the action with a nice button to copy action text in the clipboard.

This feature suggests that it is better to use GitHub online editor to create and edit your workflow files, even if they are simply text files that can be edited from your favorite code editor.

If you need to author a GitHub action file, always prefer online editor than a simple offline editor.

If you edit your workflow directly in GitHub you have also syntax checking to avoid errors, as you can see in Figure 3.

image 

Figure 3: Syntax checking during editing online

Syntax checking is not only available to check some classic YAML errors, like in Figure 3 where the editor spotted a basic error in indentation, but it can also check semantic errors based on action schema, so you can guide the user during editing.

image

Figure 4: Syntax highlighting during editing that can suggests syntax to author action

You have also diff online, so you can check what you really changed during editing session.

image

Figure 5: Diff on actions to verify what you changed in action file during editing session.

As you can see online editor is quite powerful and allows you to quick edit action definition directly on web editor.

Gian Maria.

Use latest OS image tag in GitHub actions

I have a nice GH action that runs some build and test on my project, now I noticed that some of the latest runs have some problem.

image

Figure 1: My action that ran only one of the matrix combination

Action has two distinct run because it has a matrix, actually I want to run it against Linux and Windows operating systems, but it seems that it does not run anymore on Windows.

GitHub actions can run the very same worfklow for a different combination of parameters, the most common setup is running on different operating systems.

A quick check reveals that the image I’m using is not available anymore.

image

Figure 2: Warning telling me that the image does not exists.

Actually I do not know why this is a simple warning and not a real error, but the reason is clear, the image windows-2016 is now removed so the corresponding matrix entry is invalid.

When you write your actions, it is probably better not to stick to a specific version of the image, but instead using the xxx-latest one to avoid similar problem.

image

Figure 3: If you can you should use latest version of all images.

The reason why I choose windows-2016 is that I have no 2019 container image for some of the image I’m using, so I forced older image version. Clearly now it is time for me to update my Docker Images (something I’ve already done if you read latest post) and move to windows-latest.

As a rule of thumb, if you do not have specific requirements, it is preferrable to choose xxxx-latest version of images to avoid such a problem

Gian Maria.

GitHub Actions plus Azure Docker Registry

I have some projects that needs SqlServer and MongoDb or ElasticSearch to run some integration tests, these kind of requirements made difficult to use hosted agent for build (in Azure DevOps) or whatever build system you are using where a provider gives you pre-configured machine to run your workflow. Usually each build engine made possible for you to run your own agent and GitHub actions makes no difference ( you can read here about self installed action runners https://help.github.com/en/actions/hosting-your-own-runners/about-self-hosted-runners)

Since running your own agents requires some work, using available hosted agents is the best solution, but since the list of possible software is almost infinite, you cannot blame Microsoft or other provider for not giving you some software preinstalled on build machines.

If you are using GitHub Actions, you can read the list of software pre-installed on GitHub action agents here: https://github.com/actions/virtual-environments/blob/master/images/win/Windows2019-Readme.md the list is quite big, but clearly is far for being enough if you need to run some sort of integration testing.

It is quite standard for a complex project to require some software to be preinstalled in Build Agent, making difficult to use already available public agents

In my situation I needs to run tests against MongoDb and Microsoft SqlServer, both of them missing from GH actions images. Luckily enough you have docker available, allowing you to run any pre-requisite that is available on a Docker Image.

In my specific situation I need to run the build in Windows and sadly enough there are no many Windows based container images like you have on Linux. In such scenario the solution is easy, just create your dockerfile based on Windows image and build your own images.

For Sql Server 2019  I was able to download dockerfile for Sql Server for windows Server 2016 and simply changing the base image to match Windows Server 2019 completes the trick. I did the very same with MongoDb, so I was able to build both Docker Images for Windows Server 2019 in mere minutes.

MongoDb and SqlServer images can be put on a public repository because it does not contains anything related to my software, but I want to be sure to being able to publish images to private repository because not every prerequisite can be really public. For this reason I opened a private Docker Registry on Azure.

image 

Figure 1: My private registry on Azure

Thanks to azure I was able to create a private registry with few clicks, now I can access repository using standard Azure Login, but if I need to do a standard Docker Login with UserName and Password I can use standard access keys to be used a username / password.

image

Figure 2: Access enabled to my registry thanks to standard username and Password

While this is not the most secure way to access your registry, using secure UserName and Password allows you to easily access the registry directly from a GitHub Action.

The usual question now is: where do I store UserName and Password for this registry in a secure way to be used by GitHub actions? The answer is: repo secrets.

image

Figure 3: Secrets settings can be added to every repository to store sensitive data to be used by GitHub actions.

Since only admins of the repository can create and manage secrets, you can ensure that no other than repo admin can read them. This is usually enough for secuirty but remember that, since you can use secret in GH Actions there is always the risk that someone that has access to the code can create an action to dump these secretes somewhere.

If secrets are secure enough, we can simply use a specific action to login to Azure Docker Registry using secrets stored in a GH Action.

    - uses: azure/docker-login@v1
      with:
        login-server: 'prxmdocker.azurecr.io' 
        username: '${{ secrets.azure_docker_user }}'
        password: '${{ secrets.azure_docker_token }}'

As you can see no key is directly stored inside my source code, the build can safely access my secrets, but no standard user can access them directly looking at code.

Actually this level of security is enough for most users, in my situation I only have SqlServer and Mongo Db and this is more than enough security, but I need you to remember again that, anyone that can manipulate an Action, could possibly create / modify an action definition to send him / her secrets.

Once everything is in place and thanks to the azure/docker-login@v1 action, I can use all docker images that are present in my azure registry.

    - name: Start Docker for MSSSql
      run: docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=sqlPw3$secure' -e 'MSSQL_PID=Developer' -p 1433:1433 --name mssql -d prxmdocker.azurecr.io/sql-2019:1.0
      
    - name: Start Docker for Mongodb
      run: docker run -d -p 27017:27017 prxmdocker.azurecr.io/mongodb-2019:1.0

As you can see I can simply ask to my action to start a Sql Server container and a MongoDb container using images taken from my private repository.

image

Figure 4: GitHub action running using images from my private repository

As you can see, accessing a private Docker Registry on Azure in a GitHub action is really simple, this allows you to automatically run pre-requisites for you actions with few lines of code. The only drawback is that, for big images, like that one with Sql Server (it is more than 1.5 GB of size) downloading the image will require some times in the action. In my example downloading and running my custom Sql Server image requires 5 minutes.

Gian Maria