Azure DevOps and SecDevOps

One of the cool aspect of Azure DevOps is the extendibility through marketplace api, and for security you can find a nice marketplace addin called Owasp ZAP ( that can be used to automate OWASP test for web application.

You can also check this nice article in MSDN that explain how you can leverage OWASP ZAP analysis during a deploy with release pipeline.

REally good stuff to read / use.

Another gem of Azure Devops, multistage pipelines

With deployment of Sprint 151 we have an exciting news for Azure DevOps called multi stage pipelines. If you read my blog you should already know that I’m a huge fan of having YAML build definition, but until now, for the release part, you still had to have the standard graphical editor. Thanks to Multi Stage Pipelines now you can have both build and release definition directly in a single YAML file.

Multi stage pipelines will be the unified way to create a yaml file that contains both build and release definition for your projects.

This functionality is still in preview and you can have a good starting point here, basically we still miss some key features, but you can read in previous post about what’s next for them, and this should reassure you that this is an area where Microsoft is investing a lot.

Let’s start to create first real pipeline to deploy an application based on IIS, first of all I’m starting with an existing YAML build, I just create another yaml file, then I can copy all the existing YAML of an existing build, but in the head of the file I’m using a slightly different syntax


Figure 1: New Multistage pipeline definition

As you can see the pipeline starts with name stages, then a stage section starts, that basically contains a standard build, in fact I have one single job in Build Stage, a job called Build_and_package that takes care of building testing and finally publish artifacts.

After the pipeline is launched, here is what I have as result (Figure 2):


Figure 2: Result of a multistage pipeline

As you can see the result is really different from a normal pipeline, first of all I have all the stages (actually my deploy job is fake and does nothing). As you can see the pipeline is now composed by Stages, where each stage contains jobs, and each jobs is a series of tasks. Clicking on Jobs section you can see the outcome of each jobs, this allows me to have a quick look at what really happened.


Figure 3: Job results as a full list of all jobs for each stage.

When it is time to deploy, we target environments, but unfortunately in this early preview we can only add kubernetes namespace to an environment, but we are expecting soon to be able to add Virtual Machines through deployment groups and clearly azure web apps and other azure resources.

I strongly encourage you to start familiarizing with the new syntax, so you will be able to take advantage of this new feature as soon at it will be ready.

Gian Maria

Converting Existing pipeline to YAML, how to avoid double builds

Actually YAML build is the preferred way to create Azure DevOps Build Pipeline and converting existing build is really simple thanks to the “View YAML” button that can simply convert every existing pipeline in a YAML definition.


figure 1: Converting existing Pipeline in YAML is easy with the View YAML button present in editor page.

The usual process is, start a new feature branch to test pipeline conversion to YAML, create the YAML file and a Pipeline based on it, then start testing. Now a problem arise: until the YAML definition is not merged in ANY branch of your Git repository, you should keep the old UI Based Build and the new YAML build togheter.

What happens if a customer calls you because it has a bug in an old version, you create a support branch and then realize that in that branch the YAML build definition is not present. What if the actual YAML script is not valid for that code? The obvious solution is to keep the old build around until you are 100% sure that the build is not needed anymore.

During conversion from legacy build to YAML it is wise to keep the old build around for a while.

This usually means that you start to gradually remove triggers for branches until you merge all the way to master or the last branch, then you leave the definition around without trigger for a little while, finally you delete it.

The real problem is that usually there is a transition phase where you want both the old pipeline definition to run in parallel with the YAML one, but this will create a trigger for both the build at each publish.


Figure 2: After a push both build, the old UI Based and the new based on YAML were triggered.

From figure 2 you can understand the problem: each time I push, I have two build that were spinned. Clearly you can start setting up triggers for build to handle this situation, but it is usually tedious.

The very best situation would be, trigger the right build based on the fact that the YAML definition file is present or not.

A viable solution is: abort the standard build if the corresponding YAML Build file is present in the source. This will perfectly work until YAML build file reach the last active branch, after that moment you can disable the trigger on the original Task based build or completely delete the build because all the relevant branches have now the new YAML definition.

To accomplish this, you can simple add a PowerShell Task in the original build, with a script that checks if the YAML file exists and if the test is positive aborts current build. Luckly enough I’ve found a script ready to use: Many tanks for the original author of the script. You can find the original script in GitHub and you can simply take the relevant part putting inside a standard PowerShell task.


Figure 3: Powershell inline task to simply abort the build.

The script is supposed to work if you have a variable called Token where you place a Personal Access Token with sufficient permission to cancel the build, as explained in the original project on GitHub.

Here is my version of the script

if (Test-Path Assets/Builds/BaseCi.yaml) 
    Write-Host "Abort the build because corresponding YAML build file is present"
    $url = "$($env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI)$env:SYSTEM_TEAMPROJECTID/_apis/build/builds/$($env:BUILD_BUILDID)?api-version=2.0"
    $pat = "$(token)" 
    $pair = ":${pat}"
    $b  = [System.Text.Encoding]::UTF8.GetBytes($pair)
    $token = [System.Convert]::ToBase64String($b)
    $body = @{ 'status'='Cancelling' } | ConvertTo-Json
    $pipeline = Invoke-RestMethod -Uri $url -Method Patch -Body $body -Headers @{
        'Authorization' = "Basic $token";
        'Content-Type' = "application/json"
    Write-Host "Pipeline = $($pipeline)"
   write-host "YAML Build is not present, we can continue"

This is everything you need, after this script is added to the original build definition, you can try to queue a build for a branch that has the YAML build definition on it and wait for the execution to be automatically canceled, as you can see in Figure 4:


Figure 4: Build cancelled because YAML definition file is present.

With this workaround we still have double builds triggered, but at least when the branch contains the YAML file, the original build definition will imeediately cancel after check-out, because it knows that a corresponding YAML build was triggered. If the YAML file is not present, the build runs just fine.

This is especially useful because it avoid human error, say that a developer manually trigger the old build to create a release or to verify something, if he trigger the old build on a branch that has the new YAML definition, the build will be automatically aborted, so the developer can trigger the right definition.

Gian Maria.

Error publishing .NET core app in Azure Devops YAML Build

Short story, I’ve created a simple YAML build for a .NET core project where one of the task will publish a simple .NET core console application. After running the build I’ve a strange error in the output

No web project was found in the repository. Web projects are identified by presence of either a web.config file or wwwroot folder in the directory.

This is extremely strange, because the project is not a web project, it is a standard console application written for .NET core 2.2 so I do not understand why it is searching a web.config file.

Then I decided to create a standard non YAML build, and when I dropped the task on the build I immediately understood the problem. This happens because dotnet core task with command publish is assuming by default that a web application is going to be published.


Figure 1: Default value for the dotnet publish command is to publish Web Project

Since I’ve no Web Project to publish I immediately change  my YAML definition to explicitly set to False the publishWebProjects property.

  - task: DotNetCoreCLI@2
    displayName: .NET Core Publish
      command: publish
      projects: '$(serviceProject)'
      arguments: '--output $(Build.ArtifactStagingDirectory)'
      configuration: $(BuildConfiguration)
      workingDirectory: $(serviceProjectDir)
      publishWebProjects: False
      zipAfterPublish: true

And the build was fixed.

Gian Maria.

Azure DevOps is now 150 sprints old

I remember old days when Azure DevOps was still in private preview, and yet it was really a good product, now 150 sprints passed, and the product is better than ever. Not everything is perfect, but, as users, we can expect new feature to being deployed each 3 weeks, the duration of Microsoft Sprint.

This means that now the product is 450 Weeks old, and finally we got a little nice feature that shows up news in the front page.


Figure 1: Widget with new feature of newest deployed sprint

This allows users to immediately being notified of new feature in their accounts, with a nice summary with key new features. In this sprint we have the new Task Assistant to help editing YAML pipelines, and many new feature, like the new agent administration ui.


Figure 1: New administration page in action.

The new page is more consistent with the look and feel of the rest of the service, also it shows wait time and build duration when you drill down in a pool.

As always I cannot stress out how good is to have all of your project administration tool in the Cloud, no time spent to upgrade, no time spent to verify and check backup policies, and, completely free for the first 5 users.

Gian Maria.