How to edit a YAML Azure DevOps Pipeline

I cannot stress you enough on how better is the experience of having builds defined in code than having build definition on the server, so I’m here to convince you to move to the new YAML build system in Azure DevOps :).

Having build definition in Code gives you many benefits, the first is that builds evolve with code branches.

If you still think that editing a YAML file is a daunting experience because you have tons of possible tasks and configuration to use, take a peek to the Azure Pipeline extension Visual Studio Code Addin, that brings intellisense for your pipeline editing in Visual Studio Code. I strongly encourage you to have a look at the YAML schema reference to have a complete knowledge of the syntax, but for most people a quick approach to the tool is enough, leaving the deep dive for when they need to do complex stuff.

With the extension enabled, after you opened a YAML Build definition in Visual Studio Code, you can click on the YAML button in the lower right part of visual studio code editor to change language

image

Figure 1: Language mode selection of Visual Studio Code

That area is the Language Mode Selection, and it is where you specify to Visual Studio Code what is the language of the file you are editing. If you simply open a YAML file, VS Code recognize the yaml extension and helps using standard YAML syntax, but it does not know anything about Azure DevOps Pipeline.

When you tell Visual Studio Code that the file is a YAML pipeline, intellisense kicks out and allows you to quickly edit the file

Thanks to the Language Mode Selector, we can now specify that the file is a Azure Pipeline file and not a standard YAML file.

image

Figure 2: Selecting the right language type allows VS Code to give you tremendous help in editing the file.

This is everything you need to do, from now on, VS Code will give you helps in the context of Azure DevOps pipeline syntax. Even if the file is completely empty the editor shows you possible choices for the first level nodes

image

Figure 3: Suggestions on empty file

Since I usually start specifying the pool, I can simply choose pool, then let VS Code guide me in the compilation of all properties

image

Figure 4: Intellisense in action editing the file

In real scenario you usually starts from some template file (another advantage of having build in code), you already prepared with standard build for you project, but even in that scenario having intellisense to refine the build will help you in choosing tasks.

image

Figure 5: Help in choosing tasks

I can assure that, after some usage, it is far more powerful and quick to edit a build with VS Code than to edit a standard build made with tasks in the Web based editor. Graphical editor are powerful and are a good entry point for those who does not know the instrument, but intellisense powered editors are more productive and powerful.

image

Figure 6: You do not have only intellisense to choose the task, but it will shows you also information about the task

The only drawback I found is using custom tasks that were not recognized by the intellisense, as my GitVersion Task, that was marked as wrong because VS Code does not know it.

image

Figure 7: Custom tasks were not automatically recognized by VS Code

Intellisense will completely remove the need of the old trick of creating a build with the old editor, place tasks in the pipeline and then letting the tool generates YAML definition based on how you configured the task in graphical editor. I assure you that it is faster to directly copy a reference build and then add needed tasks with intellisense in VS Code than using UI editor.

If you are really a UI Oriented person, in the latest release of Azure DevOps (at the time of writing the feature is rolling out so it is not available on all accounts) you can use the YAML Task Assistant

Badge

Figure 8: YAML Task assistant in action

The assistant allows you to configure the Task with the very same UI experience you have in UI Based pipeline, once the task is configured you can simply add corresponding YAML to the definition.

Task assistant gives you the same add experience for tasks of the old UI editor, so you can configure the task with graphic editor, then add corresponding YAML syntax to the definition.

I think that with Task Assistant there are no more excuses not to move to YAML based definition.

Gian Maria

Troubleshoot YAML Build first run

Scenario: You create a branch in your git repository to start with a new shiny YAML Build definition for Azure Devops, you create a yaml file, push the branch in Azure Devops and Create a new Build based on that YAML definition. Everything seems ok, but when you press the run button you got and error

Could not find a pool with name Default. The pool does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.

image

Figure 1: Error running your new shiny pipeline

Ok this is frustrating and following the link gives you little clue on what really happened. The problem is that, with the new editor experience, when you navigate to the pipeline page, all you see is the editor of YAML build and nothing more.

image

Figure 2: New Editor page of YAML pipeline, advanced editor and nothing more.

The new editor is fantastic, but it somewhat hides standard configuration parameters page, where the default branch can be set. As you can see from Figure 2 you can specify pool name (default) and triggers directly in YAML build so you think that this is everything you need, but there is more. Clicking on the three buttons in the right upper corner you can click on the trigger menu to open the old editor.

image

Figure 3: Clicking on the Triggers menu item will bring on the old UI

This is where the YAML pipeline experience still needs some love, you are surely puzzled why you need to click triggers menu item if you already specified triggers directly in the YAML definition, but the reason is simple, it will open the old pipeline editor page.

The new editor page with YAML editor is fantastic, but you should not forget that there are still some parameters, like default branch, that are editable from the old interface

Trigger page is not really useful, it only gives you the ability to override the YAML configuration, but the important aspect is that we can now access the first tab of the YAML configuration to change default branch.

image

Figure 4: Trigger page is not useful, but now we can access default configuration for the pipeline.

image

Figure 5: Default configuration tab where you can edit default branch

In Figure 5 you can now understand what went wrong, the wizard created my pipeline using master as default branch, but clearly my buid YAML file does not exists in master, but exists only in my feature branch. Yust change the default build to the branch that contains your build definition file, save and queue again; now everything should word again.

This trick works also when you got errors not being authorized to use endpoints, like sonar endpoint, nuget endpoint etc.

Happy YAML Building experience.

Gian Maria.

Build and Deploy Asp.Net App with Azure DevOps

I’ve blogged in the past about deploying ASP.NET application, but lots of new feature changed in Azure DevOps and it is time to do some refresh of basic concepts. Especially in the field of web.config transform there is always lots of confusion and even if I’m an advocate of removing every configuration from files and source, it is indeed something that worth to be examined.

The best approach for configuration is removing then from source control, use configuration services, etc and move away from web.config.

But since most people still use web.config, lets start with a standard ASP.NET application with a Web.Config and a couple of application settings that should be changed during deploy.

image

Figure 1: Simple configuration file with two settings

When it is time to configure your release pipeline, you MUST adhere to the mantra: Build once, deploy many. This means that you should have one build that prepares the binaries to be installed, and the very same binaries will be deployed in several environment.

Since each environment will have a different value for app settings stored in web.config, I’ll start creating a web config transform for the Release configuration (then one that will be released), changing each configuration with a specific token.

image

Figure 2: Transformation file that tokenize the settings

In Figure 2 I show how I change the value of Key1 setting to __Key1__ and Key2 to __Key2__. This is necessary because I’ll replace these value with the real value during release.

The basic trick is changing configuration values in files during the build, setting some tokenized value that will be replaced during release. Using double underscore as prefix and suffix is enough for most situations.

Now it is time to create a build that generates the package to install. The pipeline is really simple, the solution is build with MsBuild with standard configuration for publishing web site. I’ve used MsBuid and not Visual Studio Task, because I do not want to have Visual Studio on my build agent to build, MsBuild is enough.

image

Figure 3: Build and publish web site with a standard MsBuild task.

If you run the build you will be disappointed because resulting web.config is not transformed, but it remains with the very content of the one in source control. This happens because transformation is an operation that is not done during standard web site publishing, but from Visual Studio when you use publish wizard. Luckly enough there is a task in preview that performs web.config transformation, you can simply place this task before MsBuild task and the game is done.

image

Figure 4: File transform task is in preview but it does its work perfectly

As you can see in Figure 4, you should simply specify the directory of the application, then choose XML transformation and finally the option to use web.$(BuildConfiguration).config transformation file to transform web.config.

Now you only need to copy the result of the publish into the artifact staging directory, then upload with standard upload artifact task.

image

Figure 5: Copy result of the publish task into staging directory and finally publish the artifact.

If you read other post of my blob you know that I usually place a PowerShell script that reorganize files, compress etc, but for this simple application it is perfectly fine to copy the _PublishedWebsites/ directory as build artifact.

image

Figure 6: Published artifacts after the build completes.

Take time to verify that the output of the build (Artifacts) is exactly what you expected before moving to configure the release.

Before going to build the release phase, please download the web.config file and verify that the substitution were performed and web.config contains what you expected.

image

Fiure 7: Both of my settings were substituted correctly.

Now it is time to create the release, but first of all I suggest you to install this extension  that contains a nice task to perform substitution during a release in an easy and intuitive way.

One of the great power of Azure DevOps is extensibility, there are tons of custom task to perform lots of different task, so take time and look in the marketplace if you are not able to find the Task you need from basic ones.

Lets start creating a simple release that uses the previous build as artifact, and contains two simple stages, dev and production.

image

Figure 8: Simple release with two stages to deploy the web application.

Each of the two stages have a simple two task job to deploy the application and they are based on the assumption that each environment was already configured (IIS installed, site configure etc), so, to deploy our asp.net app, we can simply overwrite the old installation folder and replace with the new binaries.

The Replace Token task comes in hand in this situation, you simply need to add as the first task of the job (before the task that copies file into IIS directory), then configure prefix and suffix with the two underscore to match criteria used to tokenize configuration in web.config

image

Figure 9: Configure replace token suffix and prefix to perform substitution.

In this example only web.config should be changed, but the task can perform substitution on multiple files.

image

Figure 10: Substition configuration points to web.config file.

The beautiful aspect of transform task is that it uses all the variables of the release to perform substitution. For each variable it replace token using prefix and suffix, this is the reason of my transformation release file in the build; my web.config file has __Key1__ and __Key2__ token inside configuration, so I can simply configure those two variables differently for the two environment and my release is finished.

If you use Grid visualization it is immediate to understand how each stage is configured.

image

Figure 11: Configure variables for each stage, the replace task will do the rest.

Everything is done, just trigger a release and verify that the web config of the two stages is changed accordingly.

image

Figure 12: Sites deployed in two stages with different settings, everything worked as expected.

Everything worked good, I was able to build once with web.config tokenization, then release the same artifacts in different stages with different configurations managed by release definition.

Happy AzDo

YAML Build in Azure DevOps

I’ve blogged in the past about YAML build in azure DevOps, but in that early days, that kind of build was a little bit rough and many people still preferred the old build based on visual editing in a browser. One of the main complaint was that the build was not easy to edit and there were some glitch, especially when it is time to access external services.

After months from the first version, the experience is really improved and I strongly suggest you to start trying to migrate existing build to this new system, to take advantage of having definition of build directly in the code, a practice that is more DevOps oriented and that allows you to have different build tasks for different branches.

YAML Build is now a first class citized in Azure DevOps and it is time to plan switching old build to the new engine.

You can simply start with an existing build, just edit it, select one of the phases (or the entire process) then press View YAML button to grab generated YAML build.

SNAGHTML2fecbc

Figure 1: Generate YAML definition from an existing build created with the standard editor

Now you can simply create a yaml file in any branch of your repository, paste the content in the file, commit to the branch and create a new build based on that file. I can clearly select not only AzDO repositories, but I can build also GitHub and GitHub enterprise

image

Figure 2: I can choose GitHub as source repository, not only azure repos

Then I can choose the project searching in all the project I have access to with my access token used to connect GitHub

image

Figure 3: Accessing GitHub repositories is simple, once you connected the acount with an access token AzDO can search in repositories

Just select a repository and select the option Existing Azure Pipelines, if you are starting from scratch you can create a starter pipeline.

image

Figure 4: Choose the option to use an existing pipeline.

You are ready to go, just choose branch and yaml file and the game is done.

image

Figure 5: You can directly specify the build file in the pipeline creation wizard.

Converting an existing build pipeline to YAML it is matter of no more than 10 minutes of your time.

Now you can simply run and edit your build directly from the browser, the whole process took no more than 10 minutes, including connecting my AzDO account to GitHub

image

Figure 6: Your build is ready to run inside Azure DevOPS.

Your build is now ready to run without any problem. If you specified triggers as in Figure 6 you can just push to the repository to have the build automatically kicks in and being executed. You can also directly edit the build in the browser, and pressing the Run button (Figure 6) you can trigger a run of the build without the need to push anything.

But the coolness of actual YAML build editor starts to shine when you start editing your build in the web editor, because you have intellisense, as you can see in Figure 7.

image

Figure 7: YAML build web editor has now intellisense.

As you can see the YAML build editor allows you to edit with full intellisense support, if you want to add a task, you can simply start writing task followed by a semicolon and the editor will suggest you all the tasks-available. When it is time to edit properties, you have intellisense and help for each task parameters, as well as help for the entire task. This is really useful because it immediately spots deprecated tasks (Figure 9)

image

Figure 8: All parameters can be edited with fully intellisense and help support

image

Figure 9: Helps gives you helps for the deprecation of old task that should not be used.

With the new web editor with intellisense, maintaining a YAML build is now easy and not more difficult than the standard graphical editor.

Outdated tasks are underlined in green, so you can immediately spot where the build definition is not optimal, as an example if I have a task that have a new version, the old version is underlined in green, and the intellisense suggests me that the value is  not anymore supported. This area still need some more love, but it works quite well.

There is no more excuses to still use the old build definition based on web editor, go and start converting everything to YAML definition, your life as bulid maintainer will be better :)

Gian Maria.

Sonar Analysis of Python with Azure DevOps pipeline

Once you have test and Code Coverage for your build of Python code, last step for a good build is adding support for Code Analysis with Sonar/SonarCloud. SonarCloud is the best option if your code is open source, because it is free and you should not install anything except the free addin in Azure Devops Marketplace.

From original build you need only to add two steps: PrepareAnalysis onSonarCloud and Run SonarCloud analysis, in the same way you do analysis for a .NET project.

image

Figure 1: Python build in Azure DevOps

You do not need to configure anything for a standard analysis with default options, just follow the configuration in Figure 2.:

image

Figure 2: Configuration of Sonar Cloud analysis

The only tricks I had to do is deleting the folder /htmlcov created by pytest for code coverage results. Once the coverage result was uploaded to Azure Devops server I do not needs it anymore and I want to remove it from sonar analysis. Remember that if you do not configure anything special for Sonar Cloud configuration it will analyze everything in the code folder, so you will end up with errors like these:

image

Figure 3: Failed Sonar Cloud analysis caused by output of code coverage.

You can clearly do a better job simply configuring Sonar Cloud Analysis to skip those folder, but in this situation a simple Delete folder task does the job.

To avoid cluttering SonarCloud analysis with unneeded files, you need to delete any files that were generated in the directory and that you do not want to analyze, like code coverage reports.

Another important settings is the Advances section, because you should specify the file containing code coverage result as extended sonar property.

image

Figure 4: Extra property to specify location of coverage file in the build.

Now you can run the build and verify that the analysis was indeed sent to SonarCloud.

image

Figure 5: After the build I can analyze code smells directly in sonar cloud.

If you prefer, like me, YAML builds, here is the complete YAML build definition that you can adapt to your repository.

queue:
  name: Hosted Ubuntu 1604

trigger:
- master
- develop
- features/*
- hotfix/*
- release/*

steps:

- task: UsePythonVersion@0
  displayName: 'Use Python 3.x'

- bash: |
   pip install pytest 
   pip install pytest-cov 
   pip install pytest-xdist 
   pip install pytest-bdd 
  displayName: 'Install a bunch of pip packages.'

- task: SonarSource.sonarcloud.14d9cde6-c1da-4d55-aa01-2965cd301255.SonarCloudPrepare@1
  displayName: 'Prepare analysis on SonarCloud'
  inputs:
    SonarCloud: SonarCloud
    organization: 'alkampfergit-github'
    scannerMode: CLI
    configMode: manual
    cliProjectKey: Pytest
    cliProjectName: Pytest
    extraProperties: |
     # Additional properties that will be passed to the scanner, 
     # Put one key=value per line, example:
     # sonar.exclusions=**/*.bin
     sonar.python.coverage.reportPath=$(System.DefaultWorkingDirectory)/coverage.xml

- bash: 'pytest --junitxml=$(Build.StagingDirectory)/test.xml --cov --cov-report=xml --cov-report=html' 
  workingDirectory: '.'
  displayName: 'Run tests with code coverage'
  continueOnError: true

- task: PublishTestResults@2
  displayName: 'Publish test result /test.xml'
  inputs:
    testResultsFiles: '$(Build.StagingDirectory)/test.xml'
    testRunTitle: 010

- task: PublishCodeCoverageResults@1
  displayName: 'Publish code coverage'
  inputs:
    codeCoverageTool: Cobertura
    summaryFileLocation: '$(System.DefaultWorkingDirectory)/coverage.xml'
    reportDirectory: '$(System.DefaultWorkingDirectory)/htmlcov'
    additionalCodeCoverageFiles: '$(System.DefaultWorkingDirectory)/**'

- task: DeleteFiles@1
  displayName: 'Delete files from $(System.DefaultWorkingDirectory)/htmlcov'
  inputs:
    SourceFolder: '$(System.DefaultWorkingDirectory)/htmlcov'
    Contents: '**'

- task: SonarSource.sonarcloud.ce096e50-6155-4de8-8800-4221aaeed4a1.SonarCloudAnalyze@1
  displayName: 'Run Sonarcloud Analysis'

The only settings you need to adapt is the name of the SonarCloud connection (in this example is called SonarCloud) you can add/change in Project Settings > Service Connections.

image

Figure 6: Service connection settings where you can add/change connection with Sonar Cloud Servers.

A possible final step is adding the Build Breaker extension to your account that allows you to made your build fails whenever the Quality Gate of SonarCloud is failed.

Thanks to Azure DevOps build system, creating a build that perform tests and analyze your Python code is extremely simple.

Happy Azure Devops.

Gian Maria