Sample report for Azure DevOps

Reporting was always a pain point in Azure DevOps, because people used on SQL Server reporting Services for the on-premise version, missed a similar ability to create custom reports in Azure Dev Ops.

Now you have a nice integration with Power BI and a nice article here that explains how to connect Power BI to your instance and create some basic query. The nice part is that you can use a query that will connect directly with the OData feed, no need to install anything.

Power BI - Advanced Editor - Replace strings in query

Figure 1: Sample OData query that directly connects to your organization account to query for Work Item Data.

I strongly encourage you to experiment with Power BI because is a powerful tool that can create really good report, closing the gap with on-premise version and old Reporting Services.

Gian Maria.

Azure DevOps gems, YAML Pipeline and Templates

If you read my blog you already know that I’m a great fan of YAML Pipeline instead of using Graphic editor in the Web UI, there are lots of reasons why you should use YAML; one for all the ability to branch Pipeline definition with code, but there is another really important feature: templates.

There is a really detailed documentation on MSDN on how to use this feature, but I want to give you a complete walkthrough on how to start to effectively use templates. Thanks to templates you can create a standard build definition with steps or jobs and steps in a template file, then reference that file from real build, just adding parameters.

The ability to capture a sequence of steps in a common template file and reuse it over and over again in real pipeline is probably one of the top reason for moving to YAML template.

One of the most common scenario for me is: account with lots of utilities projects (multitargeted for full framework and dotnetstandard), each one with its git repository and the need for a standard CI definition to:

1) Build the solution
2) Run tests
3) Pack a Nuget Package with semantic versioning
4) Publish Nuget Package inside an Azure DevOps private package repository

If you work on big project you usually have lots of these small projects: Utilities for Castle, Serilog, Security, General etc. In this scenario it is really annoying to define a pipeline for each project with Graphical editor, so it is pretty natural moving to YAML. You can start from a standard file, copy it in the repository and then adapt for the specific project, but when a task is updated, you need to re-update all the project to update all the reference. With this approach the main problem is: after some time the builds are not anymore in sync and each project start to behave differently.

I start defining my template once, in a dedicated repository, then I can reuse it in any project. When the template changes, I want to be able to manually update all pipelines to reference the new version or, even better, decide which project will be updated automatically.

Lets start with the real build file, that is included in the real repository and lets check how to reference a template stored in another repository. The only limit is that the repository should be in the same organization or in GitHub. Here is full content of the file.

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

resources:
  repositories:
    - repository: templatesRepository
      type: git
      name: Jarvis/BuildScripts
      ref: refs/heads/hotfix/0.1.1

jobs:

- template: 'NetStandardTestAndNuget.yaml@templatesRepository'
  
  parameters:
    buildName: 'JarvisAuthCi'
    solution: 'src/Jarvis.Auth.sln'
    nugetProject: 'src/Jarvis.Auth.Client/Jarvis.Auth.Client.csproj'
    nugetProjectDir: 'src/Jarvis.Auth.Client'

The file is really simple, it starts with the triggers (as for a standard YAML build), then it comes a resources section, that allows you to references objects that lives outside the pipeline; in this specific example I’m declaring that this pipeline is using a resource called templateRepository, an external git repository (in the same organization)  called BuildScripts and contained in Team Project called Jarvis; finally the ref property allows me to choose the branch or the tag to use with standard refs git syntax (refs/heads/master, refs/heads/develop, refs/tags/xxxx, etc). In this specific example I’m freezing the version of the build script to the tag 0.1.0, if the repository will be upgraded this build will always reference version 0.1.0. This imply that, if I change BuildScripts repository, I need to manually update this build to reference newer version. If I want this definition to automatically use new versions  I can simply reference master or develop branch.

The real advantage to have the template versioned in another repository is that it can use GitFlow so, every pipeline that uses the template, can choose to use specific version, or latest stable or even latest development.

Finally I start to define Jobs, but instead of defining them inside this YAML file I’m declaring that this pipeline will use a template called NetStandardTestAndNuget.yaml contained in the resource templatesRepository. Following template reference I specify all the parameters needed by the template to run. In this specific example I have four parameters:

buildName: The name of the build, I use a custom Task based on gitversion that will rename each build using this parameter followed by semversion.
solution: Path of solution file to build
nugetProject: Path of the csproject that contains the package to be published
nugetProjectDir: Directory of csproject to publish

The last parameter could be determined by the third, but I want to keep YAML simple, so I require the user of the template to explicitly pass directory of the project that will be used as workingDirectory parameter for dotnet pack command.

Now the real fun starts, lets examine template file contained in the other repository. Usually a template file starts with a parameters section where it declares the parameters it is expecting.

parameters:
  buildName: 'Specify name'
  solution: ''
  buildPlatform: 'ANY CPU'
  buildConfiguration: 'Release'
  nugetProject: ''
  nugetProjectDir: ''
  dotNetCoreVersion: '2.2.301'

As you can see the syntax is really simple, just specify name of the parameter followed by the default value. In this example I really need four parameters, described in the previous part.

Following parameters section a template file can specify steps or event entire jobs, in this example I want to define two distinct jobs, one for build and run test and the other for nuget packaging and publishing

jobs:

- job: 'Build_and_Test'
  pool:
    name: Default

  steps:
  - task: DotNetCoreInstaller@0
    displayName: 'Use .NET Core sdk ${{parameters.dotNetCoreVersion}}'
    inputs:
      version: ${{parameters.dotNetCoreVersion}}

As you can see I’m simply writing a standard jobs section, that starts with the job Build_and_test that will be run on Default Pool. The jobs starts with a DotNetCoreInstaller steps where you can see that to reference a parameter you need to use special syntax ${{parameters.parametername}}. The beautiful aspect of templates is that they are absolutely like a standard pipeline definition, just use ${{}} syntax to reference parameters.

Job Build_and_test prosecute with standard build test tasks and it determines (with gitversion) semantic version for the package. Since this value will be use in other jobs, I need to made it available with a specific PowerShell task.

  - powershell: echo "##vso[task.setvariable variable=NugetVersion;isOutput=true]$(NugetVersion)"
    name: 'SetNugetVersion'

This task simply set variable $(NugetVersion) as variable NugetVersion but with isOutput=true to made it available to other jobs in the pipeline. Now I can define the other job of the template to pack and publish nuget package.

- job: 'Pack_Nuget'
  dependsOn: 'Build_and_Test'

  pool:
    name: Default

  variables:
    NugetVersion: $[ dependencies.Build_and_Test.outputs['SetNugetVersion.NugetVersion'] ]

The only difference from previous job is the declaration of variable NugetVersion with a special syntax that allows to reference it from a previous job. Now I simply trigger the build from the original project and everything run just fine.

image

Figure 1: Standard build for library project, where I use the whole definition in a template file.

As you can see, thanks to Templates, the real pipeline definition for my project is 23 lines long and I can simply copy and paste to every utility repository, change 4 lines of codes (template parameters) and everything runs just fine.

Using templates lower the barrier for Continuous integration, every member of the team can start a new Utility Project and just setup a standard pipeline even if he/she is not so expert.

Using templates brings a lots of advantage in the team, to add to standard advantages of using plain YAML syntax.

First: you can create standards for all pipeline definitions, instead of having a different pipeline structure for each project, templates allows you to define a set of standard pipeline and reuse for multiple projects.

Second: you have automatic updates: thanks to the ability to reference templates from other repository it is possible to just update the template and have all the pipelines that reference that template to automatically use new version (reference a branch). You keep the ability to pin a specific version to use if needed (reference a tag or a specific commit).

Third: you lower the barrier for creating pipelines for all team members that does not have a good knowledge of Azure Pipelines, they can simply copy the build, change parameters and they are ready to go.

If you still have pipelines defined with graphical editor, it is the time to start upgrading to YAML syntax right now.

Happy Azure Devops.

Converting a big project to .NET Standard without big bang

When you have a big project in .NET full framework and you want to convert to .NET standard / core, usually MultiTargeting can be a viable solution to avoid a Big Bang conversion. You starts with the very first assembly in the chain of dependency, the one that does not depends on any other assembly in the project, and you start checking compatibility with .NET standard for all referenced NuGet Packages. Once the first project is done you proceed with the remaining.

The very first step is converting all project files to new project format, leave all project to target full framework, then you can use a nice technique called MultiTargeting starting with the aforementioned first assembly of the chain. 

Multitargeting allows you to target both framework with a single Visual Studio Project

To enable it just edit project files, and change TargetFramework to TargetFrameworks (mind the final s) and specify that you want that project compiled for Full Framework and .NET Standard. Including .NET Standard in the list of target framework requires removal of all code that is dependent on Full Framework, but this is not always obtainable in a single big bang conversion, because the amount of code could be really high. 

image

Figure 1: Multi Targeting in action

To easy the transition I usually add some constants so I’ll be able to use #ifdef directive to isolate some piece of code only when the project is targeting Full Framework.

  <PropertyGroup Condition=" '$(TargetFramework)' == 'netstandard2.0'">
    <DefineConstants>NETCORE;NETSTANDARD;NETSTANDARD2_0</DefineConstants>
  </PropertyGroup>

  <PropertyGroup Condition=" '$(TargetFramework)' == 'net461'">
    <DefineConstants>NET45;NETFULL</DefineConstants>
  </PropertyGroup>

Thanks to conditional compile we can have some of the code that is compiled only for full framework, or we can have different implementation for a given class (full or netstandard)

After multitarget is enabled and netstandard is one of the target,  project usually stops compiling, because it usually contains some code that depends on full framework. There are two distinct problems: A) nuget packages that does not support netstandard, B) references to full framework assembly. To solve the problem you must use conditional referencing, setting the references only for specific framework version.

image

Figure 2: Conditional reference in action.

As you can see in Figure 2 I can reference difference nuget packages or assemblies depending on the version of framework used. The nice part is being able to reference different version of a library (ex System.Buffers) for full framework or .net standard framework.

At this point the project usually stops compiling because code references classes that are not available in netstandard, as an example in Figure 2 you can verify that netstandard misses lots of references like System.Runtime.Caching and System.Runtime.Remoting. For all code that uses references not available for netstandard project just use a #ifdef NETFULL to compile those classes only with full framework. This is not a real complete solution, but at the end you will have your project that is still fully functional with .NET full framework, and a nice list of #ifdef NETFULL that identify the only parts that are not available in netstandard. Now you can continue working as usual while slowly removing and upgrading all the code to netstandard.

Now repeat the process this for every project in the solution, usually there are two scenarios:

1) The vast majority of the code is full functional with netstandard, there are very few point in the code where you use #ifdef NETFULL.
2) Some of the classes / functionality available only in full framework are extensively used in all of your code, the amount of code with #ifdef NETFULL is becoming too big

Point 1 is the good side, you can continue working with full framework then convert remaining code with your pace. If you find yourself in point 2, as an example if some code uses MSMQ you should isolate the full framework depending code in a single class, then use Inversion Of Control to inject concrete class. As an example, instead of having lots of points in the code that uses MSMQ simply abstract all the code in a IQueue interface, create a MSMQQueue class and you have a single point of code that is not available for netstandard. You can then write code that uses Rabbit MQ and the problem is gone with a simple class rewrite.

Lets do an example: I have a InMemoryCacheHelper to abstract the usage of MemoryCache, and since MemoryCache class from System.Runtime.Caching is not available in netstandard, I simply protect the class with #if NETFULL conditional compiling.

image

Figure 3: Conditional compilation, this is version of the class for NET Full.

Looking in the documentation there is a nuget package called Microsoft.Extensions.Caching.Memory meant to be a replacement for MemoryCache standard full framework class. I can use this NuGet packages to create an implementation of InMemoryCacheHelper compatible with netstandard.

When you enable multitargeting it is quite common to manually edit references in project file, because references are different from Full Framework build and NetStandard build.

Remember that you need to reference the full framework version (1) for net461 compilation, but reference nuget package for netstandard version. This can be done manually editing references in csproj file Figure 4.

image

Figure 4: Reference the correct assembly or NugetPackage based on framework version.

Now you come back to the InMemoryCacheHelper class add an #else branch to the #ifdef NETFULL directive and start writing a version of the class that uses Microsoft.Extensions.Caching.Memory. One class after another you will have all of your code that is able to target both net full and netcore. You can rewrite the entire class or you can use the same class and use #if NETFULL inside each method, I prefer the first approach but this is what happens when I’m starting editing the netstandard version of the class

image

Figure 5: No highlight and no intellisense because we are in a conditional compilation branch that evaluates to false.

Ouch, since we are in a branch of conditional compilation that evaluate to false, VS does not offer highligh, no intellisense, everything is greyed out. At this point you should be aware that, when you use multitargeting, the order of the frameworks in the project file matters. The first framework in <TargetFrameworks> node is the one that VS would use to evaluate conditional compilation. This means that, when you are working on classes or piece of code that should target both full framework and NET standard, you need to change the order to fit your need.

In this example I needed to change <TargetFrameworks>net461;netstandard2.0;</TargetFrameworks> to <TargetFrameworks>netstandard2.0;net461</TargetFrameworks>, save project file and unload and reload the project (sometimes you need to force a project reload) and Visual Studio will consider netstandard2.0 during code editing.

image

Figure 6: Reversing the order of the frameworks, will enable intellisense for netstandard branch of the conditional compilation directive.

Now you can edit the netstandard version of your class and convert one by one all parts of the code that does not natively run on netstandard.

Happy coding.

How to configure Visual Studio as Diff and Merge tool for Git

After almost six years, the post on How to configure diff and merge tool in Visual Studio Git Tools is still read by people that found it useful, but it is now really really old and needs to be updated.

That post was written when Visual Studio 2012 was the latest version and the integration with Git was still really young, made with an external plugin made by Microsoft and with really basic support. If you use Visual Studio 2017 or greater, you can simply go to to Team Explorer and open settings of the repository.

image

figure 1: Git repository settings inside Visual Studio Team Explorer

Settings pane contains a specific section for Git, where you can configure settings for the current repository or Global settings, valid for all repository of current user.

image

Figure 2: Git settings inside Visual Studio

If you open Repository Settings usually you find that no specific diff and merge tool is set. Merge and Diff configurations are typical settings that are made at User level and not for each single repository.

image

Figure 3: Diff and Merge tool configuration inside Visual Studio.

As you can see, in Figure 3 no diff or merge tool was set for the current repository, this means that it will use the default one for the user (in my situation is none). If you use only Visual Studio this settings is not so useful, if you have a conflict during merge or rebase visual studio will automatically show conflicts and guide you during merging.

If you are inside Visual Studio it will handle diff and merge automatically, even if it is not configured as Diff or Merge Tool. The rationale behind this choice is: If you are inside a tool (like VS) that has full support for diff and merge, the tool will automatically present you with diff and merge capabilities without checking repo configuration.

This happens because when you open a Git Repository, Visual Studio monitors the status of the Repository and, if some operation has unresolved conflicts, it shows the situation to the user, without the need to do anything. The settings in Figure 3 is useful only if you are operating with some other tool or with command line, if you got a conflict during an operation started from any other tool (GUI or command line) the procedure is:
1) Opening VS
2) from VS Team Explorer localize local git repository and open it
3) go to team explorer changes pane to start resolving conflicts

If you configured instead VS as diff and tool you can simply issue a git mergetool command and everything is done automatically without any user intervention. But to be honest, latest VS git integration is really good and it is surely better to manually open local repository. As an example, if you are doing a rebase from commandline and you got conflicts, it is better to manually open VS, solve the conflict, then continue rebase operation inside VS. If you got further conflicts, you do not need to wait for VS to reopen with git mergetool command.

But if you really want to configure VS as Diff and Merge tool, if you press “Use Visual Studio” button (Figure 3) you can modify your local gitconfig. The net result is similar to what I suggested on my old post, VS just adds the six sections for diff and merge in config file.

image

Figure 4: Git diff and merge section as saved from Visual Studio 2019 preview

If Visual Studio is your tool of choice I simply suggest you to configure it globally (file is named %userprofile%\.gitconfig) so you can invoke Merge tool from everywhere and have Visual Studio to handle everything.

Gian Maria.

Retrieve Attachment in Azure DevOps with REST API

In a previous post I’ve dealt on how to retrieve image in Work Items description or Comments with a simple WebClient request, using network credentials taken from  TfsTeamProjectCollection class.

The solution presented in that article is not complete, because it does not works against Azure Devops, but only against a on-premise TFS or Azure DevOps Server. If you connect to Azure DevOps you will find that the Credentials of the TfsTeamProjectCollection class are null, thus you cannot download the attachment because the web request is not authenticated.

To be completely honest, TfsTeamProjectCollection class is quite obsolete, it uses old webservices, but it is really useful if you have lots of code accumulated on the years that uses it and it works perfectly with newest version of the service.

Azure DevOps has a different authentication scheme from on-premise version, thus you have no Network Credentials to do a simple web request for attachment if you use old TfsTeamProjectCollection class.

The key to the solution of the problem is using the new HTTP REST API and the good news is that you can use old C# API based on SOAP server and new REST API in the same project without a problem. Here is the correct code for the connection

image

Figure 1: Connecting to the server.

Code in Figure 1 is valid for on-premise server or for Azure Dev Ops server and if you are running the code from a program with a UI (like a WPF application) it will show the Azure DevOps login page to perform the login with azure Authentication. This is the coolest part of the API, few lines of code and you can connect to the service without worrying of authentication method

Once the application is connected, the VssConnection class can be used to grab references to a series of clients dedicated to access various sections of the service. After connection I immediately retrieve a reference to the WorkItemTrakingHttpClient class. Remember that all services/client that contains the Http string in the name are based on the new REST api.

_workItemTrackingHttpClient = _vssConnection.GetClient< WorkItemTrackingHttpClient >();

Thanks to this client, we can perform various query to the WorkItemStore and we can use this object to download any attachment. The only thing I need to do is to use a regex to parse attachment uri to grab two information, fileName and fileId (A guid).

The great benefit of using a client, instead of raw WebClient call, is that you does not need to worry about auth, everything is handled by the library.

A typical attachment url has this format https://dev.azure.com/accountName/3a600197-fa66-4389-aebd-620186063db0/_apis/wit/attachments/a92c440e-374e-4349-a26c-b9ba553e1264?fileName=image.png  and we need to extract file name (image.png) and file id (a92c440e-374e-4349-a26c-b9ba553e1264) the portion of url after attachments part. One possible regex is _apis/wit/attachments/(?<fileId>.*)\?fileName=(?<fileName>.*)  and it allows me to extract fileId and fileName from the url of the attachment. Once you have fileId and fileName parameters you have a dedicated call to download the attachment.

using (var fs = new FileStream(downloadedAttachment, FileMode.Create))
{
    var downloadStream = ConnectionManager.Instance
        .WorkItemTrackingHttpClient
        .GetAttachmentContentAsync(new Guid(fileId), fileName, download: true).Result;
    using (downloadStream)
    {
        downloadStream.CopyTo(fs);
    }
}

The variable downloadedAttachment is a temp file name where I want to download the attachment, I simply open a writer stream with FileMode.Create, then call the GetAttachmentContentAsync method of the WorkItemTrackingHttpClient that returns a Task<Stream> and finally I copy attachment stream to destination stream (temporary file name) to physically download the file.

When you interact with Azure DevOps with API, you always should try to use the official client instead of using raw WebClient class, this will save you time and headache.

Gian Maria.