Azure DevOps API, Retrieve Work Items Information

Post in the series:
1) API Connection

Now that we know how to connect to Azure DevOps services, it is time to understand how to retrieve information about Work Items to accomplish the requested task: export Work Items data inside a Word Document.

Once you connected to Azure DevOps account you start retrieving helper classes to work with the different functions of the service, if you need to interact with Work Items  you need a reference to the WorkItemStore class. Since this is the most common service I need to interact I simply get a reference in the connection class

image

Figure 1: Retrieving reference to the WorkItemStore helper class

At this point usually people goes to find a way to execute stored queries present in the service, but this is quite often the wrong way to got if you are working on a tool that needs to export information from Work Item, the right way to go is build your own query in code.

Creating a query in WIQL is the best way to programmatically query WorkItemStore to find what you need

Thanks to Microsoft, you have a full SQL Like query engine that supports a custom syntax called WIQL (Work Item Query Language) that you can use to find what you want. In my example I’m interested in exporting data belonging to a specific area path and iteration path (usually data of a sprint), and here is the code to retrieve what I need.

public List LoadAllWorkItemForAreaAndIteration(string areaPath, string iterationPath)
{
    StringBuilder query = new StringBuilder();
    query.AppendLine($"SELECT * FROM WorkItems Where [System.AreaPath] UNDER '{areaPath}' AND [System.IterationPath] UNDER '{iterationPath}'");
    if (!String.IsNullOrEmpty(_teamProjectName))
    {
        query.AppendLine($"AND [System.TeamProject] = '{_teamProjectName}'");
    }

    return _connection.WorkItemStore.Query(query.ToString())
        .OfType()
        .ToList();
}

As you can verify this is really a SQL LIKE query, but it is expressed with some concepts of Work Items, such as UNDER operator that allows me to query for Work Items whose AreaPath is under a certain path. As you can see, specifying the team project is completely optional, and it is present only for completeness, because if you specify area path and iteration you are actually filtering for Work Item of a specific team project. Once the query text is ready, just call the Query method of the WorkItemStore and do some LINQ manipulation to cast everything to WorkItem Class (helper class included in the Nuget Packages).

Querying Work Items is just: creating text query and call a method of the WorkItemStore

Another great advantage in using Nuget Packages is having intellisense that helps you working with results, here is a sample code used to dump all Work Items returned from the query.

WorkItemManger workItemManger = new WorkItemManger(connection);
workItemManger.SetTeamProject("zoalord insurance");
var workItems = workItemManger.LoadAllWorkItemForAreaAndIteration(
    "zoalord insurance", 
    "zoalord insurance\\Release 1\\Sprint 6");
foreach (var workItem in workItems)
{
    Log.Debug("[{Id}/{Type}]: {Title}", workItem.Id, workItem.Type.Name, workItem.Title);
}

Nothing could be simpler, thanks to intellisense I can simply dump id name and type of the work items in few lines of code, here is a sample output of the code.

image

Figure 2: Dump output of Work Items returned from the query.

As you can see, with very few lines of code I was able to connect to my Azure DevOps account, query for Work Items belonging to a specific iteration then dump information to console.

Next step will be creating a skeleton Word document with data instead of dumping data to console.

Gian Maria.

Please follow and like us:

Azure Devops API, Connection

One of the great benefit of using Azure DevOps is the ability to interact with the service through API calls, making it possible to extend the service with a few bunch of C#, or PowerShell or whatever language you want, because almost everything is exposed with REST API, and a simple HTTP call is enough.

Since I’m mostly a C# and .NET guy, I’ll explain how to build a C# program that interact with an Azure DevOps account, because thanks to Nuget Packages offered by Microsoft, you can interact with your account with Strongly Typed C# classes, so you can have intellisense and compile type checking to verify that everything is good.

C# helps using Azure DevOps API because you have helper client libraries that guides the programmer with Intellisense and documentation.

The first example I’m going to show, is how to retrieve a bunch of Work Items to export to a Word Document, a requirement that is really needed by everyone that uses the services. While there are commercial and non commercial tools out there, if you really need to extract Word Document with maximum customization a bunch of C# code can made your life easier.

Sample code accompanying this series of blog posts can be found at this address: https://github.com/alkampfergit/AzureDevopsWordPlayground where I’m pushing some code to export (migrate) data from Azure DevOps to word.

First of all we need to know how to setup the project and how to connect to an account and it turns out that is really simple, first of all create a Full Framework .NET project and add these nuget references.

Microsoft.TeamFoundationServer.Client
Microsoft.TeamFoundationServer.ExtendedClient
Microsoft.VisualStudio.Services.InteractiveClient
Microsoft.VisualStudio.Services.Client

With these packages you have a bunch of libraries that makes your life easier, especially for connection and interacting with base functions of the services.

Authentication can be done in several ways, but a token is the real way to go because it has several advantages over standard credentials.

When it is time to authenticate to the service, you have several options, but the most useful is using an Access Token because it has several benefits. First of all it does not require interactivity, second it can be revoked whenever you want, third you can generate a token with a reduced permission set, fourth it will expire automatically after one Year.

Using a token is perfect for a tool that should run unattended and this will be my usual first choice. To make debug life simpler I allow to specify the access token in a couple of way in command line, I can directly specify the token or I can specify the token that contains the file. The second method is useful for debugging, because I write my token in a file, then encrypt with standard NTFS routine to allow only my user to decrypt and use it, then I can configure the debugger to launch my console application with that token file and everything is secure, I do not incur the risk of incorrectly store my token file in some commit.

image

Figure 1: Project options, where I specify start options to specify token and other parameter to my application

As you see in Figure 1 I simply specify the c:\crypted\patOri.txt in the command line to made my software authenticate to the service, startup program is usually a normal console application.

Having a console application is really useful because it can be automated in a simple Bat or powershell file, it can be easily converted to a service with TopShelf and it is my usual way to go over a full WPF or Winform or Web application that require user interactivity.

Once I have an access token, here is my Connection helper class used to connect to my service.

 public class Connection
    {
        /// <summary>
        /// Perform a connection with an access token, simplest way to give permission to a program
        /// to access your account.
        /// </summary>
        /// 
        public Connection(String accountUri, String accessToken)
        {
            ConnectToTfs(accountUri, accessToken);
            _workItemStore = _tfsCollection.GetService();
        }

        private TfsTeamProjectCollection _tfsCollection;
        private WorkItemStore _workItemStore;

        public WorkItemStore WorkItemStore =&gt; _workItemStore;

        private bool ConnectToTfs(String accountUri, String accessToken)
        {
            //login for VSTS
            VssCredentials creds = new VssBasicCredential(
                String.Empty,
                accessToken);
            creds.Storage = new VssClientCredentialStorage();

            // Connect to VSTS
            _tfsCollection = new TfsTeamProjectCollection(new Uri(accountUri), creds);
            _tfsCollection.Authenticate();
            return true;
        }

The class does nothing more than create a VssBasicCredential with empty user and token as a password, use a standard VssClientCredentialStorage and pass everything to TfsTeamProjectCollection class, finally it calls the Authenticate method and if no exception is thrown, you are connected to the service.

As you can see, connecting and authenticating to an Azure DevOps account is just a matter of a few bunch of Nuget Packages and few C# lines of code.

Gian Maria.

Please follow and like us:

TFS 2019, Change Work Item Type and Move Between Team Project

When the first version of Team Foundation Server on Azure was presented, it has less feature than on-premise version, but actually Azure Dev Ops has changed the situation. The reality is that new features are first introduced into Azure Dev Ops, then on Azure Dev Ops Server (the on-premise version).

A couple of features were really missing on the on-premise version, the ability to change Work Item Type and the ability to move Work Items between projects. These two features were available from long time in the online version, but they were not present in the on-premise version until Azure DevOps server 2019, actually in RC1.

image

Figure 1: Change Type and Move to Team Project in Azure DevOps.

But if you installed Azure DevOps Server (TFS 2019) you could be disappointed because those two functions seems to be still missing from the product.

The real fact is that these two functions are actually present in the product, but are not available if Reporting Services is enabled. The reason is: changing Work Item type or moving between project will mess up the data in Warehouse database, so, if you want these two features, you need to disable reporting features. Everything is described in the Product Notes, but I noticed that most of the people missed this information.

To have Change Type and Move Work Item between Team Project you needs to disable Reporting Services feature from the product.

Reporting services is one of the feature that was often installed but never used by most people, so, if you are not using it, I suggest you to disable it from the administration console, because being able to change Work Item Type or to move Work Item between projects is a really more useful feature.

image

Figure 2: How to disable reporting in Administration console.

To disable reporting services you just open administration console, select Reporting node (1), then stop the job (2) and finally Disable Reporting features (3). You will be prompted to enter name of the server to confirm that you really want to disable Reporting, then you are done.

image

Figure 3: Warehouse and Reporting were disabled from instance.

Actually if you want to create custom reporting, I suggest you to start have a look to Power BI, that recently added a connector even for Azure DevOps server instance.

Once reporting is disabled, just refresh the Web UI and Move To Team Project and Change Type options should be available on all Team Projects of every Collection.

If you are not sure if anyone is actually using reporting feature, ask to the members of the team for usage of base or custom reporting or if there is some in-house built tool or third party tool that is reading data from the Warehouse Database.

If reporting services are actually used, Microsoft is encouraging you to try the Analytics marketplace extension (https://marketplace.visualstudio.com/items?itemName=ms.vss-analytics) or you can have a look at Power-BI.

Gian Maria

Please follow and like us:

Deploy click-once application on Azure Blob with Azure DevOps

It was a long time ago I blogged on how to publish a click-once application from a VSTS Build to Azure Blob, long time was passed, and lots of stuff changed. The whole process is now simpler, thanks to many dedicated tasks that avoid doing any manual work.

My new build always start with a  GitVersion custom tasks, that populates some environment variables with version numbers generated by GitVersion, this will allow me to simply add an MsBuild task in the build to publish click-once using automatic GitVersion versioning.

image

Figure 1: MsBuild task to publish click once application

You need to build the csproj that contains the project, using $(BuildConfiguration) to build in the right configuration and adding custom argument arguments (3) as the following list

/target:publish 
/p:ApplicationVersion=$(AssemblyVersion)  
/p:PublishURL=https://aaaaaaa.blob.core.windows.net/atest/logviewer/  
/p:UpdateEnabled=true  
/p:UpdateMode=Foreground 
/p:ProductName=Jarvis.LogViewer 

ApplicationVersion is set to variable $(AssemblyVersion) that was populated by GitVersion task, publish Url is a simple address of an azure blob storage, that contains a blob container called atest, with a  folder named logviewer that has public read access.

image

Figure 2: Public read access for a simple blob in my azure subscription

This will allows me to have a simple public blob, with a public address, that everyone can read, so everyone can download my application.

Azure blob can be given a public read access to everyone, making your click-once application available to everyone.

MsBuild task simply creates a app.publish subfolder that contains all the files that needs to be copied into the blob azure storage, the first step is a Copy file Task to copy them in the artifacts directory, as for the following picture.

image

Figure 3: All files of click-once publishing were copied in the artifacts staging directory.

Now everything is ready, just add a final task of type Azure File Copy and configure to copy everything from artifacts directory to the right subfolder of azure blob.

SNAGHTML18f3d81

Figure 4: Configuration of Azure Blob File to deploy click-once generated files.

Configuration is really simple, because thanks to a direct connection to Azure Subscription (3) we can simply connect Azure Dev Ops to an azure subscription so we can simply select storage account (4) blob name (5) and subfolder of the blob (6).

One of the advantage of the dedicated Azure File Copy Task, is that it can simply use one of  the Azure account linked to the Azure DevOps account, making super-simple to choose right blob where to upload click-once package.

Once you’ve finished configuration, you can simply run a build and your application will be automatically published.

image

Figure 5: Summary of the build with click once publishing.

Now you can simply point to the setup.exe file with the address of the blob and the game is done.

Gian Maria.

Please follow and like us:

Run code coverage for Python project with Azure DevOps

Creating a simple build that runs Python tests written with PyTest framework is really simple, but now the next step is trying to have code coverage. Even if I’m pretty new to Python, having code coverage in a build is really simple, thanks to a specific task that comes out-of-the-box with Azure DevOps: Publish Code Coverage.

In Azure DevOps you can create build with Web Editor or with simple YAML file, I prefer YAML but since I’ve demonstrated in the old post YAML build for Python, now I’m creating a simple build with standard Web Editor

Instead of creating a Yaml Build, this time I’m going to demonstrate a classic build: here is the core part of the build.

image

Figure 1: Core build to run tests and have code coverage uploaded to Azure DevOps

As you can see, I decided to run test with a Bash script running on Linux, here is the task configuration where I’ve added Pytest options to have code coverage during test run.

image

Figure2: Configuration of Bash script to run Pytests

The task is configured to run an inline script (1), command line (2) contains –cov options to specify Python modules I want to monitor for code coverage, then a couple of –cov-report options to have output in xml and HTML format. Finally I’ve specified the subfolder that contains the module I want to test (3) and finally I’ve configured the task con Continue on Error (4), so if some of the tests fails the build will be marked as Partially failed.

Thanks to Pytest running code coverage is just a matter of adding some options to command line

After a build finished you can find in the output how Pytest generates Code Coverage reporting, it create a file called coverage.xml then an entire directory called htmlcov that contains a report for code coverage.

image

Figure 3: Result of running tests with code coverage.

If you look at Figure 1 you can see that the build final task is a Publish Code Coverage Task, whose duty is to grab output of the Pytest run and upload to the server. Configuration is really simple, you need to choose Cobertura as Code coverage tool (the format used by Pytest) and the output of test run. Looking at output of Figure 3 you can double check that the summary file is called coverage.xml and all the report directory is in htmlcov subdirectory.

image

Figure 4: Configuration for Publish Code Coverage task.

Once you run the build, you can find Code Coverage result on the summary page, as well as Code Coverage Report published as Build artifacts, the whole configuration will take you no more than 10 minutes.

image

Figure 5: Artifacts containing code coverage reports as well as code coverage percentage are accessible from Build Summary page.

Finally you have also a dedicated tab for Code Coverage, showing the HTML summary of the report

image

Figure 6: Code coverage HTML report uploaded in a dedicated Code Coverage tab in build result

Even if the code coverage output is not perfectly formatted you can indeed immediately verify percentage of code coverage of your test.

Gian Maria.

Please follow and like us: