Azure DevOps and SecDevOps

One of the cool aspect of Azure DevOps is the extendibility through marketplace api, and for security you can find a nice marketplace addin called Owasp ZAP (https://marketplace.visualstudio.com/items?itemName=kasunkodagoda.owasp-zap-scan) that can be used to automate OWASP test for web application.

You can also check this nice article in MSDN https://devblogs.microsoft.com/premier-developer/azure-devops-pipelines-leveraging-owasp-zap-in-the-release-pipeline/ that explain how you can leverage OWASP ZAP analysis during a deploy with release pipeline.

REally good stuff to read / use.

WIQL editor extension For Azure DevOps

One of the nice feature of Azure DevOps is extendibility, thanks to REST API you can write addins or standalone programs that interacts with the services . One of the addin that I like the most is the Work Item Query Language Editor, a nice addin that allows you to interact directly with the underling syntax of Work Item query.

Once installed, whenever you are in query Editor, you have the ability to directly edit the query with WIQL syntax, thanks to the “Edit Query wiql” menu entry.

image

Figure 1: Wiql query editor new menu entry in action

As you can see in Figure 2, there are lots of nice feature in this addin, not only the ability to edit a query directly in WIQL syntax.

image

Figure 2: WIQL editor in action

You can clearly edit and save the query (3) but you can also export the query into a file that will be downloaded into your pc, and you can then re-import in a different Team Project. This is a nice function if you want to store some typical queries somewhere (source control) then re-import in different Team Project, or for different organization.

If you start editing the query, you will be amazed by intellisense support (Figure 3), that guides you in writing correct query, and it is really useful because it contains a nice list of all available fields.

image

Figure 3: Intellisense in action during Query Editor.

The intellisense seems to actually using API to grab a list of all the valid fields, because it suggests you even custom fields that you used in your custom process. The only drawback is that it lists all the available fields, not only those one available in the current Team Project, but this is a really minor issue.

Having intellisense, syntax checking and field suggestion, this addin is a really must to install in your Azure DevOps instance.

image

Figure 4: Intellisense is available not only on default field, but also on custom fields used in custom process.

If you are interested in the editor used, you can find that this addin uses the monaco editor, another nice piece of open source software by Microsoft.

Another super cool feature of this extension, is the Query Playground, where you can simply type your query, execute it and visualize result directly in the browser.

image

Figure 5: Wiql playground in action, look at the ASOF operator used to issue query in the past.

As you can see from Figure 5, you can easily test your query, but what is most important, the ASOF operator is fully supported and this guarantees you the ability to do historical queries directly from the web interface, instead of resorting using the API. If you need to experiment with WIQL and you need to quick create and test a WIQL query, this is the tool to go.

I think that this addin is really useful, not only if you are interacting with the service with REST API and raw WIQL, but also because it allows you to export/import queries between projects/organization and allows you to execute simply historycal queries directly from the ui.

Having the full support of WIQL allows you to use features that are not usually available through the UI, like the ASOF operator.

As a last trick, if you create a query in the web UI, then edit with this addin and add ASOF operator then save, the asof will be saved in the query, so you have an historical query executable from the UI. The only drawback is that, if you modify the query with the web editor and then save, the ASOF operator will be removed.

Gian Maria.

TFS 2019, Change Work Item Type and Move Between Team Project

When the first version of Team Foundation Server on Azure was presented, it has less feature than on-premise version, but actually Azure Dev Ops has changed the situation. The reality is that new features are first introduced into Azure Dev Ops, then on Azure Dev Ops Server (the on-premise version).

A couple of features were really missing on the on-premise version, the ability to change Work Item Type and the ability to move Work Items between projects. These two features were available from long time in the online version, but they were not present in the on-premise version until Azure DevOps server 2019, actually in RC1.

image

Figure 1: Change Type and Move to Team Project in Azure DevOps.

But if you installed Azure DevOps Server (TFS 2019) you could be disappointed because those two functions seems to be still missing from the product.

The real fact is that these two functions are actually present in the product, but are not available if Reporting Services is enabled. The reason is: changing Work Item type or moving between project will mess up the data in Warehouse database, so, if you want these two features, you need to disable reporting features. Everything is described in the Product Notes, but I noticed that most of the people missed this information.

To have Change Type and Move Work Item between Team Project you needs to disable Reporting Services feature from the product.

Reporting services is one of the feature that was often installed but never used by most people, so, if you are not using it, I suggest you to disable it from the administration console, because being able to change Work Item Type or to move Work Item between projects is a really more useful feature.

image

Figure 2: How to disable reporting in Administration console.

To disable reporting services you just open administration console, select Reporting node (1), then stop the job (2) and finally Disable Reporting features (3). You will be prompted to enter name of the server to confirm that you really want to disable Reporting, then you are done.

image

Figure 3: Warehouse and Reporting were disabled from instance.

Actually if you want to create custom reporting, I suggest you to start have a look to Power BI, that recently added a connector even for Azure DevOps server instance.

Once reporting is disabled, just refresh the Web UI and Move To Team Project and Change Type options should be available on all Team Projects of every Collection.

If you are not sure if anyone is actually using reporting feature, ask to the members of the team for usage of base or custom reporting or if there is some in-house built tool or third party tool that is reading data from the Warehouse Database.

If reporting services are actually used, Microsoft is encouraging you to try the Analytics marketplace extension (https://marketplace.visualstudio.com/items?itemName=ms.vss-analytics) or you can have a look at Power-BI.

Gian Maria

Azure DevOps pipelines and Sonar Cloud gives free analysis to your OS project

In previous post I’ve shown how easy is to create a YAML definition to create a build definition to build your GitHub Open Source project in Azure DevOps, without the need to spend any money nor installing anything on you server.

Once you create a default build that compile and run tests, it would be super nice to create a free account in SonarCloud to have your project code to be analyzed automatically from the Azure Pipeline you’ve just created. I’ve already blogged on how to setup SonarCloud analysis for OS project with VSTS build and the very same technique can be used in YAML build.

Once you have free YAML Azure DevOps pipeline, it makes sense to enable analysis with SonarCloud

First of all you need to register to SonarCloud, create a project, setup key and create a token to access the account. Once everything is in place you can simply modify YAML build to perform the analysis.

image

Figure 1: Task to start sonar cloud analysis.

The above task definition can be obtained simply creating a build with standard graphical editor, then press the YAML build to have the  UI generate the YAML for the task.

Actually YAML build does not have an editor, but it is super easy to just create a fake build with standard editor, drop a task into the definition, populate properties then let the UI to generate YAML that can be copied into the definition.

Once the analysis task is in place, you can simply place the “Run code analysis task” after build and test tasks. The full code of the build is the following.

# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net

pool:
  vmImage: 'VS2017-Win2016'

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

variables:
  solution: 'migration/MigrationPlayground.sln'
  buildPlatform: 'Any CPU'
  buildConfiguration: 'Release'

steps:

- task: GitVersion@1
  displayName: GitVersion 
  inputs:
    BuildNamePrefix: 'MigrationCI'

- task: SonarSource.sonarqube.15B84CA1-B62F-4A2A-A403-89B77A063157.SonarQubePrepare@4
  displayName: 'Prepare analysis on SonarQube'
  inputs:
    SonarQube: 'SonarCloud'
    projectKey: xxxxxxxxxxxxxxxxxxx
    projectName: MigrationPlayground
    projectVersion: '$(AssemblyVersion)'
    extraProperties: |
     sonar.organization=alkampfergit-github
     sonar.branch.name=$(Build.SourceBranchName)

- task: NuGetToolInstaller@0

- task: NuGetCommand@2
  inputs:
    restoreSolution: '$(solution)'

- task: VSBuild@1
  inputs:
    solution: '$(solution)'
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: VSTest@2
  inputs:
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: SonarSource.sonarqube.6D01813A-9589-4B15-8491-8164AEB38055.SonarQubeAnalyze@4
  displayName: 'Run Code Analysis'




Once you changed the build just push the code and let the build run, you should check if the build completes without error, then verify if analysis is present in SonarCloud dashboard.

A couple of suggestion are useful at this point: first of all you can encounter problem with endpoint authorization, if you have such problem check this link. Another issue is that you should analyze master branch for the first analysis for SonarCloud to work properly. Until you do not analyze master branch, no analysis will be shown to SonarCloud.

If everything is green you should start seeing analysis data on SonarCloud UI.

image

Figure 2: Analysis in SonarCloud after a successful master build

As you can see just a few lines of YAML and I have my code automatically analyzed in SonarCloud, thanks to Azure DevOps pipelines that already have tasks related to SonarCube integration.

A nice finishing touch is to grab the badge link for SonarCloud analysis and add it to your github readme.md.

image

Figure 3: SonarCloud badge added to readme.md of the project.

Gian Maria.

Code in GitHub, Build in Azure DevOps and for FREE

When you create a new open source project in GitHub, one of the first step is to setup continuous integration; the usual question is: What CI engine should I use? Thanks to Azure Dev Ops, you can use free build pipelines to build projects even if they are in GitHub (not hosted in Azure Dev Ops)

Azure Dev Ops, formerly known as VSTS, allows to define free build pipelines to build projects in GitHub

After you created a new project in GitHub, you can simply login to your Azure Dev Ops account (https://dev.azure.com/yourname) then go to azure pipelines and start creating a new pipeline.

image

Figure 1: Wizard to create a new pipeline in Azure DevOps

As you can see, you can choose GitHub repository or Azure Repositories. This is the latest UI available for azure pipelines and allows you to create a pipeline with YAML (definition with Code). Since I really prefer this approach than the usual graphical editor, I choose to create my new pipeline with YAML, so I simply press Git Hub to specify that I want to build a project hosted in GitHub.

Pressing Authorize button you can authorize with OAuth in GitHub, if you prefer you can use a token or install the azure devops app from the GitHub Marketplace, but for this example I’m using OAuth, because is the simpler approach.

image

Figure 2: Authorize button allows you to authorize in GitHub to allows Azure DevOps pipeline to access your code

Once logged in, I can browse and search for the repository I want to build.

image

Figure 3: I’ve chosen the repository I want to build.

When you choose the repository, the wizard analyze the code in the repository, suggesting you the template that best suites your need, in my example code is a standard .NET Desktop application (it is a console app).

image

Figure 4: Template suggestion from the wizard.

You can choose other template or you can start from an empty template. Whatever is your choice, you can always change the template later, so I choose .NET Desktop and move on.

Thanks to the new Wizard, you can start with a template and a YAML definition that contains basic steps to use as starting point.

Once I’ve chosen the template, the wizard generates a YAML Build definition based on that template, like shown in Figure 5.

image

Figure  5: Generated YAML template

Clearly YAML code for the build should be in the repository, so I press the Save and Run button, then choose to create the file in another special branch.

image

Figure 6: Create the YAML build directly in GitHub repository, but in a different branch.

Once the wizard commits the YAML definition, the build immediately starts so you can verify if everything is ok. The nice aspect is that you do not need to configure any build agent, because the build will be executed by Hosted Agent, an agent automatically managed by Microsoft that is hosted in azure. For open source project Azure Dev Ops gives you 10 conccurent builds with unlimited minutes per month, this is really cool.

Azure Pipelines gives you free minutes month and 10 concurrent build for open source projects.

image

Figure 7: Build is running on hosted agent.

The yaml definition is created on the root folder of the repository, if you do not like the position you can simply manually change location and name of the file, then update the build definition to use the new location. Usually I change the location, add the list of branches I want to monitor with continuous integration and add my GitVersion task to assign build number with GitVersion

image

Figure 8: Small modification of the build definition, triggers and GitVersion task.

Just push with the new definition and now you define triggers that automatically builds every push to standard master, develop, feature, release and hotfix branches. Thanks to YAML definition everything regarding the build is defined in YAML file.

Once the build is up and running, you can go to summary of the pipeline definition and you can grab the link for the badge, to use in readme in GitHub.

image

Figure 9: Status badge menu allows you to get the link for the status badge of selected pipeline.

Pressing Status Badge will show you links to render build badges. Usually you can put these links to Readme.md of your repository. If you look at badge Url you can verify that you can specify any branch name; for a gitflow enabled repository at least I’m going to show status for master and develop branches.

Et voilà, badges can be included in GitHub readme.

image

Figure 10: Badges in GitHub readme can show the status of the continuous integration for your project.

Thanks to Azure Pipelines I’ve setup with few minutes of work a Continuous integration pipeline; absolutely for free, without the need to install any agent and directly with YAML code.

Gian Maria.