Analyze your GitHub project for free with Azure DevOps and SonarCloud

I’ve blogged some weeks ago on how to analyze OS code with SonarCloud, but it is time to update the post, because if you want to use SonarCloud you have a dedicated extension in the marketplace.

image 

Figure 1: Official SonarCloud extension in the marketplace.

One of the great feature of Azure DevOps is its extendibility, that allows people external to Microsoft to create extensions to expand the possibility of the tool. Once you’ve added the SonarCloud extension to your account, you have a whole bunch new build templates you can use:

image

Figure 2: Build template based on Sonar Cloud

Having a template make super easy to create a build, you just choose .NET Desktop with SonarCloud and you are ready to go. As you can see in Figure 2 you can also use Azure DevOps pipeline to build with Gradle, maven or .NET core, so you are not confined to microsoft tooling.

In Figure 3 there is the build created by .NET desktop project template (remember that this template can be used also for web application, and for every .NET application).

image

Figure 3: .NET Sonar Cloud analysis template.

The only task you need to configure for Sonar Cloud analysis is the Prepare analysis on Sonar Cloud. As you can see in Figure 4, you should first create an endpoint that connect Azure DevOps to your SonarCloud account.

SNAGHTML1d1ca3

Figure 4: In task configuration you have a nice button to create the connection to your SonarCloud account

Configuring the connection is really simple, just give a name to the connection and specify the access token (you should first generate a token in SonarCloud). Then, as shown in Figure 5, press Verify Connection to check that everything is ok.

image

Figure 5: Configuration and test of the connection between Azure DevOps and SonarCloud.

Thanks to the concept of external services, you can configure one or more connection to SonarCloud and having it available in the build without disclosing tokens.

Once you’ve selected the connection, just specify name and key of the project, and other optional parameters if you need to do a custom analysis. In less than a couple of minutes you have a build up and running. Just configure the agent to use Hosted VS2017 pipeline and queue a first build to verify that everything is ok.

Once you have configured the build with the visual web designer, you can convert to Yaml build with few steps.

Clearly I prefer to have a YAML build for a lot of reasons, once the build is up and running simply press the YAML button in the build definition to have your build converted to YAML.

# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net

pool:
  vmImage: 'VS2017-Win2016'

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

variables:
  solution: 'migration/MigrationPlayground.sln'
  buildPlatform: 'Any CPU'
  buildConfiguration: 'Release'

steps:

- task: GitVersion@1
  displayName: GitVersion 
  inputs:
    BuildNamePrefix: 'MigrationCI'

- task: SonarSource.sonarcloud.14d9cde6-c1da-4d55-aa01-2965cd301255.SonarCloudPrepare@1
  displayName: 'Prepare analysis on SonarCloud'
  inputs:
    SonarCloud: 'SonarCloud'
    organization: 'alkampfergit-github'
    projectKey: MigrationPlayground
    projectName: MigrationPlayground
    projectVersion: '$(AssemblyVersion)'

- task: NuGetToolInstaller@0

- task: NuGetCommand@2
  inputs:
    restoreSolution: '$(solution)'

- task: VSBuild@1
  inputs:
    solution: '$(solution)'
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: VSTest@2
  inputs:
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: SonarSource.sonarcloud.ce096e50-6155-4de8-8800-4221aaeed4a1.SonarCloudAnalyze@1
  displayName: 'Run Code Analysis'

- task: SonarSource.sonarcloud.38b27399-a642-40af-bb7d-9971f69712e8.SonarCloudPublish@1
  displayName: 'Publish Quality Gate Result'




Finally, if you still have not installed Azure Devops Pipeline in your GitHub account, I strongly suggest you to do so, just follow the instruction of this article, it is free and gives you free hosted pipelines to run your build for free.

Gian Maria

Converting a big project to new VS2017 csproj format

Visual Studio 2017 introduced a new .csproj file format for C#  project that is the key to move to .NET Standard but it is useful also for project with Full Framework, because is more human manageable.

The main drawback of this approach is that you end compatibility with older version of VS, if you open the .csproj with VS2015 you are not able to compile the project anymore. For this reason, switching to new format should be a decision that is well discussed in the team.

New project format is really human friendly, the only drawback is losing compatibility with older VS.

Luckily enough, when you have a solution with 68 project, you do not need to do all the work manually thanks to a nice tool found on GitHub, CsprojToVs2017. I’ve done manual conversion in the past, it is time consuming and frustrating, the above tools is not perfect but it does most of the work for you.

First of all I’ve run the tool to do a conversion of the entire solution without creating backup files (why should you take a backup when you can do all the test in a Git branch isolated from the rest of the code?).

csproj-to-2017 --no-backup .\mysln.sln

The tool output lots of information, warning and other suggestion, I simply stored in a file as reference but I immediately opened the solution to verify that it compiles successfully and the answer is no, but most of the work was already done for me.

Automated tool can do most of the work for you, but you will always do some manual work to finish and fine tune the conversion.

First of all the tool remove all the information of assembly from assemblyinfo.cs file, and in my situation it generates an error because I have AssemblyInformationalVersion in my assemblyinfo.cs, value of this attribute is “localCompile” and here what the tools generates.

image

Figure 1: Incorrectly Version generated by conversion tool

I still prefer all version information to be stored inside assemblyinfo.cs, so I removed all the version information from all .csproj file, added the attribute <GenerateAssemblyInfo>false</GenerateAssemblyInfo> to avoid automatic generation of assemblyinfo and finally reverted all the modification done to all assemblyinfo.cs to revert to previous content.

Then I start having a strange error:

Assets file …\obj\project.assets.json' not found. Run a NuGet package restore to generate this file

This error can be solved simply running a dotnet restore and in this particular situation it probably happens because in the sln file there are still a couple of web project that were not converted because web project still should use the old format. If we removed those two project from the solution everything compiles correctly, probably VS2017 got confused when a solution mixed the two types of project. The funny thing is that this error did not happened to every member of the team and happens only when we switched branch from a branch with the old format to the new format. This is usually not a big problem, when we switch branch we should only issue a git clean –xdf to clean every file not in source control, dotnet restore and we are ready to go. Now that we do not have active branches with the old format, this issue is gone away.

Then we start having some compilation errors because the new project formats includes automatically all .cs files inside project folders and during the years, some files were probably removed from the project , but they are still in the file system as you can see in following picture, where we have in the left the converted project to the right the original one.

image

As you can verify the file ConfigurationManagerProcessManagerConfiguration is present in file system but it is excluded from the project (right part of the picture) and it is incorrectly included by the new project format.

To solve this problem you should open side by side converted version of the project and original version and manually remove all cs files that are still in source code but are not referenced by the old project format, to avoid them to be incorrectly included after conversion.

One nice side effect of the conversion is that we found a bunch of files still in source control but not included in the solution

Then we have problem with Post Build Action, because we use Pre and Post build action to xcopy some files around but with the new format it seems not to work anymore. It turns out that in post build and pre build action some of the properties, as $(ConfigurationName) were not correctly populated, thus all the post and pre build actions generate error. The solution is really simple, convert them to a standard MsBuild Target as seen in the following snippet, where you can see the correct way to use xcopy and the original code that gets converted.

image

Conversion is really simple, original part is commented in the lower part of previous snippet, and it is composed by a series of xcopy command inside a PreBuildEvent. To made it work with the new csproj format the solution is adding a Target with a BeforeTarget=”PreBuildEvent” that will be run before the pre build event, then add a Exec task for each xcopy instruction. Remember also that the output folder is not bin/debug but you need to append the net461 (framework version).

The whole conversion took almost a couple of  hours for a 68 project solution and thanks to the automatic conversion tool, most of the work was automated, we only need to tweak a little bit project files and solves some problems due to the fact that this is a 5 years old solution.

This conversion forces also you to review your build in VSTS or whatever build system you are using. I encountered a couple of modification to do to every build.

Once you switch to the new project version it is time to check the all automated builds.

  1. First of all I need to be sure that the version of NuGet used to restore dependencies should be set at minimum to 4.6. to be sure of the NuGet version used, you should use the couple of task  NuGet Tool Installer plus Nuget, as I described in a previous post.

image

If you are using a wrong nuget version, probably the build solution will fail with an error like: Assets file ……\project.assets.json’ not found. Run a NuGet package restore to generate this file.

Another problem is due to switching between a branch that still uses old project format and branch that use the new format. It is imperative that obj folders are cleared because building again, or the build probably will fail. To accomplish this, I simply decide to clear everything before a get sources. (This was related to the issue I explained before.)

image

Symptoms of not clearing the obj folder are weird error like: Your project file doesn’t list ‘win’ as a “RuntimeIdentifier”. You should add ‘win’ to the “RuntimeIdentifiers” property in your project file and then re-run NuGet restore.

The overall process went smooth, and now we are ready to start the migration to dotnetcore.

Gian Maria.

NullReferenceException in windows when Git fetch or pull

After updating Git to newer 2.19.1. for windows, it could happen that you are not able to use anymore credential manager. The sympthom is, whenever you git fetch or pull, you got a NullReferenceException and or error  unable to read askpass response from ‘C:/Program Files/Git/mingw64/libexec/git-core/git-gui—askpass’

Git credential manager for Windows in version 2.19.1 could have some problem and generates a NullReference Exception

Clearing Windows Credential Manager does not solves the problem, you still have the same error even if you clone again the repo in another folder. To fix this you can simply download and install the newest version of the Git Credential Manager for windows. You can find everything at this address.

image

Figure 1: Download page for release of Git Credential Manager for Windows

Just install the newest version and the problem should be solved.

Gian Maria

Azure DevOps pipelines and Sonar Cloud gives free analysis to your OS project

In previous post I’ve shown how easy is to create a YAML definition to create a build definition to build your GitHub Open Source project in Azure DevOps, without the need to spend any money nor installing anything on you server.

Once you create a default build that compile and run tests, it would be super nice to create a free account in SonarCloud to have your project code to be analyzed automatically from the Azure Pipeline you’ve just created. I’ve already blogged on how to setup SonarCloud analysis for OS project with VSTS build and the very same technique can be used in YAML build.

Once you have free YAML Azure DevOps pipeline, it makes sense to enable analysis with SonarCloud

First of all you need to register to SonarCloud, create a project, setup key and create a token to access the account. Once everything is in place you can simply modify YAML build to perform the analysis.

image

Figure 1: Task to start sonar cloud analysis.

The above task definition can be obtained simply creating a build with standard graphical editor, then press the YAML build to have the  UI generate the YAML for the task.

Actually YAML build does not have an editor, but it is super easy to just create a fake build with standard editor, drop a task into the definition, populate properties then let the UI to generate YAML that can be copied into the definition.

Once the analysis task is in place, you can simply place the “Run code analysis task” after build and test tasks. The full code of the build is the following.

# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net

pool:
  vmImage: 'VS2017-Win2016'

trigger:
- master
- develop
- release/*
- hotfix/*
- feature/*

variables:
  solution: 'migration/MigrationPlayground.sln'
  buildPlatform: 'Any CPU'
  buildConfiguration: 'Release'

steps:

- task: GitVersion@1
  displayName: GitVersion 
  inputs:
    BuildNamePrefix: 'MigrationCI'

- task: SonarSource.sonarqube.15B84CA1-B62F-4A2A-A403-89B77A063157.SonarQubePrepare@4
  displayName: 'Prepare analysis on SonarQube'
  inputs:
    SonarQube: 'SonarCloud'
    projectKey: xxxxxxxxxxxxxxxxxxx
    projectName: MigrationPlayground
    projectVersion: '$(AssemblyVersion)'
    extraProperties: |
     sonar.organization=alkampfergit-github
     sonar.branch.name=$(Build.SourceBranchName)

- task: NuGetToolInstaller@0

- task: NuGetCommand@2
  inputs:
    restoreSolution: '$(solution)'

- task: VSBuild@1
  inputs:
    solution: '$(solution)'
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: VSTest@2
  inputs:
    platform: '$(buildPlatform)'
    configuration: '$(buildConfiguration)'

- task: SonarSource.sonarqube.6D01813A-9589-4B15-8491-8164AEB38055.SonarQubeAnalyze@4
  displayName: 'Run Code Analysis'




Once you changed the build just push the code and let the build run, you should check if the build completes without error, then verify if analysis is present in SonarCloud dashboard.

A couple of suggestion are useful at this point: first of all you can encounter problem with endpoint authorization, if you have such problem check this link. Another issue is that you should analyze master branch for the first analysis for SonarCloud to work properly. Until you do not analyze master branch, no analysis will be shown to SonarCloud.

If everything is green you should start seeing analysis data on SonarCloud UI.

image

Figure 2: Analysis in SonarCloud after a successful master build

As you can see just a few lines of YAML and I have my code automatically analyzed in SonarCloud, thanks to Azure DevOps pipelines that already have tasks related to SonarCube integration.

A nice finishing touch is to grab the badge link for SonarCloud analysis and add it to your github readme.md.

image

Figure 3: SonarCloud badge added to readme.md of the project.

Gian Maria.

Code in GitHub, Build in Azure DevOps and for FREE

When you create a new open source project in GitHub, one of the first step is to setup continuous integration; the usual question is: What CI engine should I use? Thanks to Azure Dev Ops, you can use free build pipelines to build projects even if they are in GitHub (not hosted in Azure Dev Ops)

Azure Dev Ops, formerly known as VSTS, allows to define free build pipelines to build projects in GitHub

After you created a new project in GitHub, you can simply login to your Azure Dev Ops account (https://dev.azure.com/yourname) then go to azure pipelines and start creating a new pipeline.

image

Figure 1: Wizard to create a new pipeline in Azure DevOps

As you can see, you can choose GitHub repository or Azure Repositories. This is the latest UI available for azure pipelines and allows you to create a pipeline with YAML (definition with Code). Since I really prefer this approach than the usual graphical editor, I choose to create my new pipeline with YAML, so I simply press Git Hub to specify that I want to build a project hosted in GitHub.

Pressing Authorize button you can authorize with OAuth in GitHub, if you prefer you can use a token or install the azure devops app from the GitHub Marketplace, but for this example I’m using OAuth, because is the simpler approach.

image

Figure 2: Authorize button allows you to authorize in GitHub to allows Azure DevOps pipeline to access your code

Once logged in, I can browse and search for the repository I want to build.

image

Figure 3: I’ve chosen the repository I want to build.

When you choose the repository, the wizard analyze the code in the repository, suggesting you the template that best suites your need, in my example code is a standard .NET Desktop application (it is a console app).

image

Figure 4: Template suggestion from the wizard.

You can choose other template or you can start from an empty template. Whatever is your choice, you can always change the template later, so I choose .NET Desktop and move on.

Thanks to the new Wizard, you can start with a template and a YAML definition that contains basic steps to use as starting point.

Once I’ve chosen the template, the wizard generates a YAML Build definition based on that template, like shown in Figure 5.

image

Figure  5: Generated YAML template

Clearly YAML code for the build should be in the repository, so I press the Save and Run button, then choose to create the file in another special branch.

image

Figure 6: Create the YAML build directly in GitHub repository, but in a different branch.

Once the wizard commits the YAML definition, the build immediately starts so you can verify if everything is ok. The nice aspect is that you do not need to configure any build agent, because the build will be executed by Hosted Agent, an agent automatically managed by Microsoft that is hosted in azure. For open source project Azure Dev Ops gives you 10 conccurent builds with unlimited minutes per month, this is really cool.

Azure Pipelines gives you free minutes month and 10 concurrent build for open source projects.

image

Figure 7: Build is running on hosted agent.

The yaml definition is created on the root folder of the repository, if you do not like the position you can simply manually change location and name of the file, then update the build definition to use the new location. Usually I change the location, add the list of branches I want to monitor with continuous integration and add my GitVersion task to assign build number with GitVersion

image

Figure 8: Small modification of the build definition, triggers and GitVersion task.

Just push with the new definition and now you define triggers that automatically builds every push to standard master, develop, feature, release and hotfix branches. Thanks to YAML definition everything regarding the build is defined in YAML file.

Once the build is up and running, you can go to summary of the pipeline definition and you can grab the link for the badge, to use in readme in GitHub.

image

Figure 9: Status badge menu allows you to get the link for the status badge of selected pipeline.

Pressing Status Badge will show you links to render build badges. Usually you can put these links to Readme.md of your repository. If you look at badge Url you can verify that you can specify any branch name; for a gitflow enabled repository at least I’m going to show status for master and develop branches.

Et voilà, badges can be included in GitHub readme.

image

Figure 10: Badges in GitHub readme can show the status of the continuous integration for your project.

Thanks to Azure Pipelines I’ve setup with few minutes of work a Continuous integration pipeline; absolutely for free, without the need to install any agent and directly with YAML code.

Gian Maria.