Bash pornography

Suppose you have two Git repositories that are in sync with a VSTS build, after some time the usual problem is that, branches that are deleted from the “main” repository are not deleted from mirrored repository.

The reason is, whenever someone does a push on main repository, VSTS build push the branch to the other repository, but when someone deleted a branch, there is no build triggered, so the mirrored repository still maintain all branches.

This is usually not a big problem, I can accept that cloned repository still maintain deleted branches for some time after a branch is deleted from the original repository and I can accept that this kind of cleanup could be scheduled or done manually periodically. My question is:

In a git repository with two remotes configured, how can I delete branch in remote B that are not present in remote A?

I am pretty sure that there are lots of possible solutions using node.js or PowerShell or other scripting language. After all we are programmer and it is quite simple to issue a git branch –r command, parsing the output to determine branches in repository B that are not in repository A, then issue a git push remote –delete command.

Since I use git with Bash even in Windows, I really prefer a single line bash instruction that can do everything for me.

The solution is this command and I must admit that it is some sort of Bash Pornography :D.

diff --suppress-common-lines <(git branch -r | grep origin/ | sed 1d | sed "s/origin\///")  " | sed "s/>\s*//" | xargs git push --delete vsts

This is some sort of unreadble line, but it works. It is also not really complex, you only need to consider the basic constituent, first of all I use the diff command to diff two input stream. The general syntax is diff <(1) <(2) where 1 and 2 are output of a bash command. In the above example (1) is:

(git branch -r | grep origin/ | sed 1d | sed "s/origin\///")

This command list all the branches, then keep only lines that start with origin/ (grep origin/ ), then removes the first line (sed 1d) because it contains origin HEAD pointer, and finally remove the trailing origin/ from each line (sed “s/origin\///”). If I run this I got something like this.

SNAGHTML16d0ce1

Figure 1: Output of the command to list all branches in a remote

This is a simple list of all branches that are present on remote origin. If you look at the big command line, you can notice that the diff instruction diffs the output of this command and another identical command that is used to list branches in VSTS.

The output of the diff command is one line for each branch that is in VSTS and it is not in origin, so you can pipe this list to xargs, to issue a push –delete for each branch.

Now you can use this command simply issuing a git fetch for each remotes, then paste the instruction in your bash, press return and the game is done.

image

Figure 2: Results of the command, two branches are deleted from VSTS remote because they were deleted from the origin remote

It is amazing what you can do with bash + awk + sed in a single line 😛

Gian Maria.

One reason to upgrade to TFS 2017: Code search

I always suggest teams to use VSTS instead of on-premise TFS. The main reason is avoiding any issue with administration or upgrades. In these years, one of the main risk of having TFS on-premise is not planning for upgrade and keeping the first version installed for years. As a result, it is not uncommon to have teams that still uses TFS 2013 even if version 2017 is out.

For small teams usually there is not a dedicated TFS administrator; in this situation is it strongly advisable not to skip one entire major upgrade, to minimize the risk of a big bang upgrade. Suppose you have TFS 2015, I strongly suggest you to upgrade to TFS 2017, before another major version is out. This will prevent to have to upgrade 2 or more major version, with the risk of long upgrade times and too many changed that could makes the user upset.

As a general rule, if you need to use on-premise TFS, you should upgrade as frequently as possible and never skip one major upgrade.

If you are not sure that upgrading to TFS 2017 worths the time needed to plan an perform the upgrade, I’d like to share with you some of the new exciting new features.

My favorite feature of TFS 2017 is Code Search, that allows to perform semantic searches within code.

Every developer loves experimenting new libraries, new patterns etc, and in the long run Hard Disks of people in the team is full of small little projects. One day you remember that someone wrote a Proof of concepts to test bulk load with ElasticSearch but you do not know how to find the code. The obvious solution is storing everything inside your TFS (VSTS), using Git or TFVC, then search code with this new exciting functionality.

In TFS 2017 you have the “Search code” textbox, in the left upper part of the Web UI, and if you start typing a nice help allows you to view the syntax you can use.

clip_image001

Figure 1: All the syntax used to search inside code with the new Search Code Functionality

As you can see you can search text inside name of a class, comment, constructor, etc, and you can also use AND OR NOT. This gives you a tremendous flexibility and usually in few seconds you can can find the code you need. Results are presented in a nice way, and you can immediately jump to the code to verify if the result is really what you are searching for.

clip_image002

Figure 2: Results allows you to immediately browse the file with the match within the Web Ui

If you still are in TFS 2015 or earlier version, I strongly suggest you to plan for an upgrade to 2017, to use this new exciting feature.

Gian Maria.

Building with agent without Visual Studio installed

I had a build that runs fine on some agents, then I try running the build on a different agent but the build failed with the error.

Error MSB4019: The imported project “C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\WebApplications\Microsoft.WebApplication.targets” was not found

The problem originated by the fact that the build was configured to compile with VS2015 and use VS2015 test runner, but in build machine the only version of Visual Studio installed is VS2013.

For various reason I do not want to install VS 2015 on that build agent, so I tried to manually configure the agent to have my build and my unit tests running without the need of a full VS 2015 installation.

Warning: this technique worked for my build, but I cannot assure that it would work for your build.

Step 1: Install MSbuild 14 and targets

First of all I’ve installed Microsoft Build tools 2015 to have the very same version of MSBuild that VS2015 uses, but this not enough, because the build still complains that it was unable to find Microsoft.WebApplication.targets. The solution was copying the entire directory C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0 from my developer machine (where everything compiles perfectly) to the very same directory of my build server.

This solution usually works, because you are actually do a manual copy of all .targets that are needed by MsBuild to compile the solution (Asp.Net, etc etc). Now the source code compiles, but I’ve an error during test execution.

Step 2: Execute test with Visual Studio Test runner

Now the Test action failed with this error

System.IO.FileNotFoundException: Unable to determine the location of vstest.console.exe

Since I’ve not installed Visual Studio 2015 test runner is missing and tests could not execute.  To solve this problem I’ve installed Visual Studio 2015 agents, but the error is still there, even if I checked that the test runner was correctly installed. After some googling I’ve discovered that I need to modify a Registry Key called HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\14.0 adding a simple string value named ShellFolder that points to the standard Visual Studio directory: C:\Program Files (x86)\Microsoft Visual Studio 14.0\

image

Figure 1: New registry key needed to the build agent to locate test runner.

After this last modification the solution builds and all test runs perfectly without VS 2015 installed on the build machine.

Please remember that this solution could not work for your environment.  The official and suggestged solution is installing to the build agents all versions of Visual Studio you need to bulid your code.

Gian Maria

SonarQube UTF-8 error after upgrading

I’ve upgraded a SonarQube instance, and then, suddently I realyze that some builds start failing due to to a strange error in the SonarQube complete analysis task.

2016-12-09T11:41:00.8695497Z ##[error]WARN:
 Invalid character encountered in file C:\vso\_work\3\s\src\Jarvis.ConfigurationService.Client.CastleIntegration\Properties\AssemblyInfo.cs at line 25 for encoding UTF-8.
 Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.

Thiserror in turns made the entire task execution failing with a really strange error

2016-12-09T11:41:03.5135467Z ##[error]ERROR: Error during SonarQube Scanner execution
2016-12-09T11:41:03.5145430Z ##[error]java.lang.IllegalArgumentException: 5 is not a valid line offset for pointer. File [moduleKey=Jarvis.ConfigurationService:Jarvis.ConfigurationService:893ED554-3D86-47C8-B529-965329DB32AF, relative=Properties/AssemblyInfo.cs, basedir=C:\vso\_work\3\s\src\Jarvis.ConfigurationService.Client.CastleIntegration] has 1 character(s) at line 2
2016-12-09T11:41:03.5145430Z ##[error]at org.sonar.api.internal.google.common.base.Preconditions.checkArgument(Preconditions.java:145)
2016-12-09T11:41:03.5145430Z ##[error]at org.sonar.api.batch.fs.internal.DefaultInputFile.checkValid(DefaultInputFile.java:215)
2016-12-09T11:41:03.5145430Z ##[error]at org.sonar.api.batch.fs.internal.DefaultInputFile.newPointer(DefaultInputFile.java:206)

This is really annoying, but actually I start investigating the error logging to the build server and checking the file that generates the error. When you have strange encoding erorr I strongly suggested you to visualize the file with some Hex editor, and not a standard editor.

I immediately realized that the error is due to my GitVersion task script, because it manipulates assemblyinfo.cs files to change version numbers and save back the assemblyinfo.cs in UTF-16 encoding. If you do not know Byte Order Mark I strongly suggest you to take a look in wikipedia, because this is an important concepts for Unicode files.

I immediately checked with an hex editor, and verified that the original assemblyinfo.cs has a BOM for UTF-8 but the PowerShell script modified it converting to UTF-16 but the BOM is correct. The annoying stuff is that the build worked perfectly until I updated SonarQube (server + analyzers) this means that for some reason, the most recent version of Sonar somewhat does not check the BOM to understand file encoding.

I’ve solved the problem simply changing the Set-Content call to force UTF8 and the problem is gone.

Set-Content -Encoding UTF8 -Path $file.FullName -Force

I really like SonarQube, but you should always verify that everything works after every upgrade.

Gian Maria.

Enable new Work Item Form in TFS “15”

If you installed TFS 15 Preview, one of the news you expected to see is the new Work Item Layout (already available in VSTS). You could get disappointed that actually your existing Work Items still are shown with the old interface, as you can see in Figure 1

image

Figure 1: After upgrade TFS still shows the old Work Item Form

The new Work Item Form is installed with an opt-in method so it is disabled by default. To enable it you should navigate to the Project Collection administration page. From here you should see that this feature is actually disabled (Figure 2), but you have the link to Enable it.

image

Figure 2: New Work Item form is disabled by default after upgrade

If you click the “Enable the new work item form” link, you are informed that this operation will create a new layout for the Work Item, but you can choose, after the creation of the new layout, if you want to use the new model, and the opt-in model, as you can see in Figure 3.

image

Figure 3: Enabling the new Work Item layout starts with the creation of the new Layout.

Thanks to the opt-in method, you are not forced to use the new layout, but you can activate it only if you want to use it

After the creation of the Layout you should configure the Opt-in model, or you can disable entirely the new Work Item Form.

image

Figure 4: Options to enable the new Work Item form with the opt-in model

The opt-in model basically allows you to decide who can view the new Work Item form layout. You have three options, as shown in Figure 5, you can give the ability to use the new layout only to administrators, to all user, or you can force everyone to use the new Layout, disabling the old layout entirely.

image

Figure 5: Opt-in model options to enable the new Layout form

Opt-in model for the new Layout Form is really flexible because you can leave the decision up to each single user.

The central option is usually the less impacting, because each member of the team can choose to evaluate the new layout or stick with the old one. A new link appears on the head of the Work Item Form, in the far right part, as you can see in Figure 6.

image

Figure 6: Opt in model allows each user to choose to evaluate the new form.

If the user choose to preview the new form, the page refresh and the Work Item is rendered with the new layout. The user has the option to return to the old form if he do not like the new form, giving the whole team the time to evaluate the feature and decide if using it or not.

SNAGHTML7928e8

Figure 8: New form is active and the user can back to old form if the do not like it.

This settings is “per collection” so you can decide different opt-in model for each collection of your TFS.

Gian Maria.