Keep Git repository in sync between VSTS / TFS and Git

Scenario: you have a repository in Git, both open source or in private repository and you want to keep a synchronized mirror in VSTS / TFS.

There are some legitimate reason to have a mirrored repository between Github or some external provider and an instance of VSTS / TFS, probably the most common one is keeping all development of a repository private and publish in open source only certain branches. Another reason is having all the code in Github completely in open source, but internally use VSTS Work Item to manage work with all the advanced tooling VSTS has to offer.

The solution to this problem is really simple, just use a build in VSTS that push new commits from GitHub to VSTS or the opposite. Lets suppose that you have a GitHub repository and you want it to be mirrored in VSTS.

Step 1 – install extension to manipulate variables

Before creating the build you should install Variable Toolbox extension from the marketplace. This extension allows you to manipulate build variable and it is necessary if you use GitFlow.

From the list of Build Variables available in the build system there are two variables that contains information about the branch that is to be build. They are  called Build.SourceBranch and Build.SourceBranchName, but noone of them contains the real name of the branch. The SourceBranch contains the full name refs/heads/branchname while SourceBranchName contains the last path segment in the ref. If you use gitflow and have a branch called hotfix/1.2.3 the full name of the branch is refs/heads/hotfix/1.2.3 and the variable SourceBranchName contains the value 1.2.3 …. not really useful.

Thanks to the Variable Toolbox Extension you can simple configure the task to replace the refs/heads part with null string, so you can have a simple way to have a variable that contains the real name of the build even if it contains a slash character.

Step 2 – configure the build

The entire build is composed by three simple task, the very first is a Transform Value task (from Variable Toolbox ) followed by two simple command line.

SNAGHTML3a85c0

Figure 1: The entire build is three simple tasks.

The first task is used to remove the refs/heads/ part from the $(Build.SourceBranch) and copy the result to the GitBranchName variable (you should have it defined in the variables tab).

image

Figure 2: Transformation variable configured to remove refs/heads

Now we need a first command line task that checkout the directory, because the build does not issue a checkout in git, but it simple works in detatched HEAD.

image

Figure 3: Checkout git through commandline

As you can see in Figure 3 this operation is really simple, you can invoke git in command line, issuing the command checkout $(GitBranchName) created in precedent step, finally you should specify that this command should be executed in $(Build.SourcesDirectory).

The last command line pushes the branch to a local VSTS Repository.

image

Figure 4: Git command line to push everything on VSTS

The configuration is really simple, I decided to push to address https://$(token)@myaddress.visualstudio.com. Token variable (2) is a custom secret variable where I store a valid Personal Access Token that has right to access the code. To push on remote repository the syntax $(GitBranchName):$(GitBranchName) to push local branch on remote repository with –force option to allow forcing the push.

Do not forget to make your token variable as a secret variable and configure the continuous integration to keep syncronized only the branch you are interested to.

image

Figure 5: Configure branches you want to keep syncronized

If you need also to keep tags syncronized between builds you can just add another command line git invokation that pushes all tags with the push –tags option.

The result

Thanks to this simple build, whenever we push something on GitHub, a build starts that automatically replicate that branch in VSTS without any user intervention.

image

Figure 5: Build result that shows command line in action during a build.

Thanks to the free build minutes on the hosted build, we have a complete copy in VSTS of a GitHub repository with automatic sync running in few minutes.

The very same configuration can be reversed to automatically push to GitHub some branches of your VSTS account, useful if you want to publish only some branches in open source, automatically.

Gian Maria.

Using PAT to authenticate your tools

One of the strength point of VSTS / TFS is the extensibility through API, and now that we have a really nice set of REST API, it is quite normal to write little tools that interacts with your VSTS / TFS instances.

Whenever you write tools that interact with VSTS / TFS you need to decide how to authenticate to the server. While for TFS is quite simple because you can simply run the tool with Active Directory user and use AD integration, in VSTS integrating with your AD requires more work and it is not always a feasible solution.

Actually the best alternative is to use Personal Access Tokens to access your server even if you are using TFS and you could use AD authentication.

PAT acts on behalf of a real user

You can generate Personal Access Token from security section of your user profile, and this gives you immediately idea that the token is related to a specific account.

image

Figure 1: Accessing security information for your profile

From Personal access tokens section of your profile you can generate tokens to access server on behalf of your user. This means that the token cannot have more rights that your user have. This is interesting because if you revoke access to a user, all PATs related to that user are automatically disabled, also, whatever restriction you assign to the user (ex deny access to some code path), it is inerently applied to the token.

PAT expires in time

You can see from point 1 of Figure 2 that the PAT has an expiration (maximum value is 1 year) and this imply that you have no risk of forgetting some tool authenticated somewhere during years.

This image shows how to create a PAT, and point out that the token expires, is bound to a specific account and you can restrict permission of the PAT to given area.

Figure 2: PAT Creation page in VSTS

A tipical security problem happens when you create in your TFS / VSTS a user to run tools, such as TFSTool or similar one. Then you use that user in every tool that need to do unattended access your TFS instance and after some years you have no idea how many tools are deployed that have access to your server.

Thanks to PAT you can create a different PAT for each tool that need to unattendely authenticate to your server, after one year maximum the tool will lose authentication and it need to have a new Token. This will automatically prevent  the risk of having old tools that after year still have access to your data even if they are not actively used anymore.

For VSTS (point 2) you should also specify the account that the PAT is able to access if your user have rights to access more than one account.

PAT Scope can be reduced

In Figure 2 the point 3 highlight that you can restrict permission of PAT based on TFS / VSTS area. If your tool need to manipulate work items and does not need to access code or other area of TFS, it is a best practice to create the token and give access only to Work Items. This means that, even if the user can read and write code, the token can have access only to Work Item.

Another really important aspect is that many areas have the option to specify access in read-only mode. As an example, if your tool needs only to access Work Items to create some reports, you can give PAT only Work Item (read) access, so the tool will be able to access only Work Item in read-only way.

The ability to reduce the surface of data that can be accessed by a PAT  is probably the number one reason to use PAT instead of  AD authentication for on-premise TFS.

PAT can be revoked

Any PAT can be revoked any time with a single click. This means that if you use the pattern of one PAT for each tool you can selectively revoke authentication to any tool revoking associated PAT. This capability is really interesting for on-premise TFS, because if you want to selectively revoke access to specific tool without PAT, you need to use a different user for each different tool and disable that specific user.

Conclusion

Using PAT is not only useful if you want to create token used by tools that need to do an unattended authentication to the server, but you can use PAT even for tools that you use, if you want to be sure that the tool will not have access to certain part of your account (you can use a PAT that can only access code to use with Git tools), or if the tool does not support MSA or AAD authentication.

Import a Git Project with REST API between VSTS Team Projects

I’ve got an interesting question about the possibility to import via REST API a Git Repository between Team Projects of VSTS. Actually the problem is: you want to import a private git repository from a Source repository (in this situation is another VSTS git repository but it could be hosted everywhere) to a VSTS Target  repository using only REST API.

The operation is quite simple thanks to the new api described here (https://www.visualstudio.com/en-us/docs/integrate/api/git/import-requests#create-a-request-to-import-a-repository) and in this post I’ll give you all the details.

Step 1 – create a PAT

To access VSTS through REST API you have many option to authenticate the call, but the easiest one is using PAT (Personal Access Token). If you do no already have a valid PAT you can create one using security page of your account.

image

Figure 1: Open security page of your account

Creating a PAT is really simple, you should only select Personal Access Token (1), then give a description, an expiration time, and the account where PAT is valid into. Since I have more than one VSTS Account I have a combo where all of my account are listed (2).

Finally you should select only the permission you want to give to the token. The default option is All Scopes, and this will imply that the token can do pretty much anything you can do. If you need this token to manage import of repositories you can simply select only code related permission.

image

Figure 2: Create a PAT to access your account.

Personal Access Token are the most secure way to authenticate an application in VSTS because they can be revoked, you can choose the permission you want to give to the token and they have an automatic expiration.

If your Source Account is on a different account from the Target Account you need to create PAT both in Source Account VSTS Instance and in Target Account VSTS Instance. In this example VSTS instance is the very same, so I need only one PAT.

Step 2 – Create endpoint to access Source Repository

My target repository is called ImportTest, and it is important that this repository is created empty. This is my Target Repository, the repository where I want to import the Source Repository.

image

Figure 3: Create Target Repository with standard Web Interface

The import routine should be able to access Source Repository and this imply that it needs to be authenticated. To maximize security you need to create an Endpoint that point to the Source Repository in the Team Project of Target Repository. This can be easily done from the administration page of the Team Project that contains the Target Repository. The team project that contains my ImportTest repository is contained in GitMiscellaneous Team Project, and I can proceed to manually create the endpoint.

image

Figure 4: Create an endpoint of type External Git

image

Figure 5: Specify endpoint details

In Figure 5 you can see all the options needed, you should specify a connection name, then the URL parameter is the url of the Source Repository, the same url you use to clone the repository. Finally you need to use the PAT as username, then you can press OK.

This service endpoint should be created in the Team Project that contains the Target Repository, because it will be used by the import routine to authenticate to the Source Repository to take data to import.

An endpoint is basically an URL and an authentication that is used by the server to access an external service

If you need to automate the whole process, the endpoint can be created easily with REST API  (https://www.visualstudio.com/en-us/docs/integrate/api/endpoints/endpoints) here is a simple call in Postman.

image

Figure 6: Creation of the endpoint with REST API

This does not need any explanation because it is a simple call with the very same option that you specify on the UI.

Step 3 – Create the call to import repository

To create the call to start repository import routine you need some parameters: first of all you need the id of the Endpoint you created in step 2. If you created the endpoint through REST API this is not a problem, because the Id is present in the response

image

Figure 7: Response of the request shown in Figure 6 contains endpoint Id

If you created the endpoint through Web UI the id can be grabbed by the url in the administration page of the endpoints, but a simpler and better method is to list all endpoint of the Team Project through REST API. In my situation is a simple GET call to this url https://gianmariaricci.VisualStudio.com/GitMiscellaneous/_apis/distributedtask/serviceendpoints?api-version=3.0-preview.1

The answer is the very same of Figure 7, and this gives me the id of the endpoint that points to the Source Repository: df12f2e3-7c40-4885-8dbd-310f1781369a

Now I need to create the import request, as described here (https://www.visualstudio.com/en-us/docs/integrate/api/git/import-requests#create-a-request-to-import-a-repository). And the only information I’m missing is the Id of the Target Repository

image

Figure 8: Repository part of the url in the call should be replaced by repository ID

As shown in Figure8 the only annoying part of the request is the Id of the Target Repository because it is the GUID of the repository not the name. Obtaining this value is not difficult, because with REST API this is a simple GET call to this url: https://gianmariaricci.VisualStudio.com/DefaultCollection/GitMiscellaneous/_apis/git/repositories?api-version=1.0. From the answer of this call the ID of the ImportTest repository is: 3037268a-0c91-4fe1-8435-a76e9b731f5e

Now I have everything to create the import request, just forge the request in Postman or similar tool and fire the request.

image

 Figure 9: The import request where 1 is the ID of Target Repository and 2 is the ID of the endpoint.

If you are quick enough and refresh the page of Target Repository while the import routine is running, you should be able to see this image

image

Figure 10: Importing is running

After a little bit (depending on the source of Source Repository) the Target Repository will be a perfect clone of the Source Repository .

If there are errors during the import process in the source code page of Target Repository  you are warned with the error, as shown in Figure 11.

image

Figure 11: Error in the importing routing were shown to source  code page of Target Repository

As an example the error in the above image is due to a misconfiguration of the Endpoint (done in part 2), as an example if you created the endpoint with wrong credentials.

Gian Maria

Use different Excel TFS / VSTS Addin at the same time

If you are a consultant, it is quite common that you work with various version of TFS Server at the same time. I have my personal account on VSTS, always updated to the latest version, but I have also customer that still uses TFS 2012 or TFS 2010.

Microsoft test newer version of TFS against lots of applications to be sure that newer versions of TFS do not break usage of existing tools. This means that usually you can upgrade your TFS without worrying that your VS 2010 or Visual Basic 6 stops working. You need to be aware that the opposite is not true. This imply that newer version of Visual Studio could not work well with older version of TFS. This decision is done because Microsoft is encouraging people to keep their TFS installation up to date, and it would be a nightmare to always guarantee that newer tools would be able to communicate with the older service API.

To minimize compatibility problems, you should keep your TFS on-premise updated to the latest version.

Tool as Visual Studio are usually not a problem, you can keep as many VS version you want side by side, so if you still use TFS2012 you can still use VS 2012 without any problem. But you can have problems with other tools.

Office TFS addin is installed automatically with Visual Studio Team Explorer or with any version of Visual Studio. This means that whenever you update your VS or you installa new VS version the Office addin is also updated.

Starting from Visual Studio 2015 there is no Team Explorer anymore, if you want to install only the Office addin you can use the standalone installer following links on this post from Brian Harry.

In Italian Visual Studio forum there is a question where a user experienced problem in exporting Work Item Query result to Excel after upgrading to Visual Studio 2015 Update 3. He is able to connect Excel to VSTS, but the addin does not work anymore with on-premise TFS 2012. This situation prove that the addin is working correctly with latest TFS version, but it does not support anymore older TFS version.

The solution to this problem is simple, because you can choose in Excel the addin version you want to use. You just need to go to Excel Settings, then choose Add-ins (1) then manage Com Add-ins (2) and finally press the Go button.

Figure 1: Managing Excel addins from settings pane.

If you scroll down the addin list, you should see several version of the addin for TFS, one for each version of Visual Studio you have installed. In my machine I have VS2012, VS2013 and VS2015 so I have three distinct version of the addin.

Figure 2: Multiple TFS Addin installed if you have multiple version of Team Explorer.

You can understand the version of the addin simply looking at the location, but the cool part is that you can enable more than one addin at the very same time. As a result you have multiple Team ribbon tab in your Excel as shown in Figure 3.

Figure 3: Multiple TFS Addin enabled at the very same time

I need to admit that this is not really a nice situation to have, because you are confused and there is no clear clue to which version of the add-in each tab is referring to, but at least you can use both of them at the very same time. If you prefer you can simply enable an old version (say 2012 version) to make sure that it works with your main TFS instance. Usually if you enable an older version it should be capable of working with newer instance of TFS.

I’ve not tested this technique thoroughly but it should work without problem.

Gian Maria.

Release to Azure with Azure ARM templates

Thanks to new Release Management system in VSTS / TFS creating a release to your on-premise environment is really simple (I’ve described the process here). Another option is creating a test environment in Windows Azure, and if you choose this option life can be even easier.

In this example I’m using Azure as IAAS, deploying a software on a Windows Virtual Machine. While this is probably not the best approach to cloud (PAAS is surely a better approach) to create a test environment it can be perfectly acceptable.

I’m not going to give you an introduction or explanation of Azure Resource Manager because there are tons of resources on the web and Azure moves so quickly that every information I’m going to give you will probably will be old by the time I press “publish” :). The purpose of this post is giving you a general idea on how to use Azure ARM to create a release definition that automatically generates the resource in azure and deploy your software on it.

My goal is using Azure ARM for DevOps and Automatic / Continuous Deployment and the first step is creating a template file that describes exactly all the Azure resource needed to host my application. Instead of starting to write such template file I start checking on GitHub becuse there are tons of template files ready to use.

As an example I took one of the simplest, called 101-vm-simple-windows. It creates a simple Windows Virtual Machine and nothing else. That template has various parameters that can allow you to specify VM Names and other properties, and it can be directly used by a Release Management definition.  I’ve done simple modification to the template file and in this situation it is better to first check if everything is working as expected triggering the deploy process directly from command line.

New-AzureRmResourceGroupDeployment 
	-Name JarvisRm 
	-ResourceGroupName JarvisRm
	-TemplateFile "azuredeploy.json" 
	-adminUsername alkampfer 
	-adminPassword ********** 
	-vmName JarvisCmTest 
	-storageAccount jarvisrmstorage 
	-dnsLabelPrefix jarvisrm

As you can see I need to choose the name of the resource group (JarvisRm) specify the template file (azuredeploy.json) and finally all the paramters of the template as if they are parameter of the PowerShell cmdlet. Once the script finished verify that the resource group was created correctly and all the resources are suitable to deploy your software.

image

Figure 1: Your Resource group was correctly created.

Once everything is correct, I deleted the JarvisRm resource group and I’m ready to use the template on a release definition.

Always test your ARM template directly from command line, to verify that everything is allright. When the resource are created try to use them as manually target of a deploy and only once everything is ok start automating with Release Management.

When you have a good template file the best place to store it is in your source control, this allow you to version this file along with the version of the code that is supposed to use it. If you do not need versioning you can simply store in a network share, but to avoid problem it is better to have Release Management Agent run the template from a local disk and not from a network share.

image

Figure 2: Copy template file from a network share to a local folder of the agent.

First step of the release process is copying template files from a network share to $(System.DefaultWorkingDirectory)\ARM folder so PowerShell can run against script that are placed on local disk. The second task is Azure Resource Group Deployment that uses the template to deploy all Resources to Azure.

image

Figure 3: The Azure Deployment task is used to create a Resource Group from a Template definition.

You should specify only template file (1) and  all the parameters of the template (2) such as userName, password, dns name of the VM etc. As a nice option you can choose to Enable Deployment Prerequisites (3) to have your VM being able to be used as a target for Deploy Action. You can read more about prerequisites on MSDN blog, basically when you select this action the script will configure PowerShell and other option on the target machine to being able to execute script remotely.

Virtual machines need to be configured to be used as target of deploy tasks such as remote PowerShell execution, but the Azure deployment task can take care of everything for you.

This task requires that you already connected the target Azure subscription with your VSTS account. If you never connected your TFS / VSTS account to your Azure subscription with ARM, you can follow the instruction at this link, that contains a PowerShell script that does EVERYTHING for you. Just run the script, and annotate in a safe place all the data you should insert to your TFS / VSTS instance to connect to Azure with ARM.

Antoher aspect you need to take care of, is the version of PowerShell azure tools installed in the machine where the Release Agent is running. Release Management script are tested agains specific version of Azure PowerShell tools, and since the Azure team is constantly upgrading the Tools, it coudl happen that TFS / VSTS Release Management Tasks are not compatible with the latest version of the Azure Tools.

All of these tasks are open sourced, and you can find information directly on GitHub. As an example at this link tjere are the information about the DeployAzureResourceGroup task. If you go to the bottom you can verify the PowerShell tools version suggested to run that task.

image

Figure 4: Supported version of AzureRM module version

Clearly you should install a compatible version in the machine where the Agent is installed. If you are unsure if the agents has a suitable version of Azure PowerShell tools, you can go to TFS Admin page and verify capabilities of the agent directly from VSTS / TFS

image

Figure 5: Agent capabilities contains the version of Azure PowerShell tools installed

Demand for Azure PS is present on Release Definition, but it does not specify the version, so it is not guarantee that your release is going to be successful.

image

Figure 5: Release process has a demands for Azure PowerShell tools but not to a specific version.

As a result I had problem in setting up the Release Process because my agents has PowerShell tools version 1.4 installed, but they are not fully compatible with Release Manager Task. Downgrading the tools solved the problem.

If your release fails with strange errors (such as NullReferenceException) you need to check in GitHub the version of PowerShell tools needed to run that task and install the right version in the agent (or at least you can try to change the version until you find the most recent that works)

The Azure Resource Group Deployments takes care of everything, I’ve modified the base script to apply a specific Network Security Group to the VM but the general concept is that it configures every Azure Resource you need to use. At the end of the script you have everything you need to deploy your software (Virtual Machines, sites, databases, etc).

In my example I need only a VM, and once it is configured I can simply use Copy to Azure VM task and Execute PowerShell on Azure VM Task to release my software, as I did for my on-premise environment.

image

Figure 6: Configuration of the Task used to copy files to Azure VM

You can specify files you want to copy (1) login to the machine (2) and thanks to Enable Copy Prerequisites (3) option you can let the Task takes care of every step needed to allow copy file to the VM. This option is not needed if you already choosen it in the Azure Deployment task, but it can be really useful if you have a pre-existing Virtual Machine you want to use.

Final Step is executing the release script on target machine, and it has the same option you specify to run a script on a machine on-premise.

image

Figure 7: Run the installation PowerShell script on target Azure VM

Once everything is in place you only need to create a release and wait for it to be finished.

image

Figure 8: Output of release definition with Azure ARM

With this example, since I’m using a Virtual Machine, deploy script is the same I used for on-premise release, with PAAS approach, usually you have a different script that target Azure specific resources (WebSites, DocumentDb, etc)

If the release succeeded you can login to portal.azure.com to verify that your new resource group was correctly created Figure 1, and check that in the resource group there are all the expected resources Figure 9.

image

Figure 9: Resources created inside the group.

To verify that everything is ok you should check the exact version of the software that is actually deployed on the environment. From Figure 10 I can see that the release deployed the version 1.5.2.

image

Figure 10: List of all most recents releases.

Now I can login to the VM and try to use the software to verify that it is correctly installed and that the version installed is correct.

image

Figure 11: Software is correctly installed and the version corresponds to the version of the release.

Azure Resource Management is a powerful feature that can dramatically simplify releasing your software to Azure, because you can just download scripts from GitHub to automatically creates all Azure Resources needed by your application, and lets VSTS Release Management tasks takes care of everything.

Gian Maria.