Create a safe clone of your TFS environment

ChangeServerId to avoid confusion of client tools

Around the web there are a lot of resources about how to create a clone of your TFS environment for testing purpose. The most important step was always running the TfsCongfig ChangeServerId Command as described in the Move Team Foundation Server from your hardware configuration to another.

With the new wave of guidance for TFS 2015, a new interesting article cames out on how to do a Dry Run in a pre-production environment. In that articles a couple of tricks worth a mention, because they are really interesting and easy to do.

Risk of corrupting production environment

Tfs uses a lot of extra tools and products to fulfill it functions, it is based on Sql Server database but it communicates also with Reporting Services, Sharepoint, SCVMM for Lab management, test controllers and so on. When you restore a backup of your production environment to a clone (pre-production) environment, you need to be sure that this cloned installation does not corrupt your production environment.

As an example, if cloned server still uses the same Reporting Services instance of production server, you will probably end with a corrupted Reporting Services Database.

Protect your environment

In the above article, a couple of simple technique are described to avoid your cloned pre-production TFS corrupt something in production environment.

Edit your hosts file to make all of production servers not reachable from cloned server.

This is the simplest but most efficient trick, if you modifiy hosts file in cloned machine, giving an inexistent ip address for all the names of machines related to TFS environment, you are pretty sure that cloned environment cannot corrupt other services.

If for some reason you forgot to change Lab Management SCVMM Address or Sharepoint, cloned machine is not able to reach them, because name resolves to invalid address.

Use a different user to run TFS Service cloned environment, and be sure that this user has no special permission

Usually TFS Services runs with an account called TFService, and this account has lot of privileges in all machines related to TFS environment. As an example, it has right to manage SCVMM in a Lab Management scenario. If you create a user called TFSClonedService or TFSServiceCloned, withouth no special permission, and use that user to run cloned TFS environment, you are pretty sure that if cloned environment try to contact some external service (Ex SCVMM, Report Service, Etc) you will get a Unauthorized exception.

Remember that running a cloned TFS instance is an operation that should be done with great care, and you should adopt all techniques useful to limit accidental damage to production environment.

Gian Maria.

TFS Integration Platform, copy from Agile 2010 to CMMI 2013

Today I needed to move a bunch of Work Items from a TFS 2010  to TFS 2013, but I needed also to move from a TP based on Agile Template on a Project based on CMMI Template.

Number of Work Items is small, but lots of them have attachments, so I decided to use Integration Platform to migrate the history and attachments. It turns out that we accomplished an acceptable result with little time. An alternative, if you do not care about attachments and history, is using Excel.

First of all you need to be aware of the EnableBypassRuleDataSubmission option that allow Integration Platform to bypass rule validation of Work Items. This options is expecially useful if you migrate to a different Process Template, because you are not sure that Work Items are valid when they transition from a process to another. This feature is also useful to preserve the Author of Work Item Change in destination project. If you do not enable this feature, all changes will be recorded as done by the user that is doing the migration.

Moving to a different Process Template is mainly a matter of creating mapping between fields of the source template (Agile) to fields of detstination template (CMMI). A nice aspect is that you need to map only fields that are used on source Team Project. As a suggestion you should try to copy everything on a test Team Project in a test Project Collection, and repeat the migration several times, until the result is good.

You can start with a wildcard mapping, then start the migration and during Analysis Phase the Integration Platform will generate Conflicts that shows you what is wrong. You can solve the problem and update mapping until everything run smootly. In Figure 1 you can the most common error, a field that is present in source Process Template is not present in destination Process Template. In that picture you can verify that the Microsoft.VSTS.Common.AcceptanceCriteria is missing in CMMI Project.

image

Figure 1: Conflicts occours because not all used fields are mapped in the configuration.

Be sure to refer to the Work Item Field Reference, to verify which fields are available in destination template. One of the nicest feature of Integration Platform is that you can simply specify target field during migration, and this automatically updates the mapping. Since CMMI does not have acceptance criteria field, you can add it (editing process template of destination Team Project) or you can use another field, Es. Analysis, or you can do some complex mapping (will show an example later in the post)

You can also choose to update Mapping ignoring the field, this choose will ignore content of that field that will not be migrated. You can also do a manual update XML mapping configuration, if the resolution requires complex modification of the mapping.

Complex resolution conflicts

Figure 2: Update configuration if you need to do some complex resolution

At the end of the migration you should double check what happened, because bypassing Work Item rules usually lead to Work Items in unconsistent state. As an example, in Agile Process, a Task can have new status, while this is invalid in CMMI.

image

Figure 3: Some of migrated Work Items can have invalid state because we decided to bypass validation rules.

It is common to have same field that admit different values in different Process Template. Task Work Item type have Activity field in Agile that can be mapped to Discipline in CMMI, but allowed values are different. In such a situation you can map a field Activity to be copied to Discipline but using a lookup map to convert values.

<MappedField 
    LeftName="Microsoft.VSTS.Common.Activity"
     RightName="Microsoft.VSTS.Common.Discipline" 
   MapFromSide="Left" 
    valueMap="ActivityMap" />

<ValueMap name="ActivityMap">
    <Value LeftValue="Deployment" RightValue="Development">
        <When />
    </Value>
    <Value LeftValue="Design" RightValue="User Experience">
        <When />
    </Value>
    <Value LeftValue="Development" RightValue="Development">
        <When />
    </Value>
    <Value LeftValue="Documentation" RightValue="Analysis">
        <When />
    </Value>
    <Value LeftValue="Requirements" RightValue="Analysis">
        <When />
    </Value>
    <Value LeftValue="Testing" RightValue="Test">
        <When />
    </Value>
</ValueMap>

The last useful technique is the ability to compose destination value using multiple source fields. As an example, CMMI does not have AcceptanceCriteria field, that is present in Agile. From my point of view, Acceptance Criteria in Agile can be considered part of the Description field in CMMI. Thanks to FieldAggregationGroup I was able to to copy both Description and AcceptanceCriteria field from User Story (Agile) to field Description of Requirement (CMMI).

<FieldsAggregationGroup MapFromSide="Left" TargetFieldName="System.Description" Format="Description:{0} AcceptanceCriteria:{1}">
	<SourceField Index="0" SourceFieldName="System.Description" />
	<SourceField Index="1" SourceFieldName="Microsoft.VSTS.Common.AcceptanceCriteria" />
</FieldsAggregationGroup>

Thanks to this configuration, I can map multiple source fields in a single fields. In Figure 4 is depicted a User Story that has both Description and Acceptance Criteria populated.

Acceptance criteria and Description

Figure 4: A User Story Work Item that has both Details and Acceptance Criteria

Thanks to FieldAggregationGroup I’m able to compose content of these two field in a single field of migrated work item. Here is corresponding Work Item on Destination Team Project after migration.

Result  of composing two source fields in a single destination field

Figure 5: Result  of composing two source fields in a single destination field

Another interesting feature is specifying a default value for required fields that exists only in the destination Team Project, thanks to the @@MissingField@@ placeholder. The @@MissingField@@ placeholder can be used to simply specify a Default Value for a field in Destination Team Project.

If you need resources about Integration Platform, I suggest you looking at this article that contains a huge amount of links that cover almost every need  http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/06/06/toc-tfs-integration-tools-blog-posts-and-reference-sites.aspx

Gian Maria.

Work Item query by category

This is a really old functionality of TFS, but it turns out that sometimes some people missed it. When you create a query, you can add a condition on Work Item Type.

This image shows combo box rendered by the ui when you are using the equal operator

Figure 1: Add condition to Work Item Type

As you can see, you can require the Work Item Type to be equal to specific value, and the UI renders a nice combo box with all permitted values to help the user choose right value.

You can also use the in operator, to specify a comma separated list of allowed types.

In operator in Work Item Query allow to specify a comma separated query of values

Figure 2: The in operator in Work Item Query

Finally TFS has a nice concepts called Work Item Category to group togheter all Work Item Types that shared some common behavior. As an example, all types that represents a concept of requirement are shown on the Backload Board, while Work Items that represents a Task are represented in the Task Board. If you choose the in operator to specify a condition on Work Item Type, you can choose from Work Item Categories.

If you choose In Group operator you can choose between Work Item Categories instead of types

Figure 3: Query with the “in group” operator  allows you yo choose between Work Item Categories

There are many use case for this functionality, Microsoft Test Manager used Requirement category to create a generic query that lists “requirements” and is valid for all template. You can use this feature if you need to create query that spans multiple project with different project template.

image

Figure 4: Query for requirements on multiple Team Project

In Figure 4 I represented a simple query to list all requirements associated to me for every Team Project. As you can see from the result, I got Work Item Type “Requirement” from a CMMI Project and “Product Backlog Item” from a Scrum project.

Gian Maria.

Managing tags with Tag Admin for VS 2013

Tags management in Team Foundation Server is a really good way to add custom information to Work Items without the need to customize process template. The downside of this approach, is that every person is able to add whatever tag to work items with the risk of misspelling and duplication.

As an example if the team is doing T-Shirt sizing for User Stories, we can have people using tag S to identify a Small Story, then people decides to change to SIZE_S to better indicate the purpose of the tag. Now you have some User Story with S and other one with SIZE_S. Mispelling is another typical problem, even if TFS is suggesting you tags in edit with drop down, there is always the risk that someone write a slightly different tag.

An optimal solution to cope with these problem is installing an extension of Visual Studio that allows you to manage tags

image

Figure 1: Tag Admin For Visual Studio 2013 in Visual Studio Gallery

This extension adds a nice link in your team explorer to manage your tags. If you open it, you are immediately prompted with a complete list of all of your tags, with a counter that identify how many work items are associated with each tag.

image

Figure 2: List of tags of team project.

If some tags are not associated to any work item and you wonder why they are listed there (you see 0 as work item count) the reason is that TFS Still not cleared the Tag from the tag cache. After a tag is not used by any work item for some days, TFS decides that the tag should not be available anymore for suggestion.

In this example I have a misspelling problem between delighter and deligter so I can click mispelled tag, and a nice Action buttons appears in the UI allowing for some actions.

image

Figure 3: Available actions for tag

You can view a list of work items that contains that tags, you can delete that tags, effectively removing that tags from any Work Item and you can also Rename Tag. The actual version of the tool does not allow to rename a tag giving a name that already exists, and this prevent us to effectively using the tool to “merge” mispelled tag into a single tag, but it is still really useful because it allow an administrator to immediately spot mispelled tag, that can be fixed manually.

Actually you can simply click “View Linked Workitem” and then from the standard web interface apply the fix changing tags accordingly.

Gian Maria.

Manage Artifacts with TFS Build vNext

Artifacts and Build vNext

Another big improvement of Build vNext in TFS and VSO is the ability to explicitly manage the content of artifacts during a build. With the term Artifacts in Continuous Integration we are referring to every result of of the build that is worth publishing together with build result, to be further consumed by consumers of the build. Generally speaking think to artifacts as build binary outputs.

The XAML build system does not give you much flexibility, it just use a folder in the build agent to store everything, then upload everything to the server or copy to a network share.

To handle artifacts, vNext build system introduces a dedicated task called: Publish Build Artifacts.

Publish Build Artifacts options

Figure 1: Publish artifacts task

The first nice aspect is that we can add as many Publish Build Artifacts task we want. Each task requires you to specify contents to include with a  default value (for Visual Studio Build) of **\bin to include everything contained in directories called bin. This is an acceptable default to include binary output of all projects, and you can change to include everything you want. Another important option is the ArtifactName, used to distinguish this artifacts from the othter ones. Remember that you can include multiple Publish Build Artifacts tasks and Artifact Name is a simple way to categorize what you want to publish. Finally you need to specify if the artifact type is Server (content will be uploaded to TFS) or File Share (you will specify a standard UNC share path where the build will copy artifacts).

Artifacts browser

With a standard configuration as represented in Figure 1, after a build is completed, we can go to the artifacts tab, and you should see an entry for each Publish Build Artifacts task included in the build.

List of artifacts included in build output.

Figure 2: Build details lists all artifacts produced by the build

You can easily download all the content of the folder as a single zip, but you can also press button Explore to explore content of the artifacts container directly from web browser. You can easily use Artifacts Explorer to locate the content you are interested into and download with a single click.

With artifacts browser you can explore content of an artifacts directly from browser and download single contents.

Figure 3: Browsing content of an artifact

Using multiple Artifacts Task

In this specific example, using **\bin approach is not probably suggested approach. As you can see from previous image, we are including binaries from test projects, wasting space on server and making more complex for the consumer to find what he/she needs.

In this specific situation we are interested in publishing two distinct series of artifacts, an host program and a client dll to use the host. In this scenario the best approach is using two distinct publish artifacts task, one for the client and the other for the host. If I reconfigure the build using two task and configure Contents parameter to include only the folder of the project I need, the result is much better.

Multiple artifacts included in build output

Figure 4: Multiple artifacts for a single build output

As you can see from previous image, using multiple task for publishing artifacts produces an improved organization of artifacts. In such a situation it is simple to immediately locate what you need and download only client or host program. The only drawback is that we still miss a “download all” link to download all artifacts.

Prepare everything with a Powershell Script approach

If projects starts to become really complex, organizing artifacts can start to become a complex task. In our situation the approach of including the whole bin folder for a project is not really good, what I need is folder manipulation before publishing artifacts.

  • We want to remove all .xml files
  • We want to change some settings in the host configuration file
  • We need to copy content from other folders of source control

In such a scenario, Publish Artifacts task does not fulfill our requirement and the obvious solution is adding a Powershell Script in your source code to prepare what you are calling a “release” of artifacts. A real nice stuff about PowerShell is that you can create a ps1 file with the function that does what you need and declare named parameters

Param
(
    [String] $Configuration,
    [String] $DestinationDir = "",
    [Bool] $DeleteOriginalAfterZip = $true
)

In my script I accepts three parameters, the configuration I want to release (Debug or Release), destination directory where the script will copy all the file, and finally if you want the script to delete all uncompressed files in DestinationDirectory.

The third option is needed because I’d like to use 7zip to compress files in output directory directly from my script. The two main reason to do this are

  • 7zip is a better compressor than a simple zip
  • It is simpler to create pre-zipped artifacts

Using Powershell script has also the great advantage that it can be launched manually to verify that everything goes as expected or to create artifact with the exact same layout of a standard build, an aspect that should not be underestimated. Once the script is tested on a local machine (an easy task) I have two files in my output directory.

Content of the folder generated by PowerShell script

Figure 5: Content of the output folder after PowerShell script ran

One of the biggest advantage in using PowerShell scripts, is the ability to launch it locally to verify that everything works as expected, instead of standard “modify”, “launch the build”, “verify” approach needed if you use Build Tasks.

Now I customize the build to use this script to prepare my release, instead of relying on some obscure and hard to maintain string in Publish Artifact Task.

Include a Powershell Task in the build to prepare artifacts folder

Figure 6: PowerShell task can launch my script and prepare artifacts directory

Thanks to parameters I can easily specify: current configuration I’m building (release, debug), DestinationDir (I’m using the $(build.stagingDirectory) variable that contains the staging directory for the build). You can use whatever destination directory you want, but using standard folder is probably the best option.

After this script you can now place a standard Publish Build Artifacts task, specifying $(build.stagingDirectory) as the Copy Root folder, and filtering content if you need. Here is the actual configuration.

Publish build artifacts taks can be used to publish Powershell output

Figure 7: Include single Publish Build Artifacts to publish from directory prepared by PowerShell script

The only drawback of this approach is that we are forced to give an Artifact Name that will be used to contain files, you cannot directly publish pre-zipped file in the root source of build artifacts. If you want you can include multiple Publish Build artifacts to publish each zipped file with a different Artifact Name.

Build artifacts contains a single artifacts with all zipped file

Figure 8: Output of the build

But even if this can be a limitation, sometimes can be the best option instead. As you can see from previous image, I have a primary artifact and you can press the Download button to Download Everything with a  single click. Using Artifact Explorer you can download separate packages, and this is probably the best approach.

Artifacts browser permits you to download single zip files

Figure 9: Artifact browser shows distinct zip file in the output

If you use a Script to create one separate pre-compressed package for each separate artifacts, your publish experience will probably be better than any other approach.

Conclusions

Build vNext gives us great flexibility on what to publish as artifacts, but even if we can manage everything with dedicated task, if you want a good organization of your artifacts, using a PowerShell script to organize everything and pre-compressing in single files is usually the best approach.

Gian Maria.