Package manager in VSTS

One of the cool feature of Visual Studio Team Services is extendibility, you can also find lots of addin in official Marketplace. One of the coolest addin you can find there is an official addin by Microsoft and allows you to host a private Nuget Packages inside your VSTS account. You can find the Addin here: https://marketplace.visualstudio.com/items?itemName=ms.feed it is free and can be installed with a couple of simple clicks.

image

Figure 1: Package manager add a new PACKAGE menu to your VSTS Account

As you can see the Package manager is still considered to be in Preview (asterisk after the menu and a nice toolbar that link to the documentation), but you can use it because all the basic funcionalities are present.

One of the nice aspect of package management is security: you can publish private packages, and you can decide who can access that specific package. You can start pressing the “New Feed” button in the feed page to create a new feed.

image

Figure 2: Create a new feed.

After the feed is created, you can simply right-click it and choose Edit, to manage security with great granularity. As you can see in Figure 3 you can specify who Own the feed, who can publish packages to the feed and finally who can read packages from that feed. With this level of granularity, you can easily protect your packages from unwanted use.

image

Figure 3: Package management security page.

Once the feed is created, you can press the “Connect To feed” link to gather all the information needed to consume and publish packages. You can find instruction for VS2015, VS2013 or other tools / nuget versions.

image

Figure 4: Instruction on how to connect to the feed for Visual Studio 2015.

Once the feed is created the easiest way to populate it is using a TFS Build, the whole process is explained on the post Publishing a Nuget package to Nuget/Myget with VSO Build vNext.

The main difference is that a private feed can use standard VSTS authentication, you just configure the feed as Internal NuGet Feed  and put the address of the feed in Nuget Publisher Task Configuration. You can see from Figure 3 that Project Collection Build Service is included in the Contributor list, this allows a build service to publish package to that feed during a build.

image

Figure 5: Nuget Publisher task can publish Internal NuGet feed without the need for authentication.

Once the build is finished, you can simply check if the package was correctly published to the feed.

image

Figure 6: Check your published package in feed management.

You can now consume the package from whatever client you like: Visual Studio, Command Line, etc.

Gian Maria.

View results of deleted builds in VSTS

One of the nice new feature of the new build system (vNext) introduced in VSTS is the ability to view result summary for deleted builds.

image

Figure 1: View Deleted builds from VSTS

Clearly not all data is maintained, you cannot retrieve artifacts or logs, so you cannot troubleshoot a failed build, but at least you are able to view build outcome, who triggered it and some global data such as test result summary.

One of the most important information you can retrieve is the Source Version. Suppose you release the output of a build to some test or even worse, production server. Then the build was accidentaly deleted, ach…. If you did a good job to write commit number or CheckinId number in AssemblyInformationalVersion you can safely recreate the build, but if you only rely on build number to retrieve data and the build is gone you are not sure on what version of the code produced that artifacts.

image

Figure 2: Thanks to build detail we can retrieve Source Version even for deleted builds.

As you can see in Figure 2, you can easily retrieve source version from deleted build.

Gian Maria.

Different approaches for publishing Artifacts in build vNext

I’ve wrote an old post that explain how you can manage your artifacts with Build vNext, in that post I suggested to use a custom PowerShell script that identify all of your files that needs to be published as artifacts, and move everything inside the Staging Directory.

I believe that this is the perfect approach for complex application, where we have some logic to be applied before publishing an artifact, also it is super easy to compress everything with 7Zip to reduce the usage of your Upload Bandwidth if you have on-premises agent that needs to publish on VSTS (or simply because you want to save space in your shared folder used as a drop folder).

Publishing artifacts is a Two Step Process, the first one identify and prepare what to publish, the second one actually do the real publishing on the target storage.

If you create a new Build definition today on VSTS this is the default template that is created for a Visual Studio Solution.

Figure 1: Actual template for standard Visual Studio Build

As you can see from Figure 1, publishing artifacts is composed by two distinct tasks, the first one copies files in ArtifactStagingDirectory, the other one published Artifacts. This is a good approach, because you can use Several Copy Files task if you need to publish different type of files in different folder, but PowerShell task gives you a better flexibility.

Preparing files with PowerShell gives you maximum flexibility about preparing what needs to be published.

Using a PowerShell approach is definitively a must if you care DevOps and continuous delivery. With real world and complex projects, you need to publish in drop folder something that can be taken from a script or some deployer to be deployed to some target environment. Here is some common tasks done by the PowerShell script before copying data to StagingDirectory.

Rename all .exe.config files: You can take all of your applicationname.exe.config file and rename to applicationname.exe.config.default. With such a technique, if you simply unzip and ovewrite executable on a test server you do no overwrite configuration files. The application can have logic that copy all settings that does not exists in the .config.default to .config file. This helps tremendously manual deployment (avoiding to overwrite a config file carefully crafted for that environment).

Prepare configuration file for your Release Pipeline: release pipelines can change configuration files for a given environment, but often you need to take the default config in source control and replace part with various pattern (ex: __SettingsName__), and this can be easily done with XML or Json manipulation in PowerShell.

Prepare different archives / package for different configuration: if you have plugin based architecture, you probably want to create an archive with all the plugins, another one with no plugin, and a series of archive with different standard plugin configuration. Generally speaking, it is not so uncommon the nedd to have multiple packages/archives of your sofware composed by different combination of files of a same build. This is often simply done copying dll and adding something to configuration files and should be done automatically to create different packages with a single build.

Sanity check and security check: All configuration files should be checked for some common pattern to identify if some sensitive information can be included in artifacts. What about if a developer leave some password unencrypted in some config file?

Surely you can have other examples on file manipulation checking that needs to be done to  prepare your artifacts, so I still think that, unless your project is really simple, preparing Artifacts with PowerShell is the right way to go.

Gian Maria Ricci

Versioning assembly with powershell and build vNext

In an old blog post I explained how to version  assembly  during TFS 2013 build with Powershell Scripts. The goal is modifying assemblyinfo.cs and assemblyinfo.vb with PowerShell in a TFS 2013 build for a project based on TFVC. If you are interested in Git I’ve other post on the subject.

Now that the build system is changed in Visual Studio Team Services and in TFS 2015, people asked me to update that scripts to work with the new build system. It turns out that the work needed to update the scripts is only one line of code, because an environment variable is changed between the two build system, but all the rest remains equal.

If you use PowerShell scripts to customize the build, you are less dependant on build infrastructure and you have an easiest path on moving to new build systems.

You can download a zip with the script from this address, and the usage is straightforward. Just check-in the scripts under a directory of your project, or in a common directory in TFVC. Once you’ve checked in the file, you can simply add a PowerShell script task before the actual build.

image_thumb[2]

Figure 1: Add PowerShell script before the build stage

Then I simply specify where the script is located in my source control and specify the list of arguments it needs.

image_thumb[5]

Figure 2: Configure the script to run

This script needs a bunch of parameters to run:

-srcPath $(Build.SourcesDirectory)\src -assemblyVersion 2.5.0.0 -fileAssemblyVersion 2.5.J.B

Version number has a special syntax where J is substituted with date expressed with 5 digits: first 2 represents the year 2016 is 16, while the other three digits represents the progressive number of day in the year. The other special char is B that is substituted with build progressive number. If you remember the default build number in TFS / VSTS ends with a dot followed by the daily incremental number for the build.

There are no special operation to do in your build. Here is the output.

image_thumb[8]

Figure 3: Output of the build

Selecting the powershell task, you can verify (2) that the script correctly determines version number as 2.5.16023.4. This is the fourth build of 23 January 2016. In point 3 you can see that the script simply changed various AssemblyInfo.cs and AssemblyInfo.vb to update the number before the build.

Thanks to PowerShell I was able to fully reuse a script prepared for the old Build System in the new system with very little work.

Gian Maria.

Invalidate cache of TFS after a Server Move

If you move your TFS server in a new hardware be sure to follow the instructions in MSDN: Move or clone Team Foundation Server (hardware move).

There are many reason why you want to move TFS to a different hardware, probably you want to use a new powerful hardware and you have not virtualized TFS, or you need to upgrade TFS and you need to move Sql to a new hardware with a more recent version of SQL. Sometimes you simply want to startup with a clean TFS machine (probably you installed too much stuff in the old one, or you have some other services running in the very same machine).

One of the important step is refreshing the data cache on client computers, sometimes if you forget to follow this step client start behaving weird. An example could be: a user is reporting that Visual Studio shows some files as “pending add” but the file was already in TFVC and also the user is able to see the file from the web interface of TFS. The problem is that Visual Studio erroneusly believe that a file needs to be added to source code repository but the file is already there.

To globally invalidate all caches for all users you can use the witadmin rebuildcache command (as described in previous listed MSDN article). With this command you are sure that, upon new connection, all clients will have cache invalidated.

Also follow this instruction to refresh the Version Control Cache on client computers to ensure that all workspaces are in sync with the new server.

Always remember, after moving TFS to a new hardware it is a good idea to invalidate cache from server and to tell all users to refresh local version control cache to avoid weird problems.

Gian Maria.