TFS New Build System: vNext agents

With the latest Visual Studio Online update, the new build system is now online for all users. As I said in old post, it is completely rewritten and covering all new features really requires lots of time. Since I’m a great fan of Continuous Integration and Continuous Deploy procedures I’d like to do some post to introduce you this new build system, along with the reason why it is really superior to the old one.

In this post I’ll outline some of the improvements regarding build infrastructure management, starting from the most common problem that customers encountered with the older approach.

Each project collection needs a separate build machine

In the old build system you should install TFS bit and configure Agents and Controller to create a build machine, each machine can provide build capabilities only for a project collection.

With the new build system, Controller is part of TFS/VSO core installation, you should only deploy agents on build machines and an agent can provide build capabilities for an entire instance of TFS.

This means that an agent is connected to an entire TFS instance and it can build stuff from every Project Collection in your server. This simplify a lot your infrastructure because you can use a single build machine even if you have multiple Project Collections. If you are a small shop and you use multiple Project Collections to achieve full project separation, probably a single build machine is enough.

Deploying new agent is complex

With traditional build system you need to deploy Controllers and agents. Both of them requires a full TFS installation, followed by configuration of Build. Everything is administered trough standard TFS Administration Console.

With the new build system build controller is part of the core installation, you should only deploy agents in your machine.

Deploying an agent is a matter of downloading a file, unzip it and run a PowerShell script.

If you are interested in full installation details, you can follow step by step MSDN procedure here: https://msdn.microsoft.com/Library/vs/alm/Build/agents/windows. This aspect greatly simplify setup of a new agent.

I’ve multiple VSO accounts, I need multiple build machines

Since the agent is a simple Console Application you can install more agent simply unzipping agents file in different directories of the same machine. In the end you can have a single machine with agents connected to multiple VSO or TFS instances.

You can use a single build machine for multiple TFS2015 or VSO instances

Building is limited to windows machine

Since agents are installed with TFS Bits, the old build system only allow agents to run on a Windows Machine. Thanks to some Build Extensions you can run Ant and Maven build, but only with a Windows Box.

New build system provide agents also for non Windows machine thanks to the xPlat Agent. https://msdn.microsoft.com/Library/vs/alm/Build/agents/xplat 

You can run TFS/VSO build on non windows machine such as Linux and Macintosh.

Running local build is complex for developer

There are lots of reasons to run a build in local machines and with the old system you need to do a full TFS installation, leading to a maintenance nightmare. Usually you will end with a lot of stuff in your Continous Integration scripts, such as publishing nuget packages, and lots more. With such scenario, running a build in a local machine should be an easy task for developers.

New agent can run as a service, or run interactively with a simple double-click on the VsoAgent.exe file. Each developer can install agent in minutes and run the agents only when he/she really needs it.

Running local build is really easy because you can start local agent with a double click, only when you need to run a local build.

You can also attach a debugger to the build if you need to debug some custom code that can run during a build because it is a simple console application.

Keeping up with update is complex

If you are a large enterprise, you probably have several build machines and with the older version of TFS you should update all build agents and controller each time you updated TFS instance. With TFS 2012 Update 2 this requirement is relaxed, and older build system can target newer version of TFS. This will give you a timeframe to update all Build Machines after you updated TFS Core installation, but you really need to run TFS Update on all machines with build Agents or Controllers installed.

You will be happy to know that the new build agent has automatic update capabilities. It actually checks for new version at each startup of the agent, if a new version of the agent is available for the server you are connected to, it will automatically download, upgrade and restart the new version.

The only drawback is that, if you are running the agent as a windows service, you should restart the service for the upgrade check to take place. In future version we expect the agent to periodically check for an upgrade instead of requiring restart.

The overall feeling is that the new Build Infrastructure will be really easier to maintain and deploy, also it will give you new capabilities, like building on non-windows machine.

Gian Maria.

TF Service: deploy on Azure Web Site with Database Project

The ability to automatically deploy a site on Azure Web Site from TFService is really interesting, but sadly enough there is no out-of-the-box solution to update the structure of an Azure Database with a VS2012 Database Project. In this post I’ll show how to modify the standard build template to deploy a Database Project during Azure Web Site Deployment. I’ve blogged in the past to explain how to Deploy a Database Project with TFS Build, but that post refers to the old type of Database Project (VS2010) and now I want to explain how to customize the AzureContinuousDeployment build to deploy on azure a database project of VS2012.

First of all create a copy of the standard build file in BuildProcessTemplate directory.

image

Figure 1: Create a copy of the original AzureContinuousDeployment.11.xaml file and check-in

This will avoid messing with the original build definition; to accomplish it simply create a copy of the file in your workspace, check-in the new file and open your local copy from Visual Studio to modify the Workflow definition. The main problem if you are not familiar with TFS Build Workflow, is to find the place where to put the new instructions. To help you locate the point where you should modify the build file, please look at Figure 2, that shows where to locate the Try Compile and Test sequence.

image

Figure 2: Try Compile and Test is the part of the workflow you need to modify.

Now you should expand Try Compile and Test, scroll down until you find a sequence called Deploy on Test Success, expand it and you will find a Publish Output sequence where you can find a call to a MSDeploy action that actually is deploying the web site.

image

Figure 3: The point of the workflow where the site is deployed

The right place where to insert additional operations to deploy some stuff is right after the MSDeploy.

First of all you should add a couple of Arguments to the Workflow to make it more generic and make it reusable; with arguments the user will be able to specify if he want to deploy a database (DeployDatabase argument) and the Database Output File Name (DatabaseProjectOutputFile) directly from the build editor. This will permit you to keep only one xaml file with the build workflow, and have multiple build, based on that workflow, and choose for each one what database to deploy. You should also configure the Metadata property to specify Name, description and other properties of your custom arguments. You can have a look at TFS2010 Create a Personalized Build post for details about Metadata and Workflow Arguments.

image

Figure 4: Add a couple of arguments to parametrize the workflow.

The whole customization is represented in Figure 5 and you can see that it is really easy. You start adding a condition to verify if the argument DeployDatabase is true, name this condition If Should Deploy Database, and then add a standard Sequence inside it in the then area. If you exclude WriteBuildMessage actions, that are merely logging, the whole operation is done by only two Actions.

image

Figure 5: The sequence added to the standard workflow to deploy a database project.

The ConvertWorkspaceItem is used to convert a path expressed in TFS source control (starts with dollar sign) to local path of the build server. This is needed because to deploy a database project the easiest path is using SqlPublish.exe program, (installed locally in your developing machines when you install Sql Server Data Tools). I’ve simply located them in my HD, then copied inside a folder of the source control to have it included along with the source code of my project. This technique is really common to achieve Continuous Deployment, if is it possible, including all the tools needed to build and deploy your solution should be included in the source code of the project.

image

Figure 6: Insert all SqlPackage.exe distribution in your source code and check-in, this will make the tool available to the Build Agent

This is the easiest way to have an executable available for build agents. When the agents starts the build it does a get latest before compiling the source and this will automatically get tools to deploy database along with the source code. This technique is really useful for TF Service because you have no access to the elastic Build installation, and you have no way to install SqlPackage.exe in the Build Server. (Note: If you have your Build Agent on premise, you can simply install SqlPackage.exe, then find the location on the hard disk and simply hardcode  (or add another variable) installation path in the build definition and get rid of the ConvertWorkspaceItem action )

The ConvertWorkspaceItem action is needed because I know the path on the source control Es: $/myproject/trunk/tools/Deploy but I do not know where the Build Server is creating the workspace to build the source. The ConvertWorkspaceItem is the action that permits me to convert a path on source control to a real path on disk. Configuration is represented in Figure 6

image

Figure 7: Configuration of the ConvertWorkspaceItem.

This is the configuration of the ConvertWorkspaceItem action and the Input path is composed by Workspace.Folders.First().ServerItem variable, that represents the first mapped server folder on the workspace used to build the solution. This is a convenient way to make the build file reusable, as long as you remember that the first folder to map should be the top level folder of your project (trunk, or specific branch), and it should contains a folder called /Tools/Deploy/SqlDac that should contains the SqlPackage.exe distribution. The result property point to a Variable of the workflow (I have previously declared), that will contain the physical location of the SqlPackage.exe tool after the execution of the action.

The other interesting action is the InvokeProcess used to invoke the SqlPackage.exe tool to deploy the database. You can customize the build with custom action, or msbuild scripts (I’ve talked a lot about it in the past) but the easiest way is to include an executable in source control Tools directory and have it invoked from InvokeProcess. This solution does not require deploying custom assemblies, and is really simple to setup. If you expand it you can find that this action have a couple of slots to place an activity that will be invoked with the standard output and the standard error, usually you have them setup as in Figure 8. The WriteBuildError activity is really useful because it write error message on the build log, and it makes also the build as Partially Failed, so if the database deploy operation fails, I have automatically my build marked as partially failed without the need of any other further customization work.

image

Figure 8: Intercept standard output and error and have them dumped as build message

This is the complete configuration of the InvokeProcess action.

image

Figure 9: Configuration of the InvokeProcess activity

Actually the interesting part is the FileName property that use the SqlPackageLocation variable (populated by ConvertWorkspaceItem) to locate where the agent downloaded SqlPackage.exe and the Arguments that contains all the info for the deploy. Here is its full value.

image

Figure 10: Configuration of the Arguments property, it contains the full command line arguments for SqlPackage.exe tool

The cool part is the ability to use the variable azureWebSiteProfile.SqlServerDBConnectionString that contains the database connection string extracted from the publish file, it was parsed for you by AzureContinuousDeployment build. The outputDirectory variable is the directory where the build copied all the output of the various project after the build, and is the location of the .dacpac file of your database project. Finally you can check-in modified workflow back to TFS.

Now you can let azure create a standard Build for you, with standard configuration from Azure Management portal, this will create a build in your Team Project. Once the build is created, you can simply edit to use new Build Workflow. As first step you need to change the workspace in Source Settings, because you need to map the entire trunk (main) directory, because you want Tools to be downloaded with source.

image

Figure 11: Change configuration of the workflow, map the entire trunk of the project to include the Tools directory

Finally you can go to the Process section so you can change the workflow, from the standard AzureContinuousDeployment to the one you customized with previous steps.

image

Figure 12: Choose the new Build workflow for the build

Now you should see in all variables added in Figure 4, if you correctly filled metadata for them, you should see description for them. I also created a new area of configuration called DeployCustom to separate all standard build properties by new ones introduced by the customization.

image

Figure 13: Specify values for your custom variables

Now I can proceed to set Enable Database Deploy to true and specify the name of the dacpac file I want to deploy. Name of dacpac file 99% is the name of the database project followed by the .dacpac extension.

Now at each Build the Azure database connected to the Web Site would be updated to the new structure of the database project.

If you prefer You can download the modified build definition file from Here but I strongly suggest you to modify your file. First of all because the original publish file can be updated from the time this post was published, but it is also important that you try to being familiar with build customization, so you will be able to further modify it if you need to deploy other stuff.

Gian Maria.

Speedup Tfs builds with parallel compilation and incremental get

For large projects, the time needed to run a build can grow considerably large, until it eventually pass the famous barrier of 10 minutes. To speedup build time you can try to use some features of TFS,

Parallel Builds: Msbuild can run faster in multicore environment with the use of parallel builds, to enable this feature you can simply go to the machine where the tfs build agent is installed and you need to modify the file tfsbuildservice.exe.config located in C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies, and change the value for the MaxProcesses key property

    <!-- MaxProcesses
         Set this value to the maximum number of processes MSBuild.exe should
         use for builds started by agents hosted by this executable.
        -->
    <add key="MaxProcesses" value="1" />

Remember also that there are some properties of the build process that are related with parallel builds,  one is the BuildInParallel that can be set to false to disable Parallel Builds, then you can set BuildConfigurationInParallel or BuildSolutionInParallel to fine tuning the parallelization of the process. Both of these properties are set to true by default. These settings can be used to solve specific problem.

Incremental builds. By default for each build the build agent download every file specified in the build workspace and regenerates all binary. For large project you can have better performance if you get only changed files and generate only the corresponding binaries. To achieve this you can set to true a couple of properties: IncrementalGet: instructs build machine to get only the files that are changed from last build, and IncrementalBuild: that instructs build machine to generate only binary for changed source file. To change these ones you can simply edit TFSBuild.proj

<PropertyGroup>
<IncrementalGet>true</IncrementalGet>
</PropertyGroup>

These two simple tricks can speed up build process for large projects, thus keeping your build time lower than 10 minutes.

alk.

Tags: