Build and Deploy Asp.Net App with Azure DevOps

I’ve blogged in the past about deploying ASP.NET application, but lots of new feature changed in Azure DevOps and it is time to do some refresh of basic concepts. Especially in the field of web.config transform there is always lots of confusion and even if I’m an advocate of removing every configuration from files and source, it is indeed something that worth to be examined.

The best approach for configuration is removing then from source control, use configuration services, etc and move away from web.config.

But since most people still use web.config, lets start with a standard ASP.NET application with a Web.Config and a couple of application settings that should be changed during deploy.


Figure 1: Simple configuration file with two settings

When it is time to configure your release pipeline, you MUST adhere to the mantra: Build once, deploy many. This means that you should have one build that prepares the binaries to be installed, and the very same binaries will be deployed in several environment.

Since each environment will have a different value for app settings stored in web.config, I’ll start creating a web config transform for the Release configuration (then one that will be released), changing each configuration with a specific token.


Figure 2: Transformation file that tokenize the settings

In Figure 2 I show how I change the value of Key1 setting to __Key1__ and Key2 to __Key2__. This is necessary because I’ll replace these value with the real value during release.

The basic trick is changing configuration values in files during the build, setting some tokenized value that will be replaced during release. Using double underscore as prefix and suffix is enough for most situations.

Now it is time to create a build that generates the package to install. The pipeline is really simple, the solution is build with MsBuild with standard configuration for publishing web site. I’ve used MsBuid and not Visual Studio Task, because I do not want to have Visual Studio on my build agent to build, MsBuild is enough.


Figure 3: Build and publish web site with a standard MsBuild task.

If you run the build you will be disappointed because resulting web.config is not transformed, but it remains with the very content of the one in source control. This happens because transformation is an operation that is not done during standard web site publishing, but from Visual Studio when you use publish wizard. Luckly enough there is a task in preview that performs web.config transformation, you can simply place this task before MsBuild task and the game is done.


Figure 4: File transform task is in preview but it does its work perfectly

As you can see in Figure 4, you should simply specify the directory of the application, then choose XML transformation and finally the option to use web.$(BuildConfiguration).config transformation file to transform web.config.

Now you only need to copy the result of the publish into the artifact staging directory, then upload with standard upload artifact task.


Figure 5: Copy result of the publish task into staging directory and finally publish the artifact.

If you read other post of my blob you know that I usually place a PowerShell script that reorganize files, compress etc, but for this simple application it is perfectly fine to copy the _PublishedWebsites/ directory as build artifact.


Figure 6: Published artifacts after the build completes.

Take time to verify that the output of the build (Artifacts) is exactly what you expected before moving to configure the release.

Before going to build the release phase, please download the web.config file and verify that the substitution were performed and web.config contains what you expected.


Fiure 7: Both of my settings were substituted correctly.

Now it is time to create the release, but first of all I suggest you to install this extension  that contains a nice task to perform substitution during a release in an easy and intuitive way.

One of the great power of Azure DevOps is extensibility, there are tons of custom task to perform lots of different task, so take time and look in the marketplace if you are not able to find the Task you need from basic ones.

Lets start creating a simple release that uses the previous build as artifact, and contains two simple stages, dev and production.


Figure 8: Simple release with two stages to deploy the web application.

Each of the two stages have a simple two task job to deploy the application and they are based on the assumption that each environment was already configured (IIS installed, site configure etc), so, to deploy our app, we can simply overwrite the old installation folder and replace with the new binaries.

The Replace Token task comes in hand in this situation, you simply need to add as the first task of the job (before the task that copies file into IIS directory), then configure prefix and suffix with the two underscore to match criteria used to tokenize configuration in web.config


Figure 9: Configure replace token suffix and prefix to perform substitution.

In this example only web.config should be changed, but the task can perform substitution on multiple files.


Figure 10: Substition configuration points to web.config file.

The beautiful aspect of transform task is that it uses all the variables of the release to perform substitution. For each variable it replace token using prefix and suffix, this is the reason of my transformation release file in the build; my web.config file has __Key1__ and __Key2__ token inside configuration, so I can simply configure those two variables differently for the two environment and my release is finished.

If you use Grid visualization it is immediate to understand how each stage is configured.


Figure 11: Configure variables for each stage, the replace task will do the rest.

Everything is done, just trigger a release and verify that the web config of the two stages is changed accordingly.


Figure 12: Sites deployed in two stages with different settings, everything worked as expected.

Everything worked good, I was able to build once with web.config tokenization, then release the same artifacts in different stages with different configurations managed by release definition.

Happy AzDo

Mounting network share in Release Definition

Using Deployment Groups with Release Management in VSTS is really nice, because you can use a pull release model, where the agent is running on machines that are deployment target, and all scripts are executed locally (instead of using PowerShell Remoting and WinRM).

A typical release definition depends on artifacts produced by a build and with VSTS sometimes it is convenient to store build artifacts in a network share instead that on VSTS. This is especially true if, like me, you have a internet connection with really slow upload bandwidth (256 Kbps). Storing artifacts in network share reduce the time needed from the build to upload artifact and the time needed by the release to download them to almost few seconds.

Storing build artifacts in network share is really useful in situation where internet bandwidth is limited.

In this scenario, if the machines that belongs to Deployment Groups are outside your domain you have authentication problem when the release process try to access the network share to download the artifacts. Here the error I have when triggering a release.

Downloading artifacts failed: Microsoft.VisualStudio.Services.Agent.Worker.Release.Artifacts.ArtifactDownloadException: 
The artifact directory does not exist: \\neuromancer\Drops\VSO\Jarvis - CI - Package For UAT Test\JarvisPackage debug - 2.1.0-sprint7-team.2078.
 It can happen if the password of the account JVSTSINT\Administrator is changed recently and is not updated for the agent. 

This error is clear, the user that runs the agent in the Deployment Groups is not part of the domain, thus it cannot access a network share that is part of the domain.

Storing artifacts in a network share is useful to reduce bandwidth, but you need to be sure that all agents have access to it.

I want to solve this problem without the need to join the machine to the domain or to configure in some special way the agent on the machine, my goal is resolving this problem inside the release definition.

To solve this problem you can simply use the net use command line tool, that is used to map a network share with specific credentials, but the download artifact phase of the release takes part before any task and the release will fail before any of your task has the opportunity to run.


Figure 1: Task used to map the network share.

A quick solution to this problem is inserting a dedicated Deployment Group phase (Figure 1) before any other phase, call this phase “mount network share” (1) , add a simple Command Line task (2) and finally be sure to select the “Skip download of artifacts” (3) option. Point 3 is the most important one, because downloading artifacts takes place before the execution of any task.

Then I declare a couple of release variables to store username and password of a user that have access to that share (in my domain I have a dedicated TfsBuild account).


Figure 2: Variables to mount network share with a valid domain user

Now I only need to configure the Command Line task to use the net use command to mount the network share with the user specified in release variables. The configuration is straightforward and is represented in Figure 3.


Figure 3: Configuration of Command Line task to use net use command

Thanks to the net use command, the release is able to mount the network share in each machine of Deployment Group using the TfsBuild user. You can verify from release logs, that the Command Line task correctly run and maps the network share.


Figure 4: Net use command in action in build logs.

Using a special Deployment Group phase with “Skip download of artifacts” selected allows you to run any task you need before the download of the artifacts takes place.

Gian Maria.

Running UAT tests in a VSTS / TFS release

I’ve blogged on how to run UAT and integration tests during a VSTS Build; that solution works quite well but probably is not the right way to proceed. Generally speaking that build does its work but I have two main concerns.

1) Executing test with remote execution requires installation of test agent and involves WinRm, a beast that is not so easy to tame outside a domain

2) I’m deploying the new version of the application with an XCopy deployment, that is different from a real deploy to production.

The second point is the one that bothers me, because we already deploy in production with PowerShell scripts and I’d like to use the very same scripts to deploy on the machine used for the UAT testing. Using the same script used for real release will put those script also under testing.

If you want to run UAT and integration testing, the best scenario is when you install the new version of the application with the very same script you use to deploy on production.

If you have a (script, whatever) to automatically release a new version of your application, it is really better to use a different strategy to run the UAT test in VSTS / TFS: instead of using a build you should use release management. If you still do not have scripts or whatever to automatically release your application, but you have UAT tests to run automatically, it is time to allocate time to automate your deployment. This is a needed prerequisite to automate running of UAT and will simplify your life.

The first step is a build that prepare the package with all the files that are needed by the installation, in my situation I have a couple of .7z files: the first contains all the binaries and the other contains all updated configurations. These are the two files that I use for deployment with PowerShell script. The script is quite simple, it stops services, backup actual version, deletes everything, replace binaries with latest version, then update configuration with the new default values if any. It is not rocket science, it is a simple script that automate everything we have on our release list.

Once you have prerequisites (build creating binaries and installation scripts), running UAT tests in a release is really simple, a simple dependency from build artifacts, a single environment and the game is done.


Figure 1: General schema for the release that will run UAT tests.

I’m depending by the artifact of a single build, specially crafted for UAT. To run UAT testing I need the .7z files with the new release of the software, but I need also a .7z file with all the UAT tests (nunit dll files and test adapter) needed to run the tests and all installation scripts.

To simplify everything I’ve cloned the original build that I used to create package for new release and I’ve added a couple of tasks to package UAT test files.


Figure 2: Package section of the build

I’ve blogged a lot in the past of my love with PowerShell scripts to create package used for release. This technique is really simple, you can test scripts outside of build management, it is super easy to integrate in whatever build engine you are using and with PowerShell you can do almost everything. In my source code I have two distinct PowerShell package script, the first creates package with the new binaries the second one creates a package with all UAT assemblies as well as NUnit tests adapters. All the installation scripts are simply included in the artifacts directly from source code.

Build for UAT produces three distinct artifacts, a compressed archive with new version to release, a compressed archive with everything needed to run UAT tests and the uncompressed folder with all installation scripts.

When the build is stable, the next step is configuring a Deployment Group to run UAT. The concept of Deployment Group is new in VSTS and allows you to specify a set of machines, called deployment group, that will be used in a release definition. Once you create a new Deployment Group you can simply go to the details page to copy a script that you can run on any machine to join it to that deployment group.


Figure 3: Script to join a machine to a Deployment Group

As you can see from Figure 3, you can join Windows machines or a Ubuntu or RedHat machines to that group. Once you run the script that machine will be listed as part of the Group as you can see in Figure 4.


Figure 4: Deployment groups to run UAT tests.

The concept of Deployment Group is really important, because it allows for pull deployment instead of push deployment. Instead of having an agent that will remotely configure machines, we have the machines of Deployment Group that will download artifacts of the build and runs the build locally. This deployment method will completely remove all WinRM issues, because the release scripts are executed locally.

When designing a release, a pull model allows you to run installation scripts locally and this lead to more stability of the release process.

There are another advantages of Deployment Groups, like executing in parallel to all machines of a group. This MSDN post is a good starting point to learn of all the goodness of DG.

Once the Deployment Group is working, creating a release is really simple if you already created PowerShell scripts for deployment. The whole release definition is represented in Figure 5.


Figure 5: Release definition to run UAT testing

First of all I run the installer script (it is an artifacts of the build so it is downloaded locally), then I uncompress the archive that contains UAT tests and delete the app_offline.htm files that was generated by the script to bring the IIS website offline during the installation.

Then I need to modify a special .application files that is used to point to a specific configuration set in the UAT machine. That step is used because the same machine is used to run UAT tests during a release or during a Build (with the technique discussed in previous post) so I need to run the UAT testing with two different sets of parameters.

Then I run another PowerShell script that will change the Web.config of the application to use Forms Authentication instead of Integrated authentication (we use fake users during UAT). After this steps everything is ready to run UAT tests and now I can run them using standard Visual Studio Test task, because the release script will be run locally in the machines belonging to deployment Group.

Most of the steps are peculiar to this specific application, if your application is simpler, like a simple IIS application, probably the release will be even simpler, in my situation I need to install several windows services, updating an IIS application another angular application, etc etc.

If you configure that release to start automatically as soon as new artifacts is present, you can simply trigger the build and everything will run automatically for you. Just queue the build and you will end with a nice release that contains results of your UAT tests.


Figure 6: Test result summary in release detail.

This technique is superior respect running UAT tests during a standard build; first of all you do not need to deal with WinRM, but the real advantage is continuously testing your installation scripts. If for some reason a release script does not work anymore, you will end with a failing release, or all UAT tests will fail because the application was not installed correctly.

The other big advantage is having the tests running locally with the standard Visual Studio Test runner, instead of dealing with remote test execution, that is slow and more error prone.

The final great advantage of this approach, is that you gain confidence of your installation scripts, because they are run constantly against your code, instead of being run only when you are actually releasing a new version.

As a final notice, Deployment Groups is a feature that, at the time I’m writing this post, is available only for VSTS and not for TFS.

Gian Maria.