VSTS Package, packages failed to publish

I have a build that publishes nuget packages on MyGet, we decided to move packages to VSTS internal package management, so I simply added another Build Task that pushes packages to VSTS internal feed. Sadly enough I got a really generic error

Error: An unexpected error occurred while trying to push the package with VstsNuGetPush.exe.
Packages failed to publish

Those two errors does not gives me real information on what went wrong, but looking in the whole log, I verified that the error happens when the task was trying to publish symbols packages (2).

image

Figure 1: Complete error log

It turns out that VSTS Package management is not still capable of publishing symbols package and this generates the error.

To solve this problem you can avoid generating symbols package, or you can simply avoid pushing them to VSTS feed. In the future VSTS feeds will probably supports symbols package and this problem will be gone.

In the meanwhile we have a issue on GitHub where we asked to the team for a much clearer error message, that can immediately point out the problem during the push.

Gian Maria.

Migrate your TFS to VSTS

I’ve discussed a lot with many customers over the benefit of VSTS over TFS, especially for small companies, where there is no budget for a dedicated TFS administrator. The usual risk is not updating TFS, loosing the update train and then have a problem doing upgrades like TFS 2008 to TFS 2017.

For those realities, adopting VSTS is a huge benefit, no administration costs, no hardware costs, automatic upgrade, accessible from everywhere, same licensing (license for VSTS are also valid for TFS) and much more.  Also one of the original limitation, the inability to customize process, is now gone and, for certain aspect, VSTS is superior to the on-premise version (in VSTS you can do less customization but everything is done with Web Interface without needs to edit XML process file)

If you are a small company, VSTS is the perfect tool, it contains everything for DevOps, zero maintenance cost, extensible and free for the first 5 users.

To perform the migration Microsoft released a dedicated tool to do a complete migration, you can find details here https://www.visualstudio.com/team-services/migrate-tfs-vsts/ the process is well documented, with a dedicated step by step guide.

image

Figure 1: Step by step migration guide download.

Please read the guide carefully and verify the required version for TFS to being able to migrate. As you see from Figure 1 you can migrate from TFS 2017 update2, TFS 2017 Update 3 and TFS 2018 RTW. Please notice that you cannot use a migrator tool on a TFS version different from the version declared by the tool. The tool will be updated to reflect and support the new version of TFS, and it will support only the most recent versions. If you have an old TFS instance, the suggestion is to migrate as soon as possible, then plan for the migration.

If you need support, check one of the official DevOps partner at this address http://devopsms.com/SearchVSTSPartner, they can support your company for a smooth and successful migration, especially if you are a large organization and have a big TFS instance.

Gian Maria

Converting regular build in YAML build

YAML build in VSTS / TFS is one of the most welcomed feature in the Continuous Integration engine, because it really opens many new possibilities. Two of the most important advantages you have with this approach are: build definitions will follow branches, so each branch can have a different definition, then, since the build is in the code, everything is audited, you can pull request build modification and you can test different build in branches as you do with code.

Given that build definition in code is a HUGE steps forward, especially for people with strong DevOps attitude, the common question is: how can I convert all of my existing builds in YAML Build?

Edit: This is the nice aspect of VSTS, it moves really forward, I’ve prepared this post while this feature was in preview, and converting was a little bit of effort, now converting a build is really simple, you have a nice View YAML Button during build edit.

image

Figure 1: YAML definition of an existing build can be seen directly from build editing.

This button will open the corresponding YAML build definition for current build, thus making straightforward to convert existing builds.

This is VSTS, it moves forwards so fast 🙂 and if you want to do a post about a Preview Feature, often you should be really quick, before new functionalities makes the post old.

Old Way (before the View YAML)

While the feature was in early preview, converting is more manual,  we did not have an automatic tool that does this, but if you got a little bit of patience, the operation wasnot so complex.

First of all the engine of the YAML build is the very same of standard build, only the syntax is different, thus, every standard build can be converted in YAML build, you only need to understand the right syntax. This is how I proceeded to convert a definition.

First Step: Install an agent in your local machine, it is a matter of minutes, then delete everything in the _work folder and finally schedule a build with agent.name request equal to the name of the agent. This will schedule the regular build in your machine, and in the _work folder you will have a subfolder _tasks that contains all the definition of all the tasks used in your build. This will greatly simplify finding name and version of task used in your build, because you can easily open the folder with Visual Studio Code and you can browse everything.

image

Figure 2: List of tasks downloaded by the agent to execute the build. 

Then you can simply move to the History tab of the original build, where you can choose to “Compare Difference” so you can easily view the json definition of the build, where you can find all the parameters of the task and the value in the build in a Super Easy way.

image

Figure 3: Thanks to the Compare difference you can quickly have a look at the actual definition and all the parameters.

image

Figure 4: All tasks parameters are included in the definition and can be used to generate YAML build file.

Happy building.

Gian Maria.

Multitargeting in DotNetCore and Linux and Mac builds

One of the most important features of DotNetStandard is the ability to run on Linux and Mac, but if you need to use a DotNetStandard compiled library in a project that uses full .NET framework, sometimes you can have little problems. Actually you can reference a dll compiled for DotNetCore from a project that uses full Framework, but in a couple of project we experienced some trouble with some assemblies.

Thanks to multitargeting you can simply instruct DotNet compiler to produce libraries compiled against different versions of frameworks, so you can simply tell the compiler that you want both DotNetStandard 2.0 and full framework 4.6.1, you just need to modify the project file to use TargetFrameworks tag to request compilation for different framework.

netstandard2.0;net461

Et voilà, now if you run dotnet build command you will find in the output folder both the versions of the assembly.

image

Figure 1: Multiple version of the same library, compiled for different versions of the assembly.

Multitargeting allows you to produce libraries compiled for different version of the framework in a single build.

The nice aspect of MultiTargeting is that you can use dotnet pack command to request the creation of Nuget Packages: generated packages contain libraries for every version of the framework you choose.

image

Figure 2: Package published in MyGet contains both version of the framework.

The only problem of this approach is when you try to compile multitargeted project in Linux on in Macintosh, because the compiler is unable to compile for the Full Framework. that can be installed only on Windows machines. To solve this problem you should remember that .csproj files of DotNetCore projects are really similar to standard MsBuild project files so you can use conditional options. This is how I defined multitargeting in a project

image

Figure 3: Conditional multitargeting

The Condition attribute is used to instruct the compiler to consider that XML node only if the condition is true, and with Dollar syntax you can reference environment variables. The above example can be read in this way: if  DOTNETCORE_MULTITARGET environment variable is defined and equal to true, the compiler will generate netstandard2.0 and net461 libraries, otherwise (no variable defined or defined with false value) the compiler will generate only netstandard2.0 libraries.

Using the Condition attribute you can specify different target framework with an Environment Variable

All the people with Windows machines will define this variable to true and all projects that uses this syntax, automatically have the project compiled for both the framework. On the contrary, all the people that uses Linux or Macintosh can work perfectly with only netstandard2.0 version simply avoiding defining this variable.

The risk of this solution is: if you always work in Linux, you can potentially introduce code that compiles for netstandard2.0 and not for net461. Even if this situation cannot happen now, working with Linux or Mac actually does not compile and test the code against the full framework. The solution to this problem is simple, just create a build in VSTS that is executed on a Windows agent and remember to set DOTNETCORE_MULTITARGET to true, to be sure that the build will target all desired framework.

image

Figure 4: Use Build variables to setup environment variables during the build

Thanks to VSTS / TFS build system it is super easy to define the DOTNETCORE_MULTITARGET at build level, and you can decide at each build if the value is true or false (and you are able to trigger a build that publish packages only for netstandard2.0). In this build I usually automatically publish NuGet package in MyGet feed, thanks to GitVersion numbering is automatic.

image

Figure 5: Package published in pre-release.

This will publish a pre-release package at each commit, so I can test immediately. Everything is done automatically and is run in parallel with the build running in Linux, so I’m always 100% sure that the code compile both in Windows an Linux and tests are 100% green in each operating system.

Gian Maria.

Configure a VSTS Linux agent with docker in minutes

It is really simple to create a build agent for VSTS that runs in Linux and is capable of building and packaging your DotNetCore project, I’ve explained everything in a previous post, but I want to remind you that, with docker, the whole process is really simple.

Anyone knows that setting up a build machine often takes time. VSTS makes it super simple to install the Agent , just download a zip, call a script to configure the agent and the game is done. But this is only one side of the story. Once the agent is up, if you fire a build, it will fail if you did not install all the tools to compile your project (.NET Framework) and often you need to install the whole Visual Studio environment because you have specific dependencies. I have also code that needs MongoDB and Sql Server to run tests against those two databases, this will usually require more manual work to setup everything.

In this situation Docker is your lifesaver, because it allowed me to setup a build agent in linux in less than one minute.

Here are the steps: first of all unit tests use an Environment Variable to grab the connection string to Mongodb, MsSql and every external service they need. This is a key part, because each build agent can setup those environment variable to point to the right server. You can think that 99% of the time the connection are something like mongodb://localhost:27017/, because the build agent usually have mongodb installed locally to speedup the tests, but you cannot be sure so it is better to leave to each agent the ability to change those variables.

With this prerequisite, I installed a simple Ubuntu machine and then install Docker . Once Docker is up and running I just fire up three Docker environment, first one is the mongo database

sudo docker run -d -p 27017:27017 --restart unless-stopped --name mongommapv1 mongo

Than, thanks to Microsoft, I can run Sql Server in linux in a container, here is the second Docker container to run MSSQL

sudo docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=my_password' -p 1433:1433 --name msssql --restart=unless-stopped -d microsoft/mssql-server-linux

This will start a container with Microsoft Sql Server, listening on standard port 1433 and with sa user and password my_password. Finally I start the docker agent for VSTS

sudo docker run \
  -e VSTS_ACCOUNT=prxm \
  -e VSTS_TOKEN=xxx\
  -e TEST_MONGODB=mongodb://172.17.0.1 \
  -e TEST_MSSQL='Server=172.17.0.1;user id=sa;password=my_password' \
  -e VSTS_AGENT='schismatrix' \
  -e VSTS_POOL=linux \
  --restart unless-stopped \
  --name vsts-agent-prxm \
  -it microsoft/vsts-agent

Thanks to the –e option I can specify any environment variable I want, this allows me to specify TEST_MSSQL and TEST_MONGODB variables for the third docker container, the VSTS Agent. The ip of mongodb and MSSql are on a special interface called docker0, that is a virtual network interfaces shared by docker containers.

image

Figure 1: configuration of docker0 interface on the host machine

Since I’ve configured the container to bridge mongo and SQL port on the same port of the host, I can access MongoDB and MSSQL directly using the docker0 interface ip address of the host. You can use docker inspect to know the exact ip of the docker container on this subnet but you can just use the ip of the host.

image

Figure 2: Connecting to mongodb instance

With just three lines of code my agent is up and running and is capable of executing build that require external databases engine to verify the code.

This is the perfect technique to spinup a new build server in minutes (except the time needed for my network to download Docker images 🙂 ) with few lines of code and on a machine that has no UI (clearly you want to do a minimum linux installation to have only the thing you need).

Gian Maria.