Release app with Azure DevOps Multi Stage Pipeline

MultiStage pipelines are still in preview on Azure DevOps, but it is time to experiment with real build-release pipeline, to taste the news. The Biggest limit at this moment is that you can use Multi Stage to deploy in Kubernetes or in the cloud, but there is not support for agent in VM (like standard release engine). This support will be added in the upcoming months but if you use azure or kubernetes as a target you can already use it.

My sample solution is in GitHub, it contains a real basic Asp.NET core project that contains some basic REST API and a really simple angular application. On of the advantage of having everything in the repository is that you can simply fork my repository and make experiment.

Thanks to Multi Stage Pipeline we finally can have build-test-release process directly expressed in source code.

First of all you need to enable MultiStage Pipeline for your account in the Preview Features, clicking on your user icon in the upper right part of the page.

image

Figure 1: Enable MultiStage Pipeline with the Preview Features option for your user

Once MultiStage Pipeline is enables, all I need to do is to create a nice release file to deploy my app in azure. The complete file is here https://github.com/alkampfergit/AzureDevopsReleaseSamples/blob/develop/CoreBasicSample/builds/build-and-package.yaml and I will highlight the most important part here. This is the starting part.

image 

Figure 2: First part of the pipeline

One of the core differences from a standard pipeline file is the structure of jobs, after trigger and variables, instead of directly having jobs, we got a stages section, followed by a list of stages that in turns contains jobs. In this example the first stage is called build_test, it contains all the jobs to build my solution, run some tests and compile Angular application. Inside a single stage we can have more than one job and in this particular pipeline I divided the build_test phase in two sub jobs, the first is devoted to build ASP.NET core app, the other will build the Angular application.

image

Figure 3: Second job of first stage, building angular app.

This part should be familiar to everyone that is used to YAML pipeline, because it is, indeed, a standard sequences of jobs; the only difference is that we put them under a stage. The convenient aspect of having two distinct jobs, is that they can be run in parallel, reducing overall compilation time.

If you have groups of taks that are completely unrelated, it is probably bettere to divide in multiple jobs and have them running in parallel.

The second stage is much more interesting, because it contains a completely different type of job, called deployment, used to deploy my application.

image

Figure 4: Second stage, used to deploy the application

The dependsOn section is needed to specify that this stage can run only after build_test stage is finished. Then it starts jobs section that contains a single deployment job. This is a special type of job where you can specify the pool, name of an environment and then a strategy of deploy; in this example I choose the simplest, a run once strategy composed by a list of standard tasks.

If you ask yourself what is the meaning of environment parameter, I’ll cover it in much extension on a future post, for this example just ignore it, and consider it as a way to give a name to the environment you are deploying.

MultiStage pipeline introduced a new job type called deployment, used to perform deployment of your application

All child steps of deployment job are standard tasks used in standard release, the only limitation of this version is that they run on the agent, you cannot run on machine inside environment (you cannot add anything else than kubernetes cluster to an environment today).

The nice aspect is that, since this stage depends on build_test, when deployment section runs, it automatically download artifacts produced by previous stage and place them in folder $(Pipeline.Workspace) followed by another subdirectory that has the name of the artifacts itself. This solves the need to transfer artifact of the first stage (build and test) to deployment stage

image

Figure 5: Steps for deploying my site to azure.

Deploying the site is really simple, I just unzip asp.NET website to a subdirectory called FullSite, then copy all angular compiled file in www folder and finally use a standard AzureRmWebAppDeployment to deploy my site to my azure website.

Running the pipeline shows you a different user interface than a standard build, clearly showing the result for each distinct stage.

image

Figure 6: Result of a multi stage pipeline has a different User Interface

I really appreciate this nice graphical representation of how the stage are related. For this example the structure is is really simple (two sequential steps), but it shows clearly the flow of deployment and it is invaluable for most complex scenario. If you click on Jobs you will have the standard view, where all the jobs are listed in chronological order, with the Stage column that allows you to identify in which stage the job was run.

image

Figure 7: Result of the multi stage pipeline in jobs view

All the rest of the pipeline is pretty much the same of a standard pipeline, the only notable difference is that you need to use the stage view to download artifacts, because each stage has its own artifacts.

image

Figure 8: Downloading artifacts is possible only in stages view, because each stage has its own artifacs.

Another nice aspect is that you can simply rerun each stage, useful is some special situation (like when your site is corrupted and you want to redeploy without rebuilding everything)

Now I only need to check if my sites was deployed correctly and … voilà everything worked as expected, my site is up and running.

image

Figure 9: Interface of my really simple sample app

Even if MultiStage pipeline is still in preview, if you need to deploy to azure or kubernetes it can be used without problem, the real limitation of actual implementation is the inability to deploy with agents inside VM, a real must have if you have on-premise environment.

On the next post I’ll deal a little more with Environments.

Gian Maria.

Another gem of Azure Devops, multistage pipelines

With deployment of Sprint 151 we have an exciting news for Azure DevOps called multi stage pipelines. If you read my blog you should already know that I’m a huge fan of having YAML build definition, but until now, for the release part, you still had to have the standard graphical editor. Thanks to Multi Stage Pipelines now you can have both build and release definition directly in a single YAML file.

Multi stage pipelines will be the unified way to create a yaml file that contains both build and release definition for your projects.

This functionality is still in preview and you can have a good starting point here, basically we still miss some key features, but you can read in previous post about what’s next for them, and this should reassure you that this is an area where Microsoft is investing a lot.

Let’s start to create first real pipeline to deploy an asp.net application based on IIS, first of all I’m starting with an existing YAML build, I just create another yaml file, then I can copy all the existing YAML of an existing build, but in the head of the file I’m using a slightly different syntax

image

Figure 1: New Multistage pipeline definition

As you can see the pipeline starts with name stages, then a stage section starts, that basically contains a standard build, in fact I have one single job in Build Stage, a job called Build_and_package that takes care of building testing and finally publish artifacts.

After the pipeline is launched, here is what I have as result (Figure 2):

image

Figure 2: Result of a multistage pipeline

As you can see the result is really different from a normal pipeline, first of all I have all the stages (actually my deploy job is fake and does nothing). As you can see the pipeline is now composed by Stages, where each stage contains jobs, and each jobs is a series of tasks. Clicking on Jobs section you can see the outcome of each jobs, this allows me to have a quick look at what really happened.

image

Figure 3: Job results as a full list of all jobs for each stage.

When it is time to deploy, we target environments, but unfortunately in this early preview we can only add kubernetes namespace to an environment, but we are expecting soon to be able to add Virtual Machines through deployment groups and clearly azure web apps and other azure resources.

I strongly encourage you to start familiarizing with the new syntax, so you will be able to take advantage of this new feature as soon at it will be ready.

Gian Maria

Azure DevOps is now 150 sprints old

I remember old days when Azure DevOps was still in private preview, and yet it was really a good product, now 150 sprints passed, and the product is better than ever. Not everything is perfect, but, as users, we can expect new feature to being deployed each 3 weeks, the duration of Microsoft Sprint.

This means that now the product is 450 Weeks old, and finally we got a little nice feature that shows up news in the front page.

image

Figure 1: Widget with new feature of newest deployed sprint

This allows users to immediately being notified of new feature in their accounts, with a nice summary with key new features. In this sprint we have the new Task Assistant to help editing YAML pipelines, and many new feature, like the new agent administration ui.

image

Figure 1: New administration page in action.

The new page is more consistent with the look and feel of the rest of the service, also it shows wait time and build duration when you drill down in a pool.

As always I cannot stress out how good is to have all of your project administration tool in the Cloud, no time spent to upgrade, no time spent to verify and check backup policies, and, completely free for the first 5 users.

Gian Maria.

How to edit a YAML Azure DevOps Pipeline

I cannot stress you enough on how better is the experience of having builds defined in code than having build definition on the server, so I’m here to convince you to move to the new YAML build system in Azure DevOps :).

Having build definition in Code gives you many benefits, the first is that builds evolve with code branches.

If you still think that editing a YAML file is a daunting experience because you have tons of possible tasks and configuration to use, take a peek to the Azure Pipeline extension Visual Studio Code Addin, that brings intellisense for your pipeline editing in Visual Studio Code. I strongly encourage you to have a look at the YAML schema reference to have a complete knowledge of the syntax, but for most people a quick approach to the tool is enough, leaving the deep dive for when they need to do complex stuff.

With the extension enabled, after you opened a YAML Build definition in Visual Studio Code, you can click on the YAML button in the lower right part of visual studio code editor to change language

image

Figure 1: Language mode selection of Visual Studio Code

That area is the Language Mode Selection, and it is where you specify to Visual Studio Code what is the language of the file you are editing. If you simply open a YAML file, VS Code recognize the yaml extension and helps using standard YAML syntax, but it does not know anything about Azure DevOps Pipeline.

When you tell Visual Studio Code that the file is a YAML pipeline, intellisense kicks out and allows you to quickly edit the file

Thanks to the Language Mode Selector, we can now specify that the file is a Azure Pipeline file and not a standard YAML file.

image

Figure 2: Selecting the right language type allows VS Code to give you tremendous help in editing the file.

This is everything you need to do, from now on, VS Code will give you helps in the context of Azure DevOps pipeline syntax. Even if the file is completely empty the editor shows you possible choices for the first level nodes

image

Figure 3: Suggestions on empty file

Since I usually start specifying the pool, I can simply choose pool, then let VS Code guide me in the compilation of all properties

image

Figure 4: Intellisense in action editing the file

In real scenario you usually starts from some template file (another advantage of having build in code), you already prepared with standard build for you project, but even in that scenario having intellisense to refine the build will help you in choosing tasks.

image

Figure 5: Help in choosing tasks

I can assure that, after some usage, it is far more powerful and quick to edit a build with VS Code than to edit a standard build made with tasks in the Web based editor. Graphical editor are powerful and are a good entry point for those who does not know the instrument, but intellisense powered editors are more productive and powerful.

image

Figure 6: You do not have only intellisense to choose the task, but it will shows you also information about the task

The only drawback I found is using custom tasks that were not recognized by the intellisense, as my GitVersion Task, that was marked as wrong because VS Code does not know it.

image

Figure 7: Custom tasks were not automatically recognized by VS Code

Intellisense will completely remove the need of the old trick of creating a build with the old editor, place tasks in the pipeline and then letting the tool generates YAML definition based on how you configured the task in graphical editor. I assure you that it is faster to directly copy a reference build and then add needed tasks with intellisense in VS Code than using UI editor.

If you are really a UI Oriented person, in the latest release of Azure DevOps (at the time of writing the feature is rolling out so it is not available on all accounts) you can use the YAML Task Assistant

Badge

Figure 8: YAML Task assistant in action

The assistant allows you to configure the Task with the very same UI experience you have in UI Based pipeline, once the task is configured you can simply add corresponding YAML to the definition.

Task assistant gives you the same add experience for tasks of the old UI editor, so you can configure the task with graphic editor, then add corresponding YAML syntax to the definition.

I think that with Task Assistant there are no more excuses not to move to YAML based definition.

Gian Maria

Troubleshoot YAML Build first run

Scenario: You create a branch in your git repository to start with a new shiny YAML Build definition for Azure Devops, you create a yaml file, push the branch in Azure Devops and Create a new Build based on that YAML definition. Everything seems ok, but when you press the run button you got and error

Could not find a pool with name Default. The pool does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.

image

Figure 1: Error running your new shiny pipeline

Ok this is frustrating and following the link gives you little clue on what really happened. The problem is that, with the new editor experience, when you navigate to the pipeline page, all you see is the editor of YAML build and nothing more.

image

Figure 2: New Editor page of YAML pipeline, advanced editor and nothing more.

The new editor is fantastic, but it somewhat hides standard configuration parameters page, where the default branch can be set. As you can see from Figure 2 you can specify pool name (default) and triggers directly in YAML build so you think that this is everything you need, but there is more. Clicking on the three buttons in the right upper corner you can click on the trigger menu to open the old editor.

image

Figure 3: Clicking on the Triggers menu item will bring on the old UI

This is where the YAML pipeline experience still needs some love, you are surely puzzled why you need to click triggers menu item if you already specified triggers directly in the YAML definition, but the reason is simple, it will open the old pipeline editor page.

The new editor page with YAML editor is fantastic, but you should not forget that there are still some parameters, like default branch, that are editable from the old interface

Trigger page is not really useful, it only gives you the ability to override the YAML configuration, but the important aspect is that we can now access the first tab of the YAML configuration to change default branch.

image

Figure 4: Trigger page is not useful, but now we can access default configuration for the pipeline.

image

Figure 5: Default configuration tab where you can edit default branch

In Figure 5 you can now understand what went wrong, the wizard created my pipeline using master as default branch, but clearly my buid YAML file does not exists in master, but exists only in my feature branch. Yust change the default build to the branch that contains your build definition file, save and queue again; now everything should word again.

This trick works also when you got errors not being authorized to use endpoints, like sonar endpoint, nuget endpoint etc.

Happy YAML Building experience.

Gian Maria.