One Team Project to rule them all

A similar post was made lots of time ago, but since this is always an hot topic, it is probably the time to refresh with new UI and new concepts of Azure DevOps.

The subject is, how can I apply security to backlogs if I adopt the strategy one single Team Project subdivided by teams?

The approach One Team Project to rule them all is still valid as today, because, once you have a team project, you can divide it with Teams, where each team has its own backlog (or share a single backlog between teams) making everything more manageable.

If you adopt the Single Team Project approach, usually a question of security arise, what if I need people of Team A being able to view only backlog of Team A and Team B are able to view backlog of Team B? Clearly if you create more than on Team Project the solution is obvious, if John is in Team Project A  and Jane is in Team Project B, each one will see only the backlog of Team Projects he/she belongs. This happens because each Team Project has its own user and you can see code, work items, pipeline of a Team Project only if you belong to that Team Project.

Team Project is useful if you need to segregate information between various member, so members of one Team Project cannot see information of other Team Projects.

If you create a single Team Project ad then create Team A and Team B, with the default option you will have two sub areas called TeamProject\Team A and TeamProject\Team B, and two security groups, one for each team. The problem is: is you put John in Team A and Jane in Team B both of them can see Work Items belonging to both teams.

This happens because once you are added in a Team Project, you are usually added also to a security group of the corresponding team, that in turn belongs to a special group called Team Project contributors, that in turn can see all Work Items and work with code etc etc.

Lets recap: this is the dialog you got when you are creating a new team.

image

Figure 1: Interface of new Team Creation

Two are the important points in Figure 1, the first is that the security behind that group is [Team Project Name]\Contributors, the second is that a new area path with the name of the team will be created. After you press create, a new security group with the name of the team will be created, and that group will be part of [Team Project Name]\Contributors default group. This imply that each person that is added to Team A will be alos part of Contributors, a special group that has access to all Work Items, Code and other resources in the project.

This is the default behavior, you can remove group in (1), but if you already created the team, this is the situation you got.

Now you need to solve your original problem: member of Team A should see only Work Items of Team A and members of Team B should see only Work Item of Team B. An obvious solution is to change security of corresponding area. Just go to project administration page where you configure areas as shown in Figure 2.

image

Figure 2: Administration of Area for security of Team A,

This shows actual permission for Work Items in area of Team A, as you can see from Figure 3 Contributors group can edit everything. From that figure you can see (3) that you have also an option that allows this area to inherit all permissions from parent areas.

image

Figure 3: Contributors permission for Work Item of area belonging to Team A.

First step is to remove inheritance and remove contributors from permission list as shown in Figure 4. To remove the group you can simply press the trash bin icon.

image

Figure 4: Remove permissions for Contributors Group and disable inheritance for the area.

Now only administrators, readers and Build administrators can access Work Item, and it is time to add corresponding team to current area as shown in Figure 5.

image

Figure 5: Just start typing in search textbox to find the team corresponding to current area. PAY ATTENTION to choose the group that belongs to the right Team Project because you could have multiple Team A in different Team Projects

Since you are configuring Team A area, select corresponding team called Team A (pay attention at team project name if you have more team called Team A in different Team Project). Now you should give permission to see and edit Work Items in that area as shown in Figure 6.

image

Figure 6: Give all permission to Team A to access work item in area Team A

Now people that are only in Team B, cannot access Work Item that are in area Team A, they cannot even find Work Item with a query. If you want to explicitly check permission level of another team, just search Team B in the same UI and check effective permission. As you can see in Figure 7 Team B has no permission in this area.

image

Figure 7: Permission of Team B

It is important that permission are “not set”  and not deny because a deny wins over all other settings.

Now you should repeat this configuration for each area and for each team. To recap

1) Choose settings for the corresponding area
2) Remove contributors
3) Stop inheritance
4) add the corresponding team group and give it permission

Enjoy.

Gian Maria.

Release app with Azure DevOps Multi Stage Pipeline

MultiStage pipelines are still in preview on Azure DevOps, but it is time to experiment with real build-release pipeline, to taste the news. The Biggest limit at this moment is that you can use Multi Stage to deploy in Kubernetes or in the cloud, but there is not support for agent in VM (like standard release engine). This support will be added in the upcoming months but if you use azure or kubernetes as a target you can already use it.

My sample solution is in GitHub, it contains a real basic Asp.NET core project that contains some basic REST API and a really simple angular application. On of the advantage of having everything in the repository is that you can simply fork my repository and make experiment.

Thanks to Multi Stage Pipeline we finally can have build-test-release process directly expressed in source code.

First of all you need to enable MultiStage Pipeline for your account in the Preview Features, clicking on your user icon in the upper right part of the page.

image

Figure 1: Enable MultiStage Pipeline with the Preview Features option for your user

Once MultiStage Pipeline is enables, all I need to do is to create a nice release file to deploy my app in azure. The complete file is here https://github.com/alkampfergit/AzureDevopsReleaseSamples/blob/develop/CoreBasicSample/builds/build-and-package.yaml and I will highlight the most important part here. This is the starting part.

image 

Figure 2: First part of the pipeline

One of the core differences from a standard pipeline file is the structure of jobs, after trigger and variables, instead of directly having jobs, we got a stages section, followed by a list of stages that in turns contains jobs. In this example the first stage is called build_test, it contains all the jobs to build my solution, run some tests and compile Angular application. Inside a single stage we can have more than one job and in this particular pipeline I divided the build_test phase in two sub jobs, the first is devoted to build ASP.NET core app, the other will build the Angular application.

image

Figure 3: Second job of first stage, building angular app.

This part should be familiar to everyone that is used to YAML pipeline, because it is, indeed, a standard sequences of jobs; the only difference is that we put them under a stage. The convenient aspect of having two distinct jobs, is that they can be run in parallel, reducing overall compilation time.

If you have groups of taks that are completely unrelated, it is probably bettere to divide in multiple jobs and have them running in parallel.

The second stage is much more interesting, because it contains a completely different type of job, called deployment, used to deploy my application.

image

Figure 4: Second stage, used to deploy the application

The dependsOn section is needed to specify that this stage can run only after build_test stage is finished. Then it starts jobs section that contains a single deployment job. This is a special type of job where you can specify the pool, name of an environment and then a strategy of deploy; in this example I choose the simplest, a run once strategy composed by a list of standard tasks.

If you ask yourself what is the meaning of environment parameter, I’ll cover it in much extension on a future post, for this example just ignore it, and consider it as a way to give a name to the environment you are deploying.

MultiStage pipeline introduced a new job type called deployment, used to perform deployment of your application

All child steps of deployment job are standard tasks used in standard release, the only limitation of this version is that they run on the agent, you cannot run on machine inside environment (you cannot add anything else than kubernetes cluster to an environment today).

The nice aspect is that, since this stage depends on build_test, when deployment section runs, it automatically download artifacts produced by previous stage and place them in folder $(Pipeline.Workspace) followed by another subdirectory that has the name of the artifacts itself. This solves the need to transfer artifact of the first stage (build and test) to deployment stage

image

Figure 5: Steps for deploying my site to azure.

Deploying the site is really simple, I just unzip asp.NET website to a subdirectory called FullSite, then copy all angular compiled file in www folder and finally use a standard AzureRmWebAppDeployment to deploy my site to my azure website.

Running the pipeline shows you a different user interface than a standard build, clearly showing the result for each distinct stage.

image

Figure 6: Result of a multi stage pipeline has a different User Interface

I really appreciate this nice graphical representation of how the stage are related. For this example the structure is is really simple (two sequential steps), but it shows clearly the flow of deployment and it is invaluable for most complex scenario. If you click on Jobs you will have the standard view, where all the jobs are listed in chronological order, with the Stage column that allows you to identify in which stage the job was run.

image

Figure 7: Result of the multi stage pipeline in jobs view

All the rest of the pipeline is pretty much the same of a standard pipeline, the only notable difference is that you need to use the stage view to download artifacts, because each stage has its own artifacts.

image

Figure 8: Downloading artifacts is possible only in stages view, because each stage has its own artifacs.

Another nice aspect is that you can simply rerun each stage, useful is some special situation (like when your site is corrupted and you want to redeploy without rebuilding everything)

Now I only need to check if my sites was deployed correctly and … voilà everything worked as expected, my site is up and running.

image

Figure 9: Interface of my really simple sample app

Even if MultiStage pipeline is still in preview, if you need to deploy to azure or kubernetes it can be used without problem, the real limitation of actual implementation is the inability to deploy with agents inside VM, a real must have if you have on-premise environment.

On the next post I’ll deal a little more with Environments.

Gian Maria.

Another gem of Azure Devops, multistage pipelines

With deployment of Sprint 151 we have an exciting news for Azure DevOps called multi stage pipelines. If you read my blog you should already know that I’m a huge fan of having YAML build definition, but until now, for the release part, you still had to have the standard graphical editor. Thanks to Multi Stage Pipelines now you can have both build and release definition directly in a single YAML file.

Multi stage pipelines will be the unified way to create a yaml file that contains both build and release definition for your projects.

This functionality is still in preview and you can have a good starting point here, basically we still miss some key features, but you can read in previous post about what’s next for them, and this should reassure you that this is an area where Microsoft is investing a lot.

Let’s start to create first real pipeline to deploy an asp.net application based on IIS, first of all I’m starting with an existing YAML build, I just create another yaml file, then I can copy all the existing YAML of an existing build, but in the head of the file I’m using a slightly different syntax

image

Figure 1: New Multistage pipeline definition

As you can see the pipeline starts with name stages, then a stage section starts, that basically contains a standard build, in fact I have one single job in Build Stage, a job called Build_and_package that takes care of building testing and finally publish artifacts.

After the pipeline is launched, here is what I have as result (Figure 2):

image

Figure 2: Result of a multistage pipeline

As you can see the result is really different from a normal pipeline, first of all I have all the stages (actually my deploy job is fake and does nothing). As you can see the pipeline is now composed by Stages, where each stage contains jobs, and each jobs is a series of tasks. Clicking on Jobs section you can see the outcome of each jobs, this allows me to have a quick look at what really happened.

image

Figure 3: Job results as a full list of all jobs for each stage.

When it is time to deploy, we target environments, but unfortunately in this early preview we can only add kubernetes namespace to an environment, but we are expecting soon to be able to add Virtual Machines through deployment groups and clearly azure web apps and other azure resources.

I strongly encourage you to start familiarizing with the new syntax, so you will be able to take advantage of this new feature as soon at it will be ready.

Gian Maria

Azure DevOps is now 150 sprints old

I remember old days when Azure DevOps was still in private preview, and yet it was really a good product, now 150 sprints passed, and the product is better than ever. Not everything is perfect, but, as users, we can expect new feature to being deployed each 3 weeks, the duration of Microsoft Sprint.

This means that now the product is 450 Weeks old, and finally we got a little nice feature that shows up news in the front page.

image

Figure 1: Widget with new feature of newest deployed sprint

This allows users to immediately being notified of new feature in their accounts, with a nice summary with key new features. In this sprint we have the new Task Assistant to help editing YAML pipelines, and many new feature, like the new agent administration ui.

image

Figure 1: New administration page in action.

The new page is more consistent with the look and feel of the rest of the service, also it shows wait time and build duration when you drill down in a pool.

As always I cannot stress out how good is to have all of your project administration tool in the Cloud, no time spent to upgrade, no time spent to verify and check backup policies, and, completely free for the first 5 users.

Gian Maria.

How to edit a YAML Azure DevOps Pipeline

I cannot stress you enough on how better is the experience of having builds defined in code than having build definition on the server, so I’m here to convince you to move to the new YAML build system in Azure DevOps :).

Having build definition in Code gives you many benefits, the first is that builds evolve with code branches.

If you still think that editing a YAML file is a daunting experience because you have tons of possible tasks and configuration to use, take a peek to the Azure Pipeline extension Visual Studio Code Addin, that brings intellisense for your pipeline editing in Visual Studio Code. I strongly encourage you to have a look at the YAML schema reference to have a complete knowledge of the syntax, but for most people a quick approach to the tool is enough, leaving the deep dive for when they need to do complex stuff.

With the extension enabled, after you opened a YAML Build definition in Visual Studio Code, you can click on the YAML button in the lower right part of visual studio code editor to change language

image

Figure 1: Language mode selection of Visual Studio Code

That area is the Language Mode Selection, and it is where you specify to Visual Studio Code what is the language of the file you are editing. If you simply open a YAML file, VS Code recognize the yaml extension and helps using standard YAML syntax, but it does not know anything about Azure DevOps Pipeline.

When you tell Visual Studio Code that the file is a YAML pipeline, intellisense kicks out and allows you to quickly edit the file

Thanks to the Language Mode Selector, we can now specify that the file is a Azure Pipeline file and not a standard YAML file.

image

Figure 2: Selecting the right language type allows VS Code to give you tremendous help in editing the file.

This is everything you need to do, from now on, VS Code will give you helps in the context of Azure DevOps pipeline syntax. Even if the file is completely empty the editor shows you possible choices for the first level nodes

image

Figure 3: Suggestions on empty file

Since I usually start specifying the pool, I can simply choose pool, then let VS Code guide me in the compilation of all properties

image

Figure 4: Intellisense in action editing the file

In real scenario you usually starts from some template file (another advantage of having build in code), you already prepared with standard build for you project, but even in that scenario having intellisense to refine the build will help you in choosing tasks.

image

Figure 5: Help in choosing tasks

I can assure that, after some usage, it is far more powerful and quick to edit a build with VS Code than to edit a standard build made with tasks in the Web based editor. Graphical editor are powerful and are a good entry point for those who does not know the instrument, but intellisense powered editors are more productive and powerful.

image

Figure 6: You do not have only intellisense to choose the task, but it will shows you also information about the task

The only drawback I found is using custom tasks that were not recognized by the intellisense, as my GitVersion Task, that was marked as wrong because VS Code does not know it.

image

Figure 7: Custom tasks were not automatically recognized by VS Code

Intellisense will completely remove the need of the old trick of creating a build with the old editor, place tasks in the pipeline and then letting the tool generates YAML definition based on how you configured the task in graphical editor. I assure you that it is faster to directly copy a reference build and then add needed tasks with intellisense in VS Code than using UI editor.

If you are really a UI Oriented person, in the latest release of Azure DevOps (at the time of writing the feature is rolling out so it is not available on all accounts) you can use the YAML Task Assistant

Badge

Figure 8: YAML Task assistant in action

The assistant allows you to configure the Task with the very same UI experience you have in UI Based pipeline, once the task is configured you can simply add corresponding YAML to the definition.

Task assistant gives you the same add experience for tasks of the old UI editor, so you can configure the task with graphic editor, then add corresponding YAML syntax to the definition.

I think that with Task Assistant there are no more excuses not to move to YAML based definition.

Gian Maria