Why I love DevOps and hate DevSecOps

DevOps is becoming a buzzword, it makes hype and everyone want to be part of it, even if he/she does not know exactly what DevOps is. One of the symptoms of this is the “DevOpsEngineer”, a title that does not fit in my head.

We could debate for days or years on the right definition of DevOps, but essentially is a cultural approach on building software focused on building the right thing with the maximum quality and satisfaction for the customer. 

Remember, DevOps is a cultural approach based on transparency and inclusion, not a set of tools/practices

In my head DevOps is nothing really different from Agile, it has just a different perspective. And I know, probably most of you are crying out loud because we had lots of guru, articles, sites telling you the difference from Agile and DevOps, but I simply do not care. If you think that DevOps is about continuous deployment and only tools and practices, you are probably wrong. I can admit that tools and practices can be a backbone of DevOps culture, everything should start with culture, collaboration, inclusion and transparency, tools and practices come later in the game.

What I care, as a professional that gets paid to create software, is the satisfaction of the Customer, because it brings me more work and makes me proud of my work, after all, in this industry, we all love our works. I read “The Goal” lots of years ago, and it is still so actual, the Theory of Constraint is still actual, even in software. In my mind DevOps is just another tentative of changing the approach of making software for the good of the team, ops, Customer and users.

Given that, what is a DevOps Engineer? Giving DevOps prefixed roles in a DevOps culture is really bad, because every person of the team is part of DevOps: Customer + Developers + Operationals.

DevOps culture permeates work environment and we do not need DevOps XXX roles, everyone is part of DevOps culture and if you urge the need to find a DevOps Engineer it just means that other members in the team are out of DevOps culture. A DevOps XXX is just a patch in your culture problem and it will not just work.

Given that, since I love DevOps I hate every DevXXXOps, because there is nothing more to add to DevOps.

This is why I hate with all myself DevSecOps; since DevOps is now a buzzword as Security, why not to create a super buzzword like DevSecOps?

Let me be crystal clear, if you claim that your organization has a DevOps culture and you do not care about security, you are doing it dead wrong. Security is paramount, it should be part of every professional / practice / culture and it should permeate every part of software lifecycle. If you think that you need to add Sec to DevOps it just means that you are not caring about security in your culture, and this is a HUGE problem that should be address before bringing new Buzzword into the game.

Gian Maria.

WIQL editor extension For Azure DevOps

One of the nice feature of Azure DevOps is extendibility, thanks to REST API you can write addins or standalone programs that interacts with the services . One of the addin that I like the most is the Work Item Query Language Editor, a nice addin that allows you to interact directly with the underling syntax of Work Item query.

Once installed, whenever you are in query Editor, you have the ability to directly edit the query with WIQL syntax, thanks to the “Edit Query wiql” menu entry.

image

Figure 1: Wiql query editor new menu entry in action

As you can see in Figure 2, there are lots of nice feature in this addin, not only the ability to edit a query directly in WIQL syntax.

image

Figure 2: WIQL editor in action

You can clearly edit and save the query (3) but you can also export the query into a file that will be downloaded into your pc, and you can then re-import in a different Team Project. This is a nice function if you want to store some typical queries somewhere (source control) then re-import in different Team Project, or for different organization.

If you start editing the query, you will be amazed by intellisense support (Figure 3), that guides you in writing correct query, and it is really useful because it contains a nice list of all available fields.

image

Figure 3: Intellisense in action during Query Editor.

The intellisense seems to actually using API to grab a list of all the valid fields, because it suggests you even custom fields that you used in your custom process. The only drawback is that it lists all the available fields, not only those one available in the current Team Project, but this is a really minor issue.

Having intellisense, syntax checking and field suggestion, this addin is a really must to install in your Azure DevOps instance.

image

Figure 4: Intellisense is available not only on default field, but also on custom fields used in custom process.

If you are interested in the editor used, you can find that this addin uses the monaco editor, another nice piece of open source software by Microsoft.

Another super cool feature of this extension, is the Query Playground, where you can simply type your query, execute it and visualize result directly in the browser.

image

Figure 5: Wiql playground in action, look at the ASOF operator used to issue query in the past.

As you can see from Figure 5, you can easily test your query, but what is most important, the ASOF operator is fully supported and this guarantees you the ability to do historical queries directly from the web interface, instead of resorting using the API. If you need to experiment with WIQL and you need to quick create and test a WIQL query, this is the tool to go.

I think that this addin is really useful, not only if you are interacting with the service with REST API and raw WIQL, but also because it allows you to export/import queries between projects/organization and allows you to execute simply historycal queries directly from the ui.

Having the full support of WIQL allows you to use features that are not usually available through the UI, like the ASOF operator.

As a last trick, if you create a query in the web UI, then edit with this addin and add ASOF operator then save, the asof will be saved in the query, so you have an historical query executable from the UI. The only drawback is that, if you modify the query with the web editor and then save, the ASOF operator will be removed.

Gian Maria.

Deploy click-once application on Azure Blob with Azure DevOps

It was a long time ago I blogged on how to publish a click-once application from a VSTS Build to Azure Blob, long time was passed, and lots of stuff changed. The whole process is now simpler, thanks to many dedicated tasks that avoid doing any manual work.

My new build always start with a  GitVersion custom tasks, that populates some environment variables with version numbers generated by GitVersion, this will allow me to simply add an MsBuild task in the build to publish click-once using automatic GitVersion versioning.

image

Figure 1: MsBuild task to publish click once application

You need to build the csproj that contains the project, using $(BuildConfiguration) to build in the right configuration and adding custom argument arguments (3) as the following list

/target:publish 
/p:ApplicationVersion=$(AssemblyVersion)  
/p:PublishURL=https://aaaaaaa.blob.core.windows.net/atest/logviewer/  
/p:UpdateEnabled=true  
/p:UpdateMode=Foreground 
/p:ProductName=Jarvis.LogViewer 

ApplicationVersion is set to variable $(AssemblyVersion) that was populated by GitVersion task, publish Url is a simple address of an azure blob storage, that contains a blob container called atest, with a  folder named logviewer that has public read access.

image

Figure 2: Public read access for a simple blob in my azure subscription

This will allows me to have a simple public blob, with a public address, that everyone can read, so everyone can download my application.

Azure blob can be given a public read access to everyone, making your click-once application available to everyone.

MsBuild task simply creates a app.publish subfolder that contains all the files that needs to be copied into the blob azure storage, the first step is a Copy file Task to copy them in the artifacts directory, as for the following picture.

image

Figure 3: All files of click-once publishing were copied in the artifacts staging directory.

Now everything is ready, just add a final task of type Azure File Copy and configure to copy everything from artifacts directory to the right subfolder of azure blob.

SNAGHTML18f3d81

Figure 4: Configuration of Azure Blob File to deploy click-once generated files.

Configuration is really simple, because thanks to a direct connection to Azure Subscription (3) we can simply connect Azure Dev Ops to an azure subscription so we can simply select storage account (4) blob name (5) and subfolder of the blob (6).

One of the advantage of the dedicated Azure File Copy Task, is that it can simply use one of  the Azure account linked to the Azure DevOps account, making super-simple to choose right blob where to upload click-once package.

Once you’ve finished configuration, you can simply run a build and your application will be automatically published.

image

Figure 5: Summary of the build with click once publishing.

Now you can simply point to the setup.exe file with the address of the blob and the game is done.

Gian Maria.

Create a release in TFS 2015 / VSTS Release Management

This is the end of the journey of the last serie of posts. I’m now at the situation where I have a build that produces a single zip file with everything I need to deploy the software and a bunch of PowerShell scripts that relase the software using that zip as a source artifact.

Now it is time to automate the process with Release Management. I want to use RM because the process is automated on a chain of environments, but also I have traceability, auditing, and verification of the release procedures. I’m not going to cover all the steps needed to create the a release definition, I want to focus on how simple is creating a Release Process when you adopt Zip+PowerShell approach. I strongly suggest to have a look at official site if you never heard of VSTS / TFS Release Management.

If you have a zip and PowerShell files that release the software using only the zip, you are ten minutes away from a working Release Management definition

Let’s start creating a first release, the most important aspect is adding a build as Artifacts source, this allows the release to consume what is produced by the build.

this image shows how a build can be linked to Release MAnagement, so the RM process can access artifacts produced by the build

Figure 1: Add your build as a artifacts source for the release process

This build produces a couple of artifacts, the zipped file with everything and the set of PowerShell scripts needed to run the software (as I suggested they should always be stored in source control and moved as build artifacts). Thus the release process is really simple and is composed by only three steps.

The entire release process is only three task, as described further in the post.

Figure 2: The entire release process is composed only by three scripts

First two steps are used to copy zip file and installation scripts in a folder of the target machine (c:\deploy\jarvis\…). as you can see I’m using local administrator user, so this technique can be used even if the machine is not in the domain. You should now run this script on the machine where the agent is running.

Set-Item WSMan:\localhost\Client\TrustedHosts -Value WebTest -Force

This instruction should be run to the machine where the build agent is running, and specify that the target machine is a Trusted Hosts. If you are using domain credentials, this is not needed.

The password is stored inside a Release variable, to use secret variable feature, so noone that can edit this release can see the password in clear text.

Password for target machine is stored as secret variable in Configuration section

Figure 3: Password for target machine is stored as secret variable in Configuration section

Final step in Figure 2 is using the task to run the installation script in a remote machine machine.

This image shows the configuration of the Remote Powershell task that allows you to execute a powershell in a remote machine.

Figure 4: Run PowerShell in a remote machine to install software

If you already tested the script manually, there is no reason why it should fail in RM process. Remember that you should never use command that interact with the shell, and you should use Write-Output instead of Write-Host in your PowerShell script to be sure that no shell is used.

With only three task I’ve created a release definition for my software

Now you can trigger a new release, or have each build trigger a new release on the dev machine. The really interesting aspect is, using GitVersion and GitFlow, when you select the build you want to release, you can choose the branch you want to deploy instead of understanding what is in a build from its number.

Thanks to gitversion when you trigger a new release you can choose the build that you want to deploy, and you can choose using a semantic versioning like 1.5.0 or 1.5.0-beta1 instead that a generic build number

Figure 5: Trigger a release manually allows to specify the build you want to release

Ad you can see from Figure 5, you can easily choose the version of the software you want to release, it is really clear from the name of the build. Once the Release is started, if the deploy is set to occur automatically, the release immediately starts

This image simply shows that the release is in progress

Figure 6: Relase is in progress

Once the release is in progress, from the summary you can verify what version of the software is released in what environment. I have only one environment for this demo, and I can verify that 4 minutes ago the version 1.5.0 is deployed in that environment.

This image shows all releases done with this definition. New release of version 1.5.0 was completed 4 minutes ago

Figure 7: New release was deployed successful

I can now verify that the new version was really deployed correctly opening the software in target machine.

Opening the software the new version confirm that the deploy was really successful

Fgiure 8: New version of the software is correctly deployed

Once you have everything in place and use simple PowerShell approach, setting up a release management process is really straightforward.

Gian Maria.

How to manage PowerShell installation scripts

In  previous post I explained how I like to release software using a simple paradigm:

build produces one zipped file with everything needed for a release, then a PowerShell scripts accepts the path of this zipped release and installation parameters and executes every step to install/upgrade the software.

This approach has numerous advantages, first of all you can always test script with PowerShell ISE in a Developer Machine. Just download from build artifacts the version you want to use for test, load installation script in PowerShell ISE, then run the script, and if something went wrong (the script has a bug or needs to be updated) just debug and modify it until it works.

My suggestion is to use Virtual Machines with snapshots. The process is,

restore a snapshot of the machine without the software installed, then run the script, if some error occurred, just restore the snapshot, fix the script, and run again.

You can do the very same using a snapshot of the VM where the software has a previous version of the software installed, so you can verify that the script works even for an upgrade, not only for a fresh installation. This is a really simple process, that does not involve any tool related to any release technology.

The ability to debug script using VM and snapshot, is a big speedup for release script development. If you are using some third part engine for Release Management software, probably you will need to trigger a real release to verify your script. Another advantage is that this process allows you to do a manual installation where you can simply launch the script and manually verify if everything is good.

You should store all scripts along with source control, this allows you to:

1) Keep scripts aligned with the version of the software they install. If a modification of the software requires change in installation scripts, they are maintained togheter.
2) You can publish script directory as build artifacts, so each build contains both the zipped release, and the script needed to install it.
3) History of the scripts allows you to track the entire lifecycle of the script, and you can understand why someone changed the script in version Y of your software.
4) You can test installation script during builds or run during a Build for a quick release on a test environment

Final advantage is: the script runs on every Windows machine, without the need to use tasks, agents, or other stuff. Once the script is ready to go, you first start testing in DEV, QA and Production environment manually. Manual installation is really simple, just download artifacts from the build, run the script, check script log and manually checks that everything is ok. If something went bad (in DEV or QA) open a Bug and let Developers fix the script until everything is ok.

Once the script start to be stable, you can proceed to automate the release.

Gian Maria.