Windows Docker Container for Azure Devops Build agent

Thanks to Docker Compose, I can spin off an agent for Azure Devops in mere seconds (once you have all the images). Everything I need is just insert the address of my account a valid token and an agent is ready.

With .NET core everything is simple, because we have a nice build task that automatically install .NET Core SDK in the agent, the very same for node.js. This approach is really nice, because it does not require to preinstall too much stuff in your agent, everything is downloaded and installed on the fly when a build needs that specific tooling.

.NET core Node.Js and other SDK can be installed directly from build definition, without any requirement on the machine where the agent is running

In my projects I need also to build solution based on Full Framework SDK and clearly I want to use the very same Docker approach for Agents. Everything I need is a machine with Windows Server2019 with docker enabled and looking in the internet for needed Dockerfile or pre build images on public registry.

I’ve started with a nice article that explain how to install Build Tools inside a container. This articles gives you a dockerfile that basically is doing everything, just docker build and the container is ready to be used.

The only real modification I need is excluding most of the packages I’m not using in my project; I’ve also changed base image to use .NET Framework 4.8 and not 4.7.2.

image

Figure 1: RUN instruction to install build tools inside a Docker Image.

Be prepared to wait a litlle bit for the image to be built, because it downloads all the packages from network and then install into the machine. To build the container image from dockerfile you can simply type.

docker build -t buildtools2019:1.0 -m 4GB .

Now I need to modify my Azure Devops Agent dockerfile built following MSDN instructions  and change base image to FROM buildtools2019:1.0 this will base the agent image on the new image built with all .NET sdk for full framework and all build tools. With this image the agent can build Full Framework projects.

Creating a Docker Images with all needed tools for full framework SDK was really easy.

The other challenge on creating your build agent in Docker is having all the requirements in other docker images, for SQL Server I’ve rebuild the container on top of Windows Server Core 2019 image and it runs just fine, for MongoDb I’ve enabled experimental feature to mix Linux and Windows container so I’ve used a Linux based mongodb image with no problem.

My only remaining dependency was ElasticSearch, because I’m using an old version, 2.4 and I’d like to have that image running in Windows Container. It is not difficult to find some nice guy that published dockerfile for Elasticsearch, here https://github.com/StefanScherer/dockerfiles-windows you can find lots of interesting images for Windows. In my situation I only need Elasticsearch, so I’ve gotten the image, modified to install the version I need and the game is done.

Now I need to have some specific plugins installed in Elasticsearch for my integration tests to run; these situations are the ones where Docker shows its full powers. Since I’ve already an image with elasticsearch, I only need to create a DockerFile based on that image that install required plugin.

FROM elasticsearch_custom:2.4.6.1

RUN "C:\elasticsearch\bin\plugin.bat" install delete-by-query

Wow, just two lines of code launch docker build command and I have my ElasticSearch machine ready to be used.

image

Figure 2: All my images ready to be used in a Docker compose file

Now everything you need to do is create the docker compose file, here is an example

version: '2.4'

services:
  agent:
    image: azdoagent:1.0
    environment:
      - AZP_TOKEN=****q
      - AZP_POOL=docker
      - AZP_AGENT_NAME=TestAlkampfer
      - AZP_URL=https://dev.azure.com/prxm
      - NSTORE_MONGODB=mongodb://mongo/nstoretest
      - NSTORE_MSSQL=Server=mssql;user id=sa;password=sqlPw3$secure
      - TEST_MONGODB=mongodb://mongo/
      - TEST_ES=http://es:9200

  mssql:
    image: mssqlon2019:latest
    environment:
      - sa_password=sqlPw3$secure
      - ACCEPT_EULA=Y
    ports:
      - "1433:1433"

  mongo:
    platform: linux
    image: mongo
    ports:
      - "27017:27017"

  es:
    image: elasticsearch_jarvis:1.0
    ports: 
      - "9200:9200"

Once the file is ready just start an agent with

docker-compose -f docker-compose.jarvis.yml  up –d

and your agent is up and running.

image

Figure 3: All docker images up and running

Then I go to my C# project and included all the docker file needed to create images as well as docker compose file directly in source repository. This allows everyone to locally build docker image and spin out an agent with all the dependency needed to build the project.

If your organization uses azure or any other container registry (you can also push to a public registry because all these images does not contains sensitive data) spinning the agent is only a matter of starting docker composer instance.

After a couple of minutes (the time needed by azure devops agent to download everything and configure itself) my builds start running.

image

Figure 4: A build running a Full Framework solution with integration tests on MongoDB and Elasticsearch is running in my docker container.

From now on, every time I need to have another agent to build my project, I can simply spin out another docker-compose instance and in less than one minute I have another agent up and running.

Gian Maria.

Why I love DevOps and hate DevSecOps

DevOps is becoming a buzzword, it makes hype and everyone want to be part of it, even if he/she does not know exactly what DevOps is. One of the symptoms of this is the “DevOpsEngineer”, a title that does not fit in my head.

We could debate for days or years on the right definition of DevOps, but essentially is a cultural approach on building software focused on building the right thing with the maximum quality and satisfaction for the customer. 

Remember, DevOps is a cultural approach based on transparency and inclusion, not a set of tools/practices

In my head DevOps is nothing really different from Agile, it has just a different perspective. And I know, probably most of you are crying out loud because we had lots of guru, articles, sites telling you the difference from Agile and DevOps, but I simply do not care. If you think that DevOps is about continuous deployment and only tools and practices, you are probably wrong. I can admit that tools and practices can be a backbone of DevOps culture, everything should start with culture, collaboration, inclusion and transparency, tools and practices come later in the game.

What I care, as a professional that gets paid to create software, is the satisfaction of the Customer, because it brings me more work and makes me proud of my work, after all, in this industry, we all love our works. I read “The Goal” lots of years ago, and it is still so actual, the Theory of Constraint is still actual, even in software. In my mind DevOps is just another tentative of changing the approach of making software for the good of the team, ops, Customer and users.

Given that, what is a DevOps Engineer? Giving DevOps prefixed roles in a DevOps culture is really bad, because every person of the team is part of DevOps: Customer + Developers + Operationals.

DevOps culture permeates work environment and we do not need DevOps XXX roles, everyone is part of DevOps culture and if you urge the need to find a DevOps Engineer it just means that other members in the team are out of DevOps culture. A DevOps XXX is just a patch in your culture problem and it will not just work.

Given that, since I love DevOps I hate every DevXXXOps, because there is nothing more to add to DevOps.

This is why I hate with all myself DevSecOps; since DevOps is now a buzzword as Security, why not to create a super buzzword like DevSecOps?

Let me be crystal clear, if you claim that your organization has a DevOps culture and you do not care about security, you are doing it dead wrong. Security is paramount, it should be part of every professional / practice / culture and it should permeate every part of software lifecycle. If you think that you need to add Sec to DevOps it just means that you are not caring about security in your culture, and this is a HUGE problem that should be address before bringing new Buzzword into the game.

Gian Maria.

Home Made zero trust security?

I have a small office with some computers and servers and since I’m a fan of Zero Trust Security, I have firewall enabled even in local network. I’m especially concerned about my primary workstation, a Windows Machine where I have explicitly created firewall rules to block EVERY packet from another machine of the network.

I have backups, I have antivirus, but that machine is important and I do not want it to be compromised, working with a rule that block every contact from external code is nice and make it secure, but sometimes it is inconvenient.

Actually Zero Trust security can be complex, but for small offices it could be implemented with home made tools. In my scenario I’m pretty annoyed because there are some situation where I want to access my home network in VPN and then use my workstation in RDP, but since every port is closed, no RDP for me.

The alternative is to left RDP port opened always for all local machines, or for VPN ip subnet, but it is not a thing that I like. What about someone cracking my VPN? Once he is in VPN he can try to attack my RDP port, something that I want to prevent.

To solve this problem I’ve authored a really stupid program to manage hardening of my machine: https://github.com/alkampfergit/StupidFirewallManager This is a proof of concept, but basically it works with a simple concept: if I want to conditionally open TCP port based on a secret, I created a program that is listening on some UDP ports and if it receives a message with a specific secret it will open related TCP port, only for caller IP and only for a certain amount of time.

Here is a configuration:

{
  "Rules" : [
    {
      "Name": "Rdp",
      "UdpPort": 23457,
      "TcpPort": 3389,
      "Secret": "this_is_a_secret"
    }
  ]
}

This means that I want a rule called RDP that will listen on UDP port 23457 and when some other computer sends single UDP packet on that port with the expected secret, it will open 3389 TCP port for that IP for a couple of hours.

You can find the code on the above project, but you need to pay attention, because as soon as the program starts, it will add some rules to close all ports. Please be sure to use this POC only on computer where you have physical access.

image

Figure 1: All ports blocked upon program execution

In Figure 1  you can see that the tool added two rules to deny access to every ports that is not in the controlled range. This basically left only UDP port 23457 opened, and every TCP port except 3389 explicitly closed.

If I try to connect to my machine from another machine in my local network with RDP, I got no response.

image

Figure 2: Cannot access to my workstation with RDP

This happens because the tool explicitly deleted all rules about port 3389 and deny access to every other port. If I want to connect to that machine I need to fire up client and ask for port to open

image

Figure 3: Client program used to ask for port opening

Now if you got correctly both port and secret, a new rule was created.

image

Figure 4: New rule created upon correctly matching port and secret

The name of the rule is unique (it ends with a GUID) and contains in the name the expiration date in form of yyyyMMddHHmm, a timer in server part of the program check firewall rules each minutes to remove all of the rules that matches this pattern and remove them when they are expired.

image

Figure 5: New rule is created scoped for only the computer that sent correct UDP packet with correct secret.

If you look at the new rule it has a scope limited to caller computer

This simple POC allows me to remotely manage my firewall, opening only pre-determined port, for a predetermined amount of time, only for specific ip that knows a specific secret.

This is really only nothing more than a proof of concept, but if you want to apply Zero Trust Security in your environment and you have no commercial or enterprise ready solution, learning on how to manage local firewalls with your code is a good starting point to strongly limit the surface of attack on your local network. These kind of techniques strongly limit lateral movement on your network.

Gian Maria.

Azure DevOps agent with Docker Compose

I’ve dealt in the past on using Docker for your Azure DevOps Linux Build Agent in a post called Configure a VSTS Linux agent with docker in minutes and also I’ve blogged on how you can use Docker inside a build definition to have some prerequisite for testing (like MongoDb and Sql Server), now it is time to move a little step further and leverage Docker compose.

Using Docker commands in pipeline definition is nice, but has some drawbacks: First of all this approach suffers in speed of execution, because the container must start each time you run a build (and should be stopped at the end of the build). Is indeed true that if the docker image is already present in the agent machine startup time is not so high, but some images, like MsSql, are not immediately operative, so you need to wait for them to be ready for Every Build. The alternative is leave them running even if the build is finished, but this could lead to resource exaustion.

Another problem is dependency from Docker engine. If I include docker commands in build definition, I can build only on a machine that has Docker Installed. If most of my projects uses MongoDb, MsSql and Redis, I can simple install all three on my build machine maybe using a fast SSD as storage. In that scenario I’m expecting to use my physical instances, not waiting for docker to spin new container.

Including Docker Commands in pipeline definition is nice, but it tie the pipeline to Docker and can have a penalty in execution speed

What I’d like to do is leverage docker to spin out an agent and all needed dependencies once, then use that agent with a standard build that does not require docker. This gives me the flexibility of setting up a build machine with everything preinstalled, or to simply use Docker to spin out in seconds an agent that can build my code. Removing Docker dependency from my pipeline definition gave user the most flexibility.

For my first experiment I want also use Docker in Windows 2109 to leverage Windows Container.

First of all you can read the nice MSDN article about how to create a Windows Docker image that downloads, install and run an agent inside a Windows server machine with Docker for Windows. This allows you to spin out a new Docker Agent based on Windows image in minutes (just the time to download and configure the agent).

Thanks to Windows Containers, running an Azure DevOps agent based on Windows is a simple Docker Run command.

Now I need that agent to being able to use MongoDb and MsSql to run integration tests. Clearly I can install both db engine on my host machine and let docker agent to use them, but since I’ve already my agent in Docker I wish for dependencies to run also in Docker; so… welcome Docker Compose.

Thanks to Docker Compose I can define a YAML file with a list of images that are part of a single sceanrio so I specified an Agent image followed by a Sql Server and a MongoDb images. The beauty of Docker-compose is the ability to refer to other container machines by name. Lets do an example: here is my complete docker compose YML file.

version: '2.4'

services:
  agent:
    image: dockeragent:latest
    environment:
      - AZP_TOKEN=efk5g3j344xfizar12duju65r34llyw4n7707r17h1$36o6pxsa4q
      - AZP_AGENT_NAME=mydockeragent
      - AZP_URL=https://dev.azure.com/gianmariaricci
      - NSTORE_MONGODB=mongodb://mongo/nstoretest
      - NSTORE_MSSQL=Server=mssql;user id=sa;password=sqlPw3$secure

  mssql:
    image: mssqlon2019:latest
    environment:
      - sa_password=sqlPw3$secure
      - ACCEPT_EULA=Y

    ports:
      - "1433:1433"

  mongo:
    platform: linux
    image: mongo
    ports:
      - "27017:27017"

To simplify everything all of my integration tests that needs a connection string to MsSql or MongoDb grab the connection string by environment variable. This is convenient so each developer can use db instances of choice but also this technique makes super easy to configure a Docker agent specifying database connection strings as seen in Figure 1. I can specify in environment variables connection string to use for testing and I can simply use other docker service names directly in connection string.

image

Figure 1: Environment variable to specify connection string.

As you can see (1) connection strings refers to other containers by name, nothing could be easier.

The real advantage of using Docker Compose is the ability to include Docker Compose file (as well as dockerfiles for all custom images that you could need)  inside your source code. With this approach you can specify build pipelines leveraging YAML build of Azure DevOps and also the configuration of the agent with all dependencies.

Since you can configure as many Agent you want for Azure DevOps (you actually pay for number of concurrent executing pipeline) thanks to Docker Compose you can setup an agent suitable for your project in less than one minutes. But this is optional, if you do not like to use Docker compose, you can simply setup an agent manually, just as you did before.

Including a docker compose file in your source code allows consumer of the code to start a compatible agent with a single command.

Thanks to docker compose, you pay the price of downloading pulling images once, also you are paying only once the time needed for any image to become operative (like MsSql or other databases that needs a little bit before being able to satisfy requests). After everything is up and running, your agent is operative and can immediately run your standard builds, no docker reference inside your YAML build file, no time wasted in waiting your images to become operational.

Thanks to experimental feature of Windows Server2019, I was able to specify a docker-compose file that contains not only windows images, but also Linux images. The only problem I had is that I did not find a Windows 2019 Image for Sql Server. I started getting error using standard MsSql images (build for windows 2016); So I decided to download official Docker file, change reference image and recreate the image and everything worked like a charm.

image

Figure 2: Small modification to Sql Server docker windows image file, targeting Windows Server 2019.

Since it is used only for test, I’m pretty confident that it should work and indeed my build runs just fine.

image

Figure 3: My build result, ran on docker agent.

Actually my agent created with Docker Compose is absolutely equal to all other agentw, from the point of view of Azure DevOps it has nothing different, but I’ve started it with a single line of Docker-Compose command

image

Figure 4: Agent running in docker it is treated as standard agents.

That’s all, with a little effort I’m able to include in my source code both YAML build definition as YAML Docker Compose file to specify the agent with all prerequisites to ran the build. This is especially useful for Open Source projects, where you want to fork a project then activate CI with no effort.

Gian Maria.

Check for Malware in a Azure DevOps Pipeline

In a previous post I’ve showed Credential Scanner, a special task part of Microsoft Security Code Analysis available in Azure, today I want to have a quick peek at Anti Malware scanner task.

First of all a simple consideration: I’ve been asked several times if there is any need to have an AntiVirus or AntiMalware tools in build machines, after all the code that is build is developed by own developer, so there should be no need of such tools, right? In my opinion this is a false assumption, here is some quick consideration on how a malware can be downloaded in your build machine

1) If you build open source code where others can contribute and if there is no constant analysis of code, it is simple for a malicious user to modify a script or yaml build to do something nasty to your build machine. Ok, this is a really an edge case, but think at an angry employee that got fired and want to damage the company….

2) Nuget, Npm and in general every package manager download stuff from internet, this should be enough to justify keeeping an AntiVirus on your build agent. Some npm package you are using can be hijacked to download everything, and, generally speaking, everything can go south if you download stuff from the internet. I know that npm and nuget probably does some check of packages, but there is no real formal approval process, so I think that noone can guarantee that everything that comes from nuget, npm or the like is safe.

3) Custom Task in azure devops are also downloaded from the server, but in this situation the risk can be mitigated, because Microsoft checks product that are in marketplace.

In my opinion, since a build agent will download stuff from internet and executes scripts made by humans, it is better to have a good security solution constantly monitoring Agent Working folder

Ok, point 2 is the real risk, to mitigate it, the only solution is to point to a private Nuget or Npm repositories, and double check every package that you allow from nuget.org or npm main repository. The goal is: before a new version of a library is allowed to be used, someone should check if there are no risks. Npm is especially annoying, because an NPM install usually automatically updates libraries, this is why on a build you should always prefer npm ci instead of npm install.

Generally speaking, in my opinion it is better to have an antivirus on your build machine, and be 100% sure that agent folder and agent folder is constantly monitored.

To add an extra level of security, I’d also like to have a report in my build that certify that output of the build is safe, welcome Microsoft Malware Scanner Analyzer. This is another task part of the Microsoft Security Code Analysis whose purpose is scanning a specific folder and report analysis in the build.

Task configuration is quite simple, you can usually leave all default configurations, and you are ready to go.

image

Figure 1: Configuration of Malware scanner Task

The only real parameter you want to configure is the path you want to scan, usually is the artifacts directory, so you are confident that the output of the build that will be uploaded to the service is Malware free. Having another AntiVirus as I told before gives you double security, because standard antivirus kicks in automatically, and this task will do another check and upload result on the build.

image

Figure 2: Output of Malware scanner in build output

Call me paranoid, but I really like having an assurance that my artifacts are secure. I perfectly know that this is no 100% assumption that everything is good, but it is a good part to start.

Another nice aspect of the tool is that the output of the scan is also included as an artifacts.

image

Figure 3: Anti malware scanner log uploaded as artifacts.

This allows for everyone that downloads artifacts for installation to check output of the scanner.

Remember, when it is time for security, having a double check is better than have a single check.

Gian Maria