I built a CI pipeline for my project. And you should too.

EV
6 min readJan 11, 2023

--

In the first post, I briefly described the project that I’m currently working on. In this post, I’m going to share how I built a CI pipeline for my project.

What I used:

  • Azure DevOps
  • Unit & acceptance tests using xUnit and Specflow
  • TestContainer library
  • Build Quality Checks
  • Built-in container support for the .NET7 SDK

The book

I’m very grateful to Dave Farley and his book “Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation” which has had a profound impact on me. I urge every IT professional to read it. This book made me re-think the value of automation testing and turned me into an advocate of Test Drive Development (TDD) which I now practice in almost every project that I work on. TDD has been a part of my toolset for nearly two years, and throughout that time it has helped me greatly enhance the quality of my work.

… high-quality software is only possible if testing becomes the responsibility of everybody involved in delivering software and is practiced right from the beginning of the project and throughout its life. — Dave Farley

I started using what I’ve learned in my project after reading about a third of the book. It was quite difficult! Writing tests before code turned out to be considerably harder than I had anticipated. Oftentimes, I had the impulse to skip writing tests and jump staring to coding, which I occasionally did but then felt bad afterward. While getting used to something new takes some time, practice makes perfect. Later I came to understand something crucial about TDD that changed my perception of the technique: TDD is more about design than testing. Writing tests before the code helps the programmer to see the system from the user’s perspective, making it easier to produce maintainable, short, and focused code. Give TDD a try, I’m sure you will love it!

The book also emphasizes the importance of Continuous Integration (CI), and I’ve decided to implement a CI pipeline for my project using Azure DevOps. It works like this: on every push to the repo, the new changes are automatically picked by the CI server which executes the following stages:

  • Determine the build number
  • Check changed files
  • Build and test
  • Create and publish the Docker container
  • Publish artifacts

The stages in the pipeline are designed to identify any integration issues early on which ensures that the code is reliable and error-free. I’ll describe what happens in each stage next.

Determine the build number

The first stage of the CI pipeline is to determine the build number which also serves as a tag for the Docker container. The CI pipeline generates a new version number with each new execution. I use Semantic Versioning (MAJOR.MINOR.PATCH)

Check changed files

CI pipelines are meant to give quick feedback to the developer on whether their changes can be integrated with other developers’ changes. The most time-consuming part of a CI pipeline is running tests. However, there are situations when tests can be safely skipped. In my case, I don’t want to run tests when I only change YAML files that are part of the solution.

One of the optimizations that I implemented is to check if any source code files have been modified, and if not, skip the execution of tests.

To perform this kind of check I use the Changed Files task that gets changed files and makes it easy to apply conditions according to those changes.

Build and test

This is the most interesting part of the pipeline where the code is checked out from the repo, built, and tested. The stage includes the following steps:

  • Check out the latest changes from the repo
  • Build the project using dotnet build
  • Run unit tests using dotnet test and collect code coverage with Coverlet
  • Execute acceptance tests
  • Build quality check with BuildQualityChecks task

Two flavors of tests

Unit tests are a good indicator that the code works as the developer intended. However, they are insufficient since the code must also deliver value to the end user. The best way to ensure this is by acceptance tests. These tests also serve as a regression tests suite, verifying that no bugs are introduced by new changes. Acceptance tests usually take much longer to run than unit tests and are the most time-consuming part of my pipeline. However, when all acceptance test pass, it instills me with confidence that the code does what it was designed to do from the user’s perspective.

For my acceptance tests, I chose to use Specflow, a BDD-Framework for .NET. I have a dedicated project for the specifications, and I use WebApplicationFactory<> to streamline bootstrapping the SUT with TestServer. Each test is executed in isolation; a new instance of the application is spun up for each test, that’s why it takes much longer compared to unit tests.

Running tests is a piece of cake with Azure DevOps. Check this out

- task: DotNetCoreCLI@2
displayName: "Run unit tests"
inputs:
command: test
projects: "./tests/Untrap.Api.UnitTests/Untrap.Api.UnitTests.csproj"
arguments: "--configuration $(buildConfiguration) --no-restore --settings ./tests/Untrap.Api.UnitTests/coverlet.runsettings"

- task: DotNetCoreCLI@2
displayName: "Execute specs"
inputs:
command: test
projects: "./tests/Untrap.Api.Specs/Untrap.Api.Specs.csproj"
arguments: "--configuration $(buildConfiguration) --no-restore"

How to deal with database reset

My application uses a database — MongoDB. Most likely your application uses a database too. If you ever wondered how to reset the database for your acceptance or integration tests — you’re not alone. I’ve faced this issue early on in my project. My initial solution was to run tests sequentially and reset the database at the end of each test. However, this approach was error-prone, didn’t scale well, and required a lot of time to execute all tests.

But thanks the Nick Chapsas’ video about the Test Containers library, I discovered a much better solution!

Testcontainers for .NET is a library to support tests with throwaway instances of Docker containers for all compatible .NET Standard versions. The library is built on top of the .NET Docker remote API and provides a lightweight implementation to support your test environment in all circumstances.

After integrating this library into my project, I was able to run my acceptance tests in parallel which reduced the duration of test execution by ten times. Most importantly, I didn’t have to worry about resetting the database anymore — the library did it for me. If you’ve had a similar problem, I highly recommend you to check Test Container on Github and if you like it, give the project a star!

Build quality checks

The BuildQualityChecks task helps me to maintain the high quality of my code by alerting me if any of the following conditions are violated:

  • overall code coverage drops below 95%
  • new warnings are introduced compared to the previous version
- task: BuildQualityChecks@8
inputs:
checkWarnings: true
warningFailOption: 'build'
allowWarningVariance: true
warningVariance: '1'
showStatistics: true
checkCoverage: true
coverageFailOption: 'fixed'
coverageType: 'lines'
coverageThreshold: '95'

The pipeline will terminate if any of the conditions fail.

I like keeping my code warnings-free. You know those annoying yellow triangles? I used to ignore them in the past but I’ve developed intolerance toward them. Despite using StyleCop nor code analyzers, every now and then warnings creep into the code. The CI pipeline takes care of those unnoticed warnings and signals me whenever there are any.

Now that I’ve covered the most important part of my CI pipeline, I move on to the next stage.

Create and publish the Docker container

.NET7 brought an amazing feature — built-in container support for the .NET SDK. The feature allows you to ditch the Dockerfile and configure the Docker build properties in your .csproj!

This is what it looks like in my .csproj:

<PropertyGroup>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
<PublishProfile>DefaultContainer</PublishProfile>
<ContainerBaseImage>mcr.microsoft.com/dotnet/aspnet:7.0</ContainerBaseImage>
<ContainerImageName>untrap-api</ContainerImageName>
</PropertyGroup>

<ItemGroup>
<ConainerPort Include="80" type="tcp" />
</ItemGroup>

Then, the CI pipeline builds and pushes the Docker image into my Azure Container Registry

- task: CmdLine@2
displayName: Publish Docker image
inputs:
script: |
dotnet publish ./Untrap.Api.csproj --os linux --arch x64 -p:ContainerImageTag=${tag}

- task: Docker@2
displayName: Push Docker image to CR
inputs:
containerRegistry: 'AZ CR'
repository: '$(imageRepository)'
command: 'push'
tags: '$tag'

Easy as pie, and no Dockerfile is needed!

Publish artifacts

The final step of my CI pipeline is publishing artifacts which in my case are Helm char YAML files. These files are used in the Release pipeline to deploy the app into Kubernetes.

This is what a run of the pipeline looks like. It took only 4.5 minutes to run 900 unit tests and 160 acceptance tests, assemble and push a Docker container, and publish the artifacts.

Main takeaways

  • Tests give you peace of mind
  • CI pipeline streamlines code development and delivery
  • A database reset for tests is not an issue — use TestContainers

In the next post, I’ll show my release pipeline.

Stay tuned, cheers!

--

--

EV
EV

Written by EV

Hi all. I'm a software engineer, hiker, semi-professional Rubik's cube solver. I love all things outdoors, travelling, reading about the universe, and cats.

No responses yet