Last Updated: 12 September 2022
A common use case is to build multiple Docker images inside a pipeline, often from the same code repo or Visual Studio solution. The pipeline tasks that come out-of-the-box with Azure DevOps work well where there is a one-to-one mapping between repo and Docker image, but not so well in this case.
This is because the Docker steps require you to explicitly specify the path to the Dockerfile, and only allow exactly one Dockerfile. As a result, you will find yourself creating inefficient pipelines as you copy and paste bits of yaml to deal with each Docker image that you want to build. The desire to be DRY ('don't repeat yourself') is thwarted and you eventually give up trying to parameterize the Dockerfile paths, output image names, tags and architectures.
Our build task is the result of several years of in-house development and use of our own tools, which are now available to all users of Azure DevOps.
We do not rely on the Azure DevOps 'Service Connection' paradigm, making it easier to script and re-use. Instead, you directly pass in the credentials of your container repository. It is recommend that you store any sensitive credentials in Azure Key Vault, or in library variable secrets.
Consider this fairly standard C# solution and project layout:
ParkSquare.Demo (folder) |-- ParkSquare.Demo.sln |-- azure-pipelines.yml | ... | |-- ParkSquare.Demo.WebApp (folder) | |-- ParkSquare.Demo.csproj | |-- SomeFiles.cs | | ... | |- Dockerfile | |-- ParkSquare.Demo.Api (folder) | |-- ParkSquare.Demo.Api.csproj | |-- MoreFiles.cs | | ... | |-- Dockerfile | |-- ParkSquare.Demo.BackendDaemon (folder) | |-- ParkSquare.Demo.BackendDaemon.csproj | |-- YetMoreFiles.cs | | ... | |-- Dockerfile
As you can see, there is one solution and three projects - one for the frontend web app, one for an API, and another for some backend processing daemon. Each one builds into its own container. How best to handle this in the Azure DevOps pipeline yaml?
One solution would be have three separate pipelines, and mask the folders to prevent changes to different projects from triggering unwanted builds. This can quickly get unwieldy, as the pipeline yaml becomes complicated, specific, and less reusable. It will also cause problems with tools like SonarCloud, as different code could be built and analysed each time, causing SonarCloud statistics to fluctuate wildly. Further, it becomes harder to manage versioning as each component is built at a different time.
Another option is to split the solution into three repositories, one for each project. The resulting build pipelines remain fairly straightforward, but any assemblies that are shared across the three projects will need managing separately, e.g. using a private Nuget feed. Rebuilding the common code each time is possible, but generally undesirable.
The most elegant solution is to use our Multi Docker Build task. This allows you to retain the solution structure, and build all three images in one step.
To use our task, simply install the extension into your organization and add the task to your pipeline using the UI or by copy/pasting our example YAML.
|registryUrl||Y||Url of your container registry.|
|registryUsername||Y||Username to connect to registry.|
|registryPassword||Y||Password to connect to registry.|
|pushAfterBuild||Y||If true, will also push the image to your registry. If false, will just build it.|
|imagesToBuild||Y||List of images to build, one per line.|
|tags||Y||List of tags to apply to the image, one per line.|
In all cases, variables are permitted and will be evaluated correctly.
Here you specifiy one or more images that you want to build. The format is:
<docker file>|<image name>|<context path>
Use a new line for each specificiation.
When building the image, you can specify any tags to be applied. We recommend tagging with a specific version and also 'latest'. This is the default.
- task: multiDockerBuild@1 displayName: 'Demo Task' inputs: registryUrl: '$(privateDockerRepoUrl)' registryUsername: '$(privateDockerRepoUsername)' registryPassword: '$(privateDockerRepoPassword)' repositoryName: 'my-repo' imagesToBuild: | ParkSquare.Demo.Api/Dockerfile|$(Build.Repository.Name)-api|$(Build.ArtifactStagingDirectory)/Api ParkSquare.Demo.WebApp/Dockerfile|$(Build.Repository.Name)-web|$(Build.ArtifactStagingDirectory)/WebApp ParkSquare.Demo.Daemon/Dockerfile|$(Build.Repository.Name)-daemon|$(Build.ArtifactStagingDirectory)/Daemon tags: | $(Build.BuildNumber) latest something-else pushAfterBuild: true
A free trial is available to everybody for 30 days so you can try our our extension. Only one free trial is permitted per organization. Simply start using our build task and your trial period will automatically start when you run your first build.
After your free trial ends, you can upgrade to the full version without having to reinstall or reconfigure anything! You pipelines will 'just work' as they did before during the trial, without any modifications.
|License Type||Annual Cost||Builds Permitted|
|Personal||FREE||5 per month|
|Small (1-5 Developers)||£129.00||Unlimited|
|Medium (6 - 10 Developers)||£299.00||Unlimited|
|Corporate (11+ Developers)||£499.00||Unlimited|
Team size is the total number of Azure DevOps users that have read or write access to any code repository accessible by your Azure DevOps pipeline(s).
All prices are exclusive of VAT at the prevailing rate where applicable.
We accept payment by credit or debit card, bank transfer in the UK and international bank transfers in GBP (Pounds Sterling).
Individual developers can use our extension free of charge, however this is limited to 5 builds (i.e. pipeline executions) per month. To qualify for this, you must be working on your own hobby or not-for-payment projects. This edition is not permitted for commercial projects, or where any user is receiving any form of payment related to their project.
Contact us to get your free license!
We can also offer technical support packages related to this extension, including setting up and configuring multi docker builds in your Azure DevOps pipelines.
Please contact us with your requirements.
Upgrade to the full version by contacting us. No additional work is involved once upgraded, your pipelines will work perfectly!
For license terms and conditions, please refer to License Terms.