Imagine a very simple distributed application comprising of the following:
- Back end tier which implements a single REST end point
- Front end comprises of a static HTML page which invokes the end point and renders the data.
- The code repository is maintained on Github
- the build and deployment process is on Azure Devops and the actual application is deployed to Azure App Service and Azure Storage Account.
In this article, I have described in a step-by-step fashion how to implement a robust and repeatable Continuous integration(CI) and Continuous deployment(CD) process which takes the code commits and does the build, unit test, followed by deployment to various environments. Links to accompanying code is towards the very end.
- DEV - This is where the code that is comitted by the developer and not yet merged into the master branch is deployed. This environment gives the developer an opportunity to test her changes and get quick feedback. Pull Request comments are implemented and tested out.
- UAT - After the pull request has been completed, the code is now implemented to the master branch. This environment gives the developer and the product owners to examine the product and possibly recommend improvements.
- PROD - When the Product owner is happy , she would approve and trigger the deoployment of the master branch. This is a gated stage and might take into several factors. E.g. Deploy during quiter hours, do not deploy during long weekends, etc.
You want your team to follow a Pull Request workflow to ensure good quality code commits. You want the PR branch to be deoployed to a Dev environment. This is an environment that is safe and allows rapid feedback thereby leading to a better product quality The PR deployment should be automated as much as possible.
The accompanying code in this article uses Azure Powershell and Azure CLI to implement the automation of the infrastructure. You do not need any 3 rd party product. Just the following:
- Azure CLI
- Powershell Core
I recommend using Visual Studio Code for editing Powershell scripts
I wanted to demonstrate that if we stay committed to the fundamental tools and products like Azure CLI, Powershell, Github and Azure Devops - then a fairly complex process like CI/CD can be solved with relative ease. I often hear developers stressing how important it is to use a 3rd party Cloud management product like Terraform or Pulumi. If you are on Azure and have no intention for a multi-cloud solution, then CLI and PowerShell coupled with an automation agent like Azure Devops/Jenkins are all you need for your "Infrastructure as code" solution.
For the purpose of demonstrating a working CI/CD solution, I have developed a very simple application which comprises of the following:
- One REST end point implemented in a .NET Core Web API project. This end point generates dummy weather forecast records
- Static HTML served out of a storage account (with static web site option enabled). This invokes the REST end point and renders the forecast on the web browser
In a more practical scenario you would have other assets like a central database, Redis cache, message bus and most probably multiple micrososervices hosted in their respective web app.
- Github - Source code repository
- Azure Devops - Build (CI) and release(CD) pipelines
- Azure - The cloud platform where the infrastructure is deployed
- Azure app service - Azure serverless compute service for hosting a REST end point implemented on .NET Core Web API
- Azure storage account - Azure storage account configured for serving static HTML content over HTTP
A classic CI/CD workflow with deployments to dev-uat-prod stages would be as follows. The workflow could be even simpler or more complex. Example of complex CI/CD scenarios:
- Carry out integration test after deployment to Dev where the REST end points are tested
- Deploy to a QA environment for carrying out load tests
- Carry out UI automated UI tests immediately after deploying to Dev
I have followed a Monorepo approach for the toy application. This was mainly for simplicity. If you are following a Monorepo then there are certain guidelines to be followed in the YAML file, so that the CI/CD automation environment (Azure Devops) can respond correctly to code commits in the source repository (Github). This is discussed further down.
|
+---BackEnd
| +---build
| | build.yml
| |
| +---infrastructure
| | common.ps1
| | createwebapp.ps1
| | deploy.ps1
| |
| \---src
|
|
+---FrontEnd
| +---build
| | build.yml
| |
| +---infrastructure
| | deploy.ps1
| |
| \---src
|
You will notice that the repo is structured into 2 top level folders BackEnd and FrontEnd which contain a .NET Core WebApp and static HTML assets respectively.
Both the folders have their respective build.yml and IAC PowerShell scripts.
I have separated the deployment into 6 resource groups. 3 resource groups for BackEnd and 3 resource groups for FrontEnd. Such a structure simplifies management and cost monitoring.
The CI pipeline is defined by a list of ordered tasks written down in a YAML file. The most common tasks performed by CI stage are:
- code checkout
- code compilation
- unit tests
- code coverage analysis
- quality gates assessment (e.g. Sonar Cloud integration)
Example of a YAML snippet which executes the .NET unit tests in the BackEnd project
- task: DotNetCoreCLI@2
displayName: Test
inputs:
command: test
projects: '**/*[Tt]est*/*.csproj'
arguments: '--configuration $(BuildConfiguration)'
I would refrain from deploying assets to the Cloud from the CI pipeline. Even though YAML allows you to specify Azure CLI/Powershell Tasks. This should be the responsibility of the CD pipeline
The CD stage is responsible for collecting the build output from CI stage and then executing the IAC scripts. The CD is futher split into environment specific stages. It could get more complex. E.g. Kick of a load test after deployment to Dev environment is complete.
The steps are logically identical. They all point to the same PowerShell scripts. So what is the difference?
- The variables differ. E.g. the variable environment
- The service connection used by Devops to connect to the Azure cloud may differ. E.g. An UAT subscription for UAT environment and PROD subscription for production environment
- Allocate higher compute resources depending on the environment.
The following snippet from BackEnd\infrastructure\common.ps1 demonstrates how we can use the environment to create our assets
$environment=$env:ENVIRONMENT
if ([string]::IsNullOrWhiteSpace($environment)){
Write-Error -Message "The variable 'environment' was empty"
}
$ResourceGroup="rg-$environment-demo-webapp-with-cicd"
$Location="uksouth"
$PlanName="WebAppPlanName"
$WebAppName="MyDemoWebApi123-$environment"
$StaticSiteStorageAccount="saustorageaccount001$environment"
$StaticSiteResourceGroup="rg-demo-staticwebsite-with-cicd"
Do not compile code here. Do not run unit tests. Let the CI pipeline handle this and be responsible for producing the drops
- Azure Devops will expect a YML file in your repo
- This file must be in the master branch if you want to test out the entire CI/CD flow
- Therefore you must get a working YML comitted into master before you can finish the rest of the CI/CD pipeline
- Remember to have
tigger
andpr
elements if you are following a mono-repo approach
If you are following a Monorepo approach, then Azure Devops faces a challenge. Whenever a file is comitted to the Github repo, how would Devops know which CI pipeline to execute?
The YML specification addresses this problem by using the elements trigger
and pr
Every YML should have something similar at the very beginning of the file. The trigger
and pr
settings work in tandem to guarantee execution of the correct YML file.
trigger:
branches:
include:
- master
paths:
include:
- '/DemoWebAppWithCiCd/BackEnd/*'
pr:
paths:
include:
- '/DemoWebAppWithCiCd/BackEnd/*'
The trigger
element places a path filter on every commit to the master branch. If any of the committed files match the specified pattern and the branch is master then the CI is kicked off.
The pr
element places a path filter on every commit to the feature branch which is under an active pull request. If any of the committed files in the feature branch match the specified pattern and the branch is under an active PR, then the CI is kicked off.
Every successful Pull Request ends with a merge of the feature branch into the master. The merge operation kicks of the CI once more. But this time it is on the master branch.
The CD stage requires configuration at several places
The variable environment (could be any name) influences the PowerShell script. The PowerShell script will use this environment variable to control the names of the assets in the cloud.
Example: In the following Powershell snippet, the Azure resource group containing our Azure Web app is named as per the environment. Notice that the naming of the environment is also led by the environment variable.
$environment=$env:ENVIRONMENT
if ([string]::IsNullOrWhiteSpace($environment)){
Write-Error -Message "The variable 'environment' was empty"
}
$ResourceGroup="rg-$environment-demo-webapp-with-cicd"
$Location="uksouth"
$PlanName="WebAppPlanName"
$WebAppName="MyDemoWebApi123-$environment"
A one-time connection trust between Azure Devops and Azure cloud needs to be established. This is done via the Service connections panel of Azure Devops.
You will also need to register a new Application in Active Directory
The service connection created in the previous step is now available for use in any of the stages. Example: In the case of a Azure CLI task or a Azure PowerShell task, this can be specified in the drop down labelled Azure Resource Manager Connection. Pay attention to the check boxes. This is neccessary if we want to run Powershell scripts with Azure CLI and Azure Powershell commands.
For full automation, the CD should be configured to "listen" to drops from the CI pipeline.
In this section we will minutely examine the Pull Request workflow. To drive this example, we will make a code commit to our toy BackEnd application.
git branch feature/do-logging
git checkout feature/do-logging
For this excercise, I have simply added 1 line of logging to the API end point.
Notice that the BackEnd CI has kicked off automatically. The CI is running on the feature branch.
When the CI completes, Azure Devops kicks off the CD stage.
Notice that Github is reporting that BackEnd CI and BackEnd CD are mandatory checks.
Note that Azure Devops has completed the CD stage on the feature branch.
Github has merged the Pull Request. Azure Devops will automatically kick off the CI once more. This time on the master branch
CI on master is complete. DEV stage of CD is now running
DEV stage is complete.UAT stage of CD is now running
UAT stage is complete. PROD stage is waiting for approval
Click on the Approve button
PROD stage of CD is now complete
How do we know that our BackEnd assets have actually been updated on Azure cloud? The Activity Log of the App Service displays the following:
This is absolutely important. You want to run all the infrastructure scripts from your local workstation before you can expect CI/CD to work smoothly. You want a system where there is quick feedback to the developer. CI/CD is an automation engine and should not be used as an debugging environment.
You will need to do a az login
. This will do a one time interactive login into Azure.
All subsequent calls to az
CLI will use saved credentials to interact with Azure.
https://docs.microsoft.com/en-us/cli/azure/authenticate-azure-cli#sign-in-interactively
You can verify by opening another Powershell console and typing az account show
.
You will need to do a Set-AzContext
from a Powershell Core shell. This will do a one time interactive login into Azure.
All subsequent Azure powershell cmdlet invocations will use saved credentials to interact with Azure.
https://docs.microsoft.com/en-us/powershell/module/az.accounts/set-azcontext?view=azps-7.2.0
You can verify by opening another Powershell Core shell and run Get-AzContext
Attention! Azure CLI and Azure Powershell require their own authentication and hence you need to do both of the above.
https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd
The back end code is managed by the single .SLN file https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/BackEnd/src/Demo.sln
- Execute the following PS script from a Powershell Core shell.
- The script will create the resource group, app service plan and app service. https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/BackEnd/infrastructure/createwebapp.ps1
- Build the solution
- Zip the binaries
- Attention! The ZIP file should contain the assemblies at the very top level
- Use the script https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/BackEnd/infrastructure/deploy.ps1
The FrontEnd comprises of a simple HTML and JS files. Hence there is no build required - unlike a React or Vue application.
- Execute the following PS script from a Powershell Core shell.
- The script will upload the static content to the Azure blob container
$web
https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/FrontEnd/infrastructure/deploy.ps1
You will need to create two CI pipelines in your Azure Devops account. These should use the following YML files
- https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/FrontEnd/build/build.yml
- https://github.com/sdg002/AnyDotnetStuff/tree/master/DemoWebAppWithCiCd/BackEnd/build/build.yml
You will need to create 2 CD releases. One wired up with the BackEnd CI and another wired up with the FrontEnd CI.
To see a full Pull Request workflow in action, protection must be applied to the branch branch.
Refer the PowerShell scripts inside the folders \BackEnd\Infrastructure and \FrontEnd\infrastructure for the complete Powershell script.
New-AzResourceGroup -Name $ResourceGroup -Location $Location -Force
function CreatePlan(){
Write-Host "Creating plan $PlanName"
az appservice plan create --name $PlanName --resource-group $ResourceGroup --sku $PlanSKu --number-of-workers $NumOfWorkers --subscription $ctx.Subscription.Id
}
function CreateWebApp(){
Write-Host "Creating Web App $WebAppName"
az webapp create --name $WebAppName --plan $PlanName --resource-group $ResourceGroup --subscription $ctx.Subscription.Id
}
Write-Host "Creating storage account $StaticSiteStorageAccount"
az storage account create --name $StaticSiteStorageAccount --resource-group $ResourceGroup --location $Location --sku Standard_LRS --subscription $ctx.Subscription.Id
Write-Host "Uploading files from $Sourcefolder"
az storage blob upload-batch --account-name $StaticSiteStorageAccount --source $Sourcefolder -d '$web'
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops
https://docs.microsoft.com/en-us/cli/azure/webapp?view=azure-cli-latest#az-webapp-deploy
https://docs.microsoft.com/en-us/azure/devops-project/azure-devops-project-github
dotnet publish --configuration Release --output c:\truetemp\someoutput\ demo.sln
Specifying the file version helps immensely in post deployment application support.
dotnet publish --output c:\truetemp\someoutput\ /p:FileVersion=1.2.3.4 demo.sln
https://en.wikipedia.org/wiki/Monorepo
The Powershell script BackEnd\infrastructure\deploy.ps1
creates the App Service Plan which then hosts the App Service.
The App Service Plan controls the compute resources allocated for the web application.
You would like a cheaper resource for DEV. But, PROD and UAT would need more realistic compute resources.
The --sku
command line parameter governs the compute capabilities of the plan.
az appservice plan create --sku "FREE"
We could easily tweak the deploy.ps1
such that the --sku
parameter is governed by the $env:evironment
variable value
Example:
$sku="FREE"
if ($env:environment -eq "prod")
{
$sku="P1v2"
}
Refer documentation for a full listing of available plans and their pricing https://azure.microsoft.com/en-us/pricing/details/app-service/windows/
If you continue with the Monorepo approach, then you can create a new folder at the same level as FrontEnd and BackEnd. See AnotherService1 and AnotherService2 below
|
+---BackEnd
| +---build
| | build.yml
| |
| +---infrastructure
| | common.ps1
| | createwebapp.ps1
| | deploy.ps1
| |
| \---src
|
|
+---FrontEnd
| +---build
| | build.yml
| |
| +---infrastructure
| | deploy.ps1
| |
| \---src
|
+---AnotherService1
| +---build
| | build.yml
| |
| +---infrastructure
| | deploy.ps1
| |
| \---src
|
+---AnotherService2
| +---build
| | build.yml
| |
| +---infrastructure
| | deploy.ps1
| |
| \---src
|
|
You would a new CI pipeline for and new CD pipeline for each of the new back end services.
How would the repository design change if there were central resources like MSSQL, Redis and KeyVault in the mix?
As the application matures you would need central infrastructure pieces. Example: SQL Server\Postgres\Cosmos, Redis Cache, Application Insights, KeyVault
- You could follow the Monorepo approach, then you could create a new folder as described above
- You could create a new repository
Regardless of which approach, you would need a CI YAML and a CD stage.