Azure Devops Explained
Azure Devops Explained
Explained
Sjoukje Zaal
Stefano Demiliani
Amit Malik
BIRMINGHAM—MUMBAI
Azure DevOps Explained
Copyright © 2020 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system,
or transmitted in any form or by any means, without the prior written permission of the
publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the
information presented. However, the information contained in this book is sold without
warranty, either express or implied. Neither the author(s), nor Packt Publishing or its
dealers and distributors, will be held liable for any damages caused or alleged to have been
caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the
companies and products mentioned in this book by the appropriate use of capitals.
However, Packt Publishing cannot guarantee the accuracy of this information.
ISBN 978-1-80056-351-3
www.packt.com
Packt.com
Subscribe to our online digital library for full access to over 7,000 books and videos, as
well as industry leading tools to help you plan your personal development and advance
your career. For more information, please visit our website.
Why subscribe?
• Spend less time learning and more time coding with practical eBooks and Videos
from over 4,000 industry professionals
• Improve your learning with Skill Plans built especially for you
• Get a free eBook or video every month
• Fully searchable for easy access to vital information
• Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and
ePub files available? You can upgrade to the eBook version at packt.com and as a print
book customer, you are entitled to a discount on the eBook copy. Get in touch with us at
[email protected] for more details.
At www.packt.com, you can also read a collection of free technical articles, sign up for
a range of free newsletters, and receive exclusive discounts and offers on Packt books and
eBooks.
Contributors
About the authors
Sjoukje Zaal is a CTO, Microsoft Regional Director, and Microsoft Azure MVP with over
20 years of experience in architecture-, development-, consultancy-, and design-related
roles. She works at Capgemini, a global leader in consultancy, technology services, and
digital transformation.
She loves to share her knowledge and is active in the Microsoft community as a
co-founder of the user groups Tech Daily Chronicle, Global XR Community, and the
Mixed Reality User Group. She is also a board member of Azure Thursdays and Global
Azure. Sjoukje is an international speaker and is involved in organizing many events. She
has written several books and writes blogs.
Stefano Demiliani is a Microsoft MVP in business applications, an MCT, a Microsoft
Certified DevOps Engineer and Azure Architect, and a long-time expert on Microsoft
technologies. He works as a CTO for EID NAVLAB and his main activities are
architecting solutions with Azure and Dynamics 365 ERPs. He’s the author of many IT
books for Packt and a speaker at international conferences about Azure and Dynamics
365. You can reach him on Twitter or on LinkedIn or via his personal website.
I dedicate this book to my little daughter, Sara. In the past few months, I
have spent so much time away from you; I hope you can now appreciate the
work done and understand me.
Amit Malik is an IT enthusiast and technology evangelist focused on the cloud and
emerging technologies. He is currently employed by Spektra Systems as the director
of technology, where he helps Microsoft partners grow their cloud businesses by using
effective tools and strategies. He specializes in the cloud, DevOps, software-defined
infrastructure, application modernization, data platforms, and emerging technologies
around AI. Amit holds various industry-admired certifications from all major OEMs
in the cloud and data space, including Azure Solutions Architect Expert. He is also a
Microsoft Certified Trainer (MCT). Amit is an active community member of various
technology groups and is a regular speaker at industry conferences and events.
About the reviewers
Vassili Altynikov is the founder and a principal DevOps architect at Blend Master
Software.
With nearly two decades of software development, application architecture, and technical
consulting experience, he is helping organizations establish and improve their DevOps
practices to deliver better-quality software faster.
I would like to thank my parents and brothers for their support and
motivation.
Preface
2
Managing Projects with Azure DevOps Boards
Technical requirements 24 Work Items 31
Understanding processes and Backlogs 39
process templates 24 Boards44
Sprints45
Creating an organization 27
Queries48
Creating a project 29
Creating and managing project Summary50
activities31 Further reading 51
4
Understanding Azure DevOps Pipelines
Technical requirements 104 Overview of Azure Pipelines 106
Implementing a CI/CD process 104 Understanding build agents 109
Table of Contents iii
5
Running Quality Tests in a Build Pipeline
Technical requirements 166 testing179
Benefits of automatic testing 166 Assigning test results to work
Introduction to unit testing 167 items182
Running unit tests in a build Introduction to Feature Flags 184
pipeline 168 Using Feature Flags to test in
Downloading the source code 168 production185
Creating the pipeline 171 Creating a new .NET Core application 185
6
Hosting Your Own Azure Pipeline Agent
Technical requirements 192 Understanding the types of
Azure pipeline agent overview 193 agents in Azure Pipelines 194
iv Table of Contents
8
Deploying Applications with Azure DevOps
Technical requirements 254 Using approvals and gates for
An overview of release pipelines254 managing deployments 272
Creating a release pipeline with Creating approvals 272
Azure DevOps 256 Using gates to check conditions 275
Using deployment groups 279
Creating the Azure DevOps release 261
Configuring the release pipeline YAML release pipelines with
triggers for continuous deployment 265 Azure DevOps 281
Creating a multi-stage release Summary285
pipeline268
10
Using Test Plans with Azure DevOps
Technical requirements 324 Exploratory testing 324
Introduction to Azure Test Plans324
vi Table of Contents
11
Real-World CI/CD Scenarios with Azure DevOps
Technical requirements 351 Setting up the required infrastructure 386
Setting up a CI/CD pipeline for Setting up Azure Repos for the voting
.NET-based applications 352 application389
Setting up the CI pipeline 389
Introduction to the sample application 352
Setting up the CD pipeline 393
Preparing the pre-requisite Azure
infrastructure 353 Simulating an end-to-end CI/CD
experience402
Setting up an Azure DevOps project 361
Azure Architecture Center for
Setting up a CI/CD pipeline for a
DevOps402
container-based application 385
Introduction to the sample app 385
Summary405
Chapter 4, Understanding Azure DevOps Pipelines, shows you how to create a build
pipeline for your code with Azure Pipelines and how best to handle continuous
integration.
Chapter 5, Running Quality Tests in a Build Pipeline, explains how to create and execute
quality tests for your code in a build pipeline.
Chapter 6, Hosting Your Own Azure Pipeline Agent, shows you how to create your own
build agents and use them in a build pipeline.
Chapter 7, Using Artifacts with Azure DevOps, explains how to use artifacts (package
feeds) to create and share packages and add fully integrated package management to your
continuous integration/continuous delivery pipelines.
Chapter 8, Deploying Applications with Azure DevOps, explains how to use release
pipelines to handle the continuous deployment of your code and how to use stages and
approvals before releasing code into a production environment.
Chapter 9, Integrating Azure DevOps with GitHub, shows you how to integrate Azure
DevOps tools with GitHub and use both applications for your continuous integration/
continuous delivery processes.
Chapter 10, Using Test Plans with Azure DevOps, shows you how to manage your project's
testing life cycle with test plans in Azure DevOps.
Chapter 11, Real-World CI/CD Scenarios with Azure DevOps, shows you some real-world
scenarios of continuous integration/continuous delivery processes being handled with
Azure DevOps.
If you are using the digital version of this book, we advise you to type the code yourself
or access the code via the GitHub repository (link available in the next section). Doing
so will help you to avoid any potential errors related to the copying and pasting of
code.
Preface ix
Conventions used
There are a number of text conventions used throughout this book.
Code in text: Indicates code words in text, database table names, folder names,
filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles.
Here is an example: 'You can download the file named node-v6.12.3-x64.msi and
install it using the interactive installer.'
A block of code is set as follows:
using System;
using PartsUnlimited.Models;
namespace AzureArtifacts
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine('Hello World!');
When we wish to draw your attention to a particular part of a code block, the relevant
lines or items are set in bold:
[Net.ServicePointManager]::SecurityProtocol = [Net.
SecurityProtocolType]::Tls12
Install-Module AzureRM -AllowClobber
docker run \
-e VSTS_ACCOUNT=<name> \
-e VSTS_TOKEN=<pat> \
-it mcr.microsoft.com/azure-pipelines/vsts-agent
Bold: Indicates a new term, an important word, or words that you see onscreen. For
example, words in menus or dialog boxes appear in the text like this. Here is an example:
'Log in with your Microsoft account and in the left menu, select Artifacts.'
Get in touch
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, mention the book
title in the subject of your message and email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes
do happen. If you have found a mistake in this book, we would be grateful if you would
report this to us. Please visit www.packtpub.com/support/errata, selecting your
book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet,
we would be grateful if you would provide us with the location address or website name.
Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise
in and you are interested in either writing or contributing to a book, please visit
authors.packtpub.com.
Preface xi
Reviews
Please leave a review. Once you have read and used this book, why not leave a review on
the site that you purchased it from? Potential readers can then see and use your unbiased
opinion to make purchase decisions, we at Packt can understand what you think about
our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packt.com.
Section 1:
DevOps Principles
and Azure DevOps
Project Management
In this section, DevOps principles, Azure DevOps key concepts, and project management
will be covered.
This section contains the following chapters:
• Introducing DevOps
• Understanding DevOps principles
• Introducing Azure DevOps key concepts
• Discovering Azure DevOps services
• Introducing the scenarios
Introducing DevOps
For a long time, development and operations had been divided into isolated modules with
both separate concerns and responsibilities. Developers wrote the code and made sure that
it worked on their development systems, while the system administrators were responsible
for the actual deployment and integration in the organization's IT infrastructure.
As there was limited communication between these two isolated modules, both teams
worked mostly separated on their projects. However, they heavily depended on each
other because there was no cross-platform knowledge across the different teams.
This fitted in nicely with the Waterfall Methodology that was used for most projects.
The Waterfall Methodology is based on the Software Development Life Cycle (SDLC),
which has clearly defined processes for creating software. The Waterfall Methodology is a
breakdown of project deliverables into linear sequential phases, where each phase depends
on the deliverables of the previous phase. This sequence of events may look as follows:
• Early in the development life cycle, customers and developers agree on what will be
delivered, with minimal to no changes during the development of the project.
• For integration with external systems, it is common for multiple components of the
software to be designed in parallel. In these cases, it is desirable to have the design
document complete at an early stage in the development life cycle.
• Various team members are involved in other projects simultaneously as well. For
example, business analysts can gather the requirements and create the design while
developers are working on another project.
• Where it is not possible to break down the requirements phase, customers are not
fully engaged in smaller deliverables.
Introducing DevOps 5
However, customers may not exactly know what their requirements are before they
see working software. This can result in changing the requirements, thus leading to
redesign, reimplementation, and reverification. This can dramatically increase the
costs of the project.
Due to this, Agile and DevOps were introduced in 2009 and have slowly taken over
the world of software development. They replaced the Waterfall Methodology for most
projects that are out there. DevOps is a natural extension of Agile and continuous
delivery approaches, and it stands for development and operations. It is a practice that
merges development, IT operations, and quality assurance into one single, continuous
set of processes.
The following diagram illustrates the different parts that DevOps consists of:
• By working directly with the project team throughout the whole project,
the customer will experience a stronger sense of ownership.
• The customer has opportunities to see the work being delivered in an early stage
of the project and can make appropriate decisions and changes to it.
• Development is more business and value focused. This is a result of working closer
with the customer and having a better understanding of their needs.
• An Agile way of working enables us to quickly create a base version of the product,
which can be built upon in the next iterations.
Now that we have covered a very brief introduction to DevOps, we are going to look at the
different DevOps principles.
In this section, we have covered the six principles that are very important when adopting
or migrating to a DevOps way of working. In the next few sections, we are going to look
at what Azure DevOps has to offer as a tool that supports teams so that they can work
in a DevOps oriented manner.
Plan
During the planning phase, teams can use Kanban boards and backlogs to define, track,
and lay out the work that needs to be done in Azure Boards. They can also use GitHub
for this. In GitHub, an issue can be created by suggesting a new idea or stating that a bug
should be tracked. These issues can be organized and assigned to teams.
Develop
The development phase is supported by Visual Studio Code and Visual Studio. Visual
Studio Code is a cross-platform editor, while Visual Studio is a Windows- and Mac-only
IDE. You can use Azure DevOps for automated testing and use Azure Pipelines to create
automatic builds for building the source code. Code can be shared across teams with
Azure DevOps or GitHub.
Deliver
The deliver phase is about deploying your applications and services to target
environments. You can use Azure Pipelines to deploy code automatically to any Azure
service or on-premises environments. You can use Azure Resource Manager templates or
Terraform to spin up environments for your applications or infrastructure components.
You can also integrate Jenkins and Spinnaker inside your Azure DevOps Pipelines.
Operate
In this phase, you implement full-stack monitoring for monitoring your applications and
services. You can also manage your cloud environment with different automation tools,
such as Azure Automation, Chef, and more. Keeping your applications and services secure
is also part of this phase. Therefore, you can use features and services such as Azure Policy
and Azure Security Center.
To support the full life cycle of analyzing, designing, building, deploying, and maintaining
software and infrastructure products and services, Azure DevOps provides integrated
features that can be accessed through any web browser.
Azure DevOps offers a combination of solutions and tooling that can be used to create
unique and custom workflows throughout each of the application life cycle phases.
These solutions will be described in the upcoming sections.
10 Azure DevOps Overview
Version control
A version control system, also known as a source control system, is an essential tool for
multi-developer projects. It allows developers to collaborate on the code and track changes.
The history of all the code files is also maintained in the version control system. This makes
it easy to go back to a different version of the code files in case of errors or bugs.
Azure DevOps supports two different types of source control: Git (distributed) and
Team Foundation Version Control (TFVS). With Git, each developer has a copy of the
source repository on their development machine. All branch and history information is
included inside the source repository. Each developer works directly with their copy of
the repository and all the changes are shared between the local and source repositories
as a separate step. Changes can be committed on the local filesystem, and version control
operations can be executed without a network connection. Branches can be created easily
on the dev machine and later, they can be merged, published, or disposed by the developer
separately. With TFVC, developers have only one version of each file on their local dev
machines. All the others, as well as the historical data, are maintained only on the server.
The branches are created on the server as well.
Infrastructure as Code
Teams can also manage the infrastructure in Azure DevOps. Infrastructure components
that are used in a project, such as networks, virtual machines, and load balancers, can
be managed using the same versioning features and capabilities that are used for the
source code.
Used together with continuous delivery, an Infrastructure as Code (IaC) model generates
the same environment every time it is deployed. Without IaC, teams need to configure
and maintain the settings of all the individual deployment environments manually, which
is a time-consuming and error-prone task. The most plausible outcome is that, over time,
each environment becomes a snowflake, which is a unique configuration that cannot be
reproduced automatically anymore. This inconsistency across environments will lead
to issues during the deployment phase.
12 Azure DevOps Overview
Configuration Management
Configuration Management refers to all the items and artifacts that are relevant to the
project and the relationship between them. Those items are stored, retrieved, and uniquely
identified and modified. This includes items such as source code, files, and binaries. The
configuration management system is the one true source of configuration items.
Using Azure DevOps, resource configuration across the entire system can be managed by
teams to roll out configuration updates, enforce desired states, and automatically resolve
unexpected changes and issues. Azure offers multiple DevOps tools and capabilities for
configuration management, such as Chef, Puppet, Ansible, and Azure Automation.
Monitoring
You can use Azure Monitor to practice full-stack continuous monitoring. The health
of your infrastructure and applications can be integrated into existing dashboards
in Grafana, Kibana, and the Azure portal with Azure Monitor. You can also monitor
the availability, performance, and usage of your applications, whether they are hosted
on-premises or in Azure. Most popular languages and frameworks are supported by Azure
Monitor, such as NET, Java, and Node.js, and they are integrated with DevOps processes
and tools in Azure DevOps.
Azure Boards
Azure Boards can be used to plan, track, and discuss work across teams using the Agile
planning tools that are available. Using Azure Boards, teams can manage their software
projects. It also offers a unique set of capabilities, including native support for Scrum and
Kanban. You can also create customizable dashboards, and it offers integrated reporting
and integration with Microsoft Teams and Slack.
Discovering Azure DevOps services 13
You can create and track user stories, backlog items, tasks, features, and bugs that are
associated with the project using Azure Boards.
The following screenshot shows an example of an Azure Board:
Azure Repos
Azure Repos provides support for private Git repository hosting and for Team
Foundation Server Control (TFSC). It offers a set of version control tools that can be
used to manage the source code of every development project, large or small. When you
edit the code, you ask the source control system to create a snapshot of the files. This
snapshot is saved permanently so that it can be recalled later if needed.
Today, Git is the most used version control system among developers. Azure Repos
offers standard Git so that developers can use the tools and clients of their choice, such
as Git for Windows, Mac, third-party Git services, and tools such as Visual Studio and
Visual Studio Code.
14 Azure DevOps Overview
The following screenshot shows an example of the commits you can push to a repo
in Azure:
Azure Pipelines
You can use Azure Pipelines to automatically build, test, and deploy code to make it
available to other users and deploy it to different targets, such as a development, test,
acceptance, and production (DTAP) environment. It combines CI/CD to automatically
build and deploy your code.
Before you can use Azure Pipelines, you should put your code in a version control system,
such as Azure Repos. Azure Pipelines can integrate with a number of version control
systems, such as Azure Repos, Git, TFVS, GitHub, GitHub Enterprise, Subversion, and
Bitbucket Cloud. You can also use Pipelines with most application types, such as Java,
JavaScript, Node.js, Python, .NET, C++, Go, PHP, and XCode. Applications can be
deployed to multiple target environments, including container registries, virtual machines,
Azure services, or any on-premises or cloud target.
Discovering Azure DevOps services 15
Azure Artifacts
With Azure Artifacts, you can create and share NuGet, npm, Python, and Maven
packages from private and public sources with teams in Azure DevOps. These packages
can be used in source code and can be made available to the CI/CD pipelines. With Azure
Artifacts, you can create multiple feeds that you can use to organize and control access
to the packages.
Discovering Azure DevOps services 17
Extension Marketplace
You can download extensions for Azure DevOps from the Visual Studio Marketplace.
These extensions are simple add-ons that can be used to customize and extend your team's
experience with Azure DevOps. They can help by extending the planning and tracking of
work items, code testing and tracking, pipeline build and release flows, and collaboration
among team members. The extensions are created by Microsoft and the community.
18 Azure DevOps Overview
The following screenshot shows some of the extensions that can be downloaded from
the marketplace:
• ARM Outputs: This extension reads the output values of ARM deployments
and sets them as Azure Pipelines variables. You can download and install
the extension from https://marketplace.visualstudio.com/
items?itemName=keesschollaart.arm-outputs.
• Team Project Health: This extension enables users to visualize the overall health
of builds, thereby delivering a visual cue similar to Codify Build Light. You can
download the extension from https://marketplace.visualstudio.com/
items?itemName=ms-devlabs.TeamProjectHealth.
Once the extensions have been installed inside your Azure DevOps organization, you can
generate the sample project:
Tip
For more information about the Tailwind Traders sample project, refer
to the following site: https://github.com/Microsoft/
TailwindTraders. For more information about the Parts
Unlimited example, refer to https://microsoft.github.io/
PartsUnlimited/.
Summary
In this chapter, we covered some of the basics of DevOps and covered the six different
DevOps principles. Then, we covered the key concepts of Azure DevOps and the different
solutions that Azure DevOps has to offer to support teams throughout each of the
application life cycle phases. After that, we looked at the different features that Azure
DevOps has to offer, and we introduced and created the two scenarios that we will use
in the upcoming chapters of this book.
In the next chapter, we are going to cover how to manage projects with Azure Boards.
Further reading
Check out the following links for more information about the topics that were covered
in this chapter:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization. The
organization that we'll be using in this chapter was created in Chapter 1, Azure DevOps
Overview.
• Basic: This is the simplest model that teams can choose. It uses Epics, Issues, and
Tasks to track the work. These artifacts are created when you create a new basic
project, as follows:
• Agile: Choose Agile when your team uses the Agile planning process. You can track
different types of work, such as Features, User Stories, and Tasks. These artifacts
are created when you create a new project using the Agile process. Development
and test activities are tracked separately here, and Agile uses the Kanban board to
track User Stories and bugs. You can also track them on the task board:
• Scrum: When your team is practicing the Scrum methodology, choose the Scrum
process. This will create Product backlog items (PBIs), Tasks, Bugs, and more
artifacts for the team. You can track artifacts using the Kanban board, or break PBIs
and bugs down into tasks on the task board. The Scrum process is shown in the
following diagram:
Creating an organization
An organization in Azure DevOps is used to connect groups of related projects. You can
plan and track your work here and collaborate with others when developing applications.
From the organization level, you can also integrate with other services, set permissions
accordingly, and set up continuous integration and deployment.
28 Managing Projects with Azure DevOps Boards
In the previous chapter, we introduced the scenarios that we will be using throughout
this book. Tailwind Traders is an example retail company that is showcasing the future of
intelligent application experiences. By generating a project using the DevOps generator,
the organization and the project were automatically created.
However, there are cases where you might need to create an organization manually, such
as when you first start to use Azure DevOps in an organization, or when it is a logical fit
to create a separate organization based on permission requirements. So, we are going to
cover this step as well. Therefore, you need to perform the following steps:
With that, the organization has been created. In the next section, we are going to learn
how to add a new project to this organization.
Creating a project
After creating a new organization, Azure DevOps automatically gives you the ability to
create a new project. Perform the following steps:
1. The wizard for creating a project is automatically displayed once you've created a
new organization. There, you can specify the project's name. In my case, I named it
LearnDevOps.
30 Managing Projects with Azure DevOps Boards
2. You can also choose if you want your project to be Public, so that everyone on the
internet can view it, or Private. If you choose the latter, you need to give access to
users manually. We will choose Private for this demo:
We have now covered how to create a new organization and add a project to it. For the
remaining sections of this chapter, we are going to leave this organization and project
as-is, and we are going to use the Tailwind Traders project that we imported in Chapter 1,
Azure DevOps Overview.
In the next section, we will cover how to create and manage different project activities.
Work Items
Teams use artifact Work Items to track all the work for a team. Here, you will describe
what is needed for the software development project. You can track the features and the
requirements, the code defects or bugs, and all other items. The Work Items that are
available to you are based on the process that was chosen when the project was created.
Work Items have three different states: new, active, and closed. During the development
process, the team can update the items accordingly so that everyone has a complete
picture of the work related to the project.
Now, let's create a new Work Item.
32 Managing Projects with Azure DevOps Boards
4. Next, from the left menu, select Boards and then Work items:
5. On the next screen, you will see an overview of all the Work Items that were
generated automatically when we created the Tailwind Traders project:
1. A new window will open where you can specify the values for the User Story. Add
the following:
a) Title: As a user, I want to edit my user profile.
b) Assigned: Here, you can assign the Work Item to a specific person.
c) Add tag: You can also add tags to this Work Item. These tags can be used for
searching later. I've added a tag called Profile Improvements.
d) State: Because this is a newly created item, the state is automatically set to
New.
e) Iteration: Here, you can specify which sprint you want to add this User Story
to. You can also do this later from the backlog. I've added it to iteration 2.
f) Description: As a user, I want to edit my user profile. This is a rich text editor
where you can also format the description to your needs.
36 Managing Projects with Azure DevOps Boards
g) Discussion: Here, you can add additional comments related to this Work
Item. You can link it to another Work Item using "#" followed by "the name of the
Work Item", link a particular pull request using "!" followed by the "name of the pull
request", or mention a person using "@" followed by the "name of the person".
h) Priority: You can prioritize your User Story here. The priority here is just a
number to indicate the importance of the Work Item, not the priority of it. The
priority can be decided from the board by dragging the User Story up and down.
i) Classification: You can also classify this item. The generator created two
different categories for the Tailwind Traders project. Here, you can select Business
or Architecture. In this case, the item is more business-related.
j) Development: Here, you can link the item to a specific branch, build, pull
request, and so on.
k) Story points: Using story points, you can estimate the amount of work
required to complete a User Story using any numeric unit of measurement. The
value in this field is based on the velocity of the team:
2. Related Work: You can also link the item to other items or GitHub issues, such as
parent-child relationships, Tested By, Duplicate Of, and so on:
3. After filling in these fields, click the Save button at the top-right-hand side of the
screen:
Important Note
For more information on how to create the different Work Items, refer
to the following website: https://docs.microsoft.com/en-
us/azure/devops/boards/work-items/about-work-
items?view=azure-devops&tabs=agile-process.
For more information about the different fields that are used in the Work Item
forms, refer to this website: https://docs.microsoft.com/en-
us/azure/devops/boards/work-items/guidance/work-
item-field?view=azure-devops.
In the next section, we are going to look at backlogs and sprints in more detail.
Creating and managing project activities 39
Backlogs
The product backlog is a roadmap for what teams are planning to deliver. By adding
User Stories, requirements, or backlog items to it, you can get an overview of all the
functionality that needs to be developed for the project.
From the backlog, Work Items can be easily reordered, prioritized, and added to sprints.
Let's take a look at how backlog works:
3. Here, you will see all the different User Stories for the project, including the one that
we created in the previous demo. From the top-right, you can select the different
types of Work Items that come with the project template:
5. From the backlog, you can also add Work Items to the different sprints. During
creation of the Work Item, we added this User Story to Sprint 2. From here, we can
drag this item to a different sprint if we want to:
6. You can also change the view to see more Work Items that are related to these User
Stories. By clicking on the view options shown on the left-hand side of screen, you
can enable different views. Enable Parent, which displays epics and features:
7. By dragging the different types of Work Items, you can also easily create different
types of parent-child relationships. For instance, you can drag our newly created
User Story to the Membership signup feature and create a relationship between
them:
Boards
Another way to look at the different Work Items you have is by using boards. Each project
comes with a preconfigured Kanban board that can be used to manage and visualize the
flow of the work.
This board comes with different columns that represent different work stages. Here, you
can get a comprehensive overview of all the work that needs to be done and what the
current status of the Work Items is.
Let's look at Boards in Azure DevOps in more detail:
1. From the left menu, under Boards, select Boards. Here, you will see an overview of
the different Work Items that have been added to cards on the board, as shown in
the following screenshot:
There are also Work Items that are being Resolved, which means the development
part has finished but they still need to be tested. Items that have passed these tests
and meet the Definition of done are moved to the Closed column.
2. From here, you can also drag items to different columns, view the items in the
backlog, and make changes to them by clicking on the three (…) at the top-right of
the item, as follows:
Sprints
According to the project template that is chosen, sprints can have a different name. In our
Tailwind Traders project, the Agile project template is being used. This changes the name
to Iterations. However, Azure DevOps treats these the same as Sprints.
Iterations or Sprints are used to divide the work into a specific number of (mostly) weeks.
This is based on the velocity that a team can handle; that is, the rate at which the team is
burning the User Stories.
46 Managing Projects with Azure DevOps Boards
1. From the left menu, under Boards, select Sprints. By default, the backlog view will
be displayed. Here, you will see an overview of the User Stories again, except this
time for the current sprint, as shown in the following screenshot:
2. By clicking on Taskboard from the top menu, you will see a different view of the
Work Items in the sprint, similar to what happens in Boards. This time, the items
that are in the current sprint are displayed at the backlog task level:
3. From here, you can drag items to different columns, create new Work Items if
needed, and filter through the different sprints:
Queries
You can filter Work Items based on the filter criteria that you provide in Azure DevOps.
This way, you can easily get an overview of all the Work Items that are in a particular type,
state, or have a particular label. This can be done within a project, but also across different
projects.
To create different queries and search for Work Items, perform the following steps:
1. From the left menu, under Boards, select Queries. Then, from the top menu, select
+ New query:
Creating and managing project activities 49
3. Then, click on Run query. The result will display the Work Item that we created in
the first step of this section:
Important Note
This was a basic example of the search queries that you can create.
For more in-depth information, you can refer to https://docs.
microsoft.com/en-us/azure/devops/project/search/
overview?view=azure-devops.
With that, we have covered the basics of how to run a query to filter Work Items. This
concludes this chapter.
Summary
In this chapter, we covered Azure Boards in more depth. We started by looking at the
different project templates that you can choose from based on the methodology that your
organization embraces. Based on that project template, different Work Items are created
that can be used for planning the project. These Work Items can be added to backlogs and
relationships can be created for a logical view of the project items. They can also be added
to sprints.
In the next chapter, we are going to focus on source code management in Azure DevOps.
Further reading 51
Further reading
Check out the following links for more information about the topics that were covered in
this chapter:
In this section, Azure builds are covered as well as how to manage your source code in
Azure DevOps.
This section contains the following chapters:
By the end of this chapter, you will have learned about all the concepts you can use to
apply SCM techniques to your team using Azure DevOps.
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization and Visual
Studio or Visual Studio Code installed on your development machine.
Understanding SCM
Source control (or version control) is a software practice used to track and manage
changes in source code. This is an extremely important practice because it permits to
maintain a single source of code across different developers and helps with collaborating
on a single software project (where different developers works on the same code base).
SCM is an essential practice in any DevOps process. To adopt a source control policy, you
should do the following:
• Select a source control management system to adopt (for example, install Git on a
server or use a cloud-based SCM such as Azure DevOps Repos or GitHub)
• Store your code base in a repository managed by your source control
management system
• Clone the repository locally for development by taking the latest code version (pull)
stored in the central repository
• Commit and push your released code to the central repository
• Use different copies of the repository for developing in a parallel way (branches)
1. You create a repository for your project on your Git hosting system.
2. You copy (or clone) the repository to your local development machine.
3. You create a new file in your local repository and then you save the changes locally
(stage and commit).
4. You push the changes to the remote repository (push).
5. You pull the changes from the remote repository to the local one (to align your code
with the remote repository if other developers have made modifications).
6. You merge the changes with your local repository.
58 Source Control Management with Azure DevOps
When using Git as an SCM system, you need to memorize some key concepts:
• Snapshots are the way Git keeps track of your code history. A snapshot essentially
records what all your files look like at a given point in time. You decide when to take
a snapshot and of what files.
• Commit is the act of creating a snapshot. In a project, you create different commits.
A commit contains three sets of information:
-- Details on how the files has changed from the previous version
-- A reference to the parent commit (previously occurred commit)
-- A hash code name
• Repositories are collections of all the necessary files and their history. A repository
can be on a local machine or on a remote server.
• Cloning is the act of copying a repository from a remote server.
• Pulling is the process of downloading commits that don't exist on your machine
from a remote repository.
• Pushing is the process of adding your local changes to a remote repository.
• Branches are "versions" of your code base. All commits in Git live in a branch
and you can have different branches. The main branch in a project is known as
the master.
As an example, these are some Git commands you can use to activate the previously
described SCM process:
6. Merge branch1 into the master branch and save it to the remote server:
git merge branch1
git push
Exploring branching strategies 61
Once you've mastered these commands, you'll be ready to start using Git. In the next
section, we'll provides an overview of branches and the possible branching strategies you
can use.
• GitHub Flow
• GitLab Flow
• Git Flow
GitHub Flow
GitHub Flow is one of the most widely used branching strategies and is quite simple
to adopt.
62 Source Control Management with Azure DevOps
According to this workflow, you start from a master branch (which always contains
the deployable code). When you start developing a new feature, you create a new
branch and you commit regularly to this new branch. When the development work
has been completed, you create a pull request to merge the secondary branch with the
master branch:
GitLab Flow
GitLab Flow is another popular branching strategy that's widely used, especially when
you need to support multiple environments (such as production, staging, development,
and so on) in your SCM process. The following diagram represents this flow:
This is useful if you want to maintain a stable production release, work separately on new
features that can be moved to a testing environment (in order to be tested), and then
merge that environment into the production release when testing has been completed.
Git Flow
Git Flow is a workflow that's used when you have a scheduled release cycle. The following
diagram represents this flow:
Every time you add a new feature to your code base, you create a feature branch, starting
from the develop branch, and then you merge the feature branch into develop when the
implementation is finished. Here, you never merge into the master branch.
When you need to release a set of features, you create a release branch, starting from the
develop branch. Code in the release branch must be tested (maybe with bug fixes merged
in) and then when you're ready to release the code, you merge the release branch into the
master branch and then into the develop branch.
If a serious bug occurs in production, this flow says that you can create a fix branch from
the master, fix the bug, and then merge this branch into master again directly. You can
also merge it into the release branch if it's present, or into develop otherwise. If you have
merged the code into the release branch, the develop branch will have the fix when you
merge the release branch.
• Git: This is a distributed version control system and is the default version control
provider in Azure DevOps when you create a new project.
• Team Foundation Version Control (TFVC): This is a centralized version control
system where developers have only one version of a file locally, data is stored on a
server, and branches are created on the server (path-based).
The first step when working with Azure DevOps is to create a new project inside your
organization. When you create a new project with Azure DevOps, you're prompted to
choose the version control system you want to use (shown in the red box in the following
screenshot):
Handling source control with Azure DevOps 65
Once the project has been provisioned, you can manage your repositories by going to the
Repos hub on the left bar in Azure DevOps (see the following screenshot). This is where
your files will be stored and where you can start creating repositories and managing
branches, pull requests, and so on:
Starting from Repos, every developer can clone a repository locally and work directly
from Visual Studio or Visual Studio Code while being connected to Azure DevOps in
order to push code modifications, pull and create branches, make commits, and start pull
requests.
When you start a new project from scratch, Azure DevOps creates an empty repository
for you. You can load your code into this repository manually (via upload) or you can
clone from a remote repository (for example, GitHub) to Azure DevOps.
In a single project, you can create different repositories and each can have its own set of
permissions, branches, and commits. To create a new repository, just select the Repos hub
and click on New repository, as shown in the following screenshot:
From here, you'll see a window that shows you the clone repository's URL. You can clone
this repository by using the git clone <Repository URL> command or directly in
Visual Studio or Visual Studio Code by using one of the options shown in the following
screenshot:
In Visual Studio Code, you can also clone a repository by going to the Command Palette
(Ctrl + Shift + P), selecting the Git:Clone command, and then pasting the repository
URL into the URL window that will be prompted to you:
In order to work with remote repositories on Azure DevOps with Visual Studio Code
more efficiently, I recommend that you install an extension (from the Visual Studio Code
Marketplace) called Azure Repos:
When you click the Import button, the remote GitHub repository import process will
start and you will see an image showing its progress:
Now, all the code modifications have been pushed online to the master branch. If you go
to Azure DevOps in the Repos hub and select the Commits menu, you will see the history
of every commit for the selected branch:
You can create a new branch in Azure DevOps or directly from Visual Studio Code. To
create a new branch, follow these steps:
1. From Azure DevOps, select Branches and then click on New branch:
To create a new branch directly from Visual Studio Code, just click on the branch
name on the bottom bar and select Create new branch…:
4. Now, you can work on your code (maybe for developing a new set of features) and
make commits on this new branch without affecting the master branch (it will
continue to have the actually released version of your code base).
As an example, here, I have added a new modification to the
MedicineController.cs file. I can stage and commit the modification on the
development branch locally:
5. Then, I can push these modifications to the remote repository on Azure DevOps.
When pushed online, if this is the first time the development branch is being
created, you will receive a message that looks as follows:
Figure 3.37 – The Team:View History command from Visual Studio Code
82 Source Control Management with Azure DevOps
A branch can be deleted (manually or automatically after a pull request), restored from
accidental deletion, and also be locked (in order to be placed in a read-only state or
to avoid new commits on this branch affecting a merging that is in place). To lock a
particular branch, just select the branch from Azure DevOps and then, from the menu,
select the Lock option:
To specify the branch policies for a particular branch, go to the Branch section in Azure
DevOps, select your branch, and then select the Branch policies menu:
• Basic merge (no fast-forward): This option merges the commit history of the
source branch and creates a merge commit in the target branch. The complete
non-linear history of commits that occurs during development is preserved.
• Squash merge: This creates a single commit in the target branch by compressing the
source branch commits (linear history).
• Rebase and fast-forward: A rebase allows the integration of a pull request branch
into the master branch. Each commit on the pull request is merged into the target
branch individually (linear history).
• Rebase with merge commit: This creates a semi-linear history by replacing the
source branch commits in the target branch and then creating a merge commit.
Build validation
This section allows you to specify a set of rules for building your code before the pull
request can be completed (useful for catching problems early). Upon clicking Add build
policy, a new panel appears:
Cross-repo policies
Instead of defining a policy for each branch you create manually, Azure DevOps allows
you to define cross-repository policies (which will be automatically applied to all the
branches that you create for your project).
To define a policy that will be valid for each repository you'll create, you need to go
to Project settings and then select Cross-repo policies, as shown in the following
screenshot:
Handling source control with Azure DevOps 89
• Protect the default branch of each repository (for example, the master branch of
each repo).
• Protect current and future branches matching a specified pattern. Here, you
can define a pattern for filtering branches and the policy will be applied to all the
branches that apply to this pattern.
90 Source Control Management with Azure DevOps
As an example, if you want to define a policy that will be automatically applied to all the
branches you create for your project, do the following:
In the following sections, we'll learn how to start pull requests in each of these situations.
92 Source Control Management with Azure DevOps
In the Files section, you can see what this pull request will do in the destination branch
(for every file). As an example, this is what my pull request shows me:
When you click on Complete, you'll be prompted to fill in the Complete pull request
window, which looks as follows:
Regarding the type of merge operation to apply, you can choose from the following
options:
Azure DevOps gives you a nice animated graph to show the final result of the merge.
To complete the pull request, click on Complete merge. You need to resolve any merge
conflicts if something happens. With this, the merging phase starts:
If you have an automatic build policy on the target branch (the master branch here), the
build pipeline is executed and then the merge operation is completed:
Tagging a release
Git Tags are references that point to specific points in the Git history. Tags are used in
Azure DevOps for marking a particular release (or branch) with an identifier that will be
shared internally in your team to identify, for example, the "version" of your code base.
As an example, in the previous section, we merged the development branch into the
master branch by using a pull request. Now, the master branch contains our latest
release of the code, which we're now ready to share internally.
100 Source Control Management with Azure DevOps
To use tags for your branches, in the Repos hub in Azure DevOps, go to the Tags menu:
When you click on Create, the tag will be applied to your branch:
Summary
In this chapter, we learned how to handle source control management with Azure DevOps
and why it's so important when working in teams when developing code.
We looked at the basic concepts of source control management and Git, the possible
strategies to apply when merging code, how to use Azure DevOps to apply SCM, and how
to handle repositories, commits, branches, and pull requests from Azure DevOps and
development tools such as Visual Studio Code and Visual Studio. We also learned how to
apply better policies to control the source code releases in order to improve the SCM life
cycle, how to protect branches and how to use tags for a branch.
In the next chapter, we'll learn how to create build pipelines with Azure DevOps for
implementing CI/CD practices.
4
Understanding
Azure DevOps
Pipelines
When adopting Azure DevOps in your organization, one of the main important
decisions you must make is how to define the pipeline of your development process.
A pipeline is a company-defined model that describes the steps and actions that a code
base must support, from building to the final release phase. It's a key part of any DevOps
architecture.
In this chapter, we'll learn how to define and use pipelines with Azure DevOps for
building code.
We will cover the following topics:
• Retention of builds
• Multi-stage pipeline
• Build pipeline with GitHub repositories
• Using container jobs in Azure Pipelines
• Let's get started!
Technical requirements
To follow this chapter, you need to have the following:
In DevOps, you can also have a continuous deployment process in place, where
you can automate the deployment of your code modifications to the final production
environments without manual intervention.
The typical DevOps CI/CD loop is represented in the following famous "loop" diagram:
• Commit stage: Here, new code modifications are integrated into the code base and
a set of unit tests are performed in order to check code integrity and quality.
• Build stage: Here, the code is automatically built and then the final results of the
build process (artifacts) are pushed to the final registry.
• Test stage: The build code will be deployed to preproduction, where the final testing
will be performed and then go to production deployment. Here, the code is tested
by adopting alpha and beta deployments. The alpha deployment stage is where
developers check the performance of their new builds and the interactions between
builds. In the Beta deployment stage, developers execute manual testing in order to
double-check whether the application is working correctly.
• Production deployment stage: This is where the final application, after successfully
passing all the testing requirements, goes live to the production stage.
106 Understanding Azure DevOps Pipelines
There are lots of benefits of implementing a CI/CD process in your organizations. The
main benefits are as follows:
• Improved code quality and early bug detection: By adopting automated tests, you
can discover bugs and issues at an early stage and fix them accordingly.
• Complete traceability: The whole build, test, and deployment process is tracked
and can be analyzed later. This guarantees that you can inspect which changes in
a particular build are included and the impact that they can have on the final tests
or release.
• Faster testing and release phases: Automating building and testing of your code
base on every new commit (or before a release).
In the next section, we'll provide an overview of the service offered by the Azure platform
for implementing CI/CD: Azure Pipelines.
• It's platform and language independent, which means you can build code on every
platform using the code base you want.
• It can be integrated with different types of repositories (Azure Repos, GitHub,
GitHub Enterprise, BitBucket, and so on).
• Lots of extensions (standard and community-driven) are available for building your
code and for handling custom tasks.
• Allows you to deploy your code to different cloud vendors.
• You can work with containerized applications such as Docker, Azure Container
Registry, or Kubernetes.
Overview of Azure Pipelines 107
• An organization in Azure DevOps, where you can create public or private projects
• A source code stored in a version control system (such as Azure DevOps Repos
or GitHub)
• Using the Classic interface: This allows you to select some tasks visually from a list
of possible tasks. You only need to fill in the parameters for these tasks.
• Using a scripting language called YAML: The pipeline can be defined by creating
a YAML file inside your repository with all the needed steps.
Using the classic interface can be easier initially, but remember that many features are
only available on YAML pipelines. A YAML pipeline definition is a file, and this can be
versioned and controlled just like any other file inside a repository. You can easily move
the pipeline definition between projects (this is not possible with the Classic interface).
108 Understanding Azure DevOps Pipelines
pool:
vmImage: 'ubuntu-latest'
jobs:
- job: job1
steps:
- bash: echo "Hello!"
- bash: echo "I'm job 1"
- job: job2
steps:
- bash: echo "Hello again…"
- bash: echo "I'm job 2"
If you're using stages when defining your pipeline, this is what is called a fan-out/fan-in
scenario:
When defining agents for your pipeline, you have essentially two types of possible agents:
Microsoft-hosted agents
Microsoft-hosted agents is the simplest way to define an agent for your pipeline. Azure
Pipelines provides a Microsoft-hosed agent pool by default called Azure Pipelines:
Table 1.1
Understanding build agents 111
Each of these images has its own set of software automatically installed. You can install
additional tools by using the pre-defined Tool Installer task in your pipeline definition.
More information can be found here:
https://docs.microsoft.com/en-us/azure/devops/pipelines/
tasks/?view=azure-devops#tool.
When you create a pipeline using a Microsoft-hosted agent, you just need to specify
the name of the virtual machine image to use for your agent from the preceding table. As
an example, this is the definition of a hosted agent that's using Windows Server 2019 with
a Visual Studio 2019 image:
- job: Windows
pool:
vmImage: 'windows-latest'
Self-hosted agents
While Microsoft-hosted agents are a SaaS service, self-hosted agents are private agents
that you can configure as per your needs by using Azure virtual machines or directly
using your on-premise infrastructure. You are responsible for providing all the necessary
software and tools to execute your pipeline and you're responsible for maintaining and
upgrading your agent.
112 Understanding Azure DevOps Pipelines
• Windows
• Linux
• macOS
• Docker
These steps are similar for all the environments. Next, we'll learn how to create a self-
hosted Windows agent.
Here, you can create a new personal access token for your organization with an
expiration date and with full access or with a custom defined access level (if you select
the custom defined scope, you need to select the permission you want for each scope).
To see the complete list of available scopes, click on the Show all scopes link at the bottom
of this window:
Please check that the Agent Pools scope has the Read & manage permission enabled.
When finished, click on Create and then copy the generated token before closing the
window (it will only be shown once).
Important Note
The user that you will be using for the agent must be a user with permissions
to register the agent. You can check this by going to Organization Settings |
Agent pools, selecting the Default pool, and clicking on Security.
Now, you need to download the agent software and configure it. From Organization
Settings | Agent Pools, select the Default pool and from the Agents tab, click on
New agent:
The Get the agent window will open. Select Windows as the target platform, select
x64 or x86 as your target agent platform (machine) accordingly, and then click on the
Download button:
When the agent is online, it's ready to accept your code build, which should be queued.
Remember that you can also install multiple agents on the same machine (for example,
if you want the possibility to execute core pipelines or handle jobs in parallel), but this
scenario is only recommended if the agents will not share resources.
Scalars
As an example, the following are scalar variables that have been defined in YAML:
Number: 1975
quotedText: "some text description"
notQuotedtext: strings can be also without quotes
boolean: true
nullKeyValue: null
You can also define multi-line keys by using ?, followed by a space, as follows:
? |
This is a key
that has multiple lines
: and this is its value
Cars:
- Fiat
- Mercedes
- BMW
- Drivers:
name: Stefano Demiliani
age: 45
Driving license type:
- type: full car license
license id: ABC12345
expiry date: 2025-12-31
120 Understanding Azure DevOps Pipelines
Dictionaries
You can define a Dictionary object by using YAML in the following way:
CarDetails:
make: Mercedes
model: GLC220
fuel: Gasoline
Document structure
YAML uses three dashes, ---, to separate directives from document content and
to identify the start of a document. As an example, the following YAML defines two
documents in a single file:
---
# Products purchased
- item : Surface Book 2
quantity: 1
- item : Surface Pro 7
quantity: 3
- item : Arc Mouse
quantity: 1
---
invoice: 20-198754
date : 2020-05-27
bill-to: C002456
Name : Stefano Demiliani
Creating a build pipeline with Azure DevOps 121
address:
lines:
Viale Pasubio, 21
c/o Microsoft House
city : Milan
state : MI
postal : 20154
ship-to: C002456
product:
- itemNo : ITEM001
quantity : 1
description : Surface Book 2
price : 1850.00
- sku : ITEM002
quantity : 2
description : Arc Mouse
price : 65.00
tax : 80.50
total: 1995.50
comments:
Please deliver on office hours.
Leave on the reception.
Now that we've provided a quick overview of the YAML syntax, in the next section, we'll
learn how to create a build pipeline with Azure DevOps.
To create a build pipeline with Azure DevOps, you need to go to the Pipelines hub and
select the Pipelines action:
1. Using a YAML file to create your pipeline definition. This is what happens when you
select the repository in this window.
2. Using the classic editor (graphical user interface). This is what happens when you
click on the Use the classic editor link at the bottom of this page.
In the next section, we'll learn how to create a build pipeline by using these two methods.
124 Understanding Azure DevOps Pipelines
Then, you need to choose a template for the kind of app you're building. You have a set of
predefined templates to choose from (that you can customize later), but you can also start
from an empty template:
If predefined templates fit your needs, you can start by using them; otherwise, it's
recommended to create a custom pipeline by selecting the actions you need.
Here, my application that's stored in the Azure DevOps project repository is an ASP.NET
web application (an e-commerce website project called PartsUnlimited; you can find
the public repository at the following URL: https://github.com/Microsoft/
PartsUnlimited), so I've selected the ASP.NET template.
When selected, this is the pipeline template that will be created for you automatically:
The agent job starts by installing the NuGet package manager and restoring the required
packages for building the project in the selected repository. For these actions, the pipeline
definition contains the tasks that you can see in the following screenshot:
There's also a task for testing the solution and publishing the test results:
The next section is called Triggers. Here, you can define what triggers start your pipeline.
By default, no triggers are published initially, but here, you can enable CI to automatically
start your pipeline on every commit on the selected branch:
Important Note
Enabling CI is a recommended practice if you want every piece of code that's
committed on a branch (for example, on the master branch) to always be tested
and safely controlled. In this way, you can be assured that the code is always
working as expected.
In the Option tab, you can set some options related to your build definition. For example,
here, you can create links to all the work items so that they're linked to associated changes
when a build completes successfully, create work items on failure of a build, set the status
badge for your pipeline, specify timeouts, and so on:
Creating a build pipeline with Azure DevOps 133
Once you've finished defining the pipeline, you can click Save & queue to save your
definition. By clicking on Save and run, the pipeline will be placed in a queue and wait
for an agent:
You can follow the execution of each step of the pipeline and see the related logs. If the
pipeline ends successfully, you can view a summary of its execution:
To start creating a YAML pipeline, go to the Pipeline section in Azure DevOps and click
on New Pipeline.
Here, instead of selecting the classic editor (as we did in the previous section), just select
the type of repository where your code is located (Azure Repos Git, GitHub, BitBucket,
and so on):
The system now analyzes your repository and proposes a set of available templates
according to the code stored in the repository itself. You can start from a blank YAML
template or you can select a template. Here, I'm selecting the ASP.NET template:
# ASP.NET
# Build and test ASP.NET projects.
# Add steps that publish symbols, save build artifacts,
deploy, and more:
Creating a build pipeline with Azure DevOps 139
# https://docs.microsoft.com/azure/devops/pipelines/
apps/aspnet/build-aspnet-4
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller@1
- task: NuGetCommand@2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild@1
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /
p:WebPublishMethod=Package /p:PackageAsSingleFile=true
/p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.
artifactStagingDirectory)"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest@2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
Here I add two more tasks for publishing the symbols and the
140 Understanding Azure DevOps Pipelines
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(Parameters.ArtifactName)'
condition: succeededOrFailed()
As you can see, the YAML file contains the trigger that starts the pipeline (here, this is
a commit on the master branch), the agent pool to use, the pipeline variables, and the
sequence of each task to execute (with its specific parameters).
Click on Save and run as shown in the previous screenshot to queue the pipeline and have
it executed. The following screenshot shows the executed YAML pipeline.
Figure 4.37 – YAML pipeline task selection
When you choose to create a pipeline with YAML, Azure DevOps creates a file that's
stored in the same repository that your code is stored in:
For a complete reference to the YAML schema for a pipeline, I suggest following this link:
https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-
schema?view=azure-devops&tabs=schema%2Cparameter-schema
Retention of builds
When you run a pipeline, Azure DevOps logs each step's execution and stores the final
artifacts and tests for each run.
Azure DevOps has a default retention policy for pipeline execution of 30 days. You can
change these default values by going to Project settings | Pipelines | Settings:
- task: CopyFiles@2
displayName: 'Copy files to shared network'
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: '**'
TargetFolder: '\\networkserver\storage\$(Build.
BuildNumber)'
Important Note
Remember that any data saved as artifacts with the Publish Build Artifacts
task is periodically deleted.
More information about the Copy files task can be found here:
https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/
utility/copy-files?view=azure-devops&tabs=yaml.
144 Understanding Azure DevOps Pipelines
Multi-stage pipeline
As we explained previously, you can organize the jobs in your pipeline into stages.
Stages are logical boundaries inside a pipeline flow (units of works that you can assign
to an agent) that allow you to isolate the work, pause the pipeline, and execute checks or
other actions. By default, every pipeline is composed of one stage, but you can create more
than one and arrange those stages into a dependency graph.
The basic YAML definition of a multi-stage pipeline is as follows:
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- script: echo Build!
- stage: Test
jobs:
- job: TestOne
steps:
- script: echo Test 1
- job: TestTwo
steps:
- script: echo Test 2
- stage: Deploy
jobs:
- job: Deploy
steps:
- script: echo Deployment
As an example of how to create a multi-stage pipeline with YAML, let's look at a pipeline
that builds code in your repository (with .NET Core SDK) and publishes the artifacts
as NuGet packages. The pipeline definition is as follows. The pipeline uses the stages
keyword to identify that this is a multi-stage pipeline.
In the first stage definition (Build), we have the tasks for building the code:
trigger:
- master
stages:
Multi-stage pipeline 145
- stage: 'Build'
variables:
buildConfiguration: 'Release'
jobs:
- job:
pool:
vmImage: 'ubuntu-latest'
workspace:
clean: all
steps:
- task: UseDotNet@2
displayName: 'Use .NET Core SDK'
inputs:
packageType: sdk
version: 2.2.x
installationPath: $(Agent.ToolsDirectory)/dotnet
- task: DotNetCoreCLI@2
displayName: "NuGet Restore"
inputs:
command: restore
projects: '**/*.csproj'
- task: DotNetCoreCLI@2
displayName: "Build Solution"
inputs:
command: build
projects: '**/*.csproj'
arguments: '--configuration (buildConfiguration)'
Here, we installed the .NET Core SDK by using the UseDotnet standard task template
that's available in Azure DevOps (more information can be found here: https://docs.
microsoft.com/en-us/azure/devops/pipelines/tasks/tool/dotnet-
core-tool-installer?view=azure-devops). After that, we restored the required
NuGet packages and built the solution.
146 Understanding Azure DevOps Pipelines
Now, we have the task of creating the release version of the NuGet package. This package
is saved in the packages/release folder of the artifact staging directory. Here, we will use
nobuild = true because in this task, we do not have to rebuild the solution again
(no more compilation):
- task: DotNetCoreCLI@2
displayName: 'Create NuGet Package - Release Version'
inputs:
command: pack
packDirectory: '$(Build.ArtifactStagingDirectory)/
packages/releases'
arguments: '--configuration $(buildConfiguration)'
nobuild: true
As the next step, we have the task of creating the prerelease version of the NuGet package.
In this task, we're using the buildProperties option to add the build number to the
package version (for example, if the package version is 2.0.0.0 and the build number is
20200521.1, the package version will be 2.0.0.0.20200521.1). Here, a build of the package
is mandatory (for retrieving the build ID):
- task: DotNetCoreCLI@2
displayName: 'Create NuGet Package - Prerelease Version'
inputs:
command: pack
buildProperties: 'VersionSuffix="$(Build.BuildNumber)"'
packDirectory: '$(Build.ArtifactStagingDirectory)/
packages/prereleases'
arguments: '--configuration $(buildConfiguration)'
- publish: '$(Build.ArtifactStagingDirectory)/packages'
artifact: 'packages'
Multi-stage pipeline 147
- stage: 'PublishPrereleaseNuGetPackage'
displayName: 'Publish Prerelease NuGet Package'
dependsOn: 'Build'
condition: succeeded()
jobs:
- job:
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: none
- download: current
artifact: 'packages'
- task: NuGetCommand@2
displayName: 'Push NuGet Package'
inputs:
command: 'push'
packagesToPush: '$(Pipeline.Workspace)/packages/
prereleases/*.nupkg'
nuGetFeedType: 'internal'
publishVstsFeed: 'Test'
148 Understanding Azure DevOps Pipelines
- stage: 'PublishReleaseNuGetPackage'
displayName: 'Publish Release NuGet Package'
dependsOn: 'PublishPrereleaseNuGetPackage'
condition: succeeded()
jobs:
- deployment:
pool:
vmImage: 'ubuntu-latest'
environment: 'nuget-org'
strategy:
runOnce:
deploy:
steps:
- task: NuGetCommand@2
displayName: 'Push NuGet Package'
inputs:
command: 'push'
packagesToPush: '$(Pipeline.Workspace)/packages/
releases/*.nupkg'
nuGetFeedType: 'external'
publishFeedCredentials: 'NuGet'
This stage uses a deployment job to publish the package to the configured environment
(here, this is called nuget-org). An environment is a collection of resources inside
a pipeline.
In the NuGetCommand task, we specify the package to push and that the feed where
we're pushing the package to is external (nuGetFeedType). The feed is retrieved
by using the publishFeedCredentials property, set to the name of the service
connection we created.
Multi-stage pipeline 149
Once the environment has been created, in order to publish it to NuGet, you need to
create a new service connection by going to Project Settings | Service Connections |
Create Service Connection, selecting NuGet from the list of available service connection
types, and then configuring the connections according to your NuGet account:
1. To use Azure Pipelines to build your GitHub repository, you need to add the
Azure DevOps extension to your GitHub account. From your GitHub page,
select the Marketplace link from the top bar and search for Azure Pipelines.
Select the Azure Pipelines extension and click on Set up a plan, as shown in the
following screenshot:
2. Select the Free plan, click the Install it for free button, and then click Complete
order and begin installation.
3. Now, the Azure Pipelines installation will ask you if this app should be available for
all your repositories or only for selected repositories. Select the desired option and
click on Install:
4. You will now be redirected to Azure DevOps, where you can create a new project
(or select an existing one) for handling the build process. Here, I'm going to create
a new project:
5. Now, you need to authorize Azure Pipelines so that it can access your
GitHub account:
When the necessary authorization is given, the project will be created for you on
Azure DevOps and the pipeline creation process will start. You'll be immediately
prompted to select a GitHub repository for the build from the list of available
GitHub repositories in your account:
6. Here, I'm selecting a repository where I have an Azure Function project. As you
can see, Azure Pipelines has recognized my project and proposed a set of available
templates for the pipeline (but you can also start from a blank template or from
a YAML file that you have in any branch of the repository). Here, I will select
.NET Core Function App to Windows on Azure:
8. To do so, select your pipeline in Azure DevOps, click on the three dots on the right,
and select Status badge:
Then, to add a new agent pool to your pipeline (for executing the other task
independently), click the three dots beside the pipeline and select Add an agent job:
Finally, we'll add a new agent job (here, this is called Agent Job 3) to execute the Final
Task that will run on a Microsoft-hosted agent. However, this job has dependencies from
Agent Job 1 and Agent Job 2:
In this way, the first two tasks start in parallel and the final job will wait until the two
previous tasks are executed.
For more information about parallel jobs in an Azure pipeline, I recommend that you
check out this page:
https://docs.microsoft.com/en-us/azure/devops/pipelines/
process/phases?view=azure-devops&tabs=yaml
• --image is used to select the name of the Azure Pipelines image for creating your
agent, as described here: https://hub.docker.com/_/microsoft-azure-
pipelines-vsts-agent.
• VSTS_POOL is used to select the agent pool for your build agent.
Remember that you can start and stop an ACI instance by using the az container
stop and az container start commands. This can help you save money.
pool:
vmImage: 'windows-2019'
container: mcr.microsoft.com/windows/servercore:ltsc2019
steps:
- script: date /t
displayName: Gets the current date
- script: dir
workingDirectory: $(Agent.BuildiDirectory)
displayName: list the content of a folder
As we mentioned previously, to run a job inside a Windows container, you need to use
the windows-2019 image pool. It's required that the kernel version of the host and the
container match, so here, we're using the ltsc2019 tag to retrieve the container's image.
For a Linux-based pipeline, you need to use the ubuntu-16.04 image:
pool:
vmImage: 'ubuntu-16.04'
container: ubuntu:16.04
steps:
- script: printenv
As you can see, the pipeline creates a container based on the selected image and runs the
command (steps) inside that container.
Summary
In this chapter, we provided an overview of the Azure Pipelines service and we saw
how to implement a CI/CD process by using Azure DevOps. We also saw how to create
a pipeline for code hosted in a repository by using the graphical interface and by using
YAML, as well as how to use and create build agents. We then looked at how to create
a build pipeline by using the classic editor and by using a YAML definition. We also saw
an example of a multi-stage pipeline and how to use Azure DevOps pipelines to build
code inside a GitHub repository, before looking at how to use parallel tasks in a build
pipeline to improve build performance. Finally, we learned how to create a build agent
on Azure Container Instances and how to use a container's jobs.
In the next chapter, we'll learn how to execute quality tests for our code base in
a build pipeline.
5
Running
Quality Tests in
a Build Pipeline
In the previous chapter, we introduced Azure Pipelines and learned how to implement a
CI/CD process using Azure DevOps, GitHub, and containers.
In this chapter, we are going to cover how to run quality tests in a build pipeline. We
will begin by explaining what the benefits of automatic testing are. Then, we will look
at how to run unit tests in a build pipeline, how to perform code coverage testing, and
how to view the test results. Finally, we will cover how to use Feature Flags to test code in
production.
The following topics will be covered in this chapter:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization. The
organization that will be used in this chapter is called the Parts Unlimited organization.
It was created in Chapter 1, Azure DevOps Overview. You also need to have Visual
Studio 2019 installed, which can be downloaded from https://visualstudio.
microsoft.com/downloads/. For the latest demo, you will need Visual Studio Code
with the C# extension installed and the .NET Core SDK, version 3.1 or later.
The source code for our sample application can be downloaded from the following link:
https://github.com/PacktPublishing/Learning-Azure-DevOps-
--B16392/tree/master/Chapter%205/RazorFeatureFlags
In conjunction with CI, where the code is automatically pushed into production,
automatic testing will protect teams from releasing bugs into their software. However,
there is a trade-off. Developers need to dedicate more time to writing and maintaining test
code. However, by investing this extra time, the outcome will be higher quality code, and
code that has been proven to function completely as expected.
There are different types of automated testing you can perform; for instance, you can
run regression, acceptance, and security tests. In this chapter, we are going to focus on
development testing, which is also used in CI and can be done directly from the build
pipeline.
Visual Studio and Azure DevOps both offer features for testing. They are test framework-
agnostic, so you can plug in your own framework and bring third-party tools as well. You
can easily add test adapters in order to run the tests and explore the results. This can make
testing part of your daily software build process.
In the upcoming sections, we will cover unit testing and code coverage testing, which
is part of development testing. First, we will describe how to run an automatic unit test
from a build pipeline, and then how to perform code coverage and UI tests from a build
pipeline.
7. From Visual Studio, you will be able to build and run the application. To do this,
press F5. Alternatively, from the top menu, select Debug > Start Debugging:
Now that everything is working, we can start creating a build pipeline, which includes
running the unit test projects.
Running unit tests in a build pipeline 171
4. On the next screen, make sure that Azure Repos Git is selected. Keep the default
settings as they are and click Continue:
5. Next, we need to select a template. Select ASP.NET from the overview and click
Apply:
6. With that, the pipeline will be created. Various tasks are added to the pipeline by
default. We are going to use these tasks here. For this demo, we are going to focus
on the Test Assemblies task. Click on this task and make sure that version 2 is
selected. Under Test selection, you will see the following settings:
9. The search folder is the folder that's used to search for test assemblies. In this case,
this is the default working directory.
10. The test results folder is where test results are stored. The results directory will
always be cleaned before the tests are run.
11. We are now ready to run the test. Click on Save & queue from the top menu and
then again on the Save & queue sub-menu item to execute the build pipeline:
12. The wizard for running the pipeline will open. Here, you can specify a comment
and then select an Agent Pool, Agent Specification, and which Branch/tag you
would like to use:
The overview page of the job will be displayed, which is where you can view the
status of the execution:
15. On the Tests screen, you will see the number of tests you have, as well as the tests
that passed and failed. You can also see the duration of the run from here.
16. At the bottom of the screen, you can filter by specific tests. For instance, you can
filter for tests that have been Passed, Failed, and Aborted:
In this demonstration, we have created a build pipeline that includes automatic unit
testing for our source code. In the next section, we are going to look at code coverage
testing.
Important Note
Cobertura and JaCoCo are both Java tools that calculate the percentage of
code that's accessed by tests. For more information about Cobertura, you can
refer to https://cobertura.github.io/cobertura/. For more
information about JaCoCo, you can refer to https://www.eclemma.
org/jacoco/.
In the next section, we are going to look how to perform code coverage testing by using
Azure DevOps.
180 Running Quality Tests in a Build Pipeline
1. With the build pipeline open, select the Edit button in the right-hand corner:
3. Now, Save and queue the build, specify a save comment, and wait until the pipeline
is fully executed. The Visual Studio Test task creates an artifact that contains
.coverage files that can be downloaded and used for further analysis in Visual
Studio.
4. After executing the pipeline, on the overview page of the build, select Code
Coverage from the top menu and click on Download code coverage results. A file
with the .coverage extension will be downloaded to your local filesystem.
5. Double-click the downloaded file so that it opens in Visual Studio. From here, you
can drill down into the different classes and methods to get an overview of the test
results:
1. Go back to the build pipeline and select the pipeline that ran last. Click Test from
the top menu.
2. For the results table, make sure that Passed is selected and that Failed and Aborted
have been deselected:
3. Then, select a couple of tests. After doing this, from the top menu, click Link:
6. Now, click on one of the test results that's linked to the work item. This will show
the details for this item. From here, you can click on work items from the top menu.
This will display the work item that we linked in the previous step:
In this demonstration, we covered how to link test results to work items. In the next
section, we are going to cover how to use Feature Flags to test in production.
1. With Visual Studio Code open, click on Terminal > New terminal from the top
menu.
2. In the Terminal, add the following line of code to create a new project:
dotnet new webapp -o RazorFeatureFlags
code -r RazorFeatureFlags
3. The newly created project will now open. Open the Terminal once more and add
the following line of code to test the project:
dotnet run
4. Navigate to the .NET Core application by clicking on one of the localhost URLs in
the Terminal output. You will then see the following:
6. Once the package has been installed, open the Program.cs class and add the
following using statement:
using Microsoft.FeatureManagement;
8. Then, open the Startup.cs class. Here, add the using statement again and add
the following to the ConfigureServices method:
public void ConfigureServices(IServiceCollection
services)
{
//...
services.AddFeatureManagement();
}
9. Now, we can inject this into a controller, for instance. Open the code behind the
home page of the application, which can be found in the Index.cshtml.cs file,
and add the using statement again. Then, replace the IndexModel class with the
following code:
<div class='text-center'>
<h1 class='display-4'>@IndexModel.WelcomeMessage</h1>
<p>Learn about <a href='https://docs.microsoft.com/aspnet/
core'>building Web apps with ASP.NET Core</a>.</p>
</div
Summary 189
12. This will inject the welcome message into the web page.
13. Build and run the code by opening a new Terminal window and adding the
following line of code to the Terminal:
dotnet run
14. Let the application open in the browser and open the appsettings.json
file again in Visual Studio Code. Change the ChangeBanner Feature Flag to
true and reload the website in your browser by pressing F5. This will result in the
following output:
Figure 5.26 – Welcome message changed based on the Feature Flag provided
In this demonstration, we added some Feature Flags to our application using the
Featuremanagement NuGet package of Microsoft. Using these Feature Flags, we
changed the welcome message for the home page of the application. This concludes this
chapter.
Summary
In this chapter, we covered how to run quality tests in a build pipeline in more depth.
With this, you can now run unit tests from the build pipeline and execute coverage tests
from Azure DevOps. Lastly, we covered how to create Future Flags inside an application
that you can use in your future projects as well.
In the next chapter, we are going to focus on how to host build agents in Azure Pipelines.
190 Running Quality Tests in a Build Pipeline
Further reading
Check out the following links for more information about the topics that were covered in
this chapter:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization and an
Azure subscription to create a VM.
Getting the project pre-requisites ready: This section requires you to have the
PartsUnlimited project ready in your own DevOps organization. If you are continuing
from the previous chapter, Chapter 5, Running Quality Tests in a Build Pipeline, you should
have it already.
If you do not have the project ready in your DevOps org, you can import it using
Azure DevOps Demo Generator – https://azuredevopsdemogenerator.
azurewebsites.net/:
Azure pipeline agents are grouped into agent pools. You can create as many agent pools
as you require.
194 Hosting Your Own Azure Pipeline Agent
Important note
Azure Pipelines supports running basic tasks, such as invoking the REST
API or Azure Function without the need to have any agents. Please
refer to https://docs.microsoft.com/en-us/azure/
devops/pipelines/process/phases?view=azure-
devops&tabs=yaml#server-jobs for more details about agentless
execution of Azure Pipelines.
• Microsoft-hosted agents
• Self-hosted agents
Microsoft-hosted agents
Microsoft-hosted agents are fully managed VMs, deployed and managed by Microsoft.
You can choose to use a Microsoft-hosted agent with no additional pre-requisites
or configurations. Microsoft-hosted agents are the simplest and are available at no
additional cost.
Every time you execute a pipeline, you get a new VM for running the job, and it's
discarded after one use.
Self-hosted agents
Self-hosted agents are servers owned by you, running in any cloud platform or data center
owned by you. Self-hosted agents are preferred due to various reasons, including security,
scalability, and performance.
You can configure your self-hosted agent to have the dependencies pre-installed, which
will help you decrease the time for your pipeline execution.
Planning and setting up your self-hosted Azure pipeline agent 195
If your code base is small and the build pipeline is optimized, it's better to use
Microsoft-hosted agents as it won't take much time to download all the dependencies on
the fly. However, if you have a large code base with numerous amounts of dependencies,
using a self-hosted agent will be a better option as you can eliminate various build
pre-creation tasks from the pipeline by configuring them in your self-hosted environment
in advance. Self-hosted agents would be the only option in the case of highly secure and
customized build pipelines that interact with other services running in your network. If
you need more CPU and memory than what is provided with Microsoft-hosted agents,
you can use self-hosted agents with your customized sizing.
It is recommended to start with Microsoft-hosted agents and move to self-hosted at
a later stage when the Microsoft-hosted agents become a bottleneck in your build and
release process.
Based on the specifications identified in this process, you can choose to start with a vanilla
OS and install your required frameworks and build tools, or choose a pre-created image in
the cloud.
Supported OSes
The following list shows the supported OSes:
• Windows-based:
a) Windows 7, 8.1, or 10 (if you're using a client OS)
b) Windows Server 2008 R2 SP1 or higher (Windows Server OS)
• Linux-based:
a) CentOS 7, 6
b) Debian 9
c) Fedora 30, 29
d) Linux Mint 18, 17
e) openSUSE 42.3 or later
f) Oracle Linux 7
g) Red Hat Enterprise Linux 8, 7, 6
h) SUSE Enterprise Linux 12 SP2 or later
i) Ubuntu 18.04, 16.04
• ARM32:
a) Debian 9
b) Ubuntu 18.04
• macOS-based:
a) macOS Sierra (10.12) or higher
198 Hosting Your Own Azure Pipeline Agent
Pre-requisite software
Based on the OS you choose, you will have to install the following pre-requisites before
you can set up the host as an Azure pipeline agent:
• Windows-based:
a) PowerShell 3.0 or higher
b) .NET Framework 4.6.2 or higher
• Linux/ARM/macOS-based:
a) Git 2.9.0 or higher.
b) RHEL 6 and CentOS 6 require installing the specialized RHEL.6-x64 version of
the agent.
The agent installer for Linux includes a script to auto-install the required pre-requisite.
You can complete the pre-requisite by running ./bin/installdependencies.sh ,
which is available in the downloaded agent directory. The downloading of the agent is
covered in the following sections of this chapter.
Important note
Please note that the preceding pre-requisites are just to install the Azure
Pipelines agent on the host; based on your application development
requirements, you may need to install additional tools, such as Visual Studio
build tools, a Subversion client, and any other frameworks that your application
might need.
Now that we have understood the pre-requisites, we will create an agent VM for our
sample project, PartsUnlimited.
Important note
Visual Studio 2019-based images are available in the Azure portal directly in
the search results.
4. Click Create to start creating a VM. Choose the required subscription, resource
group, and other settings based on your preference.
5. In further pages, you can modify the settings to use a pre-created virtual network,
as well as customize the storage settings and other management aspects. Please
review the documentation to explore more on VM creation in Azure.
Important note
Please follow the Microsoft docs to learn more about creating a VM in Azure:
https://docs.microsoft.com/en-us/azure/virtual-
machines/windows/quick-create-portal.
Now that we have the VM ready, we'll set it up as an Azure Pipelines agent for our project.
b) --Name and Description: Give a meaningful name that you can use to refer to
later:
We will now set up an access token for the agent VM to be able to authenticate with Azure
DevOps.
1. Sign in to your Azure DevOps organization with the admin user account.
2. Go to your user profile and click Personal access tokens:
Planning and setting up your self-hosted Azure pipeline agent 203
Figure 6.11 – Creating a personal access token for the agent pool
7. Once you click Create, Azure DevOps will display the personal access token. Please
copy the token and save it in a secure location. If you happen to lose this token, you
must create a new token for setting up new agents.
Planning and setting up your self-hosted Azure pipeline agent 205
We will use this token when setting up the Azure Pipelines agent.
Important note
You will need to give additional permissions when creating a token if you
plan to use deployment groups (more information here: https://
docs.microsoft.com/en-us/azure/devops/pipelines/
release/deployment-groups/?view=azure-devops).
Now that we have completed the agent pool setup in Azure DevOps, we can start
installing the agent in the VM we created earlier.
1. In your Azure DevOps account, browse to Organization Settings > Agent Pools.
2. Select your newly created agent pool and click on New agent:
3. On the next page, you can download the agent installer based on the OS and
architecture (x64/x32). In our example, we're using a Windows Server 2016-
based VM. We'll choose Windows and the x64 architecture. You can also copy the
download URL and use it to download the agent directly inside your self-hosted
agent machine. You can also choose to follow the installation steps given in the
Azure DevOps portal based on the OS for your agent machine:
Tip
If you are unable to download the agent file on your Visual Studio machine,
you can use a different browser than Internet Explorer or disable Enhanced IE
Security configuration from the server manager. You can refer to https://
www.wintips.org/how-to-disable-internet-explorer-
enhanced-security-configuration-in-server-2016/ to
learn how to disable enhanced Internet Explorer security configuration.
Planning and setting up your self-hosted Azure pipeline agent 207
6. It will take a minute or two to extract the files. Please browse to the new directory
once it's completed. You should see files as displayed in the following screenshot:
You now have a ready-to-use, self-hosted agent for your Azure pipelines! This self-hosted
agent can be used to run your Azure pipeline jobs, and you can add as many agents as you
want in a similar fashion. You may have various types of hosted agents in one pool; the
appropriate agent for the job is automatically selected based on the pipeline requirements,
or you can select an agent specifically at execution time. Typically, in a large environment,
you'd pre-create an image of an agent server so that it is faster to provision additional
agents whenever needed. In the next section, we will update our pipeline to leverage
this newly set-up self-hosted agent. Please refer to this documentation if you wish to use
a Linux-based hosted agent: https://docs.microsoft.com/en-us/azure/
devops/pipelines/agents/v2-linux?view=azure-devops.
Refer to the following link for macOS-based agents: https://docs.microsoft.
com/en-us/azure/devops/pipelines/agents/v2-osx?view=azure-
devops.
Important note
If your self-hosted agent machine is behind a network firewall or proxy, you
must define the proxy address while installing the Azure pipeline agent. You
can do that by specifying the proxy URL, username, and password with a
config command:
./config.cmd --proxyurl http://127.0.0.1:8888
--proxyusername "myuser" --proxypassword "mypass"
1. Log in to the Azure DevOps portal and browse to the PartsUnlimited project.
2. Browse to Pipelines:
9. This will start executing the pipeline job on your self-hosted agent. This may take a
few minutes to complete:
You've now completed setting up an Azure pipeline, which is using a VM-based self-
hosted pipeline agent for running the jobs.
2. Create a new file named Dockerfile (no extension) and update it with the
following content. You can use Notepad to open the file:
FROM mcr.microsoft.com/windows/servercore:ltsc2019
WORKDIR /azp
COPY start.ps1 .
3. Create a new PowerShell file with the name start.ps1 and copy the content
from here: https://github.com/PacktPublishing/Learning-Azure-
DevOps---B16392/blob/master/Chapter-6/start.ps1.
4. Run the following command to build the container image:
docker build -t dockeragent:latest.
Your container image is now ready to use. You can use dockeragent as the image name
to refer to this image. Optionally, you can save this image in your container repository.
Your container-based Azure pipeline agent is now ready to use. If you want to use
a container for one job and re-create it every time, you can use the –-once flag to use
one container for running one job only and use a container orchestrator such as
Kubernetes to re-create the container as soon as it finishes executing the current job.
Important note
Refer to the Microsoft docs – https://docs.microsoft.com/
en-us/virtualization/windowscontainers/about/ – for
additional details about Windows containers.
Using containers as self-hosted agents 217
In the next section, we'll take a look at setting up Linux-based containers as Azure
Pipelines agents.
docker run \
-e VSTS_ACCOUNT=<name> \
-e VSTS_TOKEN=<pat> \
-it mcr.microsoft.com/azure-pipelines/vsts-agent
Alternatively, you can also choose to build your own Docker image to use for your
Pipelines agents. The process is similar to building an image for Windows containers.
Please refer to the Microsoft docs here – https://docs.microsoft.com/en-us/
azure/devops/pipelines/agents/docker?view=azure-devops – for
reference to the entry point script.
• The --image parameter is used to select the name of the Azure Pipelines image
for creating your agent, as described here: https://hub.docker.com/_/
microsoft-azure-pipelines-vsts-agent.
• The VSTS_POOL parameter is used to select the agent pool for your build agent.
Remember that you can start and stop an ACI instance by using the az container
stop and az container start commands, and this can help you save money.
Let's take a look at some of the additional environment variables you can use with Azure
pipeline agent-based containers.
Environment variables
Azure DevOps pipeline agents running on containers can be customized further by using
additional environment variables. The environment variables and their purposes are
described as follows:
In this section, we learned about using containers as your Azure pipeline agents for
executing your pipeline jobs.
• You need more computer power (CPU and memory) at a certain time and this
requirement fluctuates based on workload.
• Microsoft-hosted agents are not able to meet your pipeline requirements.
• Your job runs for a long time or takes time to complete.
• You wish to use the same agent for various jobs consecutively to take advantage of
caching and so on.
• You don't want to run dedicated agents all the time as these incur costs.
• You want to regularly update the image of VMs running jobs.
Azure VM scale sets can automatically increase/decrease the number of pipeline agents
available based on the current demand of Azure Pipelines. This helps you save money,
as well as supports your scaling requirements.
220 Hosting Your Own Azure Pipeline Agent
7. On the networking page, you can choose to connect the scale set to an existing
virtual network if your scale set needs to access any of your network resources
securely. Please ensure that Use a load balancer is set to No for the Azure Pipelines
scale set. Once configured, click Next:
10. On the Health and Advanced Setting page, optionally change any settings you want
to customize for your environment. Click Review and Create once you're ready to
start creating the scale set.
11. Once the validation is successful, click Create to start the deployment.
It may take a few minutes for the deployment to complete. Please wait while the
deployment finishes. One the VM scale set is ready, we'll set it up to be used as an Azure
Pipelines agent.
1. Log in to your Azure DevOps organization and browse to Project Settings > Agent
Pools.
2. Click on Add Pool.
3. Fill in the values as defined here:
--Pool Type: Virtual machine scale set
--Project for Service Connections: Choose your Azure DevOps project
--Azure Subscription: Select the Azure subscription where you created the VM
scale set:
4. Click Authorize to enable access to your Azure subscription. You may be asked to
log in to your Azure account in this process.
5. Once authenticated, select an existing VM scale set and fill in the values as
described here:
--The name and description of your choice for this agent pool.
--Optionally, configure the scale set to delete the VM after each execution.
--The maximum number of VMs.
--The number of agents to keep as standby. While this can help you in completing
jobs quickly, it may increase your Azure cost.
6. Click Create once you have filled in all the values:
7. Your agent pool creation will now start. Please note that it may take around 15
minutes before your agent pool is ready to take up the jobs. You should see the
agents live in your agent pool upon completion:
Summary
In this chapter, we looked at using Microsoft-hosted agents and self-hosted agents to run
your Azure pipeline jobs. We dug deep into the process of setting up a self-hosted agent
and updated our pipelines to use the self-hosted agent.
We also looked at how you can use Docker containers, Azure container instances, and
Azure VM scale sets as your Azure pipeline agents. With this chapter, you should be able
to plan and implement the appropriate pipeline agent solution for your projects
In the next chapter, we'll learn about Artifacts in Azure DevOps.
Section 3:
Artifacts and
Deployments
In this section, we are going to cover how to use artifacts with Azure DevOps and how to
deploy your applications using pipelines.
This section contains the following chapters:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization. The
organization we'll be using in this chapter is the PartsUnlimited organization, which
we created in Chapter 1, Azure DevOps Overview. You also need to have Visual Studio
2019 installed, which can be downloaded from https://visualstudio.
microsoft.com/downloads/.
The source code for our sample application can be downloaded from https://
github.com/PacktPublishing/Learning-Azure-DevOps---B16392/
tree/master/Chapter%207.
It is highly recommended to use Azure Artifacts as the main source for publishing
internal packages and remote feeds. This is because it allows you to keep a comprehensive
overview of all the packages being used by the organization and different teams. The feed
knows the provenance of all the packages that are saved using upstream resources; the
packages are saved into the feed even when the original source goes down or the package
is deleted. Packages are versioned, and you typically reference a package by specifying the
version of the package that you want to use in your application.
Many packages allow for unrestricted access, without the need for users to sign in.
However, there are packages that require us to authenticate by using a username and
password combination or access token. Regarding the latter, access tokens can be set to
expire after a given time period.
In the next section, we are going to look at how to create an Artifact Feed in Azure
DevOps.
3. In the Create new feed dialog box, add the following values (make sure that
Upstream sources is disabled; we are not going to use packages from remote feeds
in this chapter):
With that, we have created a new feed so that we can store our packages. In the next
section, we are going to produce a package using a build pipeline.
1. Navigate to the PartsUnlimited project in Azure DevOps and go to Repos > Files.
2. Select Import repository from the PartsUnlimited dropdown:
5. Once the project has been imported, the repository will look as follows:
6. Reorder the tasks and drag the NuGet task so that it's after the Build Solution task.
Delete the Test Assemblies method since we don't have any tests in this project:
The build pipeline will now run successfully. In the next section, we are going to publish
the PartsUnlimited.Models NuGet package that we created in the first demo to our feed.
238 Using Artifacts with Azure DevOps
1. Log in with your Microsoft account and from the left menu, select Artifacts.
2. Go to the settings of the feed by selecting the Settings button from the
top-right menu:
3. Then, click on Permissions from the top menu and click on + Add users/groups:
Now that the identity of the build pipeline has the required permissions on the feed,
we can push the package to it during while it's being built.
3. Click on the + button again next to Agent job 1 and search for NuGet. Add the task
to the pipeline.
4. Drag the newly added task below the NuGet task that we created in the previous
step. Make the following changes to the settings of the task:
--Display name: NuGet push
--Command: push
--Path to NuGet package(s) to publish: $(Build.
ArtifactStagingDirectory)/**/*.nupkg;!$(Build.
ArtifactStagingDirectory)/**/*.symbols.nupkg
--Target feed location: This organization/collection
--Target feed: PacktLearnDevOps
5. After making these changes, the task will look as follows:
1. Open Visual Studio 2019 and create a new .NET Core console application:
2. Once the application has been created, navigate to Azure DevOps and from the left
menu, select Artifacts.
3. From the top menu, select Connect to feed:
11. Now, we can reference the package in our code and use the model classes. Add
a using statement and create a new CarItem by replacing the code in the
Program.cs file with the following:
using System;
using PartsUnlimited.Models;
namespace AzureArtifacts
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
In this demonstration, we consumed the package that is automatically built and released
from the feed. In the next and last section of this chapter, we are going to look at how to
scan a package for vulnerabilities using WhiteSource Bolt.
246 Using Artifacts with Azure DevOps
Important Note
For more information about WhiteSource Bolt, you can refer to the following
website: https://bolt.whitesourcesoftware.com/.
In this section, we are going to install the extension in our Azure DevOps project and
implement the tasks that come with it into our existing build pipeline. Let's get started:
3. Install the extension in your DevOps organization by selecting the organization and
clicking the Install button:
5. On the Settings screen, specify a work email address, company name, and country.
Then, click the Get Started button:
8. Drag the task below the Build solution task since we want to scan the solution
before the package is packed and pushed into the Artifact feed. This will look
as follows:
11. Go to the top menu once more and select WhiteSource Bolt Build Support. There,
you will see an overview of the scan:
Summary
In this chapter, we looked at Azure Artifacts in more depth. First, we set up a feed and
created a new NuGet package using the model classes in the PartsUnlimited project.
Then, we created a build pipeline where we packed and pushed the package to the feed
automatically during the build process. Finally, we used the WhiteSource Bolt extension
from the Azure marketplace to scan the package for vulnerabilities.
In the next chapter, we are going to focus on how to deploy applications in Azure DevOps
using release pipelines.
Further reading
Check out the following links for more information about the topics that were covered in
this chapter:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization. The
organization used in this chapter is the PartsUnlimited organization we created in
Chapter 1, Azure DevOps Overview.
A way to define an Azure Pipelines environment is with a YAML file, where you can
include an environment section that specifies the Azure Pipelines environment where
you'll deploy your artifact, or by using the classic UI-based editor
In the next section, we'll see how to define a release pipeline with the Azure DevOps UI
in detail.
1. To create a release pipeline with Azure DevOps, click on Pipelines on the left menu,
select Releases, and then click on New release pipeline:
Creating a release pipeline with Azure DevOps 257
3. Now, provide a name for the stage that will contain the release tasks. Here, I'm
calling it Deploy to cloud:
You have now defined the stage of your release pipeline (single-stage). In the next section,
we'll see how to specify the artifacts for your release pipeline.
1. To select artifacts, on the main release pipeline screen, click on Add an artifact:
2. In the Add an artifact panel, you have Source type automatically set to Build (this
means that you're deploying the output of a build pipeline). Here, you need to select
the build pipeline that you want to use as the source (the name or ID of the build
pipeline that publishes the artifact; here, I'm using the PartsUnlimitedE2E build
pipeline) and the default version (the default version will be deployed when new
releases are created. The version can be changed for manually created releases at the
time of release creation):
1. To create a release, on the release pipeline definition page, click on the Create
release button in the top-right corner:
2. On the Create a new release page, accept all the default values (you need to have a
successfully completed build with artifacts created), and then click on Create:
4. Now, you can click on the release name (here, it is Release-1) and you will be
redirected to the details of the release process:
Using variables is recommended if you have parameters that change on your pipeline. In
the next section, we'll see how to configure triggers for continuous deployment.
4. In the Pre-deployment conditions pane, check that the trigger for this stage is set
to After release (this means that the deployment stage will start automatically when
a new release is created from this pipeline):
1. In the previously defined pipeline, as a first step, I renamed the Deploy to cloud
stage to Production. This will be the final stage of the release pipeline.
2. Now, click on the Clone action to clone the defined stage into a new stage:
5. As you can see, the pipeline diagram has now changed (you have the QA and
Production stages executed in parallel). Now, select the Pre-deployment
conditions properties for the Production stage; set the trigger to After stage and
select QA as the stage:
cYou have now created a release pipeline with different stages (Dev, QA, and Production)
for controlling the deployment steps of your code.
In the next section, we'll see how to add approvals for moving between stages.
272 Deploying Applications with Azure DevOps
Creating approvals
Let's follow these steps to create approvals:
1. To create an approval step, from our pipeline definition, select the Pre-deployment
conditions properties of the Production stage. Here, go to the Pre-deployment
approvals section and enable it. Then, in the Approvers section, select the
users that will be responsible for approving. Please also check that the The user
requesting a release or deployment should not approve it option is not ticked:
3. Now, create a new release to start our pipeline and click on the name of the created
release (here, it is called Release-2):
5. The release pipeline is waiting for approval. You can click on the Pending approval
icon and the approval dialog is opened. Here, you can insert a comment and then
approve or reject the release:
8. If you now click on the Azure App Service instance deployed by your pipeline, you
can see that the final code (the PartsUnlimited website) is deployed in the cloud:
9.
As an example, here we want to configure a gate for our previously created release pipeline
where we check for open bugs on Azure Boards. We will see how to do this with the help
of the following steps:
Important note
If there are open bugs, the release pipeline cannot go ahead.
1. To check for open bugs in our project, we need to define a query for work items.
From our Azure DevOps project, select Boards, click on Queries, and then select
New query:
Here, you can also specify the delay before the evaluation of gates (the time before
the added gates are evaluated for the first time. If no gates are added, then the
deployment will wait for the specified duration before proceeding), and we can
specify the deployment gates (adding gates that evaluate health parameters). These
gates are periodically evaluated in parallel and if the gates succeed, the deployment
will proceed; otherwise, the deployment is rejected.
5. To specify our gate, click on Add and then select Query work items (this will
execute a work item query and check the results):
You can define a deployment group in Azure DevOps by going to the Pipeline section and
selecting Deployment groups:
You can add a deployment group job by going to the release pipeline editor, selecting
the job, and clicking on the three-dots button. Here, you can see the Add a deployment
group job option:
Environments are a group of resources targeted by a pipeline – for example, Azure Web
Apps, virtual machines, or Kubernetes clusters. You can use environments to group
resources by scope – for example, you can create an environment called development
with your development resources and an environment called production with the
production resources. Environments can be created by going to the Environments section
under Pipelines:
stages:
- stage: Build_Source_# Build Source Code for Dotnet Core
Web App
jobs:
- job: Build
pool: 'Hosted VS2017'
variables:
buildConfiguration: 'Release'
continueOnError: false
steps:
YAML release pipelines with Azure DevOps 283
- task: DotNetCoreCLI@2
inputs:
command: build
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI@2
inputs:
command: publish
arguments: '--configuration $(buildConfiguration)
--output $(Build.ArtifactStagingDirectory)'
modifyOutputPath: true
zipAfterPublish: true
- task: PublishBuildArtifacts@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: drop
- stage: Deploy_In_Dev # Deploy artifacts to the dev
environment
jobs:
- deployment: azure_web_app_dev
pool: 'Hosted VS2017'
variables:
WebAppName: 'PartsUnlimited-dev'
environment: 'dev-environment'
strategy:
runOnce:
deploy:
steps:
- task: AzureRMWebAppDeployment@4
displayName: Azure App Service Deploy
inputs:
WebAppKind: webApp
ConnectedServiceName: 'pay-as-you-go'
WebAppName: $(WebAppName)
Package: $(System.WorkFolder)/**/*.zip
- stage: Deploy_In_QA # Deploy artifacts to the qa
environment
284 Deploying Applications with Azure DevOps
jobs:
- deployment: azure_web_app_qa
pool: 'Hosted VS2017'
variables:
WebAppName: 'PartsUnlimited-qa'
environment: 'qa-environment'
strategy:
runOnce:
deploy:
steps:
- task: AzureRMWebAppDeployment@4
displayName: Azure App Service Deploy
inputs:
WebAppKind: webApp
ConnectedServiceName: 'pay-as-you-go'
WebAppName: $(WebAppName)
Package: $(System.WorkFolder)/**/*.zip
- stage: Deploy_In_Production # Deploy artifacts to the
production environment
jobs:
- deployment: azure_web_app_prod
pool: 'Hosted VS2017'
variables:
WebAppName: 'PartsUnlimited'
environment: 'prod-environment'
strategy:
runOnce:
deploy:
steps:
- task: AzureRMWebAppDeployment@4
displayName: Azure App Service Deploy
inputs:
WebAppKind: webApp
ConnectedServiceName: 'pay-as-you-go'
WebAppName: $(WebAppName)
Package: $(System.WorkFolder)/**/*.zip
Summary 285
As you can see in the preceding YAML file, the pipeline defines four stages: Build Source,
Deploy in Dev, Deploy in QA, and Deploy in Production. At each of these stages, the
application is deployed on the specified environment.
Summary
In this chapter, we had a full overview of how to work with release pipelines in Azure
DevOps.
We created a basic release pipeline for the PartsUnlimited project, defined artifacts, and
created our first release by adding continuous deployment conditions.
Then, we improved our pipeline definition by using multiple stages (DEV, QA, and
Production), and at the end of this chapter, we saw how to define approvals and gates
for managing the release of our code in a more controlled way and the concepts around
YAML-based release pipelines
In the next chapter, we'll see how to integrate Azure DevOps with GitHub.
Section 4:
Advanced Features
of Azure DevOps
In this part, we are going to integrate Azure DevOps with GitHub and we are going to
cover some real-world examples.
This section contains the following chapters:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization and a
GitHub account. You can sign up for a GitHub account here: https://github.com/
join.
290 Integrating Azure DevOps with GitHub
Let's get this chapter's prerequisites ready. This chapter requires that you have the Parts
Unlimited GitHub repository cloned to your GitHub account. You will also need an
Azure DevOps project to follow the examples in this chapter. Follow these steps before
moving on to the next section:
--Azure Pipelines: This extension allows you to build and release software
using Azure Pipelines while your code is stored and maintained in your GitHub
repository:
1. Log into your Azure DevOps account and select the project we created in the
Technical requirements section.
294 Integrating Azure DevOps with GitHub
4. You will need to grant permission from Azure Pipelines to your GitHub account:
5. Upon successful completion, you will have your GitHub repositories listed in Azure
DevOps. Select the newly created PartsUnlimitedE2E repository:
8. Azure DevOps will automatically generate a pipeline YAML file. You can review
and modify it based on your requirements. PartsUnlimited E2E is designed to run
build operations on Visual Studio 2017 on a Windows 2016 image. Please change
the vm-image name to vs2017-win2016 before continuing:
12. While this completes, let's look at the changes you made to your GitHub repository.
Browse to your GitHub account and go to the PartsUnlimitedE2E repository.
13. You will see a commit and a newly added azure-pipelines.yml file, which
stores the pipeline's configuration:
1. Browse to your GitHub account and open the PartsUnlimited E2E repository.
2. Click on Readme.MD and click Edit:
5. This will open the Pull request page. It'll take a couple of minutes for the Azure
Pipelines job to start. Once started, you can click on Details to see the status of the
pipeline job:
6. This concludes testing out the continuous integration capabilities of GitHub and
Azure Pipelines. As we can see, Azure Pipelines and GitHub integrate very well
and provide a whole new DevOps experience. You can merge the pull request to
complete this process.
1. Log into Azure DevOps and browse to Your project > Pipelines > PartsUnlimited
E2E.
2. Click on the ellipses (…) and select Status badge:
3. Copy the Sample markdown text box's value. Optionally, you can choose to get
the markdown for a specific branch. Please save this markdown in a temporary
location:
6. Turn off the Disable anonymous access to badges setting. If you find this option
grayed out, you must turn this off in the organization settings first:
8. Upon committing your changes, you should see the Azure Pipelines badge:
• Work item and Git commit/issue/pull request linking means you can link your work
items to the corresponding work being done in GitHub.
• You can update your work item's status from GitHub itself.
• Overall, integration allows us to track and link the deliverable across the two
platforms easily.
1. Log into Azure DevOps and browse to your Parts Unlimited project > Project
settings > Boards > GitHub connections:
3. Azure DevOps will list your repositories. Please choose PartsUnlimited E2E for the
purpose of this project and click Save:
4. This will redirect you back to GitHub so that you can install the Azure Boards
application. You can choose to install it for specific repositories or for all your
repositories:
5. Upon installing Azure Boards, you should see your GitHub connection listed with a
green checkmark, meaning it has been successful:
1. Log into Azure DevOps, browse to Boards, and click on the settings gear icon:
2. On the Settings page, browse to the status badge and set the following settings:
a) Check the Allow anonymous users to access the status badge box.
b) You can choose to show only the 'In Progress' columns or include all columns.
Your screen should look as follows:
1. In Azure Boards, create a new work item. You can use the Azure board status badge
task we completed earlier as an example here:
3. Since this task has already been completed, we can link it to the respective GitHub
commit. Open the newly created task and click on Add link:
4. Click on the Link type drop-down and choose GitHub Commit. Provide your
GitHub commit URL and click OK. Note that you also have the options to link to a
GitHub issue or pull request:
5. You will now see the GitHub commit linked to the work item. Change its State to
Done:
6. By doing this, you can view your GitHub objects in Azure Boards, which can be
used to directly open the respective commit link in GitHub:
1. Go to Azure Boards > Boards > New item. Create a test work item with a name of
your choice:
5. Once you've completed the pull request, you will see that the commit message is
now hyperlinked to Azure Boards and that the status for this work item in Azure
Boards is updated to Done:
6. Along with the link objective, in this demonstration, you also updated the state of
the work item by using a simple instruction in the commit message. Let's take a look
at some of the sample messages you can use:
It is outside the scope of this book to talk about GitHub Actions in detail, but you can
refer to the GitHub Actions documentation at https://github.com/features/
actions to get started.
Summary
In this chapter, we looked at how to use GitHub and Azure DevOps together to build an
integrated software development platform for our software teams. To do this, we learned
how to set up and manage Azure DevOps pipelines from GitHub, as well as build and
integrate CI/CD solutions.
We also learned about how to plan and track our work better in Azure Boards while
doing software development in GitHub. You should now be able to use GitHub and Azure
DevOps together and improve your overall productivity and DevOps experience. You
should also be able to set up integration between the two services and use it in your daily
DevOps work.
In the next chapter, we'll look at several real-world CI/CD examples with the help of
Azure DevOps.
10
Using Test Plans
with Azure DevOps
In the previous chapter, we covered how you can integrate Azure DevOps with GitHub.
In this chapter, we are going to cover how to use test plans with Azure DevOps.
Comprehensive testing should be added to each software development project, because
it delivers quality and a great user experience for your applications. We will begin with
a brief introduction to Azure Test Plans. Then we will look at how you can manage test
plans, suites, and cases in Azure DevOps. We will run and analyze a test as well. After that,
we will cover exploratory testing and we will install the Test & Feedback extension.
The following topics will be covered in this chapter:
Technical requirements
To follow this chapter, you need to have an active Azure DevOps organization. The
organization used in this chapter is the Parts Unlimited organization that we created in
Chapter 1, Azure DevOps Overview. You also need to have Visual Studio 2019 installed,
which can be downloaded from https://visualstudio.microsoft.com/
downloads/.
The test plan that is used to run and analyze a manual test plan can be downloaded
from https://github.com/PacktPublishing/Learning-Azure-DevOps-
--B16392/tree/master/Chapter%2010.
Exploratory testing
With exploratory testing, testers are exploring the application to identify and document
potential bugs. It focuses on discovery and relies on the guidance of the individual tester
to discover defects that are not easily discovered using other types of tests. This type of
testing is often referred to as ad hoc testing.
Exploratory testing 325
Most quality testing techniques use a structured approach by creating test cases up front
(just like we did in our previous demo). Exploratory testing is the opposite of this and is
mostly used in scenarios where someone needs to learn about a product or application.
They can review the quality of the product from the user perspective and provide feedback
quickly. This will also make sure that you don't miss cases that can lead to critical quality
failures. The outcome of these ad hoc tests can later be converted into a test plan as well.
Microsoft has released a Test & Feedback extension for exploratory testing. This extension
can be installed on the browser and used by all the stakeholders that are involved in the
software development project, such as developers, product owners, managers, UX or UI
engineers, marketing teams, and early adopters. The extension can be used to submit bugs
or provide feedback to contribute to the quality of the software.
In the next demonstration, we are going to look at how we can install the Test & Feedback
extension.
Important note
For a detailed overview of what browsers and features are supported, you
can refer to the following article: https://docs.microsoft.com/
en-us/azure/devops/test/reference-qa?view=azure-
devops#browser-support.
2. Select the Azure DevOps tab and search for Test & Feedback. Select the Test &
Feedback extension for the list:
4. Next, you will be redirected to the supported browsers where this extension can be
installed. Click the Install button below the browser that you are currently using to
install the extension. You will be redirected to the extension page of your current
browser. There, you can install it.
5. Once the extension is installed, the icon will be added to the right of the address bar.
Select the Connections button:
6. You need to specify the Azure DevOps server URL there to connect to your Azure
DevOps instance. The URL begins with https://dev.azure.com/ and ends
with the project name. After providing the URL, click Next. After connecting to the
project, you can select the team. Select Parts.Unlimited Team:
Now that the extension is configured, we can start using it. You can use the extension for
exploratory testing or for providing feedback:
8. We are going to start an exploratory testing session. Click the Start button:
Important note
For more information about how to create feedback items in Azure DevOps,
refer to the following website: https://docs.microsoft.com/
en-us/azure/devops/test/request-stakeholder-
feedback?view=azure-devops. To respond to this feedback items
using the Test & Feedback extension, visit https://docs.microsoft.
com/en-us/azure/devops/test/provide-stakeholder-
feedback?view=azure-devops#provide.
In this demonstration, we installed the Test & Feedback extension from the Visual Studio
Marketplace, which can be used for exploratory testing.
In the next section, we are going to look into planned manual testing.
Important note
The different agile processes that are supported and integrated in Azure
DevOps are covered in more detail in Chapter 2, Managing Projects with Azure
DevOps Boards.
Software development teams can begin manual testing right from the Kanban board from
Azure Boards. From the board, you can monitor the status of the tests directly from the
cards. This way, all team members can get an overview of what tests are connected to the
work items and stories. From there the team can also see what the status is of the different
tests.
In the following image, you can see the tests and statuses that are displayed on the board:
In the following image, you see the how the Test Hub can be accessed from the left menu,
together with the different menu options:
• Test plans: A test plan groups different test suites, configurations, and individual
test cases together. In general, every major milestone in a project should have its
own test plan.
• Test suites: A test suite can group different test cases into separate testing scenarios
within a single test plan. This makes it easier to see which scenarios are complete.
Test plans, test suites, and test cases 333
• Test cases: With test cases, you can validate individual parts of your code or app
deployments. They can be added to both test plans and test suites. They can also
be added to multiple test plans and suites if needed. This way, they can be reused
effectively without the need to copy them. A test case is designed to validate a work
item in Azure DevOps, such as a feature implementation or a bug fix.
In the next section, we are going to put this theory into practice and see how you can
create and manage test plans in Azure DevOps.
4. Then, right-click the first item in the list: Verify that user is allowed to save his
credit card detail. In the menu, select Edit test case:
6. In the Add link window, select Parent and then search for credit card. Select
the Credit Card Purchase feature to link the test case to:
In some cases, test cases should be run in a specific order. To do this click Define in
the top menu and select the Verify that user is not allowed to save invalid credit card
details test case. Then drag the test case above the first test case in the list:
Important note
Adding and managing test plan configurations is beyond the scope of this book.
However, if you want more information you can refer to the following article:
https://docs.microsoft.com/en-us/azure/devops/
test/test-different-configurations?view=azure-
devops.
Next, we will create a new test suite. You can create three different types of test suites:
static, where you manually assign the test cases; requirement-based, where you create
the suite based on common requirements; and query-based, where test cases are
automatically added based on the outcome of a query:
1. Let's add a new requirement-based test suite. For this, select the three dots next to
Parts.Unlimited_TestPlan1 > New Suite > Requirement based suite:
2. Here, you can create your own query for retrieving work items based on your
requirements. For this demonstration, we will use the default settings. Click Run
query and then select the three items that are related to shipping:
In this demonstration, we have managed some test cases and created a new requirement-
based test suite. In the next section, we are going to run and analyze a manual test plan.
1. Open the test plan of the Parts.Unlimited project again in Azure DevOps.
2. First, we need to add a new static test suite. For this, select the three dots next to
Parts.Unlimited_TestPlan1 > New Suite > Static suite. Name the suite End-to-
end tests.
342 Using Test Plans with Azure DevOps
3. Select the newly created suite and in the top menu, select the import button to
import test cases:
6. Click Save & Close in the top menu. Now that we have our test suite in place, we
can start testing. Click on the Execute tab in the top menu and click on the test
case, then Run, then Run with options:
4. If you discover bugs or issues, you can add a comment to the step directly, or create
a separate bug at the top:
5. To finish the test, click Save and close in the top menu of the test runner.
6. Now go back to Azure DevOps. In the left menu, click Test Plans > Runs.
7. In the list of recent runs, select the run that we just executed:
8. There, you can see all the details of the test and the outcome:
Summary
In this chapter, we have covered Azure DevOps Test Plans. We looked at the different
features and capabilities and managed test plans, test suites, and test cases. Then we
imported a test case from a CSV file and tested the Parts Unlimited application. Then, we
covered exploratory testing in detail, and we used the Test & Feedback extension to report
a bug.
In the next chapter, we are going to focus on real-world CI/CD scenarios with Azure
DevOps.
Further reading
Check out the following links for more information about the topics that were covered in
this chapter:
Technical requirements
To follow along with this chapter, you need to have an active Azure DevOps organization
and an Azure subscription.
352 Real-World CI/CD Scenarios with Azure DevOps
1. Resource groups: The following resource groups will be created for hosting the
Azure resources for both environments:
a) Contoso-ToDo-Staging
b) Contoso-ToDo-Production
2. Application components: We'll be creating the following resources for both the
staging and production environments:
a) Azure App Service to host the web application
b) Azure SQL Database to host the SQL database
In this example, we'll be using Azure App Service to host the ToDo application:
1. In the Azure portal, click on + Create a resource and click on Web App:
In this task, we created an Azure app service for hosting the ToDo web application.
358 Real-World CI/CD Scenarios with Azure DevOps
1. In the Azure portal, click on + Create a resource and select SQL Database:
In this task, we've created Azure SQL databases for our application.
Setting up a CI/CD pipeline for .NET-based applications 361
3. Select your Azure repo and master branch, then click Continue to move to the next
step:
5. Review the pipeline configuration. For the purpose of this project, the default
configuration does the job. Once it's reviewed, click on Save & queue:
7. Once the job is in progress, you can review the status by click on the job name:
8. Now, let's enable CI on the pipeline to auto-start the build on commit to the master
branch. Edit the pipeline and browse to Triggers, and enable CI. You can choose to
filter by branch or change to a different branch if you are not using master as your
primary branch:
5. Azure DevOps will now require you to authenticate to Azure. Please log in with an
account with at least subscription owner rights and global admin rights in the Azure
Active Directory (AD) tenant. You can choose to allow the service connection
scope to be limited to a resource group or allow the entire subscription. Select your
Azure subscription and give it a name:
4. This will open a page to select a template. Since we're planning to deploy our ToDo
app to App Service, select Azure App Service deployment:
6. You can now close the Stage blade. Your pipeline should look as follows:
8. Select Build as Source type and select the build pipeline created in the previous
task. You can choose to configure which version is to be deployed by default:
11. Inside the tasks view, select your Azure subscription service connection and the app
service that you deployed earlier:
d) Inline SQL Script: Provide the following script code. This will create the
required tables in the SQL database. Please note that this is a sample SQL script
to create the required schema (also available at https://github.com/
PacktPublishing/Learning-Azure-DevOps---B16392/tree/
master/Chapter11); in a production environment, you may choose to do so
using the SQL Server Data Tools project in Azure Pipelines. Please refer to this
documentation to learn more about doing Azure DevOps for SQL: https://
devblogs.microsoft.com/azure-sql/devops-for-azure-sql/:
/****** Object: Table [dbo].[__
MigrationHistory] Script
Date: 8/24/2020 12:35:05 PM ******/
SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
IF NOT EXISTS
( SELECT [name]
FROM sys.tables
WHERE [name] = '__MigrationHistory'
)
BEGIN
CREATE TABLE [dbo].[__MigrationHistory](
[MigrationId] [nvarchar](150) NOT NULL,
[ContextKey] [nvarchar](300) NOT NULL,
[Model] [varbinary](max) NOT NULL,
[ProductVersion] [nvarchar](32) NOT NULL,
CONSTRAINT [PK_dbo.__MigrationHistory]
PRIMARY KEY CLUSTERED
(
[MigrationId] ASC,
[ContextKey] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_
KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
END
/****** Object: Table [dbo].[Todoes] Script
378 Real-World CI/CD Scenarios with Azure DevOps
Date: 8/24/2020 12:35:05 PM ******/
SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
IF NOT EXISTS
( SELECT [name]
FROM sys.tables
WHERE [name] = 'Todoes'
)
BEGIN
CREATE TABLE [dbo].[Todoes](
[ID] [int] IDENTITY(1,1) NOT NULL,
[Description] [nvarchar](max) NULL,
[CreatedDate] [datetime] NOT NULL,
CONSTRAINT [PK_dbo.Todoes] PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_
KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
END
15. Click Save and + to add another task. We'll now need to add another task to update
the connection string of the database in the connection settings of Azure App
Service.
16. Search for Azure App Service Settings in the task's menu:
Setting up a CI/CD pipeline for .NET-based applications 379
[
{
'name': 'MyDbConnection',
'value': 'Server=tcp:contosotodostagingdb.database.
windows.NET,1433;Initial Catalog=ContoSoToDoStageDB;
Persist Security Info=False;User ID=azadmin;
Password=<YourPassword>;MultipleActiveResultSets=False;
Encrypt=True;TrustServerCertificate=False;
Connection Timeout=30;',
'type': 'SQLAzure',
'slotSetting': false
}
]
380 Real-World CI/CD Scenarios with Azure DevOps
19. Once all the tasks are updated, click on Save. You can save the pipeline in the root
folder upon prompt. This should be the order of the tasks:
a) Apply Database migration script
b) Apply Azure App Service Settings
c) Deploy Azure App Service:
21. Typically, you wouldn't want to auto-deploy to production. Let's modify the flow to
include a manual approval for production deployment. Click on Pre-Deployment
Conditions:
The Azure release pipeline to deploy the application is now ready. Let's create a release and
see whether we can get our application up and running through CI/CD pipelines.
Creating a release
Let's test the release pipeline by creating a release manually:
Once the development environment deployment has completed, you should try to launch
the app service and see whether the ToDo application is working well for you:
1. In Azure Repos, modify the view for the home page. Go to Repos |
DotNetAppSQLDB | Views | Todos | index.cshtml and modify the label from
Create new to Create New ToDo Item:
2. Commit the change in a new branch and follow through the pull request. You
should approve and complete the pull request.
This should start an automated build pipeline execution followed through
automated release execution.
In the end, you should have your application updated with the change without
having to do any manual steps except the approval task configured for production.
Congratulations, you've now completed the setup and testing of an end-to-end CI/
CD pipeline! In the next section, we'll set up a similar pipeline for a Kubernetes-based
application.
5. If you need to select a specific subscription for provisioning resources, run the
following command:
az account set --subscription 'Your Subscription Name'
7. Repeat the resource group creation command to create another resource group
named Contoso-Voting-Prod for the production environment.
You have now completed the required resource groups. In the next step, you'll create an
Azure Kubernetes cluster.
Setting up a CI/CD pipeline for a container-based application 387
4. Once your cluster is ready, you can get the Kubernetes authentication configuration
in your Cloud Shell session by running the following command:
az aks get-credentials --resource-group Contoso-Voting-
Stage --name Contoso-Stage-AKS
388 Real-World CI/CD Scenarios with Azure DevOps
5. You can try running kubectl commands now to interact with Kubernetes. Run
the following command to get a list of all the Kubernetes nodes:
kubectl get nodes
Your Azure Kubernetes cluster is now ready; please repeat the process to create another
AKS cluster for the production environment.
1. Log in to Azure Cloud Shell and run the following command to create a container
registry:
az acr create --resource-group Contoso-Voting-Stage
--name ContosoStageACR --sku Basic
2. Once your container registry is ready, you can get the status and details of it by
running the following command:
az acr list
Now that our infrastructure is ready, we'll begin with setting up the code repository for
the application.
Setting up a CI/CD pipeline for a container-based application 389
1. Log in to Azure DevOps and create a new project named Contoso Voting App
or any other name of your choice.
2. Navigate to Azure Repos and click Import a Git repository. Please import the
Azure voting app repository from: https://github.com/Azure-Samples/
azure-voting-app-redis:
10. Upon completion, navigate to the Azure portal and open the container registry you
created earlier.
11. Navigate to Repositories; you should see a new image being created there. Let's look
at the image and find out the image name to update in our application deployment
configuration:
4. Replace the value with your own container registry and image name. It should look
like the one given as follows. You should specify the latest tag to ensure that the
newest image is always used:
image: contosostageacr.azurecr.io/
contosovotingapp:latest
5. In Artifact, select the Azure repo and choose the repository we imported. Click
Add:
6. In the Tasks section, let's configure a task to perform the application deployment.
Configure the kubectl task as follows:
a) Display Name: Deploy to Kubernetes.
b) Kubernetes Service Connection: Create a new server connection and connect
to your AKS cluster created earlier:
c) Command: Apply.
d) Click on Choose configuration file to provide a path to your deployment
YAML file (azure-vote-all-in-one-redis.yaml). Browse to your default
directory and select the deployment YAML file. We can define additional options,
such as Kubernetes secrets and config maps, if required. Click Save after verifying
that all the configurations are valid:
e) Review the task configurations and click Save to save the progress so far:
8. Configure the task so that it uses the same Kubernetes connection. Under
Command, keep set as the command and use image deployments/azure-
vote-front azure-vote-front=youracrname.azurecr.io/
contosovotingapp:latest as the argument. In a production deployment, you
may not want to use the latest tag in your pipeline and rather refer to the version
tag generated using the build pipeline. This will help you manage your deployments
with specific versions and roll back easily if you wish to.
9. Once you're ready, save the pipeline and create a release to test the deployment
pipeline.
10. Review the release logs to understand the deployment steps and flow.
11. Once it's completed successfully, go back to editing the pipeline again and enable
continuous deployment:
Summary
In this chapter, we looked at a .NET and SQL-based application and set up a CI/CD
pipeline for it using Azure DevOps. We looked at how you manage your production and
staging environments through approval workflows.
Similarly, we also looked at a container-based application and did a walkthrough of setting
up an end-to-end CI/CD pipeline for the application using ACR and AKS.
In the end, we talked about Azure Architecture Center, which can be referred to while
planning your DevOps architecture.
This was the final chapter, and we hope you have enjoyed reading this book!
Other Books You
May Enjoy
If you enjoyed this book, you may be interested in these other books by Packt:
• Explore different Azure services and understand the correlation between them
• Secure and integrate different Azure components
• Work with a variety of identity and access management (IAM) models
• Find out how to set up monitoring and logging solutions
• Build a complete skill set of Azure administration activities with Azure DevOps
• Discover efficient scaling patterns for small and large workloads
408 Other Books You May Enjoy
B C
branches policies Capability Maturity Model
additional services, approval 87 Integration (CMMI) 26
build validation 86 CD pipeline
code reviewers 88 Kubernetes deployment manifest
comment resolution, checking 84 file, updating 394
linked work items, checking 84 release pipeline, setting up 395-401
merge types, limiting 85 setting up 393
number of reviewers, specifying 84 setting up, for container-
protecting 82, 83 based application 385
branching strategies setting up, for NET-based
exploring 61 applications 352
GitHub Flow 61, 62 CI/CD pipeline implementation
GitLab Flow 62, 63 benefits 106
branch strategies Build stage 105
Git Flow 63, 64 Commit stage 105
build agents Production Deployment stage 105
about 109 Test stage 105
Microsoft-hosted agents 110 CI/CD pipeline, setting up for
self-hosted agents 110 container-based application
build pipeline ACR, integrating with AKS 388
creating 171-179, 234-237 Azure container registry, creating 388
creating, with Azure DevOps 121 Azure Kubernetes service, creating 387
package, publishing to feed 238 Azure resource group, creating 386
retention policy 142, 143 required infrastructure, setting up 386
unit tests, running 168 sample app 385
used, for producing package 232 CI/CD process
build pipeline, creating with implementing 104-106
Azure DevOps CI pipeline
about 121-123 setting up 389-393
defining, with classic editor 124-135 setting up, for container-
YAML pipeline, creating 135-141 based application 385
Build Source 285 setting up, for NET-based
application 352
Cobertura
reference link 179
code coverage testing
Index 415
Production 268
using 111 pull request
using, scenarios 118 about 90
multi-stage pipeline creating, after code commit 93
creating, with YAML 144-150 creating, from Azure DevOps
defining 144 pull request page 92
multi-stage release pipeline creating, from Visual Studio 94
creating 268-271 creating, from Visual Studio Code 94
creating, from work item 92
N handling 95-99
working with 90, 91
NET-based applications
CI/CD pipeline, setting up for 352
Q
O QA 268
query-based 338
original equipment manufacturers
(OEMs) 196
R
P release pipeline
about 254, 256
package creating, with Azure DevOps 256-263
build pipeline, creating 234-237 overview 254-256
producing, with build pipeline 232 setting up 395-401
publishing 239-241 triggers, configuring for continuous
publishing, to feed from deployment 265-267
build pipeline 238 used, for defining artifacts 259-261
sample project, adding to Parts variables, using 264
Unlimited repository 233, 234 requirement-based 338
using, in Visual Studio from
Artifacts feed 241-245
package vulnerabilities
S
scanning, with WhiteSource Bolt 246-50 sample project
Parts Unlimited example adding, to Parts Unlimited
reference link 21 repository 233, 234
Personal access token (PAT) 208 SCM, key concepts
planned manual testing 330-332 branches 58
processes and process templates 24 cloning 58
418 Index
U Y
unit testing 167 YAML Ain't Markup Language (YAML)
unit tests collections 119
running, in build pipeline 168 complex object, defining 120, 121
upstream sources 230 dictionaries 120
document structure 120
V lists 119
overview 118, 119
Visual Studio scalars 119
package, using from Artifacts URL 119
feed 241-245 YAML release pipeline
Visual Studio Marketplace using, with Azure DevOps 281, 282, 285
extensions, downloading 17, 18
VM creating, in Azure
reference link 200
voting application
Azure Repos, setting up for 389
W
WhiteSource Bolt
URL 246
used, for scanning package
vulnerabilities 246-250
Windows container
Azure Pipeline agent container,
executing 216
container image, building 215, 216
reference link 217
setting up, as Azure pipeline agent 215
work items
creating 32-38
reference link 38
test results, assigning to 182-184