Devops Lab Manual1
Devops Lab Manual1
Lab Manual
Devops
BCSL657D
Prepared By
Prof M Rajesh
Prof Chandrashekar G
Prof Lakshmi M R
Devops lab manual
Appendix
Before starting any experiment, ensure you have the following installed and configured on your
Ubuntu system:
Verify installation:
java -version
3. Maven:
Verify installation:
mvn -version
4. Gradle:
Verify installation:
gradle -v
5. Jenkins:
Follow these steps to install Jenkins:
wget -q -O - https://pkg.jenkins.io/debian-stable/jenkins.io.key |
sudo apt-key add -
sudo sh -c 'echo deb https://pkg.jenkins.io/debian-stable binary/ >
/etc/apt/sources.list.d/jenkins.list'
sudo apt update
sudo apt install jenkins
sudo systemctl start jenkins
sudo systemctl status jenkins
6. Ansible:
Verify installation:
ansible --version
1. Understanding DevOps
What is DevOps?
DevOps is a set of cultural philosophies, practices, and tools that combine software
development (Dev) and IT operations (Ops). Its goal is to shorten the systems development life
cycle while delivering features, fixes, and updates frequently in close alignment with business
objectives. DevOps promotes:
What is Maven?
Maven is a build automation and project management tool primarily used for Java projects. It
uses a central configuration file known as the POM (Project Object Model), written in XML,
which defines the project structure, its dependencies, build order, and plugins.
What is Gradle?
Gradle is a modern build automation tool that is known for its flexibility and performance. It
uses a Groovy or Kotlin DSL (Domain Specific Language) to define build logic, which
allows for more dynamic and customizable configurations compared to Maven’s XML-based
approach.
• Flexible Build Scripts: Instead of a rigid XML file, Gradle build scripts written in
Groovy or Kotlin allow you to include logic, conditionals, and loops.
• Incremental Builds: Gradle tracks changes in source files and only rebuilds what is
necessary, which can significantly speed up the build process.
• Multi-project Builds: Easily manages complex projects with multiple modules or
subprojects.
• Extensibility: A robust plugin system that enables you to integrate various languages,
frameworks, and tools.
• Parallel Execution: Can run tasks in parallel, optimizing build times for large projects.
Both Maven and Gradle are used to automate the build process and manage project
dependencies. Here’s why they are so popular in the DevOps ecosystem:
• Automated Build and Testing: They allow developers to compile code, run tests, and
package applications without manual intervention.
• Consistent Build Environment: By enforcing standardized project structures and
dependency management, they help avoid the “it works on my machine” problem.
• Ease of Integration: Both tools integrate well with Continuous Integration (CI) servers
like Jenkins, enabling automated pipelines.
• Dependency Resolution: They simplify the process of managing external libraries and
ensure that all developers are using the same versions.
A. Installing Maven
mvn -version
B. Installing Gradle
Gradle can be installed in two primary ways: via the Ubuntu repositories (which may not be
the latest version) or by manually installing the latest version.
1. Download Gradle:
wget https://services.gradle.org/distributions/gradle-8.0-bin.zip
3. Set Up the Environment Variables: Add Gradle to your system PATH by appending
the following line to your ~/.bashrc (or ~/.profile):
gradle -v
Gradle 8.0
Configuration Declarative and rigid – follows Flexible and dynamic – allows you
Style strict conventions (convention to write custom logic and conditions
over configuration). within the build script.
Build Lifecycle Provides a fixed lifecycle (e.g., Uses a task-based approach where
validate, compile, test, package, tasks can be defined, customized,
install, deploy). and linked in a flexible manner.
Extensibility & Rich ecosystem of plugins but Highly extensible through its
Plugins customization can be more scripting capabilities; writing
challenging due to XML’s custom tasks or plugins in
verbosity and limitations in Groovy/Kotlin is more
scripting. straightforward.
Learning Curve Easier for beginners due to its More flexible but may have a
structured, convention-based steeper learning curve initially if
approach but can become you need to leverage its dynamic
complex with large features and custom logic.
configurations.
Community & Has been around longer, so many Relatively newer; it has gained
Maturity legacy projects and extensive popularity due to its modern
documentation exist. features, especially in Android and
multi-language projects.
1. Introduction to Maven
Maven is a powerful build automation and project management tool primarily used for Java
projects. It simplifies the build process by:
Step-by-Step Process
Make sure you have Maven installed (refer to Experiment 1). Open your terminal on your
Ubuntu system.
Maven comes with a set of archetypes that provide you with a standard project template. Use
the following command to create a new Maven project:
Once the command completes successfully, change your directory to the newly created project:
cd MyMavenApp
After generating the project, you will notice the following standard Maven directory structure:
MyMavenApp/
├── pom.xml
└── src
├── main
│ └── java
│ └── com
│ └── example
│ └── App.java
└── test
└── java
└── com
└── example
└── AppTest.java
• pom.xml:
The Project Object Model (POM) file is the core of any Maven project. It contains
configuration details such as project coordinates (groupId, artifactId, version),
dependencies, plugins, and build settings.
• src/main/java:
This directory holds the source code of your application. In our example, the package
structure com.example is created, and you have an App.java file.
• src/test/java:
This directory is for your test cases. The default example includes a basic test class,
AppTest.java.
The pom.xml file is written in XML and is essential to how Maven operates. It includes several
key sections:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.2</version>
<scope>test</scope>
</dependency>
</dependencies>
• Project Coordinates:
o <groupId>: Acts like a namespace, usually following the reverse domain name
convention.
o <artifactId>: The name of the project.
o Used to define values that can be referenced elsewhere in the POM (e.g., Java
source and target versions).
• Dependencies:
o Dependency Management: Maven downloads and manages external libraries.
In the example, JUnit is added for testing.
o Scope: Determines when a dependency is used (e.g., compile, test, runtime).
• Build and Plugins:
o Maven Compiler Plugin: Ensures that your Java code is compiled with the
specified Java version.
o Maven Surefire Plugin: Executes unit tests during the build process.
Example Dependency
To add a dependency for JUnit, include the following snippet in your <dependencies> section:
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.2</version>
<scope>test</scope>
</dependency>
• groupId, artifactId, version: These three elements uniquely identify the dependency.
• scope: The test scope ensures that this dependency is only available during the test
phase and not included in the final artifact.
Plugins are key to Maven’s flexibility, adding tasks to your build process.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>11</source>
<target>11</target>
</configuration>
</plugin>
• Configuration: Specifies that the project should be compiled using Java 11.
This plugin runs your unit tests during the test phase.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
</plugin>
Once your project is set up and your pom.xml is defined, you can use Maven commands to
build and test your application.
mvn compile
mvn test
mvn package
This command compiles, tests, and packages your code into a JAR file located in the
target directory. Screenshot Tip: Capture the listing of the target directory showing
the JAR file.
mvn clean
1. Introduction to Gradle
Gradle is a modern build automation tool designed to be highly flexible, fast, and scalable. It
is widely used in Java projects, Android development, and many multi-language projects.
Here’s what makes Gradle stand out:
• Flexible Build Scripts: Gradle uses a Domain Specific Language (DSL) based on
either Groovy (by default) or Kotlin. This provides a more dynamic and expressive way
to define build logic compared to static XML configurations (as used in Maven).
• Incremental Builds: Gradle optimizes build times by determining what parts of the
project have changed and rebuilding only those parts.
• Task Automation: Everything in Gradle is treated as a task, allowing you to create
custom tasks or reuse existing ones for compiling code, running tests, packaging, and
more.
• Dependency Management: Like Maven, Gradle can automatically download and
manage dependencies from remote repositories (e.g., Maven Central, JCenter, or
custom repositories).
Step-by-Step Instructions
Before starting, verify that Gradle is installed on your Ubuntu system. Open a terminal and run:
gradle -v
If you see version information, you’re set. If not, follow the Gradle installation instructions
from Experiment 1.
Gradle offers an interactive command to generate a new project using an initialization script.
To create a basic Java application project:
You will be prompted to choose between a few options (if you don’t use the non-
interactive mode). Choose defaults or specify details as needed.
After project creation, the directory structure typically looks like this:
HelloGradle/
├── build.gradle // The primary build script (Groovy DSL by
default)
├── gradle/ // Contains Gradle wrapper files (if generated)
├── gradlew // Unix shell script to run Gradle wrapper
├── gradlew.bat // Windows batch script for Gradle wrapper
├── settings.gradle // Contains project settings and names
└── src
├── main
│ └── java
│ └── App.java // Your main application source file
└── test
└── java
└── AppTest.java // Your test cases
Explanation of Components:
• build.gradle:
This is the main build script written in Groovy (or Kotlin if you choose). It defines
plugins, repositories, dependencies, and tasks.
• settings.gradle:
A small script that defines the project’s name and, in multi-project builds, the included
subprojects.
• gradlew / gradlew.bat:
The Gradle wrapper scripts. They allow you to run Gradle without requiring a separate
installation on every machine by automatically downloading the correct Gradle version.
• src/main/java:
Contains your application’s source code.
• src/test/java:
Contains your unit tests.
A build script in Gradle is a programmatic file that instructs Gradle on how to build your
project. It can be written in two main DSLs:
Below is an example of a basic build.gradle for a Java application using Groovy DSL:
plugins {
// Apply the Java plugin for compiling Java code
id 'java'
// Apply the application plugin to add support for building an
application
id 'application'
}
group = 'com.example'
version = '1.0'
repositories {
// Use Maven Central for resolving dependencies.
mavenCentral()
}
dependencies {
// Define your dependencies. For example, JUnit for testing:
testImplementation 'junit:junit:4.13.2'
}
application {
// Define the main class for the application.
mainClass = 'com.example.App'
}
Explanation:
• plugins block: Declares the plugins used. The java plugin adds Java compilation and
testing tasks, while the application plugin adds tasks for running the application.
• group/version: Sets project coordinates.
• repositories block: Configures the repository (Maven Central) where dependencies
will be resolved.
• dependencies block: Lists the libraries your project depends on.
• application block: Specifies the main class of your application.
• Custom task: Defines a task named hello that prints a greeting when run.
Here’s how the same build configuration might look in Kotlin DSL:
Create a file named build.gradle.kts (or convert your file) with the following content:
plugins {
// Apply the Java plugin for compiling Java code
java
// Apply the application plugin to add support for building an
application
application
}
group = "com.example"
version = "1.0"
repositories {
mavenCentral()
}
dependencies {
// Define dependencies using Kotlin DSL syntax
testImplementation("junit:junit:4.13.2")
}
application {
// Set the main class for the application
mainClass.set("com.example.App")
}
Explanation:
• Kotlin DSL Syntax: Uses a statically typed syntax, which can be more intuitive for
developers familiar with Kotlin.
• Plugins, Repositories, and Dependencies: Defined similarly to the Groovy DSL but
with Kotlin-style function calls and property access.
• Custom Task Registration: Uses tasks.register to define a task, similar in
functionality to the Groovy DSL.
How It Works:
Example:
dependencies {
testImplementation 'junit:junit:4.13.2'
}
This line tells Gradle to download JUnit version 4.13.2 from Maven Central and include it in
the test classpath.
In Gradle, nearly everything is a task. Tasks represent individual units of work (compiling
code, running tests, packaging applications). Gradle comes with many built-in tasks (provided
by plugins) and allows you to define your own custom tasks.
Built-in Tasks:
groovy
CopyEdit
task hello {
doLast {
println 'Hello, Gradle!'
}
}
• Command:
gradle hello
• Expected Output:
You should see:
tasks.register("hello") {
doLast {
println("Hello, Gradle with Kotlin DSL!")
}
}
• Command:
./gradlew hello
• Expected Output:
You should see the greeting printed from the custom task.
Task Dependencies:
You can make one task depend on another. For instance, you might want your custom task to
run after the build task:
Groovy DSL:
Kotlin DSL:
tasks.register("greet") {
dependsOn("build")
doLast {
println("Build is complete! Time to celebrate!")
}
}
Screenshot Tip: Capture your terminal output after running these custom tasks.
• Command:
gradle build
• What it does:
o Compiles source code (compileJava).
o Runs tests (test).
o Packages the application into a JAR file (jar).
• Expected Output:
Look for a "BUILD SUCCESSFUL" message.
• Screenshot Tip: Capture the full output of the gradle build command
Command:
gradle run
What it does:
Runs your main class as specified in the application block.
Expected Output:
Any output from your application (for example, if your App.java prints a message).
Command:
gradle hello
Expected Output:
The custom greeting message you defined earlier.
o What it does:
This command creates a new Maven project with the group ID com.example
and the artifact ID HelloMaven. The archetype sets up a basic Java application,
including a sample test.
o Expected Output:
Maven will display messages as it downloads dependencies and generates the
project files.
o Screenshot Tip:
Capture the terminal output showing the successful project generation.
3. Change Directory into the Newly Created Project:
4. cd HelloMaven
o Screenshot Tip:
Capture the output of the pwd or ls command to show the project structure.
HelloMaven/
├── pom.xml
└── src
├── main
│ └── java
│ └── com
│ └── example
│ └── App.java
└── test
└── java
└── com
└── example
└── AppTest.java
• pom.xml: The Maven configuration file (POM) that defines your project’s coordinates,
o Expected Output:
You should see a “BUILD SUCCESS” message along with information on the
created JAR file.
o What it does:
This command runs the com.example.App class from the JAR file generated
in the previous step.
o Expected Output:
The output should display:
o Hello World!
o Screenshot Tip:
Capture the terminal output showing “Hello World!” as proof that the
application ran successfully.
In this part, you will create a Gradle project that contains the same Java application code, then
build and run it using Gradle.
HelloMavenGradle/
├── build.gradle
├── settings.gradle
└── src
├── main
│ └── java
│ └── App.java
└── test
└── java
└── AppTest.java
application {
// Update the mainClass to reflect the package structure
mainClass = 'com.example.App'
}
4. gradle run
o What it does:
Uses the application plugin to run the main class defined in the
build.gradle file.
o Expected Output:
The output should display:
o Hello World!
1. What Is Jenkins?
Key Functionalities
Below are detailed step-by-step instructions for installing Jenkins on an Ubuntu machine.
Jenkins requires Java to run. It is recommended to use Java 11 or later. Install OpenJDK 11:
Download and add the repository key so that your system trusts the Jenkins packages:
• What it does:
o start jenkins launches the Jenkins service.
o status jenkins confirms Jenkins is running.
• Expected Output: A status message indicating Jenkins is active (running).
2. http://localhost:8080
3. Unlock Jenkins:
The initial Jenkins screen will ask for an administrator password. Retrieve it by
running:
4. sudo cat /var/lib/jenkins/secrets/initialAdminPassword
o What it does: Displays the auto-generated admin password.
o Screenshot Tip: Capture the terminal output showing the password and the
Jenkins unlock screen.
5. Follow the Setup Wizard:
o Install Suggested Plugins: Click on “Install suggested plugins” for a typical
setup.
o Create an Admin User: Follow prompts to create your first admin user.
o Finalize Configuration: Complete the remaining setup steps (e.g., instance
configuration).
o Screenshot Tip: Capture each major step (unlocking Jenkins, plugin installation,
and admin user creation).
There are several ways to run Jenkins on the cloud. One common method is to run Jenkins
using a Docker container on a cloud virtual machine. Below are instructions using Docker on
an Ubuntu cloud server (for example, on AWS EC2, DigitalOcean, or any cloud provider that
supports Ubuntu).
After installing Jenkins (locally or on the cloud), complete these steps to configure it for first
use:
• Instance Configuration:
Jenkins may ask you to confirm the URL for your Jenkins instance. Verify that it is
correct (e.g., http://localhost:8080 for local installations or your cloud server’s IP
address for cloud installations).
• Save the Configuration:
Confirm the settings and proceed.
• Dashboard:
Once the setup is complete, you will be taken to the Jenkins dashboard where you can:
o Create new jobs (Freestyle projects, Pipelines, etc.)
o Install additional plugins as needed.
1. Overview
What is a CI Pipeline?
A Continuous Integration (CI) Pipeline automates the process of building, testing, and
integrating code changes every time code is committed to the repository. This pipeline:
• Automation: Jenkins automates the build and test cycle, reducing manual intervention.
• Immediate Feedback: Developers get rapid notifications of any integration issues.
• Extensibility: With hundreds of plugins available, Jenkins can integrate with version
control systems, build tools (Maven, Gradle), testing frameworks, and more.
• Pipeline as Code: Using Jenkins Pipelines (defined in a Jenkinsfile), you can
manage the CI process as part of your source code repository.
This section explains how to create a CI pipeline as a Freestyle project that integrates with a
Maven or Gradle project.
1. Select SCM:
o In the job configuration page, scroll down to the “Source Code Management”
section.
o Select “Git” (if using Git for version control).
2. Enter Repository Details:
o Repository URL: Enter the URL of your Git repository (for example,
https://github.com/yourusername/your-maven-project.git).
This command instructs Maven to clean the previous build artifacts, compile
the code, run tests, and package the application into a JAR/WAR file.
o Optionally, set the POM File location if it is not in the default location
(pom.xml).
This instructs Gradle to clean previous build outputs and then build the project,
running tests along the way.
o Switches: If needed, you can add additional flags (for example, --info or --
stacktrace for more detailed output).
o Verify that Jenkins successfully checks out the code, runs the build commands
(Maven or Gradle), and executes tests.
o Look for “BUILD SUCCESS” or the equivalent output to confirm that the
build and tests passed.
For greater flexibility and version-controlled CI configuration, you can use a Jenkins Pipeline
defined in a Jenkinsfile.
Below are sample pipeline scripts for Maven and Gradle projects.
pipeline {
agent any
stages {
stage('Checkout') {
steps {
// Check out code from Git repository
git url: 'https://github.com/yourusername/your-maven-
project.git', branch: 'main'
}
}
stage('Build') {
steps {
// Run Maven build
sh 'mvn clean package'
}
}
stage('Test') {
steps {
// Optionally, separate test execution if needed
sh 'mvn test'
}
}
}
post {
always {
// Archive test reports
junit '**/target/surefire-reports/*.xml'
}
success {
echo 'Build and tests succeeded!'
}
failure {
echo 'Build or tests failed.'
}
}
}
pipeline {
agent any
stages {
stage('Checkout') {
steps {
// Check out code from Git repository
git url: 'https://github.com/yourusername/your-
gradle-project.git', branch: 'main'
}
}
stage('Build') {
steps {
// Run Gradle build
sh './gradlew clean build'
}
}
stage('Test') {
steps {
// Run tests (if not already run in the build stage)
sh './gradlew test'
}
}
}
post {
always {
// Archive test reports (modify the path according to your
project structure)
junit '**/build/test-results/test/*.xml'
}
success {
echo 'Build and tests succeeded!'
}
failure {
echo 'Build or tests failed.'
}
}
}
1. Introduction to Ansible
What Is Ansible?
• Inventory:
An inventory is a file (usually in INI or YAML format) that lists the hosts (or groups
of hosts) you want to manage. It tells Ansible which machines to target.
• Playbook:
A playbook is a YAML file that defines a set of tasks to be executed on your target
hosts. It is the heart of Ansible automation. In a playbook, you specify:
o Hosts: The target machines (or groups) on which the tasks should run.
o Tasks: A list of actions (using modules) that should be executed.
o Modules: Reusable, standalone scripts that perform specific actions (e.g.,
installing packages, copying files, configuring services).
• Modules:
Ansible comes with a large collection of built-in modules (such as apt, yum, copy,
service, etc.). These modules perform specific tasks on target hosts. You can also
write custom modules if needed.
• Agentless: Ansible uses SSH to communicate with target hosts, so no agent needs to
be installed on them.
• Simplicity: Playbooks use simple YAML syntax, making them easy to write and
understand.
• Idempotence: Ansible tasks are idempotent, meaning running the same playbook
multiple times yields the same result, ensuring consistency.
• Scalability: Ansible can manage a small number of servers to large infrastructures with
hundreds or thousands of nodes.
Before writing a playbook, you need to install Ansible on your control machine (your local
Ubuntu system).
ansible --version
ansible 2.9.x
config file = /etc/ansible/ansible.cfg
...
An inventory file lists the hosts you want to manage. For this experiment, you can use the local
host.
While our experiment covered the basics, here’s how you can extend it:
• Configuring Services:
Use modules like service to start, stop, or restart services. For example, you can
automate the configuration of web servers (e.g., Apache or Nginx).
• Managing Files and Templates:
Use the copy or template modules to deploy configuration files across your servers.
This is useful for maintaining consistent configuration settings.
• User and Group Management:
The user and group modules allow you to create or modify user accounts, ensuring
that the correct permissions and roles are applied automatically.
• Advanced Orchestration:
Ansible playbooks can include conditionals, loops, and error handling to manage more
complex setups, ensuring idempotence and consistency across your infrastructure.
If you wanted to automate the configuration of an Nginx server, your playbook might include
tasks such as:
handlers:
- name: Restart Nginx
service:
name: nginx
state: restarted
You will now create a simple playbook that performs two common tasks:
• Explanation:
o ansible-playbook: The command to run an Ansible playbook.
Expected Output:
1. Overview
• Set up a Jenkins job to automatically build a Maven project from source control.
• Archive the build artifact (a JAR file) produced by Maven.
• Integrate an Ansible deployment step within Jenkins (using a post-build action) to
deploy the artifact to a target location.
• Verify that the artifact is deployed successfully.
This exercise demonstrates how Continuous Integration (CI) and automated configuration
management can work together to streamline the build-and-deploy process.
2. Prerequisites
Tip: Verify installations and repository access before starting this exercise.
2. cd path/to/HelloMaven
3. git init
4. git add .
5. git commit -m "Initial commit of HelloMaven project"
6. HelloMaven/
7. ├── pom.xml
8. └── src
9. ├── main/java/com/example/App.java
10. └── test/java/com/example/AppTest.java
o Credentials: If the repository is private, click “Add” and provide the necessary
credentials.
o Branch Specifier: (e.g., */main or */master).
This command cleans any previous builds, compiles the code, runs tests, and
packages the application into a JAR file.
After the Maven build completes, you need to archive the generated artifact so that it can be
used later by the deployment process.
This pattern tells Jenkins to archive any JAR file found in the target directory.
Now, integrate an Ansible deployment step into the Jenkins job. You can do this as a post-build
action that executes a shell command.
Note:
Create an Ansible playbook that deploys the Maven artifact (the JAR file) generated by Jenkins
to a target directory.
If you haven’t already, create an inventory file (e.g., hosts.ini) that targets the deployment
machine. For a local deployment, use:
[local]
localhost ansible_connection=local
Explanation:
o hosts: local means the playbook runs on the local machine. Adjust this if
deploying to a remote server.
o become: yes: Uses sudo privileges to write to system directories.
o src: The path should point to the archived artifact in the Jenkins workspace.
(Adjust the path if your Jenkins workspace is different.)
o dest: The target directory where you want the artifact deployed (ensure this
directory exists or modify accordingly).
13. Save and Exit the File.
Screenshot Tip: Capture the console output showing the full pipeline execution,
including the deployment step.
2. Verify Deployment:
o Log into your target machine (or check locally) and verify that the artifact has
been copied to the destination directory (e.g.,
/opt/deployment/HelloMaven.jar).
Azure DevOps is a comprehensive suite of cloud-based services designed to support the entire
software development lifecycle. It provides tools for planning, developing, testing, delivering,
and monitoring applications. Here are the primary services offered:
• Azure Repos:
A set of version control tools that allow you to host Git repositories or use Team
Foundation Version Control (TFVC). It offers collaboration features such as pull
requests, branch policies, and code reviews.
• Azure Pipelines:
A CI/CD service that helps automate builds, tests, and deployments. It supports multiple
languages, platforms, and can run on Linux, Windows, or macOS agents.
• Azure Boards:
A work tracking system that helps teams manage work items, sprints, backlogs, and
Kanban boards. It facilitates agile planning and reporting.
• Azure Test Plans:
Provides a solution for managing and executing tests, capturing data about defects, and
tracking quality.
• Azure Artifacts:
Allows you to create, host, and share packages (such as Maven, npm, NuGet, and
Python packages) with your team, integrating package management into your CI/CD
pipelines.
These services integrate with each other and with popular third-party tools to create a cohesive
DevOps ecosystem.
Before you can start using Azure DevOps services, you need to set up an account and create
an organization. Follow these steps:
o Select a Region: Choose the geographic region where your data will be stored
(select the one closest to you for optimal performance).
o Click “Continue” or “Create”.
After setting up your organization, the next step is to create a project. A project in Azure
DevOps is a container for all your source code, pipelines, boards, and other resources.
1. Project Overview:
o Once your project is created, you will be directed to the project dashboard. Here
you will see navigation options for:
▪ Repos: Where your code is stored.
▪ Pipelines: For build and release automation.
▪ Boards: For work tracking and agile planning.
▪ Test Plans: For managing and running tests.
▪ Artifacts: For hosting packages.
2. Familiarize Yourself with the Interface:
o Click through each section (e.g., Repos, Pipelines, Boards) to get a sense of the
available features.
Create a project by clicking on the New Project option, choose an appropriate name and
select the following features as mentioned in the image (Build system: Maven
andLanguage: Java)
1. Check whether git is installed using the following command. $ git -version
2. Install git using $ sudo apt install git
3. Once git has been installed execute the following commands one by one.
$ git init $
git add .
$ git commit -m “First Commit”
$ git branch -M main
Go to https://github.com sign in with your account and create a new private repository
with the same name as your maven project.
Now finally run the push commands to push the code to the remote repository from the
local repository.
$ git push --set-upstream origin main (Type YES for prompted question)
$ git push
4. A YAML file is automatically created based on the configuration details found in the
pom.xml file in the repository.
Below is a detailed section that explains how to run unit tests and generate reports in an Azure
DevOps pipeline after your Maven project has been built. This section explains what happens
during the Maven build process, how test results are generated, and how to publish these results
as part of your pipeline execution.
Running Unit Tests and Generating Reports with Maven in Azure DevOps
When you build your Maven project using Azure Pipelines, the build process usually includes
running unit tests with the Maven Surefire plugin. This plugin executes tests (typically written
with JUnit) and produces test result files in XML format. Azure Pipelines can then pick up
these XML files and present them as part of the build summary. Below are the steps and details
to ensure that your unit tests are executed and the reports are published.
2. Configuring Your Azure Pipeline to Publish Test ResultsAfter your Maven build runs
and tests are executed, you need to add a step in your Azure Pipeline YAML file that locates
these test reports and publishes them in Azure DevOps. This is accomplished by using the
PublishTestResults@2 task.
Below is a sample snippet of a YAML pipeline configuration that includes both the Maven
build step and a step to publish test results:
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Maven@3
inputs:
- task: PublishTestResults@2
inputs:
testResultsFiles: '**/target/surefire-reports/TEST-*.xml'
mergeTestResults: true
After committing the YAML file to your repository, the pipeline is triggered (either
automatically or manually):
4. Troubleshooting Tips
1. Overview
• Deploy your build artifact (e.g., a JAR or WAR file from a Maven/Gradle project) to
an Azure App Service.
• Manage secrets and configuration securely using Azure Key Vault.
• Set up a release pipeline in Azure DevOps that automatically deploys your application
when a new artifact is available (continuous deployment).
This experiment bridges the gap between build automation (CI) and release automation (CD)
while ensuring secure management of sensitive information.
2. Prerequisites
• An Azure DevOps account with a project set up (see Experiment 9 and Experiment
10).
• A build artifact (e.g., your Maven/Gradle artifact) available from a build pipeline.
• An Azure Subscription with an Azure App Service instance already created to host
your application.
• An Azure Key Vault instance created in your Azure Subscription for storing secrets
(e.g., connection strings, API keys).
• Appropriate permissions to create and manage resources in Azure DevOps and your
Azure Subscription.
o Open your web browser and navigate to your Azure DevOps project (e.g.,
https://dev.azure.com/YourOrganization).
2. Navigate to the Releases Section:
o In the left-hand menu, click on “Pipelines” and then “Releases”.
3. Create a New Pipeline:
o Click on “New pipeline”.
o When prompted, select “Empty job” (or start with a template if one suits your
needs).
B. Add an Artifact
o Follow the prompts to create a new Key Vault (enter a name, select subscription,
resource group, and region).
2. Add Secrets to Your Key Vault:
o Once the Key Vault is created, navigate to it.
o Click on “Secrets” and then “Generate/Import”.
o Create new secrets (e.g., DBConnectionString, APIKey) and note their names.
1. Trigger a Build:
o Commit a change to your code repository to trigger your build pipeline (or
trigger it manually).
o Verify that the build pipeline creates a new artifact.
2. Automatic Release:
o Once the new artifact is published, the release pipeline should automatically
trigger a new release.
o Monitor the release pipeline execution to ensure that:
▪ The artifact is deployed to the Azure App Service.
▪ The deployment task uses the configuration and secrets from Key Vault.
1. Overview
In this experiment, you will create an end-to-end DevOps pipeline that demonstrates the
following processes:
• Version Control: Code is maintained in a Git repository (e.g., GitHub or Azure Repos).
• Continuous Integration (CI):
o A CI tool (Jenkins and/or Azure Pipelines) automatically checks out the code,
builds it using Maven/Gradle, runs unit tests, and archives the build artifact.
• Artifact Management: The artifact (e.g., a JAR file) is archived and made available
for deployment.
• Continuous Deployment (CD):
o Deployment automation is handled either by an Ansible playbook or an Azure
Release pipeline, deploying the artifact to a target environment (such as an
Azure App Service or a local server).
• Secrets and Configuration Management: Securely manage configuration details and
secrets using Azure Key Vault.
• Pipeline as Code: Use YAML (for Azure Pipelines) or a Jenkinsfile (for Jenkins) to
define your build and release processes.
After setting up and running the pipeline, we will discuss best practices for designing and
maintaining such pipelines and open the floor for a Q&A discussion.
2. Prerequisites
Before starting, ensure you have completed or have access to the following:
o The project should follow a standard structure (with pom.xml for Maven or
build.gradle for Gradle, and appropriate src/main/java and src/test/java
directories).
• Jenkins and/or Azure DevOps Setup:
o Jenkins installed and configured on your local machine or a cloud server (refer
to Experiments 5 and 6), or an Azure DevOps project set up with a build pipeline
(Experiments 9 and 10).
• Ansible Installed:
o Ansible is installed on your control machine (or Jenkins server) with a basic
inventory file (see Experiment 7).
• Azure Resources: (Optional but recommended for cloud deployment)
o An Azure App Service instance created to host your application.
o An Azure Key Vault instance set up to store sensitive data (e.g., connection
strings).
• Access Credentials:
o Permissions to commit code to the repository.
o Administrative access on Jenkins/Azure DevOps.
o Proper permissions on Azure to deploy to App Services and to manage Key
Vault secrets.
▪ }
▪ stage('Test') {
▪ steps {
▪ sh 'mvn test'
▪ }
▪ }
▪ stage('Archive') {
▪ steps {
▪ archiveArtifacts artifacts: 'target/*.jar', fingerprint: true
▪ }
▪ }
▪ }
▪ }
4. Run the Jenkins Job:
o Click “Build Now” and monitor the console output. Ensure that the build is
successful and that test reports are generated.
o pool:
o vmImage: 'ubuntu-latest'
o
o steps:
o - task: Maven@3
o inputs:
o mavenPomFile: 'pom.xml'
o goals: 'clean package'
o - task: PublishTestResults@2
o inputs:
o testResultsFiles: '**/target/surefire-reports/TEST-*.xml'
o mergeTestResults: true
o testRunTitle: 'Maven Unit Test Results'
3. Run the Pipeline and Verify Test Reports:
o Commit and run the pipeline.
o Navigate to the “Tests” tab to view the summary of executed tests
o src: "/var/lib/jenkins/workspace/HelloMaven-CI/target/HelloMaven-1.0-
SNAPSHOT.jar"
o dest: "/opt/deployment/HelloMaven.jar"
o Adjust paths according to your environment.
2. Configure Your Ansible Inventory:
o Create or update your hosts.ini file:
o [deployment]
o target-server ansible_host=your.server.ip ansible_user=yourusername
o For a local deployment, you can use:
o [deployment]
o localhost ansible_connection=local
3. Integrate Ansible into Your Jenkins/Azure Pipeline:
o Add a post-build (or post-release) step to execute the Ansible playbook:
o ansible-playbook -i /path/to/hosts.ini /path/to/deploy.yml
Best Practices:
• Pipeline as Code:
Use YAML (or a Jenkinsfile) to define your build and release pipelines. This allows
you to version control your pipeline configuration alongside your code.
• Automate Everything:
Automate code checkout, builds, tests, artifact archiving, and deployments. Reduce
manual interventions to minimize human error.
• Idempotence:
Ensure that your deployment scripts (whether Ansible playbooks or release tasks) are
idempotent—running them multiple times produces the same result.
• Secure Secrets Management:
Use Azure Key Vault (or a similar tool) to securely store sensitive data (e.g., API
keys, connection strings) and reference these values in your pipelines.
• Monitoring and Logging:
Integrate logging and monitoring into your pipeline. Review test reports, deployment
logs, and set up notifications for build failures.
• Modular and Scalable Design:
Break down your pipeline into clear stages (checkout, build, test, deploy) and design
it to handle multi-environment deployments (development, staging, production).
• Continuous Improvement:
Regularly review and refine your pipeline. Use metrics and feedback to optimize
build times, reduce failures, and ensure high-quality releases.