Open In App

How to Use Docker Compose With Jenkins

Last Updated : 08 Jul, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

In the fast-paced world of software development, it is mainly ensured by automation of repetition and task persistence. This is where Docker Compose and Jenkins come into play. Docker Compose is a tool for defining and running multi-container Docker applications. On the other hand, Jenkins is a widely used and popular automation server, open source in nature, used for continuous integration and continuous delivery. With Docker Compose joined to Jenkins, teams automate build, test, and deployment pipelines, ensuring applications are built and deployed in a repeatable fashion.

This article describes how Docker Compose is used with Jenkins to have automated build pipelines. We'll go over the absolute bare essentials regarding vocabulary, guide you through the setup step-by-step, and illustrate concepts by example. We'll also tackle some of those frequently asked questions so you can start to use this powerful combination to its full potential. Whether you've just started with these tools or want to increase your productivity with an existing setup, this guide will give you the insight and instructions to set up your development workflow.

Primary Terminologies

Docker

  • Docker is a platform that allows developers to develop, ship, and run apps in containers. Generally, it supports app packaging with everything it needs—application runtime plus its dependencies—into one package for consistency across different environments, from development to production.

Jenkins

  • Jenkins is an automation server tool that is open-source and used for building, deploying, and automating software development. It highly furnishes in setting up continuous integration/continuous deployment pipelines.

Jenkins Pipeline

  • A Jenkins pipeline is a suite of plugins embraced by Jenkins users, making it possible to implement and manage continuous delivery pipelines using code. It defines the steps for building, testing, and deploying applications using a Groovy-based DSL.

Docker Image

  • A Docker image is a lightweight, standalone, executable package that includes everything needed to run a piece of software; for instance, the code, runtime, libraries, environment variables, and configuration files are included.

Docker Container

  • A Docker container is an executing instance of a Docker image—an isolated environment where the application code runs, ensuring consistency and reliability, no matter how many different runtimes there are.

Dockerfile

  • A Dockerfile is a text document that contains instructions in code to create an image using Docker. In the context of a Dockerfile, every command adds a new layer on top of the previous one.

Docker Hub

  • Docker Hub is an online repository service through which you can share your created Docker images, also for them to be available at ease for any individual. It supports easy sharing, collaboration, and deployment of containerized applications.

Continuous Integration (CI)

  • Continuous Integration is the practice in software development whereby all developers merge code into shared repositories regularly. That way, automatic builds, and tests are done with each integration, allowing teams to detect problems early.

Continuous Delivery (CD)

  • Continuous Delivery (CD) is a software engineering approach that allows for small incremental changes in code, automatically builds, and tests them before being readied to go into production. That means its software can be released with ease at time.

Step-by-Step Integration Process

Step 1: Launch EC2 instance

  • Go to AWS Console and login with your credentials
  • Navigate to EC2 dashboard and launch ec2 instance
Launch EC2 instance

Step 2: Install Docker and Docker Compose

Install Docker:

  • Now install docker in our local instance by using following command
sudo yum -y install docker
Install Docker and Docker Compose
  • Start and enable docker by using following command
sudo systemctl start docker
sudo systemctl enable docker
sudo systemctl status docker
systemctl

Install Docker Compose:

sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker ec2

Step 3: Set Up Jenkins

  • Now install jenkins by using following commands or follow official site

sudo wget -O /etc/yum.repos.d/jenkins.repo \

https://pkg.jenkins.io/redhat-stable/jenkins.repo

sudo rpm --import https://pkg.jenkins.io/redhat-stable/jenkins.io-2023.key

sudo yum upgrade

sudo yum -y install java-17*

sudo yum -y install jenkins

install jenkins
  • Start and enable jenkins by using following commands
sudo systemctl start jenkins
sudo systemctl enable jenkins
sudo systemctl status jenkins
start jenkins
  • Unlock jenkins by using administration password
Aceses Jenkins

Step 4: Configure Docker in Jenkins

Install Docker and Docker Pipeline Plugins:

  • Navigate to Manage Jenkins > Manage Plugins.
  • Install Docker and Docker Pipeline plugins.
docker plugin

Step 5: Create a Jenkins Pipeline Job

  • Open Jenkins Dashboard.
  • Click on New Item.
  • Enter the name of the job, select Pipeline, and click OK.
free style project

Configure the Pipeline:

  • Go to pipeline section add pipeline script

pipeline {

agent any

stages {

stage('Checkout') {

steps {

git url: 'https://github.com/Sada-Siva-Reddy07/docker-jenkins.git', branch: 'main'

}

}

stage('Build and Test') {

steps {

dir('path/to/your/docker-compose-directory') {

// Check the docker-compose version

sh 'docker-compose --version'

// Bring up the services

sh 'docker-compose up -d'

// Ensure the services are running

sh 'docker-compose ps'

// Run tests on the correct service (adjust if necessary)

sh 'docker-compose exec -T wordpress /bin/bash -c "apt-get update && apt-get install -y maven && mvn test"'

}

}

}

stage('Deploy') {

when {

expression { currentBuild.result == null || currentBuild.result == 'SUCCESS' }

}

steps {

echo 'Deploying...'

// Add your deploy steps here

}

}

}

post {

always {

echo 'Post actions'

}

success {

echo 'Pipeline completed successfully.'

}

failure {

echo 'Pipeline failed.'

}

}

}

pipeline

Here is the docker-compose.yml file which is clone form github

  • use this docker-compose or create your own

version: '3.3'

services:

db:

image: mysql:8.0.27

environment:

MYSQL_ROOT_PASSWORD: rootpassword

MYSQL_DATABASE: wordpress

MYSQL_USER: wordpress

MYSQL_PASSWORD: wordpress

volumes:

- db_data:/var/lib/mysql

wordpress:

image: wordpress:latest

ports:

- "80:80"

environment:

WORDPRESS_DB_HOST: db:3306

WORDPRESS_DB_USER: wordpress

WORDPRESS_DB_PASSWORD: wordpress

WORDPRESS_DB_NAME: wordpress

depends_on:

- db

volumes:

db_data:

  • Now build the pipeline. Click on build now
build history
  • In below figure we can see pipeline stage view
build docker

Step 6: Verification

  • Now navigate to ec2 dashboard and copy public ip of instance browse it along with port 80
word press applciation
  • Here we see that our application deployed successfully by using jenkins pipeline and docker-compose

Conclusion

Lastly, the integration of Docker Compose with Jenkins for building pipeline automation is a significant advancement in modern software development, this way it effectively combines containerizing using Docker with potent automation within Jenkins, resulting in much efficiency throughout the development lifecycle. Docker Compose minimizes the burden of managing multi-container applications, ensuring consistency in the environments from development to production, jenkins' role in this process is complemented by automation of building, testing, and deploying tasks, thus allowing for easier CI/CD (continuous integration/continuous delivery).

The integration enhances collaboration between teams for application deployment without glitches and at high speed, which promotes scalability and flexibility that adapts to different workload demands effortlessly. Moreover, standardization ensures improved software quality due to automated testing and validation enabled by Docker Compose and Jenkins, going forward, this solution will allow for future developments with container orchestration using Kubernetes and make more integrations with cloud-native technologies. In general, Docker Compose and Jenkins help an organization reduce the development workflow, increase time to market, and ensure consistent delivery of quality software solutions.

  • Some benefits of integrating Docker Compose with Jenkins for CI/CD are as follows:

    Consistency: Ensuring consistent environments at different pipeline stages.

    Automation: It automatically builds, tests, and deploys with the least manual intervention, thereby reducing probable chances.

    Scalability: Applications and services that are described through Docker Compose files can scale.

    Adaptiveness: This rapidly adjusts to development and deployment needs as and when they change.

    Speed: Automates software updates and improvements delivery through continuous integration and pipelines.

Can I trigger Jenkins pipelines with Docker events?

Specifically, Docker events may trigger Jenkins pipelines through a couple of Jenkins plugins, such as Docker Pipeline, or one can configure Jenkins to listen for Docker events through webhooks. This way, Jenkins may be set up to trigger pipeline runs based on Docker container lifecycle events, such as image pushes or container starts and stops.

How can Docker Compose ease local development environments?

Docker Compose, then, allows the setting up and management of local development environments to be somewhat made easy since it defines all required services, i.e., databases and web servers, in a single file called docker-compose.yml. From there, developers can bootstrap these environments using a single command (docker-compose up), hence acquiring some degree of consistency between their local setups and a given production environment.



Next Article
Article Tags :

Similar Reads