0% found this document useful (0 votes)
19 views16 pages

TP Jenkins

jenkins automate

Uploaded by

gamingirl699
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views16 pages

TP Jenkins

jenkins automate

Uploaded by

gamingirl699
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Sujet de Travaux Pratiques : Pipeline Jenkins pour

Simuler le Déploiement d’un Projet Python sur EC2

Jenkins

S3 bucket
Bucket S3 déjà alloué pour stocker les fichiers du projet.

Aws credentials
Un compte AWS
avec les permissions pour créer des instances EC2, interagir avec S3 et utiliser
l'API EC2.

Dépôt GitHub privé


contenant les fichiers du projet.

S3 file creation
Create credentials on jenkins:

Create new agent:


home dir: /var/lib/jenkins

Passeer les commandes suivantes dans notre machine ec2-user:


java -jar agent.jar -url http://3.89.124.27:8080/ -secret
fbd479947b4f674ada7ba041b1e28d4b158e0afde6460824fbf649fd6a911f8d -name
agent1 -workDir "/var/lib/jenkins"
Stage1
pipeline {
agent { label 'master' } // Replace with your Jenkins agent label

stages {
stage('get_code') {
steps {
script {
// Configuration des variables
def repoUrl = '[email protected]:Mrxa69K/tp_deploy.git' //
Repository URL
def zipFile = 'project.zip'
def s3Bucket = 'jenkinsbucket69' // Your S3 bucket name
def awsRegion = 'us-east-1' // Replace with your AWS
region

// Remove the existing directory if it exists


sh "rm -rf tp_deploy"

// Récupérer le code du dépôt GitHub


sshagent(['github-jenkins']) { // Your SSH credentials
ID
sh "git clone ${repoUrl} --depth 1"
}

// Zipper les fichiers nécessaires


sh "zip -r ${zipFile} tp_deploy/hello_world.py
tp_deploy/test_hello.py tp_deploy/requirements.txt"

// Envoyer le fichier zip vers S3


withCredentials([
string(credentialsId: 'aws-access-key-id',
variable: 'AWS_ACCESS_KEY_ID'),
string(credentialsId: 'aws-secret-access-key',
variable: 'AWS_SECRET_ACCESS_KEY'),
string(credentialsId: 'aws-session-token',
variable: 'AWS_SESSION_TOKEN')
]) {
// Set AWS credentials and region for the CLI
sh """
export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
export
AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
export AWS_DEFAULT_REGION=${awsRegion}
aws s3 cp ${zipFile} s3://${s3Bucket}/
"""
}
}
}
}
}
}
Stage2:

pipeline {
agent { label 'master' }

stages {
stage('run_code') {
steps {
script {
// Configuration des variables
def zipFile = 'project.zip'
def s3Bucket = 'jenkinsbucket69' // Your S3 bucket name
def awsRegion = 'us-west-2' // Change this to your AWS region if needed

// Télécharger le fichier zip depuis S3


withCredentials([
string(credentialsId: 'aws-access-key-id', variable:
'AWS_ACCESS_KEY_ID'),
string(credentialsId: 'aws-secret-access-key', variable:
'AWS_SECRET_ACCESS_KEY'),
string(credentialsId: 'aws-session-token', variable:
'AWS_SESSION_TOKEN')
]) {
// Set AWS credentials and region for the CLI
sh """
export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
export AWS_DEFAULT_REGION=${awsRegion}
aws s3 cp s3://${s3Bucket}/${zipFile} ./
"""
}

// Dézipper le fichier
sh "unzip ${zipFile}"

// Créer un environnement virtuel Python


sh '''
python3 -m venv venv
source venv/bin/activate
'''

// Installer les dépendances


sh '''
source venv/bin/activate
pip install -r tp_deploy/requirements.txt
'''

// Exécuter le script Python


sh '''
source venv/bin/activate
python tp_deploy/hello_world.py
'''
}
}
}
}
}
Stage 3:

pipeline {
agent { label 'master' } // Use the same agent as Stage 2

stages {
stage('run_code') {
steps {
script {
// Configuration des variables
def zipFile = 'project.zip'
def s3Bucket = 'jenkinsbucket69' // Your S3 bucket name

// Télécharger le fichier zip depuis S3


withCredentials([
string(credentialsId: 'aws-access-key-id', variable:
'AWS_ACCESS_KEY_ID'),
string(credentialsId: 'aws-secret-access-key', variable:
'AWS_SECRET_ACCESS_KEY'),
string(credentialsId: 'aws-session-token', variable:
'AWS_SESSION_TOKEN')
]) {
// Set AWS credentials and region for the CLI
sh """
export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
aws s3 cp s3://${s3Bucket}/${zipFile} .
"""
}

// Dézipper le fichier, overwrite existing files


sh "unzip -n ${zipFile}"

// Créer un environnement virtuel Python


sh '''
python3 -m venv venv
source venv/bin/activate
'''

// Installer les dépendances


sh '''
source venv/bin/activate
pip install -r tp_deploy/requirements.txt
'''

// Exécuter le script Python


sh '''
source venv/bin/activate
python tp_deploy/hello_world.py
'''
}
}
}

stage('test_code') {
steps {
script {
// Exécuter les tests unitaires avec pytest
sh '''
source venv/bin/activate
pip install pytest
pytest tp_deploy/test_hello.py
'''
}
}
}
}
}
Stage 4

After deleting this first line and command=”...” etc


The authorized keys worked perfectly
pipeline {
agent { label 'agent1' }

stages {
stage('Launch EC2 Instance') {
steps {
script {
// Configuration des variables
def instanceType = 't2.micro'
def amiId = 'ami-0ebfd941bbafe70c6' // AMI ID
def keyName = 'jenkin'
def securityGroupId = 'sg-05c656607d714e983' // Security group ID
def tagName = 'MyEC2Instance' // Tag name for the instance

// Utilisation des identifiants AWS


withCredentials([
string(credentialsId: 'aws-access-key-id', variable:
'AWS_ACCESS_KEY_ID'),
string(credentialsId: 'aws-secret-access-key', variable:
'AWS_SECRET_ACCESS_KEY'),
string(credentialsId: 'aws-session-token', variable:
'AWS_SESSION_TOKEN')
]) {
// Set AWS credentials for CLI
sh '''
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN
export AWS_DEFAULT_REGION=us-east-1

# Lancer l'instance EC2


aws ec2 run-instances --image-id ''' + amiId + ''' --count 1
--instance-type ''' + instanceType + ''' \
--key-name ''' + keyName + ''' --security-group-ids ''' +
securityGroupId + ''' \
--tag-specifications
'ResourceType=instance,Tags=[{Key=Name,Value=''' + tagName + '''}]'
'''
}
}
}
}
}
}
Stage 5

pipeline {
agent { label 'agent1' }

stages {
stage('deploy_server') {
steps {
script {
// Configuration des variables
def ec2PublicIp = '54.235.239.14' // Replace with the EC2 instance
public IP
def keyPath = '/var/lib/jenkins/.ssh/id_rsa' // Path to your SSH key
def s3Bucket = 'jenkinsbucket69'
def s3ObjectKey = 'project.zip'
def ec2User = 'ec2-user' // or 'ubuntu', depending on the AMI used
// Utilisation des identifiants AWS pour accéder au bucket S3
withCredentials([
string(credentialsId: 'aws-access-key-id', variable:
'AWS_ACCESS_KEY_ID'),
string(credentialsId: 'aws-secret-access-key', variable:
'AWS_SECRET_ACCESS_KEY'),
string(credentialsId: 'aws-session-token', variable:
'AWS_SESSION_TOKEN')
]) {
// SSH into the EC2 instance and deploy the server
sh """
ssh -o StrictHostKeyChecking=no -i ${keyPath}
${ec2User}@${ec2PublicIp} << EOF

# Download project zip from S3


export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN}
aws s3 cp s3://${s3Bucket}/${s3ObjectKey} .

# Unzip the project


unzip -o ${s3ObjectKey}

# Start the Python server in the background using nohup


nohup python3 -m http.server 8080 &

EOF
"""
}
}
}
}
}
}

You might also like