90 Days of DevOps Part-I
90 Days of DevOps Part-I
1
PART 1
The Fundamentals
Days 1-3
Previously on...
Before we step into the world of DevOps transformation at NexusCorp,
you need to grasp the core principles that will be your guiding light. In
Part I, we explore the essentials of DevOps—what it is, why it matters,
and how it evolved.
2
Part I: The Fundamentals (Days 1-3) Shubham Londhe
Exercise: Research a case study about a company that failed due to the lack of
DevOps principles. Write down your observations.
DevOps Culture
What makes DevOps culture special? How does it differ from traditional IT
culture?
Exercise: List the DevOps cultural principles that you feel would have helped
prevent NexusCorp's initial problems.
3
PART 2
Unlock the Terminal
Day 4-12
Transition into Linux
4
Part II: Unlock the Terminal Shubham Londhe
Linux is ubiquitous. It's the backbone of vast cloud infrastructures and the
cornerstone of countless data centers. Government organizations rely on
Linux for its robust security features. Space agencies use Linux to interface
with complex systems on Earth and beyond. Even in cybersecurity, Linux
stands as a titan for its versatility and robustness. Financial institutions,
healthcare systems, telecom industries—the list goes on, and I could keep
naming sectors that leverage Linux for its unparalleled capabilities.
What to Expect
This next section offers you a 7-day learning plan designed to take you from
a complete beginner to a confident Linux user. We've broken down the
essential skills into manageable, bite-sized chunks. By the end of this, you'll
not only understand the Linux operating system but also be prepared to take
on more advanced DevOps topics.
Hands-On Learning
This book is committed to a learning-by-doing approach. Each day's lesson
comes with practical exercises and mini-projects that encourage you to get
your hands dirty. It's one thing to read about Linux commands; it's another to
actually execute them.
5
DAY 4
Introduction to Linux and Setting Up
❖ Brief history of Linux and its variants (distros).
❖ Installing Linux on your system through a Virtual Machine or dual boot.
Why: Before you start, you need to understand what you're getting into and
have a system to practice on.
6
Part II: Introduction to Linux and Setting Up (Day 4) Shubham Londhe
DAY 4
Introduction to Linux and Setting Up
❖ Brief history of Linux and its variants (distros).
❖ Installing Linux on your system through a Virtual Machine or dual boot.
Why: Before you start, you need to understand what you're getting into and
have a system to practice on.
Breadth: Cover the pros and cons of using a VM vs dual boot, and briefly
discuss cloud-based Linux solutions.
7
Part II: Introduction to Linux and Setting Up (Day 4) Shubham Londhe
Practical Exercise
❖ Install a Linux distro on your system either through a VM or by setting up a dual
boot.
❖ Write down the steps you took during the installation and any challenges you
faced.
Practical Exercise
❖ Navigate to your home directory and create a new directory called Day1.
❖ Inside Day1, create a text file named intro.txt and write "My first day
learning Linux" in it.
❖ Use commands to find out which directory you're currently in and list the
contents of the directory.
8
Part II: File Operations & Basic Text Manipulation (Day 5) Shubham Londhe
DAY 5
File Operations
❖ touch, mkdir, cp, mv, rm.
❖ Manipulating files and directories.
Why: File operations are your basic verbs in the Linux language; you need
to know them to do almost anything.
Practical Exercise
❖ Create a directory called Day2Files. Inside, generate three empty files named
file1.txt, file2.txt, and file3.txt using touch.
9
Part II: File Operations & Basic Text Manipulation (Day 5) Shubham Londhe
Breadth: Introduce nano and vi, two text editors, mentioning that vi is
generally available on all systems by default, whereas nano is more beginner-
friendly.
Practical Exercise
❖ Use echo to add the text "Linux is great" to a file named opinion.txt.
❖ Use cat to display the contents of opinion.txt.
❖ Open opinion.txt with nano and vi, make some changes and save them.
10
Part II: Permissions and Ownership (Day 6) Shubham Londhe
DAY 6
Permissions and Ownership
❖ Understanding User Roles, Groups, and Permissions.
❖ chmod, chown.
❖ Special Permissions: SUID, SGID, and Sticky Bit
Why: Security and permissions are critical for managing a stable and secure
system, an essential skill for DevOps.
Breadth: Discuss the significance of the root user and how permissions can
affect system stability and security. Understand permission groups: user,
group, and others.
Practical Exercise
❖ Create a file called private.txt. Remove all permissions for group and others.
❖ Create a directory called shared_folder. Give read and execute permissions
to the group and others.
11
Part II: Permissions and Ownership (Day 6) Shubham Londhe
Breadth: Discuss how chown relates to system administration tasks and the
precautions to take while using it.
Practical Exercise
❖ Create a file named ownership.txt. Use chown to change its ownership to another
user on your system (you may need superuser privileges for this).
Breadth: Explain where and when to use these special permissions. Discuss
their implications in shared environments and why they can be both useful
and potentially dangerous.
Practical Exercise
❖ Create a directory called sticky_folder. Set the Sticky Bit on this folder and
verify the permissions.
❖ Create a script called special_script.sh. Set it with SUID and execute the script
as a different user. Observe the behavior.
❖ Create a directory and set it with SGID. Create a file in that directory and
observe the group ownership.
12
Part II: Permissions and Ownership (Day 6) Shubham Londhe
13
Part II: Process Management (Day 7) Shubham Londhe
DAY 7
Process Management
❖ ps, top, kill, nice, and other commands for process monitoring and control
Why: Understanding how to manage processes is crucial for system
optimization and troubleshooting.
Practical Exercise
❖ se ps -e to list all processes and identify their Process IDs (PIDs).
❖ Use ps -u [username] to see all processes started by a particular user.
Breadth: Explain how top is useful for system administrators and DevOps
engineers for real-time monitoring.
14
Part II: Process Management (Day 7) Shubham Londhe
Practical Exercise
❖ Open top and identify the top 3 CPU-consuming processes.
❖ Use the h key in top to display a list of commands you can use within top.
Practical Exercise
❖ Start a long-running command and find its PID. Use kill to terminate it.
❖ Use nice to start a new process with a lower priority, then observe its CPU
utilization compared to other tasks.
15
Part II: Package Management and Software Installation (Day 8) Shubham Londhe
DAY 8
Package Management and Software Installation
❖ apt, yum
❖ Installing, updating, and removing software
❖ Update vs. Upgrade
Why: Managing software is a routine part of system administration, and
understanding it rounds off your basic Linux toolkit.
Breadth: Discuss why package managers are useful for not just installing
software, but also for handling dependencies and updates.
Practical Exercise
❖ Use apt or yum to install a new package.
❖ List all installed packages using apt list --installed or yum list installed.
apt vs yum
Depth: Describe the differences between apt and yum, focusing on syntax,
configuration files, and the underlying package formats (deb vs rpm).
Breadth: Offer scenarios where one might be preferable over the other,
depending on the Linux distribution and specific requirements.
Practical Exercise
❖ Update package lists using apt update or yum update.
❖ Upgrade a specific package using apt upgrade [package_name] or yum
upgrade [package_name].
16
Part II: Package Management and Software Installation (Day 8) Shubham Londhe
Practical Exercise
❖ Perform an 'update' and then an 'upgrade' on your system.
❖ Check the logs to verify the updates and upgrades.
17
Part II: Networking Basics & Text Processing Tools (Day 9) Shubham Londhe
DAY 9
Networking Basics
❖ ping, ifconfig, netstat
❖ Basic network troubleshooting and configuration
Why: Networking is fundamental to the interconnected world of DevOps.
Practical Exercise
18
Part II: Networking Basics & Text Processing Tools (Day 9) Shubham Londhe
Breadth: Discuss the flexibility these tools offer, and how they are often used
together to perform complex text processing tasks efficiently.
Practical Exercise
❖ Use grep to search for a specific string in a text file.
❖ Use awk to process a CSV file and extract specific fields.
❖ Use sed to find and replace text in a file.
19
Part II: Basic Shell Scripting & Task Scheduling (Day 10) Shubham Londhe
DAY 10
Basic Shell Scripting
❖ Writing your first Bash script
❖ Variables, loops, and conditionals
Why: Automation is a big part of Linux and DevOps, and basic scripting
helps you automate tasks.
Task Scheduling
cron and at for automated task scheduling
Why: Automation includes not just scripting but also scheduling tasks to
run unattended.
Breadth: Briefly touch upon other shell scripting languages and their use
cases, but focus on Bash for its ubiquity and ease of use for beginners.
Practical Exercise
❖ Write a basic Bash script that prints "Hello, World!".
❖ Modify that script to use a variable for the greeting.
❖ Write a Bash script that loops through numbers 1 to 10 and prints them.
❖ Create a Bash script that checks if a number is even or odd using conditionals.
20
Part II: Basic Shell Scripting & Task Scheduling (Day 10) Shubham Londhe
Practical Exercise
❖ Schedule a Bash script to run every minute using cron.
❖ Use at to run a script at a specific time in the future.
❖ Create a cron job that runs a script every day at a specific time.
❖ Experiment with scheduling different types of tasks, like sending emails or
running system checks.
RESOURCES
21
Part II: Basic Shell Scripting & Task Scheduling (Day 10) Shubham Londhe
Resource
Pomodoro Timer for Terminal on dev.to
Resource
BashBlaze on GitHub
22
Part II: Filesystem Hierarchy and Disk Management (Day 11) Shubham Londhe
Exercise
❖ Navigate through different directories and identify their purposes.
❖ Practice disk partitioning on a virtual machine.
❖ Monitor disk usage with df and du.
23
Part II: Logs and Monitoring (Day 12) Shubham Londhe
Exercise
❖ Browse through the logs in /var/log and interpret common entries.
❖ Configure rsyslog to handle custom logging.
❖ Use monitoring tools to identify system bottlenecks.
24
PART 3
What's Next?
Up next, we'll delve into the Version Control System, another cornerstone
in the DevOps landscape. But before that, make sure you've got a good
handle on Linux. After all, a well-constructed building needs a solid
foundation.
25
Part III: The Journey Is Far from Over Shubham Londhe
26
Part III: Introduction to Version Control System (Day 13) Shubham Londhe
DAY 13
Introduction to Version Control System
❖ Explanation.
❖ Types of VCS.
Why: Version control systems are a vital part of modern software
development. They allow teams to manage changes to code over time, enabling
collaboration and rollback capabilities.
Like above
While this worked for solo developers, it quickly broke down for teams.
Developers would overwrite each other's changes, leading to lost work and
bugs.
The first version of control systems emerged in the 1970s and 1980s. These
centralized systems stored all code in a central repository. Developers would
pull the latest version, make changes locally, and push those changes back.
While this brought some order, issues remained. If the central repository went
down, no work could be done. And changes from multiple developers could
conflict when pushing.
The rise of distributed version control in the mid-2000s was a game changer.
In these systems, each developer has a full copy of the repository history.
Changes are made on local branches before being merged into the central
repo.
If the central repo fails, any local copy can restore it.
27
Part III: Introduction to Version Control System (Day 13) Shubham Londhe
In short, the story of version control is the story of code itself - constantly
evolving and improving to enable human creativity at scale.
Types of VCS:
Local Version Control Systems (LVCS): These are simple databases that
maintain changes in files on a local system. They're essentially the
ancestors of modern VCS, best suited for solo projects.
28
Part III: Introduction to Version Control System (Day 13) Shubham Londhe
Benefits:
1. Collaboration: Multiple developers can work on
a project without stepping on each other's
toes.
2. History: You can travel back in time, review
changes, and understand the evolution of a
project.
3. Backup: Every copy in DVCS is a full-fledged
repository, reducing risks associated with
failures.
4. Branching and Merging: Facilitates parallel
development and seamless integration.
5. Accountability: With a VCS, you know who made
what change and when.
Practical Exercise
❖ Blog Writing: Share your newfound knowledge with the world! Draft a blog post
elucidating the differences between centralized and distributed version control.
you observe and any interesting or complex merges you come across.
❖ What issues did early version control systems have that distributed systems aimed
to solve?
RESOURCES
29
Part III: Debunking Myths & some key points (Before Day 14) Shubham Londhe
30
Part III: Debunking Myths & some key points (Before Day 14) Shubham Londhe
Additionally, many other platforms, like GitLab and Bitbucket, also utilize
Git for version control but provide different features and interfaces. Being
clear about what Git is (and isn't) allows you to make informed choices
about the platforms you choose to work with.
31
Part III: Setting Up and Basic Commands (Day 14) Shubham Londhe
DAY 14
Setting Up and Basic Commands
❖ Configuring Git.
❖ Basic Commands.
Why: Version control systems are a vital part of modern software
development. They allow teams to manage changes to code over time, enabling
collaboration and rollback capabilities.
Configuring Git:
Before diving into commands, ensure you've set up Git with your personal
configurations. This helps in identifying who is making changes, especially
when collaborating.
Basic Commands:
1. git init: Initializes a new Git repository and starts tracking an existing
directory.
3. git add: Adds changes in the working directory to the staging area.
4. git commit: Captures the state of your repository's tracked files and
prepares them to push to a remote repository.
32
Part III: Setting Up and Basic Commands (Day 14) Shubham Londhe
Practical Exercise
1. Initialize and Commit:
❖ Create a new directory named my_git_project and navigate into it.
❖ Initialize it as a Git repository.
❖ Create a file named readme.md.
❖ Write a brief introduction about yourself in this file.
❖ Add and commit this file to your repository.
2. Clone and Explore:
❖ Clone a public repository from GitHub that you find interesting.
❖ Explore its contents, check the commit history, and examine a few commit
messages.
After doing the above exercises, write a brief blog post or journal entry detailing
your experience. Share what you've learned, what was challenging, and what you're
33
Part III: Remote Repositories, Branching, and Branching Strategies (Day 15) Shubham Londhe
DAY 15
Remote Repositories, Branching, and Branching Strategies
❖ Remote Repositories.
❖ Branching.
Remote Repositories:
Remote repositories allow you to collaborate with others and keep a
backup of your local repository on a remote server. Platforms like GitHub,
GitLab, and Bitbucket offer hosting for remote repositories.
Basic Commands:
git remote: Used to view remote repositories connected to your local
repository.
git push: Pushes changes from the local repository to a remote repository.
git pull: Fetches changes from a remote repository and merges them into
the local repository.
git fetch: Retrieves changes from a remote repository but doesn't merge
them.
Branching:
Branching in Git is a powerful feature that allows developers to work on
different features, fixes, or experiments concurrently without affecting the
main codebase.
Basic Commands:
git branch: Lists all branches in the repository. The active branch is
indicated with an asterisk (*).
34
Part III: Remote Repositories, Branching, and Branching Strategies (Day 15) Shubham Londhe
Practical Exercise
1. Pushing to a Remote:
❖ Fork a public repository on GitHub.
❖ Clone your fork to your local machine.
❖ Make a change in a file, add, and commit it.
❖ Push the change to your fork on GitHub.
2. Branch Creation and Switching:
❖ In the previously cloned repository, create a new branch named feature-x.
❖ Switch between main (or master) and feature-x multiple times to get a feel for it.
3. Simulating Collaboration:
❖ On the feature-x branch, make some changes to a file, add, and commit.
❖ Switch to the main branch and make changes to the same file, then add and commit.
❖ Try merging feature-x into main. Handle any merge conflicts that arise.
4. Experiment with Different Branching Strategies:
❖ Research a branching strategy from the GitKraken guide.
❖ Implement the chosen strategy in a new repository, simulating a workflow with
multiple features and releases.
BONUS
Organize a group activity with friends or classmates. Each member should clone the
same repository, create a feature branch, make changes, and then attempt to merge
them into the main codebase. Discuss merge conflicts and how to resolve them.
RESOURCES
35
Part III: Merging and Handling Conflicts (Day 16) Shubham Londhe
DAY 16
Merging and Handling Conflicts
❖ Merging.
❖ Handling Conflicts.
❖ Why is Understanding Merging and Conflict Resolution Essential?
Merging:
In Git, merging is the process of integrating changes from one branch into
another. It's a primary way of combining the separate lines of
development. When you're done with a feature or fix in a branch, you'll
merge it into your main branch to roll out the change.
Basic Commands:
git merge [branch]: Merges the specified branch into the current branch.
Handling Conflicts:
Sometimes, when you attempt to merge, Git can't automatically combine
the changes because both branches have edited the same line of a file
differently. This results in a merge conflict. Thankfully, Git doesn't just leave
you stranded. It marks the problematic area in the file and asks you to
resolve it manually.
Commands:
1. Open the conflicted file and look for the conflict markers (<<<<<<<,
=======, and >>>>>>>).
2. Decide if you want to keep only your branch's changes, the other branch's
changes, or a combination of both.
3. Once resolved, remove the conflict markers and save the file.
4. Run git add [filename] to mark the conflict as resolved.
5. Complete the merge with git commit.
36
Part III: Merging and Handling Conflicts (Day 16) Shubham Londhe
Practical Exercise
1. Simulate a Merge Conflict:
❖ Clone or use an existing repository.
❖ Create a new branch.
❖ In both the new branch and the main branch, make different changes to the same
line of a file.
❖ Attempt to merge the new branch into the main branch and observe the conflict.
❖ Resolve the conflict.
2. Collaborative Merge Challenge:
ADVANCED CHALLENGE:
RESOURCES
37
Part III: Advanced Rebasing, Git Log, Stash, Reset, and More (Day 17) Shubham Londhe
DAY 17
Advanced Rebasing, Git Log, Stash, Reset, and More
❖ Advanced Rebasing.
❖ Git Log.
❖ Stash.
❖ Reset.
❖ Cherry-Picking.
❖ Other Useful Concepts
Advanced Rebasing:
Rebasing is the process of moving or combining a sequence of commits to
a new base commit. It's a way to integrate changes from one branch into
another, without creating a merge commit.
Basic Commands:
git rebase [branch]: Apply any change from the current branch onto
[branch].
Git Log:
Git log shows a list of commits in a repository. It's a way to see the history
of your repo, allowing you to understand the changes and navigate
through them.
Commands:
git log: Show commit logs.
git log --oneline: Show commit logs in a concise format.
Stash:
Stashing takes the changes of the working directory and saves them for
later, allowing you to switch branches without committing your changes.
Commands:
git stash save "message": Save changes with a descriptive message.
git stash list: List stashed changes.
git stash applies: Apply the latest stashed changes.
38
Part III: Advanced Rebasing, Git Log, Stash, Reset, and More (Day 17) Shubham Londhe
Reset:
Git reset is used to undo changes in your working directory that haven't
been committed yet.
Basic Commands:
git reset [commit]: Move the current branch tip to [commit].
git reset --hard [commit]: Move the current branch tip to [commit] and
match the working directory.
Cherry-Picking:
Cherry-picking in Git means to choose a commit from one branch and
apply it onto another.
Commands:
git cherry-pick [commit]: Apply changes introduced by the named
commit.
39
Part III: Advanced Rebasing, Git Log, Stash, Reset, and More (Day 17) Shubham Londhe
Practical Exercise
1. Rebase Playground:
❖ Create a new feature branch and make a few commits.
❖ Go to the main branch and make a few commits.
❖ Now, rebase the feature branch onto the main branch.
2. Stashing Challenge:
❖ Commit three changes sequentially.
❖ Use git reset to go back to the first commit, simulating the need to undo the last
two changes.
3. Reset Game:
❖ Commit three changes sequentially.
❖ Use git reset to go back to the first commit, simulating the need to undo the last
two changes.
4. Cherry-Picking Exercise:
❖ Commit a change in one branch.
❖ Use git cherry-pick to apply that change to another branch.
40
Part III: Engage, Reflect & Share: Best Practices and... (Day 18) Shubham Londhe
DAY 18
Engage, Reflect & Share: Best Practices and Collaboration
in Git
❖ Objective.
❖ Tasks: Reflect on Your Git Journey.
❖ Tasks: Design Your Git Cheat Sheet.
❖ Tasks: Learning in Public: Share on LinkedIn.
❖ Tasks: Engage with the Community.
❖ Why This Exercise?
Objective:
By the end of this session, learners will have a better understanding of
their own unique journey with Git and will have shared their insights, tools,
and best practices with a wider audience, fostering community learning
and collaboration.
41
Part III: Engage, Reflect & Share: Best Practices and... (Day 18) Shubham Londhe
42
PART 4
What's Next?
We'll dive deep (quite literally, to the ocean floors) and fly high (through
the satellite-studded skies) to explore how data packets navigate this
vast expanse.
43
Part IV: Networking Roadmap for DevOps and Cloud Professionals (Day 19) Shubham Londhe
1. Introduction to
Networking
❖ What is Networking?
❖ Understanding Data Packets
2.Foundational Concepts
❖ IP Addressing and Subnetting
❖ Ports and Protocols
❖ OSI and TCP/IP Models
3.Network DeviceS and
Infrastructure
❖ Routers, Switches, and Hubs
❖ Ethernet vs. Wi-Fi
❖ Network Topologies
4.Domain Name System (DNS)
❖ How DNS Works?
❖ Importance of DNS in Web Traffic.
5.Dynamic Host
Configuration Protocol
(DHCP)
❖ Role of DHCP in IP Assignment.
❖ DHCP Lease Process.
44
Part IV: Networking Roadmap for DevOps and Cloud Professionals (Day 19) Shubham Londhe
6.Network Protocols
❖ Transmission Control Protocol
(TCP) vs. User Datagram Protocol
(UDP).
45
Part IV: Networking Roadmap for DevOps and Cloud Professionals (Day 19) Shubham Londhe
15.Software-Defined
Networking (SDN)
❖ Introduction to SDN
❖ Benefits and Use Cases.
16.Network Automation
❖ The Need for Automation in Networking.
❖ Basics of Infrastructure as Code (IaC).
17.Internet Protocols
❖ Understanding IPv4 vs. IPv6
❖ IP Address Allocation and
Management
19.Content Delivery
Networks (CDN)
❖ Understanding CDNs and Their Role
in Web Traffic.
46
Part IV: Networking Roadmap for DevOps and Cloud Professionals (Day 19) Shubham Londhe
RESOURCES
47
PART 5
What's Next?
Learning Python will arm you with a tool that can
Simplify Complex Tasks:
Enhance Problem Solving:
Bridge Communication Gaps:
Future-proof Your Career:
48
Part V: Building Logic with Python:Why It Matters for DevOps (Day 20) Shubham Londhe
DAY 20
Building Logic with Python:Why It Matters for DevOps
❖ Introduction.
❖ Simplify Complex Tasks:
❖ Enhance Problem Solving:
❖ Bridge Communication Gaps:
❖ Future-proof Your Career:
Introduction:
Firstly, what is logic in the context of programming? It's the ability to think
systematically, anticipate outcomes, and design sequences that produce
desired results. When you write a program, you're creating a series of
steps, a logical flow, for the computer to follow.
Now, transpose that concept to the vast world of DevOps. Here, every day
is about creating, managing, and optimizing workflows—be it deploying
applications, automating infrastructure, or ensuring CI/CD pipelines run
flawlessly. At its heart, DevOps is about efficient and effective workflows,
and what is a workflow if not a logical sequence?
49
Part V: Building Logic with Python:Why It Matters for DevOps (Day 20) Shubham Londhe
NOTE
50
Part V: Introduction to Python and Tasks (Day 21) Shubham Londhe
DAY 21
Introduction to Python and Tasks
❖ Introduction to Python.
❖ Tasks for the Day.
Introduction:
Python is an interpreted, high-level, general-purpose programming
language. Due to its simplicity and readability, it's become one of the most
popular languages for a wide range of tasks, including DevOps.
2. Subtask:
❖ After installation, open your terminal or command prompt and type python --
version to check the installed version.
❖ Study Data Types: Python has several built-in data types like integers, float
(decimal), string (text), list, tuple, etc.
❖ Spend some time getting familiar with them. The official Python documentation
is a good starting point.
4. Exploration:
❖ Spend some time exploring the Python interactive shell.
❖ Simply type python in your terminal or command prompt, and you'll enter a mode
where you can type Python commands directly.
❖ Try doing some basic arithmetic or define simple variables. To exit, you can
type exit().
51
Part V: Introduction to Python and Tasks (Day 21) Shubham Londhe
NOTE
52
Part V: Python Fundamentals and Tasks (Day 22) Shubham Londhe
DAY 22
Python Fundamentals and Tasks
❖ Python Fundamentals.
❖ Tasks for the Day.
Introduction:
Python, while known for its simplicity, has powerful features that allow for
complex operations and logic. This chapter will cover basic programming
constructs that are essential for any DevOps engineer.
2. Control Structures:
❖ These are the building blocks of any program. Begin with if, else, and elif.
❖ Subtask: Write a script that checks if a number is positive, negative, or zero.
3. Loops:
❖ Learn about the two main types of loops in Python: for and while.
❖ Subtask: Write a script that prints numbers from 1 to 10 using both types of
loops.
4. Functions:
❖ Functions allow for code reuse. Explore the def keyword to create a function.
❖ Subtask: Create a function named greet that takes a name as a parameter and prints
"Hello, [name]!".
53
Part V: Python Fundamentals and Tasks (Day 22) Shubham Londhe
8. Exploring Errors:
❖ Mistakes happen. When you encounter an error, Python will throw an exception.
❖ Subtask: Deliberately make a syntax error in your code (like missing a closing
parenthesis).
❖ Note the type of error and the message Python gives you.
❖ This will help you in troubleshooting in the future.
NOTE
54
Part V: Python Libraries, Cloud Integration, and Data Parsing (Day 23) Shubham Londhe
DAY 23
Python Libraries, Cloud Integration, and Data Parsing
❖ Introduction.
❖ Tasks for the Day.
Introduction:
Python offers an extensive ecosystem of libraries that can simplify complex
tasks. Coupled with the cloud and data parsing abilities, Python becomes
an indispensable tool for the modern DevOps professional.
❖ Subtask: Visit the YouTube resource provided and watch the introductory videos on
Python libraries.
❖ Subtask: Use the pip command to install the requests library: pip install
requests.
❖ Subtask: Skim through the Hashnode blog you shared to familiarize yourself with
the integration of Python with cloud technologies.
❖ Subtask: Extend the script to modify the data and write it back to the JSON file.
55
Part V: Python Libraries, Cloud Integration, and Data Parsing (Day 23) Shubham Londhe
❖ Subtask: Explore Python's csv module and write a script to read a mock CSV file,
❖ Subtask: Pick one library from the blog post, such as boto3 (for AWS) or docker,
and install it using pip. Skim through its documentation to understand its basic
functionalities.
6. Final Reflection:
❖ After diving deep into these topics, reflect on your learning journey.
❖ Subtask: List down three key takeaways from today's tasks. It can be something you
found interesting, something challenging, or any new idea you might have
encountered.
RESOURCES
NOTE
56
PART 6
Containerization and
Automation with
Docker and Jenkins
In the realm of DevOps, learning Docker and automation is
vital. Docker streamlines application deployment, while
automation ensures efficiency and reliability, enabling faster
delivery and improved collaboration in the fast-paced world of
software development and IT operations.
57
Part VI: Containerization and Automation with Docker and Jenkins (Day 24) Shubham Londhe
DAY 24
Containerization and Automation with Docker and Jenkins
❖ Docker: Simplifying Containerization.
❖ What is a Container?
❖ Introduction to Docker in Layman's Language.
❖ What is an Image?
What is a Container?
Imagine you're trying to move your house. Instead of packing each item
individually and hoping they all fit well together in the truck, wouldn't it be
easier to pack an entire room into a giant box and transport that? When
you reach your new home, you just place these room-sized boxes where
they belong. Your house is set, without the hassles of figuring out which
item goes where.
Now, while Docker is arguably the most popular 'company' providing this
container service, it isn't the only one. There are other 'movers' in town. For
instance, Podman is another tool that allows you to manage containers.
Just like Docker, Podman lets you create, deploy, and run applications in
containers.
58
Part VI: Containerization and Automation with Docker and Jenkins (Day 24) Shubham Londhe
What is an Image?
Let's stretch our analogy a tad more. Before packing up your room into a
container, you take a photo of the room. This photo captures everything—
the arrangement, the items, the color of the walls. If ever in the future, you
want to recreate this exact room, you just look at the photo and set it up.
❖ This will help solidify your understanding and ensure you've grasped the basics.
2. Real-world Analogy Creation:
❖ The house-moving analogy was one perspective.
❖ Can you think of another real-world analogy that describes the relationship
between Docker, Containers, and Images?
include:
59
Part VI: Docker Installation and some key points (Day 25) Shubham Londhe
DAY 25
Docker Installation and some key points
❖ Docker Installation.
❖ Note on Using Linux (EC2 Instance).
❖ Post-Installation Steps for Linux.
❖ Why Add Your User to the Docker Group?
❖ Tasks.
Docker Installation:
Docker is a platform-independent tool, which means it can be run on
various operating systems. However, the installation process varies slightly
depending on the OS. Let's walk through the primary installation methods
for the most popular platforms.
60
Part VI: Docker Installation and some key points (Day 25) Shubham Londhe
NOTE
Practical Exercise
1. Verify Docker Installation:
❖ Run the following command in your terminal or command prompt:
-> docker --version
❖ This command will display the installed version of Docker. Make sure the output
aligns with the version you installed.
61
Part VI: Docker Installation and some key points (Day 25) Shubham Londhe
Practical Exercise
2. Test Docker with Hello World:
❖ Docker provides a simple "Hello World" container, which, when run, sends a
Hello World message to your screen. This is a great way to confirm that Docker can
❖ Upon execution, Docker will attempt to find the "hello-world" image locally. If
it's not present (which it won't be the first time you run this), Docker will
fetch the image from Docker Hub and then run it. You should see a message
❖ This will display a list of images, and you should see the hello-world image
listed.
❖ The more you interact with Docker commands, the more comfortable and efficient
you'll become. Don't hesitate to explore further, read the Docker documentation,
62
Part VI: Delving into Docker Images and Containers (Day 26) Shubham Londhe
DAY 26
Delving into Docker Images and Containers
❖ Docker Images: The Blueprints of Containers.
❖ Running and Interacting with Containers.
❖ Common Commands and Their Nuances.
Practical Exercise
1. Verify Docker Installation:
❖ Pull a Basic Image: To fetch an image from Docker Hub, use the pull command.
-> docker pull ubuntu:latest
❖ This will display all images you've pulled or built. You should see the ubuntu
image you just pulled.
63
Part VI: Delving into Docker Images and Containers (Day 26) Shubham Londhe
Practical Exercise
1. Verify Docker Installation:
❖ Run and Interact with an Ubuntu Container:
-> docker run -it ubuntu /bin/bash
❖ This command initializes a container from the Ubuntu image and lets you
interact with it using the bash shell. The -it flags allow interactive processes
❖ Inside the Container: Now, you're inside the container's environment. Try these
01
commands:
-> echo "Hello from inside the container!" - Outputs the provided string.
64
Part VI: Delving into Docker Images and Containers (Day 26) Shubham Londhe
Practical Exercise
1. Verify Docker Installation:
❖ Run Multiple Containers: Try pulling and running other images like nginx,
alpine, or httpd. Remember to use docker ps to manage your running containers.
❖ Interact with Multiple Containers: Run two different containers and switch
between their interactive shells.
❖ Clean-Up Exercise: Pull a few images, run containers from them, stop those
containers, and then remove both containers and images. This will familiarize you
❖ Remember, these exercises will help you understand the practical side of
Docker. The more scenarios you experience, the better equipped you'll be in real-
world applications.
65
Part VI: Into Docker Networking (Day 27) Shubham Londhe
DAY 27
Into Docker Networking
❖ Introduction.
❖ Docker Networking Basics.
❖ Networking with standalone containers | Docker Docs.
❖ Tasks.
Introduction:
Docker, at its core, is about creating isolated environments for your
applications. But often, these isolated containers need to talk to each
other, to the host machine, or to the external world. Today, we dive into
how Docker handles networking and data storage.vice versa
NOTE
66
Part VI: Into Docker Networking (Day 27) Shubham Londhe
When using Docker Compose, services within the same Compose file are
automatically connected to a default network, allowing them to communicate.
You can also define custom networks that only some services are connected
to.
TASKS
Practical Exercise
1. Communication Between Containers:
❖ Run two containers inside the same custom bridge network.
❖ Install ping or curl utilities and test communication using container names.
2. Isolation:
❖ Run two containers on different networks and attempt communication.
❖ Observe the isolation provided by Docker's network modes.
3. Exploring Docker Network Drivers:
❖ Apart from bridge and host, explore other network drivers like overlay and
macvlan.
NOTE
68
Part VI: Docker Data Management - Volumes and Bind Mounts (Day 28) Shubham Londhe
DAY 28
Docker Data Management - Volumes and Bind Mounts
❖ Introduction.
❖ Understanding Docker's Storage Mechanics.
❖ Tasks.
Introduction:
For containers to be truly portable, they shouldn’t store data within them.
This is where Docker volumes come in. They provide the ability to store and
manage data outside of the container's lifecycle.
Bind Mounts: This relies on the directory structure of the host machine. A
directory (or file) on the host system is mounted into a container. It's
suitable for specific scenarios, such as developing applications where you
need to reflect changes immediately inside the container.
69
Part VI: Docker Data Management - Volumes and Bind Mounts (Day 28) Shubham Londhe
Practical Exercise
1. Create a Volume:
❖ -> docker volume create myvolume
2. Inspect the Volume:
❖ -> docker volume inspect myvolume
3. Run a Container with Volume Attached:
❖ -> docker run -d --name=mycontainer -v myvolume:/app nginx
4. Bind Mount a Host Directory:
❖ -> docker run -d --name=mybindcontainer -v
-> /path/on/host:/path/in/container nginx
70
Part VI: Docker Data Management - Volumes and Bind Mounts (Day 28) Shubham Londhe
KEY POINTS
NOTE
71
Part VI: Docker Compose - Orchestrating... (Day 29) Shubham Londhe
DAY 29
Docker Compose - Orchestrating Multi-Container
Applications
❖ Introduction.
❖ Understanding Docker Compose.
❖ Tasks.
❖ Real-World Task: WordPress Deployment.
Introduction:
In many real-world scenarios, applications aren't just a single container
but rather a set of interlinked containers that work together. For instance,
a web application might consist of a web server, a database, and a
cache. Managing such multi-container setups individually can be
cumbersome. Docker Compose offers a solution by allowing you to define
and run multi-container Docker applications.
Practical Exercise
1. Install Docker Compose:
❖ Depending on your OS, follow Docker's official documentation to install Docker
Compose.
72
Part VI: Docker Compose - Orchestrating... (Day 29) Shubham Londhe
Practical Exercise
3. Run the Compose File:
❖ -> docker-compose up
4. Stop the Compose Services:
❖ -> docker-compose down
like this:
73
Part VI: Docker Compose - Orchestrating... (Day 29) Shubham Londhe
Practical Exercise
❖ Docker Compose Commands: Familiarize yourself with commands such as docker-
compose start, docker-compose stop, docker-compose ps, and docker-compose logs.
NOTE
74
Part VI: Building Custom Docker Images - Dockerfile Basics (Day 30) Shubham Londhe
DAY 30
Building Custom Docker Images - Dockerfile Basics
❖ Introduction to Dockerfile.
❖ Tasks.
❖ Real-World Task: Customized Database Image.
Introduction:
Dockerfile: It's a textual script that contains all the commands a user
would call to assemble an image. Using the docker build command,
Docker reads these instructions and constructs a final image.
Topics Covered:
Understanding Base Images: Starting point for building your image.
Dockerfile Directives: FROM, RUN, CMD, ADD, COPY, EXPOSE,
WORKDIR, ENV, and others.
Layering in Docker Images: Every instruction in a Dockerfile creates
a new layer in the image.
Tasks:
75
Part VI: Building Custom Docker Images - Dockerfile Basics (Day 30) Shubham Londhe
Tasks:
2. Building the Image: Use the following command to build your
Docker image:
❖ -> docker build -t flask-app:latest .
❖ You'll need SQL scripts to set up the database schema and load data. Create a
Dockerfile that starts from the official MySQL image, adds your scripts, and
Hands-On Exercises:
❖ Optimizing Image Size: Explore using different base images, such as Alpine
variants, to reduce the size of your final image. Compare the sizes using docker
images.
compiled languages.
❖ Tagging and Pushing Images: Tag your custom images and push them to Docker Hub
or another registry. Understand the importance of tagging in versioning and
distribution.
76
Part VI: Consolidation and Creation of a Docker Cheat Sheet (Day 31) Shubham Londhe
DAY 31
Consolidation and Creation of a Docker Cheat Sheet
❖ Tasks.
❖ Bonus Task.
Tasks:
❖ For today's exercise, your mission is to create a comprehensive Docker cheat
sheet. This is a summary of all the important Docker commands and concepts you've
learned over the past days. This task serves multiple purposes:
1. Reinforcement:
❖ By revisiting all the Docker commands and concepts, you're reinforcing your
memory and understanding.
2. Quick Reference:
❖ Once you've made your cheat sheet, you can refer back to it anytime you need
to quickly remember a Docker command or concept.
NOTE
Bonus Tasks:
❖ If you're up for a challenge, try to make your cheat sheet interactive. For
instance, you could create a web page where clicking on a Docker command displays
77
PART 7
78
Part VII: CI/CD - The Heartbeat of DevOps (Day 32) Shubham Londhe
DAY 32
CI/CD - The Heartbeat of DevOps
❖ Understanding CI/CD - The Foundation of Modern DevOps.
❖ Key Concepts in CI/CD.
❖ Benefits of CI/CD:
❖ CI/CD in the Real World:
❖ Stepping into Jenkins - The Automation Maestro
❖ Setting Up and Getting Started with Jenkins:
❖ Your First Dive into Jenkins:
❖ Jenkins in the CI/CD World:
79
Part VII: CI/CD - The Heartbeat of DevOps (Day 32) Shubham Londhe
Tasks/Exercises:
1. Identify Stages in Your Workflow: Reflect on a past project or an ongoing one.
Can you identify stages that can be integrated into a CI/CD pipeline?
2. Tool Research: There are many tools in the CI/CD ecosystem. Make a list of
tools you've heard of or used (like Jenkins, Travis CI, CircleCI) and write a
80
Part VII: CI/CD - The Heartbeat of DevOps (Day 32) Shubham Londhe
81
Part VII: Building and Managing Jobs in Jenkins (Day 33) Shubham Londhe
DAY 33
Building and Managing Jobs in Jenkins
❖ Understanding Jenkins Jobs.
❖ Managing and Monitoring Jobs.
❖ Tasks/Exercises.
82
Part VII: Building and Managing Jobs in Jenkins (Day 33) Shubham Londhe
Tasks/Exercises:
RESOURCES
83
Part VII: Jenkins Integrations and Plugins (Day 34) Shubham Londhe
DAY 34
Jenkins Integrations and Plugins
❖ Introduction to Jenkins Plugins.
❖ Essential Jenkins Plugins.
❖ Integrating Jenkins with Version Control Systems (VCS).
84
Part VII: Jenkins Integrations and Plugins (Day 34) Shubham Londhe
Tasks/Exercises:
1. Plugin Exploration:
❖ Navigate to the 'Available' tab under 'Manage Plugins'. Browse and familiarize
yourself with some of the popular plugins available.
3. Notification Setup:
❖ Install a notification plugin (like Slack Notification). Integrate it with a
Slack workspace and set Jenkins to send a message upon successful builds.
RESOURCES
NOTE
85
Part VII: Integrating Jenkins with Docker (Day 35) Shubham Londhe
DAY 35
Integrating Jenkins with Docker
❖ Why Integrate Jenkins with Docker?
❖ Setting up Jenkins inside a Docker Container.
❖ Integrating Jenkins with Version Control Systems (VCS).
❖ Running Jenkins in Docker.
❖ Granting Docker Access.
1. You can run Jenkins itself inside a Docker container. This ensures that
Jenkins always runs in the same environment.
86
Part VII: Integrating Jenkins with Docker (Day 35) Shubham Londhe
Tasks/Exercises:
RESOURCES
NOTE
87
Part VII: Jenkins Declarative Pipelines: Deep Dive (Day 36) Shubham Londhe
DAY 36
Jenkins Declarative Pipelines: Deep Dive
❖ Introduction to Jenkins Pipelines.
❖ Declarative vs. Scripted Pipelines.
❖ Advantages of Declarative Pipelines.
❖ Creating Your First Declarative Pipeline.
❖ Tasks/Exercises.
1. Declarative Pipelines:
Structured and Simplified: Use a more rigid structure and offer a simpler,
more straightforward syntax.
Jenkinsfile: Typically begins with a pipeline block.
Syntax: Each section of the pipeline is predefined. You can't write
arbitrary Groovy code.
2. Scripted Pipelines:
Flexible and Complex: They are based on a subset of the Groovy scripting
language. Hence, you can use most Groovy language features.
Jenkinsfile: Starts with a node block.
Syntax: More flexibility, but with greater complexity.
88
Part VII: Jenkins Declarative Pipelines: Deep Dive (Day 36) Shubham Londhe
RESOURCES
89
Part VII: Jenkins Declarative Pipelines: Deep Dive (Day 36) Shubham Londhe
Tasks/Exercises:
2. Pipeline Visualization:
❖ After running the job, view the Blue Ocean visualization. Notice how each
stage is visually represented.
3. Introduce Errors:
❖ Modify the Pipeline script to introduce errors intentionally, like a missing
steps block. Use the built-in linter to validate the script and notice the
feedback.
RESOURCES
90
Part VII: Scaling Jenkins: Master-Agent Architecture (Day 37) Shubham Londhe
DAY 37
Scaling Jenkins: Master-Agent Architecture
❖ Jenkins Master (Server).
❖ Jenkins Agent.
❖ Why Master-Agent Architecture?
❖ Pre-requisites for Setting up an Agent.
Jenkins Agent:
Agents are the executors. They do the heavy lifting:
Execution Role: Performs the actual steps defined in the job. From
cloning repositories, building the code, to tests and deployments.
Labelled Identity: Each agent has a unique label to distinguish it,
helping direct specific tasks or builds to particular agents.
Distribution: By adding more agents, Jenkins can distribute the
workload, running multiple jobs in parallel across different
environments.
91
Part VII: Scaling Jenkins: Master-Agent Architecture (Day 37) Shubham Londhe
Tasks/Exercises:
❖ Name your node, select 'Permanent Agent', and specify the number of executors.
❖ Enter the SSH key details and ensure the master can communicate with the
agent.
❖ Trigger the jobs and monitor their execution on the new agent.
3. Agent Environment Validation:
❖ Create a new Jenkins job.
❖ Design the job to print the environment variables (env command) and
installed software versions (like java -version and docker --version).
92
ENDING OF THE PART ONE
93
Engage with your community!
Post updates, ask questions, and
share your progress using the
#90DaysOfDevOps hashtag.