Lab # 3
Lab # 3
Objectives:
1. Use your existing knowledge of Linux command line and C programming (PF, OOPs, Data Structures) to
learn linux bash shell scripting to automate basic Linux jobs.
2. You should be able to write an error free Linux bash shell script given a problem description.
Lab Tasks:
Strictly following the following content delivery strategy. Ask students to take notes during the lab.
1st Hour
- Pre-Lab (up to 15 minutes)
- Explain importance and history of shell scripting. Command-line vs Shell Scripting. (15 minutes)
- Ask students to type of run Task # 1 (15 minutes). Observe their weaknesses and back scripting command
which you can cover from Handout # 1 and/or Handout # 2.
2nd Hour
- Cover Handout # 2. Few of you can cover details from Handout # 1 (keep this session within 45 minutes).
- Ask students to type and execute Task # 2 (15 minutes)
3rd Hour
- Devote full hour doing the in-lab problem. (60 minutes)
** ChatGPT is heavily used to make the contents of this document along with other Internet sources.
Lab
Lab # 3 Manual
EXPERIMENT 3
Creating, Executing LINUX Bash Shell Script
With the widespread adoption of Linux, shell scripting became an essential tool for automating tasks, executing
multiple commands sequentially, and performing various system administration tasks. Bash scripting became
the most used scripting language on Linux systems due to its availability and compatibility with POSIX
standards. A Linux Bash shell script is a text file containing a series of commands written in the Bash
scripting language. Bash (Bourne Again Shell) is a popular command-line interpreter and scripting language
for Unix-like operating systems, including Linux. Bash scripts can incorporate control structures such as
loops and conditional statements, variables, functions, and command-line arguments to enhance their
functionality.
1. Bourne Shell (sh): - Became the de facto standard shell for Linux
- Developed by Stephen Bourne at AT&T Bell systems due to its ubiquity and powerful scripting
Laboratories in the early 1970s. capabilities.
- Became the default shell for Unix systems.
- Provided basic scripting capabilities with loops, 5. Other Shells:
conditionals, and command execution. - Several other shells exist, including tcsh (an
enhanced version of C Shell), zsh (Z Shell), and fish
2. C Shell (csh): (Friendly Interactive Shell), each with its own
- Developed by Bill Joy at the University of features and syntax.
California, Berkeley, in the late 1970s.
- Featured a C-like syntax and interactive features
such as command history and job control.
- Popular among users who preferred its
interactive features.
Page 2 of 6
Lab
Lab # 3 Manual
#!/bin/bash
SRC_DIR=/path/to/source/directory
DST_DIR=/path/to/backup/directory
if [ ! -d "$DST_DIR" ]; then
mkdir -p "$DST_DIR"
fi
if [ ! -d "$SRC_DIR" ]; then
echo "Error: Source directory does not exist"
exit 1
fi
Description
1. Variable Declarations:
- `SRC_DIR=/path/to/source/directory`: Defines a variable `SRC_DIR` containing the path to the source
directory.
- `DST_DIR=/path/to/backup/directory`: Defines a variable `DST_DIR` containing the path to the backup
directory.
4. Error Handling:
- `if [ ! -d "$SRC_DIR" ]; then`: Checks if the source directory (`$SRC_DIR`) does not exist.
- `echo "Error: Source directory does not exist"`: If the source directory does not exist, prints an error message.
- `exit 1`: Exits the script with a non-zero status code (1) to indicate an error.
Page 3 of 6
Lab
Lab # 3 Manual
#!/bin/bash
process_data() {
input_file=$1
output_file=$(cut -f 1,3 $input_file | grep 'foo' | sort -n)
echo $output_file
}
for file in /path/to/files/*.txt; do
processed_data=$(process_data $file)
echo $processed_data > "${file}_processed.txt"
done
Description
1. Function Definition (`process_data`):
- `process_data() { ... }`: Defines a function named `process_data` for processing input data.
2. Function Argument:
- `input_file=$1`: Assigns the first argument passed to the function (`$1`) to the variable `input_file`.
3. Data Processing:
- `output_file=$(cut -f 1,3 $input_file | grep 'foo' | sort -n)`: Processes the data from the input file using a
series of command-line tools:
- `cut -f 1,3 $input_file`: Extracts the first and third fields from the input file using the `cut` command.
- `grep 'foo'`: Filters the extracted data to include only lines containing the string 'foo' using the `grep`
command.
- `sort -n`: Sorts the filtered data numerically using the `sort` command.
6. Function Call:
- `processed_data=$(process_data $file)`: Calls the `process_data` function with the current file as an
argument (`$file`) and captures the output in the variable `processed_data`.
Page 4 of 6
Lab
Lab # 3 Manual
In-Lab
Consider the following log file data entries:
Now you are tasked with creating a shell script to analyze log files generated by a web server. The script
should perform the following tasks:
Sample solution:
#!/bin/bash
# Log File Analyzer
# Accept log file as input
log_file="$1"
# Count total number of requests
total_requests=$(wc -l < "$log_file")
# Determine number of unique IP addresses
unique_ips=$(awk '{print $1}' "$log_file" | sort -u | wc -l)
# Identify top 5 most frequent IP addresses
top_ips=$(awk '{print $1}' "$log_file" | sort | uniq -c | sort -nr | head -5)
# Calculate total size of data transferred
total_size=$(awk '{sum += $10} END {print sum}' "$log_file")
# Generate summary report
echo "Log File Analysis Report" > analysis_report.txt
echo "-------------------------" >> analysis_report.txt
echo "Total Requests: $total_requests" >> analysis_report.txt
echo "Unique IP Addresses: $unique_ips" >> analysis_report.txt
echo "Top 5 IP Addresses:" >> analysis_report.txt
echo "$top_ips" >> analysis_report.txt
echo "Total Size of Data Transferred: $total_size bytes" >> analysis_report.txt
echo "Report generated on: $(date)" >> analysis_report.txt
Page 5 of 6
Lab
Lab # 3 Manual
Post-Lab
You are tasked with creating a bash script that renames multiple files in a directory according to a
specified naming convention. The script should:
a. Accept two arguments: the directory path containing the files and the new file name pattern.
b. Rename each file in the directory by appending a sequential number to the new file name
pattern (e.g., `file1.txt`, `file2.txt`, etc.).
c. Preserve the original file extension during the renaming process.
d. Provide feedback to the user about the renaming process, including any errors encountered.
Develop a bash script to automate directory cleanup tasks by removing old files and directories. The
script should:
Create a bash script to monitor system resources and generate a report. The script should:
a. Collect information about CPU usage, memory usage, disk space, and network traffic.
b. Calculate average values for each resource over a specified time period.
c. Generate a report containing the collected data and average values.
d. Provide options for the user to customize the time period and output format of the report.
Page 6 of 6