nm lab ex
nm lab ex
No No Awarded Signature
10
11
12
CONTENTS
MODULE I
INTRODUCTION TO SECURITY PRINCIPLES IN CLOUD
COMPUTING
EXP.NO :01 Create a VPC using Cloud Shell
MODULE II
Strategies for Cloud Security Risk Management
MODULE III
Cloud Security Risks: Identify and Protect Against Threats
EXP.NO:04
Access a firewall and create a rule
EXP.NO:05
Identify vulnerabilities and remediation techniques
EXP.NO:06
Change firewall rules using Terra form and Cloud Shell
EXP.NO:07
Create symmetric and asymmetric keys
EXP.NO :08 Determine the difference between normal activity and an incident
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_ copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials. Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left
Cloud Shell is an online development and operations environment accessible anywhere with your
browser. Cloud Shell provides command-line access to your Google Cloud resources.
1. Click Activate Cloud Shell ( ) at the top right of the Google Cloud console. You may
be asked to click Continue.
After Cloud Shell starts up, you'll see a message displaying your Google Cloud Project ID for
this session:
content_ copy
3. A pop-up will appear asking you to Authorize Cloud Shell. Click Authorize.
4. Your output should now look like this:
5. Output:
ACTIVE: *
ACCOUNT: [email protected]
To set the active account, run:
$ gcloud config set account `ACCOUNT`
5. List the project ID with this command:
gcloud config list project
Copied!
content_ copy
Example output:
[core]
project = qwiklabs-gcp-44776a13dea667a6
Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview
guide.
There are two types of VPC networks you can choose to create depending on your subnet
requirements. You can choose to create an auto mode or a custom mode VPC network. An auto
mode VPC automatically creates a subnet in each region for you while a custom mode VPC
provides you with the control to manually create subnets. Each new network that you create must
have a unique name within the same project. You can create up to four additional networks in a
project.
content_copy
NAME: labnet
SUBNET_MODE: CUSTOM
BGP_ROUTING_MODE: REGIONAL
IPV4_RANGE:
GATEWAY_IPV4:
Click Check my progress to verify that you have completed this task correctly.
Create a network
Check my progress
When you create a subnet, its name must be unique in that project for that region, even across
networks. The same name can appear twice in a project as long as each one is in a different
region. Additionally, each subnet must have a primary IP address range, which must be unique
within the same region in a project.
content_copy
This command creates a sub-network called labnet-sub.
2. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.
Create a subnet
Check my progress
content_copy
2. Press ENTER.
The output should list the default and labnet networks.
The default network was created when the project was created. The labnet network was created
by the gcloud command you ran earlier.
Custom
Default
Auto
Submit
You can either list all subnets in all networks in your project, or you can show only the subnets
for a particular network or region. Auditing subnets ensures that the network is properly secured,
and helps identify any misconfigurations or potential security vulnerabilities in your VPCs, such
as subnets that might be unintentionally exposed to the public internet.
content_copy
2. Press ENTER.
What is the name of the subnet in the labnet network?
labnet
default
labnet-sub
Submit
Conclusion
Great work! By completing this lab activity, you now have hands-on experience in setting up a
test VPC network and subnet. This is the first step of creating a test environment which will help
you to eventually secure the production environment that will need to protect company data.
Thereafter, you were able to confirm the network and subnet's successful creation.
Through observing the network and its subnetworks in the testing environment, you can gather
significant data for research. This data is highly beneficial when configuring and creating
security plans for the production environment.
Exp. no: 02 Use reports to remediate findings
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
Public bucket ACL (PUBLIC_BUCKET_ACL): This entry indicates that there is an Access
Control List (ACL) entry for the storage bucket that is publicly accessible which means that
anyone on the internet can read files stored in the bucket. This is a high-risk security
vulnerability that needs to be prioritized for remediation.
Bucket policy only disabled (BUCKET_POLICY_ONLY_DISABLED): This entry indicates
that uniform bucket-level permissions are not enabled on a bucket. Uniform bucket-level access
provides a way to control who can access Cloud Storage buckets and objects, simplifying how
you grant access to your Cloud Storage resources. This is a medium-risk vulnerability that must
also be remediated.
Bucket logging disabled (BUCKET_LOGGING_DISABLED): This entry indicates that there
is a storage bucket that does not have logging enabled. This is a low-risk vulnerability that you
are not required to remediate in this scenario.
Note: If the Public bucket ACL or Bucket policy only disabled are not listed or don't display
any active findings, you may have to wait a few minutes and refresh. Wait until these
vulnerabilities display active findings before continuing.
3. In the Security Command Center menu, click Compliance. The Compliance page
opens.
4. In the Google Cloud compliance standards section, click View details in the CIS
Google Cloud Platform Foundation 2.0 tile. The CIS Google Cloud Platform
Foundation 2.0 report opens.
5. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
Which of the following rules in the report have active findings for the Cloud Storage bucket?
Select all that apply.
Firewall rules should not allow connections from all IP addresses on TCP or SCTP port 22
Submit
Note: If the active findings for the Public bucket ACL or Bucket policy only disabled don't
display as 0 (zero) after you have successfully remediated the vulnerabilities, you may have to
wait a few minutes and refresh.
Click Check my progress to verify that you have completed this task correctly.
Throughout this lab, you have gained practical experience in identifying and prioritizing threats
using the Security Command Center. You also remediated the vulnerabilities identified for your
project, and generated a report to confirm that the vulnerabilities have been remediated.
By remediating the vulnerabilities and ensuring the compliance status of the Cloud Storage
bucket, you’ve helped your organization to prevent data breaches, unauthorized access, and data
loss.
Exp. no: 03 Create a role in Google Cloud IAM
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google console (or right-click and select Open Link in Incognito Window)
if you are running the Chrome browser. The lab Sign in page opens in a new browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username 1 below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username 1 in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials from the left panel. Do not use your Google Cloud
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left
In this task, you'll create a custom role for the audit team at Cymbal. You'll then grant the custom
role restricted access for viewing the database contents.
1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > Roles. The Roles page opens.
2. On the Explorer bar, located near the top of the Roles page, click + Create Role.
3. In the Create Role dialog, specify the following settings and leave the remaining settings
as their defaults:
4. Click the + Add permissions. The Add permissions dialog box opens.
5. In the Filter permissions by role field, type Firebase Realtime.
6. In the results drop-down field, select the Firebase Realtime Database Viewer checkbox.
7. Click OK.
8. Under Filter, select
the firebase.clients.list and firebasedatabase.instances.list checkboxes to add these
permissions to the custom role.
9. Click Add.
10. In the Create Role dialog, click Create.
Property Value (type or select)
ID CustomRole
Role launch
General Availability
stage
Each custom role can be given a role launch stage which reflects the different phases of a role's
development, testing, and deployment. These stages help users understand the current state of a
role and its suitability for various use cases.
There are several launch stages in Google Cloud. The three primary role launch stages you
should know about are:
Alpha: Roles in the Alpha stage are typically experimental and may undergo significant changes.
They are not recommended for production environments. Users can provide feedback on alpha
roles to influence their development.
Beta: Roles in the Beta stage are more mature than alpha roles but might still receive updates
and improvements based on user feedback. They are considered suitable for certain non-
production scenarios but may not be fully stable.
General Availability (GA): Roles that have reached General Availability have undergone
thorough development, testing, and refinement. They are considered stable, reliable, and suitable
for widespread use in production environments. GA roles have been extensively reviewed and
are intended to provide consistent and dependable behavior.
The new role should now be created and added to the existing roles in the project.
Click Check my progress to verify that you have completed this task correctly.
Note: For the purposes of this lab, you’ll grant the new role to Google Cloud username
2 provided in the Lab Details panel.
1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > IAM. The IAM page opens.
2. On the View By Principals tab, click Grant access. The Grant access dialog window
will open.
The Grant access dialogue box is a crucial component of the IAM system in Google Cloud. It
provides you with the ability to precisely define and manage these permissions for users, groups,
and service accounts.
3. Copy the Google Cloud username 2: Username 2 and paste it into the New
principals field.
4. Expand the Select a role drop-down menu, select Custom, and then select Audit Team
Reviewer. This is the role you created in the previous task.
5. Click Save.
The custom role should now be assigned to the user.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll use Google Cloud's Policy Analyzer to create a query to check the roles
granted to the user.
1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > Policy Analyzer. The Policy Analyzer page opens.
2. In the Analyze policies section, on the Custom Query tile, click Create custom query.
A pop-up may appear at the top left Google Cloud menu ( ) with the text “Click on the
menu anytime to find solutions for your business”. Select Got it and proceed to the next
step.
3. In the Set the query parameters section, expand the Parameter 1 drop-down menu and
select Principal.
4. Copy the Google Cloud username 2: Username 2 and paste it into the Principal field.
5. Click Continue.
6. In the Advanced options for query results section, select the List resources within
resource(s) matching your query checkbox.
7. Click Analyze and then select Run query in the drop-down menu.
The results should return the role granted to the user. Use the results to answer the following
question(s).
BigQuery Admin
Storage Admin
Pub/Sub Admin
Submit
CONCLUSION
Great work! You have successfully utilized IAM to create a custom role, grant access to a user
for that role, and verified the permissions within Google Cloud. Cymbal Bank's audit team can
now begin working on their database audit using the custom role you created.
IAM defines who has access to which resources based on their role. It is critical for managing
digital identities in an organization's environment and will be integral in your work as a cloud
security analyst.
By using IAM services, you are well on your way to effectively managing access and
permissions to storage resources.
Exp. no: 04 Access a firewall and create a rule
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-left
Task 1. Create a firewall rule
In this task, you'll create a firewall rule that allows HTTP and SSH connectivity. You will also
specify a target tag for the newly created firewall rule.
In Google Cloud, firewall rules must specify targets to define which VM instances they apply
to. Target tags can be used to apply a firewall rule to a specific group of VMs, helping simplify
the management of firewall rules. You'll use target tags to enable this firewall rule to the web
server only.
3. On the toolbar, click + Create Firewall Rule. The Create a firewall rule dialog
displays.
4. Specify the following, and leave the remaining settings as their defaults:
Field Value
Name allow-http-ssh
Logs On
Network vpc-net
5. Click Create.
Note: Wait until the Successfully created firewall rule "allow-http-ssh" message displays
before continuing.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll generate HTTP network traffic to the web server by visiting its external IP
address. The network traffic you generate will then be recorded as logs that you can analyze in
the Logs Explorer.
(Alternatively, you can add the External IP value to http://EXTERNAL_IP/ in a new browser
window or tab.) A default web page should display.
Next, you need to find the IP address of the computer you’re using.
4. Access your IP address using the following link whatismyip.com. It will directly reply
with your IP.
Note: Ensure that the IP address only contains numerals (IPv4) and is not represented in
hexadecimal (IPv6).
5. Copy the IP address and save it in a notepad. You’ll need to use this in the next task.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll access and analyze the VPC Flow Logs for the web server using the Logs
Explorer.
Entries from the subnetwork logs will display on the Query results pane to the right of the Log
fields pane.
4. On the Log fields pane, in the Log name section,
select compute.googleapis.com/vpc_flows to access the VPC Flow logs for the network.
If this option doesn’t display, wait a few minutes for this log type to show up.
Once selected, entries from the VPC Flow Logs display on the Query results pane.
5. In the Query builder at the top of the page, at the end of line 2, press ENTER to create a
new line.
6. On line 3, enter the following:
jsonPayload.connection.src_ip=YOUR_IP
Copied!
content_copy
resource.type="gce_subnetwork"
log_name="projects/PROJECT_ID/logs/compute.googleapis.com%2Fvpc_flows"
jsonPayload.connection.src_ip=YOUR_IP
7. Replace YOUR_IP with the IP address you saved from Task 2. This query will search for
network traffic logs originating from your IP address that you had generated in the
previous task.
8. Click Run query. The query results should display on the Query results pane.
Note: If the vpc_flows filter option doesn’t display or if there are no logs, you might have to wait
a few minutes and refresh. If after a couple of minutes, the vpc_flows filter option still doesn’t
display, navigate to the Compute Engine page and click on the External IP of the web server a
few times to generate more traffic and check back on the vpc_flows filter option.
Here you can examine the details about the network connection to the web server:
0.0.0.0
10.1.3.2
127.0. 0.1
255.255.255.255
Submit
In this task, you'll create a new firewall rule that denies traffic from port 80.
Field Value
Name deny-http
Logs On
Network vpc-net
Action on match Deny
5. Click Create.
Click Check my progress to verify that you have completed this task correctly.
jsonPayload.connection.src_ip=YOUR_IP DENIED
Copied!
content_copy
Replace YOUR_IP with the IP address you saved from Task 2. This query will search for
firewall logs that denied your IP address connection to the web server. Your query should
resemble the following:
resource.type="gce_subnetwork"
log_name="projects/PROJECT_ID/logs/compute.googleapis.com%2Ffirewall"
jsonPayload.connection.src_ip=YOUR_IP DENIED
10. Click Run query. The query results should display on the Query results pane.
11. In the Query results pane, expand one of the log entries.
12. Within the log entry, expand the jsonPayload field by clicking the expand arrow >.
Then, expand the connection field. You can examine the details about the network
connection to the web server to verify if the firewall rule was successfully triggered:
dest_ip - This is the destination IP address of the web server which is 10.1.3.2.
dest_port - This is the destination port number of the web server which is HTTP port 80.
protocol - The protocol is 6 which is the IANA protocol for TCP traffic.
src_ip - This is the source IP address of your computer.
src_port - This is the source port number that's assigned to your computer.
disposition - This field indicates whether the connection was allowed or denied. Here,
it's denied which indicates that the connection to the server was denied.
13. Within the log entry, expand the rule_details field by clicking the expand arrow >. You
can examine the details about the firewall rule. Additionally, you can extract more
information from the following fields in the log entry by expanding them:
action - The action taken by the rule, DENY in this case.
direction - The rule's traffic direction can be either ingress or egress, here it is INGRESS which
means the action will apply to incoming traffic.
ip_port_info - The protocol and ports this rule controls.
The ip_protocol and port_range lists TCP port 80.
source_range - The traffic sources that the firewall rule is applied to. Here it is 0.0.0.0/0.
target_tag - This lists all the target tags that the firewall rule applies to. Here, it is http-server,
the target tag you added to the firewall rule in the previous task.
By examining the details of this firewall log entry, you should notice that the firewall rule deny-
http you set up to deny HTTP traffic was successfully triggered. This rule denied incoming
network traffic on port 80.
Click Check my progress to verify that you have completed this task correctly.
CONCLUSION
You now have practical experience in creating and testing firewall rules for a web server in a
cloud environment. By creating firewall rules and analyzing log entries, you have a familiarity
with the intricacies of perimeter protection. This is useful for monitoring and analyzing potential
security incidents or threats, which is an essential part of a security analyst's role.
You’re well on your way to understanding how to modify firewall rules to ensure maximum
network security.
Exp. no: 05 Identify vulnerabilities and remediation techniques
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
1. On the Google Cloud console title bar, click the Activate Cloud Shell ( ) icon. If
prompted, click Continue.
2. Copy the following command into the Cloud Shell terminal:
gcloud compute addresses create xss-test-ip-address --region="REGION"
Copied!
content_copy
This command creates a static IP address named xss-test-ip-address in the REGION region. This
static IP will be used for scanning the vulnerable web application.
3. Press ENTER.
If prompted, click Authorize.
4. Copy the following command into the Cloud Shell terminal:
gcloud compute addresses describe xss-test-ip-address \
--region="REGION" --format="value(address)"
Copied!
content_copy
5. Press ENTER.
6. Copy the IP address from the output and save it in a notepad. You’ll need to use this in a
later task.
7. Copy the following command into the Cloud Shell terminal:
gcloud compute instances create xss-test-vm-instance --address=xss-test-ip-address --no-service-
account \
--no-scopes --machine-type=e2-micro --zone="ZONE" \
--metadata=startup-script='apt-get update; apt-get install -y python3-flask'
Copied!
content_copy
8. Press ENTER.
Note: The startup script will install python-flask, a Web Application Framework, which is used
for running a simple Python application. This application demonstrates cross-site scripting
(XSS) vulnerability, which is a common web application security vulnerability.
Click Check my progress to verify that you have completed this task correctly.
First, you’ll create a firewall rule that will allow Web Security Scanner to access the vulnerable
application.
content_copy
This command creates a firewall rule that allows access to the web application from any source
IP address. This allows the Web Security Scanner to access the vulnerable application and
perform a scan.
2. Press ENTER.
Next, use an SSH connection to connect to the VM instance.
6. A pop-up may appear asking you to allow SSH in-browser to connect to VMs.
Click Authorize.
Now, extract the web application files.
7. Copy the following command into the SSH-in-browser page (not in Cloud Shell):
gsutil cp gs://cloud-training/GCPSEC-ScannerAppEngine/flask_code.tar . && tar xvf
flask_code.tar
Copied!
content_copy
This command downloads and extracts the vulnerable web application files.
8. Press Enter.
9. Finally, copy the following command into the SSH-in-browser page:
python3 app.py
Copied!
content_copy
Click Check my progress to verify that you have completed this task correctly.
content_copy
A Cymbal Bank corporate banking portal with a web form should appear.
3. Copy the following HTML code including the script tags into the web form:
<script>alert('This is an XSS Injection to demonstrate one of OWASP vulnerabilities')</script>
Copied!
content_copy
The alert window opens with the following message: “This is an XSS Injection to demonstrate
one of OWASP vulnerabilities”.
SQL Injection
Ransomware
Submit
7. In the Google Cloud console, click the Navigation menu > View All Products ( ).
8. Select Security > Web Security Scanner.
If the Web Security Scanner API is enabled then the Cloud Web Security Scanner page
displays the Scan configs details.
In the Starting URLs section, the Starting URLs 1 field should be pre-populated with your
static IP address.
11. Add a colon and the port number 8080 at the end of the IP address. The Starting URL
1 should resemble the following:
http://<EXTERNAL_IP>:8080
Copied!
content_copy
17. When the scan is complete, return to the Google Cloud console.
Note: The scan might take 5-10 minutes to complete.
The Results tab should indicate the cross-site vulnerabilities, demonstrating how Web Security
Scanner can detect a XSS vulnerability.
The vulnerabilities can also be found in the Vulnerabilities tab under the Security Command
Centre.
Click Check my progress to verify that you have completed this task correctly.
The recommendation for fixing the current vulnerabilities is to validate and escape untrusted
user-supplied data, which also points to the corresponding OWASP® rules.
You will do this by editing the code of the vulnerable application to include lines of code that
validate and escape the user-supplied data.
content_copy
4. Press ENTER.
5. To fix the XSS vulnerability, you validate the output string variable. The output string is
the processed output of the user-supplied web form input.
Ensure that the application does not accept user input as HTML code, instead, it will escape
special characters supplied by user input. To do this, locate the two lines that set the output
string:
content_copy
6. Remove the # symbol from the first line, and add it to the beginning of the next line
(ensure that you indent your code properly.) The final lines must resemble the following:
@app.route('/output')
def output():
output_string = "".join([html_escape_table.get(c, c) for c in input_string])
# output_string = input_string
return flask.render_template("output.html", output=output_string)
Copied!
content_copy
7. Press CTRL + X to exit nano, then Y to save your changes, and then ENTER to save
your changes.
8. Copy the following command into the SSH-in-browser terminal:
python3 app.py
Copied!
content_copy
9. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.
1. Return to the Cloud Web Security Scanner page in the Google Cloud console.
2. Click Run to re-run the scan.
Note: The scan might take 5-10 minutes to complete.
The Results tab should now indicate that there are no vulnerabilities found.
Click Check my progress to verify that you have completed this task correctly. Be sure you wait
until the scan completes to get credit for completing this task.
CONCLUSION
Through this lab, you gained practical experience in scanning for application vulnerabilities. You
learned the importance of a security analyst's ability to scan for application vulnerabilities, which
is essential for helping identify and address potential weaknesses, managing risks, meeting
compliance requirements, and ultimately, maintaining a robust security posture to protect an
organization’s assets and reputation.
By closing security gaps and addressing weaknesses, you can help prevent potential exploitation,
minimize the impact of security incidents, and maintain compliance with industry regulations.
In this lab, you completed one of the fundamental aspects of proactive cybersecurity strategies.
Exp. no: 06 Change firewall rules using Terraform and Cloud Shell
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
"Google Cloud username"
Copied!
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left
content_copy
4. Press ENTER.
This command performs the following actions:
student_01_c2e095df84e2@cloudshell:~/cloudshell_open/docs-examples/firewall_basic
(qwiklabs-gcp-04-fde36f013e65)$
5. Copy the following command into the Cloud Shell terminal to list the contents of the
directory:
ls
Copied!
content_copy
You should notice that several files in the directory have been
downloaded: backing_file.tf, main.tf, motd, and tutorial.md.
6. Copy the following command into the Cloud Shell terminal to analyze the configuration
of the firewall rule:
cat main.tf
Copied!
content_copy
7. Press ENTER.
The main.tf file is the configuration file that defines the resources that Terraform will create.
Two resources will be created: a firewall rule google_compute_firewall named test-firewall-$
{local.name_suffix} with rules to allow ICMP and TCP traffic from ports 80, 8080, and 1000-
2000 and a VPC network google_compute_network named test-network-${local.name_suffix}.
The variable ${local.name_suffix} is a local variable that automatically generates unique names
for resources.
Which one of the following protocols are being modified (allow/deny) on the firewall using the
Terraform main.tf file in Cloud Shell Editor?
icmp, web
test-firewall, test-network
web, test-network
icmp, tcp
Submit
In this task, you'll deploy a new VPC network and a new firewall rule. This task provides hands-
on experience with building a VPC network and subnets.
Note: Run the following commands in sequence in the Cloud Shell terminal.
content_copy
2. Press ENTER.
3. Copy the following command into the Cloud Shell terminal:
terraform init
Copied!
content_copy
4. Press ENTER.
The output should return a message stating that the Terraform has been successfully initialized.
Take a moment to examine the output. You'll notice that Terraform will create a new firewall
and VPC network:
5. Once the initialization is complete, copy the following command into the Cloud Shell
terminal:
terraform apply
Copied!
content_copy
This command applies the changes and deploys the Terraform script.
6. Press ENTER.
Note: If an Authorize Cloud Shell dialog box appears, click Authorize to grant permission to
use your credentials for the gcloud command.
7. The command prompt will prompt you to Enter a value. Type "yes", and press ENTER.
This will start creating the VPC network and firewall rules.
Once it’s completed, the output should return the following message:
Click Check my progress to verify that you have completed this task correctly.
Check my progress
Check my progress
1. In the Google Cloud console, from the Navigation menu ( ), select VPC network >
VPC networks. The VPC networks page opens.
2. You should notice two VPC networks, default and the newest one you just created, test-
network. Click test-network to access the VPC network details.
3. Click Firewalls. Use the expand arrow to expand vpc-firewall-rules. Under Protocols
and ports and Action you should notice the firewall rules are the same rules as defined
in the configuration file: Allow and tcp:80, 1000-2000, 8080 icmp.
Note: To ensure that resource names are unique, both the test-network and test-firewall names
will be dynamically appended with a unique identifier. For example, test-network-curly-
penguin. This unique identifier is generated automatically by the ${local.name_suffix} local
variable, which is defined in the configuration file. This helps prevent resource naming conflicts
and ensures the proper organization of infrastructure components.
CONCLUSION
You've successfully built a VPC network and subnet using Terraform and the Cloud Shell. This
lab provides the foundation to developing advanced automated solutions that can be given to
system administrators to use with Terraform.
By creating the VPC network and firewall, you have gained a better understanding of how it
enables you to automate the process of provisioning and modifying firewall rules. This helps
establish consistency across various environments, while also helping reduce the chance of
human error.
Exp. no: 07 Create symmetric and asymmetric keys
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: Be sure to select the specified region, otherwise you get an error. Note: Be sure you select
the specified region, otherwise you get an error.
7. Click Create.
8. In the Name and protection level category, in the Key name field, enter demo-key.
The Protection level should be set to Software by default, if not, select it now.
Symmetric keys are commonly used to encrypt sensitive data before storage or transmission.
When data needs to be accessed or shared, the same symmetric key is used to decrypt the
encrypted content, ensuring that only authorized parties can access the original information.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll create an asymmetric key with specific settings, including that of its algorithm
and protection level.
Asymmetric keys can also be used for digital signatures. Digital signatures help verify the
authenticity and integrity of messages, files, or software, ensuring that they have not been
tampered with during transmission. Digital signatures use two keys, one for signing which
involves the user's private key, and one for verifying signatures which involves the user's public
key. The output of the signature process is called the digital signature.
Click Check my progress to verify that you have completed this task correctly.
CONCLUSION
Great work! Through this lab activity, you have gained practical experience in creating both
symmetric and asymmetric keys, which play a crucial role in ensuring secure data and
communication over networks.
Having created both types of keys, you now have a better understanding of their significance in
cryptography. Your newfound ability to create these keys allows you to assist customers in
securely storing large amounts of data.
Exp. no: 08 Determine the difference between normal activity and an
incident
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog. Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud account
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
The IAM page opens.
On the View By Principals tab, note the two student users that have been automatically
configured for the qwiklabs.net organization. These two users are also the same users listed in
the Lab details panel as Google Cloud username 1 and Google Cloud username 2.
These two users have automatically been granted owner roles to the lab project by a service
account as part of a normal provisioning process. This will trigger an alert finding or incident
because an external principal has an owner role. However, because both users belong to the
qwiklabs.net organization this alert is considered normal activity. You will examine this alert
finding later.
2. On the View By Principals tab, click Grant Access. The Grant access dialog displays.
3. Under the Add principals section, in the New principals field,
type [email protected].
4. Expand the Select a role drop-down menu, select Basic, and then select Owner.
5. Click Save.
You have now assigned the owner role to the external user [email protected]. This
will trigger a finding in SCC because this user is outside of the qwiklabs.net organization.
Click Check my progress to verify that you have completed this task correctly.
1. In the Google Cloud console, in the Navigation menu ( ), click Security > Findings.
The Findings page opens.
You should notice three findings with high severities listed in the Finding query results panel.
In this lab, you’ll examine two Persistence: IAM anomalous grant findings to determine
whether the finding is normal activity or whether it is malicious.
Note: If the Persistence: IAM anomalous grant findings are not listed, you may have to wait a few
minutes and refresh. Wait until both these active findings display before continuing.
The Persistence: IAM anomalous grant indicates that an anomalous IAM grant was detected.
This means that a user or service account was granted access to a resource that they should not
have had access to. This could be a potential indication of a malicious actor attempting to gain
unauthorized access to your environment.
Next, filter the findings to display a list of Persistence: IAM anomalous grant category
findings.
2. In the Quick filters panel, in the Category section, select the checkbox for the Persistence: IAM
anomalous grant category.
Note: Selecting attributes with quick filters automatically adds them to the query. Notice that the Query
preview is updated with the Persistence: IAM anomalous grant category you selected. You can locate
specific findings or groups of findings by editing the findings query.
3. Click the Event time column header to sort the findings in descending order, so that the earliest
finding is at the top.
Task 3. Analyze the findings
In this task, you'll examine these findings to determine which is normal activity and which is a
genuine incident.
1. In the Findings query results panel, in the Category column, click the Persistence:
IAM Anomalous Grant finding with the earliest event time. The Persistence: IAM
Anomalous Grant dialog opens on the Summary tab, which displays the finding
summary.
2. Find the Principal email row. This is the user account that granted the owner role to the
user. Notice that the service account belongs to the qwiklabs.net organization. With this
information, you can establish that this finding represents normal and expected activity.
3. Click the Source Properties tab, and expand properties > sensitiveRoleGrant >
members. Again, the email address listed for principalEmail is the user that granted the
owner role, and the email address(es) listed for members is the user that was granted the
owner role.
Which user was granted the owner role in the earliest Persistence: IAM Anomalous Grant finding record?
Submit
Next, you'll locate the malicious activity associated with the external user account you had
granted access to: [email protected].
Submit
1. In the Google Cloud console, in the Navigation menu ( ) click Logging > Logs Explorer.
The Logs Explorer page opens. (You may need to click More Products to expand
the Navigation menu options and locate Logging under Operations.)
2. Copy the following query into the Query builder at the top of the page:
protoPayload.authorizationInfo.permission="resourcemanager.projects.setIamPolicy"
protoPayload.methodName="InsertProjectOwnershipInvite"
Copied!
content_copy
3. Click Run query. The query results should display on the Query results pane.
4. In the Query results pane, expand the audit log listed for your project.
5. Click Expand nested fields. All the nested fields contained in the log are made visible.
You can now examine the details of the anomalous request event including information such as:
Which user account made the request to grant the project owner role to the [email protected] user?
Submit
1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
The IAM page opens.
2. Next to the [email protected] user, click the Edit principal ( ) icon. The Edit
permissions page opens.
3. Click the Delete ( ) icon to delete the owner role.
4. Click Save.
The policy will be updated, and the owner role removed from the [email protected]
user.
Click Check my progress to verify that you have completed this task correctly.
CONCLUSION
Great work! Through this lab activity, you have gained practical experience in analyzing a
security alert to determine whether it is a genuine malicious activity.
You did this by granting permissions to an external user, viewing the Event Threat Detection
findings in the Security Command Center, and accessing the findings in Cloud Logging. Finally,
you remediated the finding by removing the project owner role from the external user.
As a security analyst, these are skills that can enable you to quickly take steps to contain,
mitigate, and remediate any threats.
Exp. no: 09 Explore false positives through incident detection
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google console (or right-click and select Open Link in Incognito Window)
if you are running the Chrome browser. The lab Sign in page opens in a new browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username 1 below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username 1 in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials from the left panel. Do not use your Google Cloud
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-lef
In this task, you’ll create a service account and grant it permissions sufficient to trigger an
anomalous threat finding in SCC.
1. In Google Cloud console, in the Navigation menu ( ), click IAM & Admin > Service
Accounts.
2. In the action bar, click + Create Service Account.
3. In the Service account details section:
4. In the Grant this service account access to project section, expand the Select a
role drop-down menu, select Basic, and then select Owner.
5. Click Continue, and then click Done.
Notice the test-account service account listed in the Service accounts list.
Click Check my progress to verify that you have completed this task correctly.
In this task, you’ll create and download a JSON authentication key for the new service account
you created in the previous task. You’ll then use Cloud Shell to upload that key to your Google
Cloud account. This will trigger a threat finding in SCC.
1. Still on the Service Accounts page, inline with the test-account service account,
click Actions ( ) > Manage keys. The test-account page opens.
2. In the Keys section, click Add Key > Create new key.
3. In the Create private key dialog, set the Key type to JSON.
4. Click Create.
The console prompts you to download the key to your local device. Once downloaded,
you’ll use Cloud Shell to upload the key to your Google Cloud (student) account.
5. On your local device, navigate to the key file you just downloaded and rename it test-
account.
6. In the Google Cloud console, click the Activate Cloud Shell ( ) icon.
7. Click Continue.
It should only take a few moments to provision and connect to the Cloud Shell
environment.
8. In the Cloud Shell title bar, click More ( ) > Upload > Choose Files.
9. Navigate to and select the file on your local machine, and then in the Upload dialog,
click Upload.
10. Copy the following command into the Cloud Shell terminal:
ls
Copied!
content_copy
This command lists the key file you just uploaded.
Click Check my progress to verify that you have completed this task correctly.
In this task, you’ll reconfigure the Cloud Shell environment to use the new test-account service
account that you created in Task 1. This will trigger a threat finding in SCC. Then, you’ll assign
excessive permissions to the lab project.
content_copy
2. Press ENTER.
3. Copy the following command into the Cloud Shell terminal:
4. gcloud auth list
Copied!
content_copy
This command confirms that you activated the service account, and that gcloud is using this
service account.
4. Press ENTER.
ACTIVE: *
ACCOUNT: test-account@"Google Cloud project ID".iam.gserviceaccount.com
5. Copy the following command into the Cloud Shell terminal:
6. export STUDENT2="Google Cloud username 2"
gcloud projects add-iam-policy-binding $PROJECT_ID --member user:$STUDENT2 --
role roles/editor
Copied!
content_copy
This command grants the editor role to user 2 so that you can access and remediate the false
positive finding in the next task.
6. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.
1. In the Google Cloud console, click on the user icon in the top-right corner of the screen,
and then click Add account.
2. Navigate back to the Lab Details panel, copy the Google Cloud username 2: Google
Cloud username 2 and password. Then, paste the username and password into the
Google Cloud console Sign in dialog.
Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.
In this task, you’ll locate and examine the SCC finding generated by the service Event Threat
Detection. This finding is a false positive that was triggered by the activity you generated in
Tasks 1-3.
The Findings query results panel updates to display only the selected finding category.
3. In the Findings query results panel, display the details of the finding by clicking the
most recent (see Event time) User managed service account key in
the Category column. The details panel for the finding opens and displays
the Summary tab.
Leave the User managed service account key page open to answer the following questions.
High
Critical
Medium
Low
Submit
Vulnerability
Misconfiguration
Observation
Threat
Submit
Submit
4. Which tab in the User managed service account key page provides compliance standards,
explanation of the threat, and a recommendation on how to handle the threat?
Summary
Source Properties
JSON
Submit
In this task, you'll remediate the false positive by deleting the JSON authentication key for
the test-account service account.
1. In Google Cloud console, in the Navigation menu ( ), click IAM & Admin > Service
Accounts.
2. On the Service accounts page, click the email address of the test-account service
account.
3. Click the Keys tab.
4. From the list of keys, click the Delete service account key ( ) icon to delete the key.
A pop-up will appear asking you to confirm the action. Click Delete.
Click Check my progress to verify that you have completed this task correctly.
Conclusion
You have completed this lab! You used SCC to investigate a false positive and took action to
remediate it. As a cloud security analyst, you'll likely encounter false positive alerts. It's
important to understand how and why false positive alerts are triggered and how you can take
action to remediate them
.
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
Activate Cloud Shell
Cloud Shell is an online development and operations environment accessible anywhere with your
browser. Cloud Shell provides command-line access to your Google Cloud resources.
1. Click Activate Cloud Shell ( ) at the top right of the Google Cloud console. You may
be asked to click Continue.
After Cloud Shell starts up, you'll see a message displaying your Google Cloud Project ID for
this session:
content_copy
3. A pop-up will appear asking you to Authorize Cloud Shell. Click Authorize.
4. Your output should now look like this:
Output:
ACTIVE: *
ACCOUNT: [email protected]
To set the active account, run:
$ gcloud config set account `ACCOUNT`
5. List the project ID with this command:
gcloud config list project
Copied!
content_copy
Example output:
[core]
project = qwiklabs-gcp-44776a13dea667a6
Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview
guide.
In this task, you'll create and delete cloud resources to generate account activity which you'll
access as Cloud Audit Logs.
content_copy
2. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.
The activity you generated in the previous task was recorded as audit logs. In this task you'll
export these logs to a BigQuery dataset for further analysis.
1. In the Google Cloud console, in the Navigation menu ( ) click Logging > Logs
Explorer. The Logs Explorer page opens. (You may need to click More Products to
expand the Navigation menu options and locate Logging under Operations.)
2. When exporting logs, the current filter will be applied to what is exported. Copy the
following query into the Query builder:
logName = ("projects/Project ID/logs/cloudaudit.googleapis.com%2Factivity")
Copied!
content_copy
3. Click Run query. The query results should display on the Query results pane. This
query filters for Cloud Audit logs within your project.
4. Under the Query editor field, click More actions > Create sink. The Create logs
routing sink dialog opens.
Note: If your browser window is narrow, the UI may display More instead of More actions.
5. In the Create logs routing sink dialog, specify the following settings and leave all other
settings at their defaults:
Click Next.
Sink
Uncheck the Use Partitioned Tables checkbox, if it is
destination
already selected, and click Next.
Notice the pre-filled Build inclusion
filter:logName=("projects/[PROJECT
Choose logs
ID]/logs/cloudaudit.googleapis.com%2Factivity")
to include in
Click Next.
sink
Click Create Sink.
Return to the Logs Explorer page.
6. In the Logging navigation pane, click Log Router to view the AuditLogsExport sink in
the Log Router Sinks list.
7. Inline with the AuditLogsExport sink, click More actions ( ) > View sink details to
view information about the AuditLogsExport sink you created. The Sink details dialog
opens.
8. Click Cancel to close the Sink details dialog when you're done viewing the sink
information.
All future logs will now be exported to BigQuery, and the BigQuery tools can be used to perform
analysis on the audit log data. The export does not export existing log entries.
Click Check my progress to verify that you have completed this task correctly.
Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.
In this task, you'll create and delete cloud resources to generate additional account activity which
you'll then access in BigQuery to extract additional insights from the logs.
content_copy
These commands generate more activity to view in the audit logs exported to BigQuery.
2. Press ENTER.
When prompted, enter Y, and press ENTER. Notice you created two buckets and deleted a
Compute Engine instance.
3. When the prompt appears after a few minutes, continue by entering the following
commands into the Cloud Shell terminal:
gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID
content_copy
4. Press ENTER.
Notice you deleted both buckets.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll review the Admin activity logs generated in the previous task. Your goal is to
identify and apply filters to isolate logs that may indicate suspicious activity. This will enable
you to export this subset of logs and streamline the process of analyzing them for potential
issues.
Admin Activity logs record the log entries for API calls or other administrative actions that
modify the configuration or metadata of resources. For example, the logs record when VM
instances and App Engine applications are created when permissions are changed.
Note: You can view audit log entries in the Logs Viewer, Cloud Logging, and in the Cloud SDK.
You can also export audit log entries to Pub/Sub, BigQuery, or Cloud Storage.
content_copy
7. Within this entry, click on the storage.googleapis.com text, and select Show matching
entries. The Query results should now display only six entries related to created and
deleted cloud storage buckets.
8. In the Query editor field, notice
the protoPayload.serviceName="storage.googleapis.com" line was added to the query
builder, this filters your query to entries only matching storage.googleapis.com.
9. Within those query results, click storage.buckets.delete in one of the entries, and
select Show matching entries.
Notice another line was added to the Query builder text:
logName = ("projects/"PROJECT_ID"/logs/cloudaudit.googleapis.com%2Factivity")
protoPayload.serviceName="storage.googleapis.com"
protoPayload.methodName="storage.buckets.delete"
The Query results should now display all entries related to deleted Cloud Storage buckets. You
can use this technique to easily locate specific events.
10. In the Query results, expand a storage.buckets.delete event by clicking the expand
arrow > next to the line:
11. Expand the authenticatitonInfo field by clicking the expand arrow > next to the line:
Notice the principalEmail field which displays the email address of the user account that
performed this action which is the user 1 account you used to generate the user activity.
Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.
You've generated and exported logs to a BigQuery dataset. In this task, you'll analyze the logs
using the Query editor.
Note: When you export logs to a BigQuery dataset, Cloud Logging creates dated tables to hold
the exported log entries. Log entries are placed in tables whose names are based on the entries'
log names.
3. In the Explorer pane, click the expand arrow beside your project, Google Cloud project
ID. The auditlogs_dataset dataset is displayed.
Note: If auditlogs_dataset is not displayed, reload your browser window.
Next, verify that the BigQuery dataset has appropriate permissions to allow the export writer to
store log entries.
This permission is assigned automatically when log exports are configured so this is a
useful way to check that log exports have been configured.
12. In the Untitled tab of the query builder, delete any existing text and copy and paste the
following command:
SELECT
timestamp,
resource.labels.instance_id,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gce_instance"
AND operation.first IS TRUE
AND protopayload_auditlog.methodName = "v1.compute.instances.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000;
Copied!
content_copy
This query returns the users that deleted virtual machines in the last 7 days.
14. Replace the previous query in the Untitled tab with the following:
SELECT
timestamp,
resource.labels.bucket_name,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gcs_bucket"
AND protopayload_auditlog.methodName = "storage.buckets.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000;
Copied!
content_copy
This query returns the users that deleted Cloud Storage buckets in the last 7 days. You should
notice two entries, which is the activity you generated in the previous tasks as user 1.
Click Check my progress to verify that you have completed this task correctly.
CONCLUSION
Great work! You have successfully queried in Logs Explorer. You then exported logs and
created a dataset that you analyzed in BigQuery.
You have shown how you can use audit logs and filter for types of malicious activity and then
further analyze those logs in BigQuery as a way to analyze the threats.
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
content_copy
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
1. In the Google Cloud console, click the Navigation menu ( ) > Backup and DR. (You
will have to click More Products and then scroll down to find Backup and DR in
the Operations section).
2. From left nevigation pane, click Management console.
3. In the Log in to the management console section, click Log in to the management
console.
4. If asked to Choose an account, click your Google Cloud Username: USERNAME.
5. Skip the Welcome to Google Backup and DR! tour. The Backup and DR management
console opens.
6. In the Backup and DR management console titlebar, click Manage > Appliances.
If the management server and the Backup and Recovery server are successfully installed, the
Connectivity status has a green check.
Note: If the Update Status is Pending (yellow exclamation point), an update is waiting for
installation. You can ignore this and continue to your next task.
1. In the Backup and DR management console titlebar, click Backup Plans > Templates,
and then click +Create Template.
2. In the Template field, set the template name to vm-backup.
Note: Template names are text strings. The only allowed special characters are spaces,
underscores (_), and dashes (-).
5. In the Create/Edit Policy section, set the following fields and leave all other settings at
their defaults:
Field Value
Scheduling Continuous
Every 2 Hour(s)
Note: The Scheduling policy type can be either Windowed or Continuous. The default
is Windowed:
• Windowed defines a discrete snapshot backup schedule adhering to a specific frequency and
time window.
• Continuous defines a continuous snapshot backup schedule
Click Check my progress to verify that you have completed this task correctly.
Every appliance has a dedicated service account attached to it—that was created during
appliance deployment in the project where the appliance was deployed. For appliances installed
on version 11.0.2 and higher, a corresponding cloud credential for this service account is
automatically created at the time of an appliance deployment.
The name of the cloud credential is based on the appliance name followed by the suffix -sa. For
example, if the name of the backup/recovery appliance is bur-appliance-us-east1, then the name
appliances corresponding cloud credential is bur-appliance-us-east1-sa.
1. Return to the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > IAM.
2. In the Name column, find the service account attached to your backup appliance, the
service account's name should be Service account for backup and recovery appliance.
3. In the Role column, notice that the Backup and DR Cloud Storage Operator role is
already assigned.
Task 4. Discover and add Compute Engine instances to the management console
In this task, you’ll use the onboarding wizard to onboard your Compute Engine instances.
Onboarding an instance means you attach the template to the instance.
5. Click Search.
The results are listed in the search results. You may have to scroll down to view them:
lab-vm
qwiklabs-appliance
6. Select the lab-vm Compute Engine instance for backup, and then click Next.
Note: If no instances or only one instance appear, ensure that the zone selected matches the zone
where your Compute Engine instance(lab-vm) is located or running.
7. In the Enable backups for Compute Engine VM instances? page, select the lab-
vm and then set the following:
11. Click Finish to complete the onboarding process. This triggers the back up of the
selected Compute Engine instances based on the Policy Template you attached.
12. Click Finish to confirm your intent to finish.
After onboarding is complete the Status is a green check. This means the policy template is
attached to the selected VM.
Note: Backup and DR ensures that the chosen Compute Engine instances get backed up at the
frequency you set in the backup policy.
13. In the Backup and DR management console titlebar, click Monitor > Jobs.
You can monitor the progress of the backup job. When the job is finished, you have an image
that you can restore if needed.
If the jobs list is empty, the backup job has either not started or is already completed. Use
different filter options to populate the jobs list, for example Succeeded or All filter options.
Filter results are listed in the Jobs list.
Click Check my progress to verify that you have completed this task correctly.
Now that you have an image of your Compute Engine instance, in this task, you’ll create a brand
new Compute Engine instance using the backup image that you created in the previous task.
1. From the Backup and DR management console titlebar, click Backup &
Recover > Recover.
2. Click the name of the Compute Engine instance you want to recover (lab-vm) to select it.
Click Next.
3. In the action bar, click Table. In the Images list, one image is displayed because there
has only been one backup image created.
4. Select the image and click Mount.
Note: Typically, the Mount panel has many selection choices that allow you to choose where
and how to restore an image. In this lab, you may have only one timeline option as you just
created the first backup.
Note: The job may take five minutes or longer depending on the region you selected.
To view the recovered VM, go to the Google Cloud console, in Navigation menu ( ),
click Compute Engine > VM instances to view three VM instances:
lab-vm
lab-vm-recovered
qwiklabs-appliance
Click Check my progress to verify that you have completed this task correctly.
You can also create a brand new Compute Engine instance in a different project from backup
images.
Note: Before you set the default service account as a Principal in a different project, you must
add the default service account as a Principal in the target project.
To restore a Compute Engine instance to an alternate project, you first add the service account of
project 1 as a principle to Google Cloud project 2 and then recover the instance on Google Cloud
project 2:
1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
2. In the list of principals, find and copy the email of the Service account for backup and
recovery appliance to use in Step 6. The email is similar to the following: qwiklabs-
[email protected].
3. In the Google Cloud console, click the Project selection drop-down. If the project lists
only one project, click All to open the All tab.
4. Search for Google Cloud project ID 2:PROJECT_ID_2 and then click to select that
project ID. You are now in the Permissions page for Google Cloud project ID
2:PROJECT_ID_2.
5. Click Grant access.
6. In the Add Principles section, in the New principals field, paste the email address of the
service account of Google Cloud project 1, named Service account for backup and
recovery appliance. It should be still in your clipboard.
7. In the Assign roles section:
Click Select a role and assign the Backup and DR > Backup and DR Compute
Engine Operator role.
Click +Add Another Role.
Click Select a role and assign the Backup and DR > Backup and DR Cloud
Storage Operator role.
8. Click Save.
You’ve added the service account of Google Cloud project 1 as a principal to Google Cloud
project 2. You can now recover the instance on google Cloud project 2.
Click Check my progress to verify that you have completed this task correctly.
Check my progress
Conclusion
Great work! You successfully used Google Backup and DR Service to create a backup template
and then applied it to two Compute Engine instances.
You have shown how to prepare for issues with VMs and the service. When a device
malfunctions, you can use Backup and DR Service to restore mal-functioning devices across
multiple Google Cloud projects.
Date:
How to start your lab and sign in to the Google Cloud console
1. Click the Start Lab button. On the left is the Lab Details panel with the following:
Time remaining
The Open Google Cloud console button
The temporary credentials that you must use for this lab
Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.
Note: If the Choose an account dialog displays, click Use Another Account.
3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
You can also find the Google Cloud username in the Lab Details panel.
4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy
You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
One morning, the security team detects unusual activity within their systems. Further
investigation into this activity quickly reveals that the company has suffered a massive security
breach across its applications, networks, systems, and data repositories. Attackers gained
unauthorized access to sensitive customer information, including credit card data, and personal
details. This incident requires immediate attention and thorough investigation. The first step
towards understanding the scope and impact of this breach is to gather information and analyze
the available data.
In this task, you'll examine the vulnerabilities and findings in Google Cloud Security Command
Center to determine how the attackers gained access to the data, and which remediation steps to
take.
Important: The vulnerabilities listed in this section rely on specific security checks being run
beforehand. If some checks haven't run yet, the related vulnerabilities might not appear in the
Security Command Center when you complete the steps in this section. Don't worry though! You
can still use the information provided in this task to analyze the available findings and proceed
with the remediation steps in the tasks that follow.
First, navigate to the Security Command Center to view an overview of the active
vulnerabilities.
1. In the Google Cloud console, in the Navigation menu ( ), click Security > Risk
Overview. The Security Command Center Overview page opens.
2. Scroll down to Active vulnerabilities. This provides an overview of current security
vulnerabilities or issues that need attention within the Google Cloud environment.
3. Select the Findings By Resource Type tab. The security findings or vulnerabilities based
on the type of cloud resource affected (e.g., instances, buckets, databases) are organized.
By reviewing active vulnerabilities and findings by resource type, you can prioritize and
address security issues effectively.
You'll note that there are both high and medium severity findings relating to the Cloud Storage
bucket, the Compute Instance virtual machine, and the firewall.
Which three resource types are listed with high severity findings?
Submit
4. In the Security Command Center menu, click Compliance. The Compliance page
opens.
5. In the Google Cloud compliance standards section, click View details in the PCI DSS
3.2.1 tile. The PCI DSS 3.2.1 report opens.
6. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
Note: Make sure to follow these steps to assess the PCI report, and do not refresh the page, as
the required filters will be removed, and the correct information won't be displayed.
The Payment Card Industry Data Security Standard (PCI DSS) is a set of security requirements
that organizations must follow to protect sensitive cardholder data. As a retail company that
accepts and processes credit card payments, Cymbal Retail must also ensure compliance with the
PCI DSS requirements, to protect cardholder data.
As you examine the PCI DSS 3.2.1 report, notice that it lists the rules that are non-compliant,
which relate to the data breach:
Firewall rule logging should be enabled so you can audit network access: This medium
severity finding indicates that firewall rule logging is disabled, meaning that there is no record of
which firewall rules are being applied and what traffic is being allowed or denied. This is a
security risk as it makes it difficult to track and investigate suspicious activity.
Firewall rules should not allow connections from all IP addresses on TCP or UDP port
3389: This high severity finding indicates that the firewall is configured to allow Remote
Desktop Protocol (RDP) traffic for all instances in the network from the whole internet. This is a
security risk as it allows anyone on the internet to connect to the RDP port on any instance in the
network.
Firewall rules should not allow connections from all IP addresses on TCP or SCTP port 22:
This high severity finding indicates that the firewall is configured to allow Secure Shell (SSH)
traffic to all instances in the network from the whole internet. SSH is a protocol that allows
secure remote access to a computer. If an attacker can gain access to a machine through SSH,
they could potentially steal data, install malware, or disrupt operations.
VMs should not be assigned public IP addresses: This high severity finding indicates that a
particular IP address is actively exposed to the public internet and is potentially accessible to
unauthorized individuals. This finding is considered a potential security risk because it could
allow attackers to scan for vulnerabilities or launch attacks on the associated resource.
Cloud Storage buckets should not be anonymously or publicly accessible: This high severity
finding indicates that there is an Access Control List (ACL) entry for the storage bucket that is
publicly accessible which means that anyone on the internet can read files stored in the bucket.
This is a high-risk security vulnerability that needs to be prioritized for remediation.
Instances should not be configured to use the default service account with full access to all
Cloud APIs: This medium severity finding indicates that a particular identity or service account
has been granted full access to all Google Cloud APIs. This finding is considered a significant
security risk because it grants the identity or service account the ability to perform any action
within the Google Cloud environment, including accessing sensitive data, modifying
configurations, and deleting resources.
Since you're focusing on identifying and remediating the issues related to the security incident,
please disregard the following findings as they do not relate to the remediation tasks you’re
completing:
VPC Flow logs should be Enabled for every subnet VPC Network: There are a number of
low severity findings for Flow Logs disabled. This indicates that Flow Logs are not enabled for a
number of subnetworks in the Google Cloud project used for this lab. This is a potential security
risk because Flow Logs provide valuable insights into network traffic patterns, which can help
identify suspicious activity and investigate security incidents.
Note: Enabling logging for cloud resources is important in maintaining observability. However,
you will not remediate this finding in this lab activity as the subnetworks are part of this lab
environment. As a result, this finding will still be visible on the report after you have completed
the remediation tasks.
Basic roles (Owner, Writer, Reader) are too permissive and should not be used: This
medium severity finding indicates that primitive roles are being used within the Google Cloud
environment. This is a potential security risk because primitive roles grant broad access to a wide
range of resources.
An egress deny rule should be set: This low severity finding indicates that no egress deny rule
is defined for the monitored firewall. This finding raises potential security concerns because it
suggests that outbound traffic is not restricted, potentially exposing sensitive data or allowing
unauthorized communication.
The following table pairs the rules listed in the report with their corresponding findings category.
This will assist you when examining the findings according to resource type later:
Findings
Rule
category
Flow logs VPC Flow logs should be Enabled for every subnet
disabled VPC Network
Overall, these findings indicate a critical lack of security controls and non-compliance with
essential PCI DSS requirements; they also point to the vulnerabilities associated with the data
breach.
Next, navigate to the Security Command Center, and filter the findings for further examination
and analysis of the vulnerabilities in the Google Cloud environment.
7. In the Google Cloud console, in the Navigation menu ( ), click Security > Findings.
The Findings page opens.
8. In the Quick filters panel, in the Resource Type section, select the checkbox for
the Google Cloud storage bucket resource type.
The following active findings pertaining to the storage bucket should be listed:
Public bucket ACL: This finding is listed in the PCI DSS report, and indicates that anyone with
access to the internet can read the data stored in the bucket.
Bucket policy only disabled: This indicates that there is no explicit bucket policy in place to
control who can access the data in the bucket.
Bucket logging disabled: This indicates that there is no logging enabled for the bucket, so it will
be difficult to track who is accessing the data.
These findings indicate that the bucket is configured with a combination of security settings that
could expose the data to unauthorized access. You'll need to remediate these findings by
removing the public access control list, disabling public bucket access, and enabling the uniform
bucket level access policy.
Note: Enabling logging for cloud resources is important in maintaining observability. However,
you will not remediate the Bucket logging disabled finding in this lab activity as this would
require working with multiple projects. As a result, this finding will still be visible after you have
completed the remediation tasks.
9. In the Quick filters panel, in the Resource Type section, uncheck Google Cloud
storage bucket, andselect the checkbox for the Google compute instance resource type.
The following active findings that pertain to the virtual machine named cc-app-01 should be
listed:
Malware bad domain: This finding indicates that a domain known to be associated with
malware was accessed from the google.compute.instance named cc-app-01. Although this
finding is considered to be of low severity, it indicates that malicious activity has occurred on the
virtual machine instance and that it has been compromised.
Compute secure boot disabled: This medium severity finding indicates that secure boot is
disabled for the virtual machine. This is a security risk as it allows the virtual machine to boot
with unauthorized code, which could be used to compromise the system.
Default service account used: This medium severity finding indicates that the virtual machine is
using the default service account. This is a security risk as the default service account has a high
level of access and could be compromised if an attacker gains access to the project.
Public IP address: This high severity finding is listed in the PCI DSS report and indicates that
the virtual machine has a public IP address. This is a security risk as it allows anyone on the
internet to connect to the virtual machine directly.
Full API access: This medium severity finding is listed in the PCI DSS report, and indicates that
the virtual machine has been granted full access to all Google Cloud APIs.
These findings indicate the virtual machine was configured in a way that left it very vulnerable to
the attack. To remediate these findings you'll shut the original VM (cc-app-01) down, and create
a VM (cc-app-02) using a clean snapshot of the disk. The new VM will have the following
settings in place:
10. In the Time range field, expand the drop-down, and select Last 30 days. This will
ensure the list includes findings for the last 30 days.
11. In the Quick filters panel, in the Resource Type section, uncheck Google compute
instance, and select the checkbox for the Google compute firewall resource type.
The following active findings should be listed that pertain to the firewall:
Open SSH port: This high severity finding indicates that the firewall is configured to allow
Secure Shell (SSH) traffic to all instances in the network from the whole internet.
Open RDP port: This high severity finding indicates that the firewall is configured to allow
Remote Desktop Protocol (RDP) traffic to all instances in the network from the whole internet.
Firewall rule logging disabled: This medium severity finding indicates that firewall rule
logging is disabled. This means that there is no record of which firewall rules are being applied
and what traffic is being allowed or denied.
These findings are all listed in the PCI DSS report and highlight a significant security gap in the
network's configuration. The lack of restricted access to RDP and SSH ports, coupled with
disabled firewall rule logging, makes the network highly vulnerable to unauthorized access
attempts and potential data breaches. You'll need to remediate these by removing the existing
firewall overly broad rules, and replacing them with a firewall rule that allows SSH access only
from the addresses that are used by Google Cloud's IAP SSH service.
Now that you have analyzed the security vulnerabilities, it’s time to work on remediating the
report findings.
Public bucket ACL, Public IP address, Open SSH port, and Open RDP port
Public IP address, Default service account used, Full API access, and Firewall rule logging
disabled
Bucket policy only disabled, Bucket logging disabled, Malware bad domain, and Compute
secure boot disabled
Firewall rule logging disabled, Compute secure boot disabled, Public IP address, and Bucket
logging disabled
Submit
Task 2. Fix the Compute Engine vulnerabilities
In this task, you'll shut down the vulnerable VM cc-app-01, and create a new VM from a
snapshot taken before the malware infection. VM snapshots are effective in restoring the system
to a clean state, and ensures that the new VM will not be infected with the same malware that
compromised the original VM.
Next, create a new VM from a snapshot. This snapshot has already been created as part of
Cymbal Retail's long term data backup plan.
23. In the VM instances section, click the cc-app-02 link. The cc-app-02 page opens.
24. In the cc-app-02 toolbar, click Edit. The Edit cc-app-02 instance page opens.
25. Scroll down to the Security and access section, and under Shielded VM, select the
checkbox for the Turn on Secure Boot option. This will address the Compute secure
boot disabled finding.
26. Click Save.
27. In the Compute Engine menu, select VM instances.
28. Select the checkbox for the cc-app-02 VM.
29. Click Start/Resume.
30. A pop-up will appear asking you to confirm that the VM should be started, click Start.
The cc-app-02 VM instance will restart and the Secure Boot disabled finding will be
remediated.
Click Check my progress to verify that you have completed this task correctly.
Click Check my progress to verify that you have completed this task correctly.
By following these steps, you have effectively created a new VM from the snapshot, ensuring it
is free from malware and misconfigurations. You also deleted the compromised VM, eliminating
the source of the security breach.
1. In the Navigation menu ( ), select Cloud Storage > Buckets. The Buckets page opens.
2. Click the project_id_bucket storage bucket link. The Bucket details page opens.
You'll note there is a myfile.csv file in the publicly accessible bucket. This is the file that
contains the sensitive information that was dumped by the malicious actor. Perform the
following steps to address the Public bucket ACL finding.
Switch the access control to uniform and remove permissions for the allUsers principals from
the storage bucket to enforce a single set of permissions for the bucket and its objects. You'll also
need to ensure that users who rely on basic project roles to access the bucket won't lose their
access.
Click Check my progress to verify that you have completed this task correctly.
By following these steps, you have effectively prevented public access to the bucket, switched to
uniform bucket-level access control, and removed all user permissions, addressing the Public
bucket ACL, Bucket policy only disabled, and Bucket logging disabled findings.
In this task, you'll restrict access to RDP and SSH ports to only authorized source networks to
minimize the attack surface and reduce the risk of unauthorized remote access.
Exercise extreme caution before modifying overly permissive firewall rules. The rules may be
allowing legitimate traffic, and improperly restricting it could disrupt critical operations. In this
lab, ensure the Compute Engine virtual machine instances tagged with target tag "cc" remain
accessible via SSH connections from the Google Cloud Identity-Aware Proxy address range
(35.235.240.0/20). To maintain uninterrupted management access, create a new, limited-access
firewall rule for SSH traffic before removing the existing rule allowing SSH connections from
any address.
Create a new firewall rule. This rule must restrict SSH access to only authorized IP addresses
from the source network 35.235.240.0/20 to compute instances with the target tag cc.
Click Check my progress to verify that you have completed this task correctly.
In this task, you'll delete three specific VPC firewall rules that are responsible for allowing
unrestricted access to certain network protocols, namely ICMP, RDP, and SSH, from any source
within the VPC network. Then, you'll enable logging on the remaining firewall rules.
By deleting these rules, you have restricted access to these protocols, limiting the potential for
unauthorized access attempts and reducing the attack surface of your network.
Enable logging for the remaining firewall rules limit-ports (the rule you created in a previous
task) and default-allow-internal.
Enabling logging allows you to track and analyze the traffic that is allowed by this rule, which is
likely to be internal traffic between instances within your VPC.
Click Check my progress to verify that you have completed this task correctly.
Enable logging
Check my progress
By customizing firewall rules and enabling logging, you've addressed the Open SSH port, Open
RDP port, and Firewall rule logging disabled findings. The new firewall rule better protects
the network and improves network visibility.
After diligently addressing the vulnerabilities identified in the PCI DSS 3.2.1 report, it's crucial
to verify the effectiveness of your remediation efforts. In this task, you'll run the report again to
ensure that the previously identified vulnerabilities have been successfully mitigated and no
longer pose a security risk to the environment.
1. In the Security Command Center menu, click Compliance. The Compliance page
opens.
2. In the Google Cloud compliance standards section, click View details in the PCI DSS
3.2.1 tile. The PCI DSS 3.2.1 report opens.
3. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
All major vulnerabilities are now resolved.
Note: While you addressed the high and medium severity vulnerabilities, the flow logs remain
disabled for a number of subnetworks. This finding will still be visible on the report after you
have completed the remediation tasks, as this relates to this lab environment.
CONCLUSION
You have helped the security team at Cymbal Bank to mitigate the impact of the data breach,
address the identified vulnerabilities, and significantly enhanced the security posture of Cymbal
Bank’s Google Cloud environment.
First, you examined and analyzed the vulnerabilities and findings in Google Cloud Security
Command Centre.
Next, you shut the old VM down and created a new VM from a snapshot taken before the
malware infection.
Then, you fixed the cloud storage permissions by revoking public access to the storage bucket
and switching to uniform bucket-level access control. You also removed all user permissions
from the storage bucket.
Next, you fixed the firewall rules by deleting the default-allow-icmp, default-allow-rdp, and
default-allow-ssh firewall rules, and enabling logging for the remaining firewall rules.
Finally, you run a compliance report to confirm that the vulnerability issues have been
remediated.
Remember, as a security analyst it is crucial to maintain regular security audits and implement
ongoing monitoring practices for continued protection against evolving threats and
vulnerabilities.