0% found this document useful (0 votes)
26 views98 pages

nm lab ex

The document outlines a series of experiments related to cloud security principles, including creating a VPC, managing cloud security risks, and responding to cybersecurity incidents. Each experiment provides step-by-step instructions for using Google Cloud tools and services to enhance security and manage vulnerabilities. The content is structured into modules that guide users through practical applications in a cloud environment.

Uploaded by

alagurajam04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views98 pages

nm lab ex

The document outlines a series of experiments related to cloud security principles, including creating a VPC, managing cloud security risks, and responding to cybersecurity incidents. Each experiment provides step-by-step instructions for using Google Cloud tools and services to enhance security and manage vulnerabilities. The content is structured into modules that guide users through practical applications in a cloud environment.

Uploaded by

alagurajam04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 98

INDEX

Expt. Date Name Of The experiment Page Marks Staff

No No Awarded Signature

10

11

12
CONTENTS

MODULE I
INTRODUCTION TO SECURITY PRINCIPLES IN CLOUD
COMPUTING
EXP.NO :01 Create a VPC using Cloud Shell

MODULE II
Strategies for Cloud Security Risk Management

EXP.NO :02 Use reports to remediate findings

MODULE III
Cloud Security Risks: Identify and Protect Against Threats

EXP.NO :03 Create a role in Google Cloud IAM

EXP.NO:04
Access a firewall and create a rule

EXP.NO:05
Identify vulnerabilities and remediation techniques

EXP.NO:06
Change firewall rules using Terra form and Cloud Shell

EXP.NO:07
Create symmetric and asymmetric keys

MODULE IV Detect, Respond, and Recover from Cloud Cybersecurity Attacks

EXP.NO :08 Determine the difference between normal activity and an incident

EXP.NO:09 Explore false positives through incident detection

EXP.NO:10 Analyze audit logs using Big Query

EXP.NO:11 Recover VMs with Google Backup and DR Service


MODULE V
Put It All Together: Prepare for a Cloud Security Analyst Job

EXP.NO :12 Respond and recover from a data breach


Exp. no: 01 Create a VPC using Cloud Shell

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_ copy

You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials. Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left

Activate Cloud Shell

Cloud Shell is an online development and operations environment accessible anywhere with your
browser. Cloud Shell provides command-line access to your Google Cloud resources.

1. Click Activate Cloud Shell ( ) at the top right of the Google Cloud console. You may
be asked to click Continue.
After Cloud Shell starts up, you'll see a message displaying your Google Cloud Project ID for
this session:

Your Cloud Platform project in this session is set to YOUR_PROJECT_ID


The command-line tool for Google Cloud, gcloud,comes pre-installed on Cloud Shell and
supports tab-completion. In order to access Google Cloud, you'll have to first authorize gcloud.

2. List the active account name with this command:


g cloud auth list
Copied!

content_ copy

3. A pop-up will appear asking you to Authorize Cloud Shell. Click Authorize.
4. Your output should now look like this:
5. Output:
ACTIVE: *
ACCOUNT: [email protected]
To set the active account, run:
$ gcloud config set account `ACCOUNT`
5. List the project ID with this command:
gcloud config list project
Copied!

content_ copy

Example output:

[core]
project = qwiklabs-gcp-44776a13dea667a6
Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview
guide.

Task 1. Create a network


A network forms the basis of communication between devices. You'll need to first create a
network for your test environment before you can begin testing security functionality,
experimenting with configurations, or building a proof-of-concept for security tools in your role
at Cymbal Bank. Here, you'll use software-defined networking to easily set up a network in
Google Cloud.

There are two types of VPC networks you can choose to create depending on your subnet
requirements. You can choose to create an auto mode or a custom mode VPC network. An auto
mode VPC automatically creates a subnet in each region for you while a custom mode VPC
provides you with the control to manually create subnets. Each new network that you create must
have a unique name within the same project. You can create up to four additional networks in a
project.

In this task, you'll create an initial custom mode VPC network.

1. Copy the following command into the Cloud Shell terminal:


Note: When copying the command and placing it in the Cloud Shell terminal, be sure to note that
each command option beginning with "--" needs to be on its own line, and that entering the
information incorrectly will result in having to fix the network.

gcloud compute networks create labnet --subnet-mode=custom


Copied!

content_copy

This command creates a custom mode network called labnet.


2. Press ENTER.
Although you don't need to memorize this command, the following breaks it down to help you
better understand the syntax:

 gcloud invokes the Cloud SDK gcloud command line tool.


 compute is one of the groups available in gcloud. It lets you create and configure Compute
Engine resources and forms part of a nested hierarchy of command groups.
 networks is a subgroup of compute with its own specialized commands. It lets you list, create,
and delete Compute Engine networks.
 create is the action to be executed on this group.
 labnet is the name of the network you're creating.
 --subnet-mode=custom is the flag that specifies the type of VPC you are creating, in this
instance custom.
The output should list the labnet network you created:

NAME: labnet
SUBNET_MODE: CUSTOM
BGP_ROUTING_MODE: REGIONAL
IPV4_RANGE:
GATEWAY_IPV4:
Click Check my progress to verify that you have completed this task correctly.

Create a network
Check my progress

Task 2. Create a subnet


In this task, you'll create a subnet within the newly created custom mode VPC network.
Configuring subnets is a network management best practice. For test environments, subnets
allow you to split your VPC into logical segments to improve the organization of cloud
resources, to improve network performance, and to improve security.

When you create a subnet, its name must be unique in that project for that region, even across
networks. The same name can appear twice in a project as long as each one is in a different
region. Additionally, each subnet must have a primary IP address range, which must be unique
within the same region in a project.

1. Copy the following command into the Cloud Shell terminal:


gcloud compute networks subnets create labnet-sub \
--network labnet \
--region "REGION" \
--range 10.0.0.0/28
Copied!

content_copy
This command creates a sub-network called labnet-sub.

2. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.

Create a subnet
Check my progress

Task 3. View networks


In this task, you'll list the available networks to ensure that you have successfully created them.

1. Copy the following command into the Cloud Shell terminal:


gcloud compute networks list
Copied!

content_copy

This command lists the networks in your project.

2. Press ENTER.
The output should list the default and labnet networks.

The default network was created when the project was created. The labnet network was created
by the gcloud command you ran earlier.

What is the subnet mode of the labnet network you created?

Custom

Default

Auto

None of these options

Submit

Task 4. List subnets


In this task, you'll list all subnets within the networks of your project.

You can either list all subnets in all networks in your project, or you can show only the subnets
for a particular network or region. Auditing subnets ensures that the network is properly secured,
and helps identify any misconfigurations or potential security vulnerabilities in your VPCs, such
as subnets that might be unintentionally exposed to the public internet.

1. Copy the following command into the Cloud Shell terminal:


gcloud compute networks subnets list --network=labnet
Copied!

content_copy

This command lists the subnets in the labnet network.

2. Press ENTER.
What is the name of the subnet in the labnet network?

labnet

default

labnet-sub

None of these options

Submit

Conclusion
Great work! By completing this lab activity, you now have hands-on experience in setting up a
test VPC network and subnet. This is the first step of creating a test environment which will help
you to eventually secure the production environment that will need to protect company data.
Thereafter, you were able to confirm the network and subnet's successful creation.

Through observing the network and its subnetworks in the testing environment, you can gather
significant data for research. This data is highly beneficial when configuring and creating
security plans for the production environment.
Exp. no: 02 Use reports to remediate findings

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment

method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.

Task 1. Identify the vulnerabilities with Security Command Center (SCC)


In this task, you’ll use the Security Command Centre (SCC) to check the compliance status of
your project, and identify the high and medium risk vulnerabilities that need to be remediated.

1. In the Google Cloud console, from the Navigation menu ( ),


select Security > Overview. The Security Command Center Overview page opens.
2. In the Security Command Center menu, click Vulnerabilities. The Vulnerabilities page
opens.
There are many active vulnerabilities listed. You can use the filter to search for the specified
findings using the Module ID. You will focus on the following active findings listed for your
storage bucket:

 Public bucket ACL (PUBLIC_BUCKET_ACL): This entry indicates that there is an Access
Control List (ACL) entry for the storage bucket that is publicly accessible which means that
anyone on the internet can read files stored in the bucket. This is a high-risk security
vulnerability that needs to be prioritized for remediation.
 Bucket policy only disabled (BUCKET_POLICY_ONLY_DISABLED): This entry indicates
that uniform bucket-level permissions are not enabled on a bucket. Uniform bucket-level access
provides a way to control who can access Cloud Storage buckets and objects, simplifying how
you grant access to your Cloud Storage resources. This is a medium-risk vulnerability that must
also be remediated.
 Bucket logging disabled (BUCKET_LOGGING_DISABLED): This entry indicates that there
is a storage bucket that does not have logging enabled. This is a low-risk vulnerability that you
are not required to remediate in this scenario.

Note: If the Public bucket ACL or Bucket policy only disabled are not listed or don't display
any active findings, you may have to wait a few minutes and refresh. Wait until these
vulnerabilities display active findings before continuing.

Next, run a compliance report that confirms the vulnerability issues.

3. In the Security Command Center menu, click Compliance. The Compliance page
opens.
4. In the Google Cloud compliance standards section, click View details in the CIS
Google Cloud Platform Foundation 2.0 tile. The CIS Google Cloud Platform
Foundation 2.0 report opens.
5. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
Which of the following rules in the report have active findings for the Cloud Storage bucket?
Select all that apply.

VMs should not be assigned public IP addresses

Bucket policy only should be Enabled

Firewall rules should not allow connections from all IP addresses on TCP or SCTP port 22

Cloud Storage buckets should not be anonymously or publicly accessible

Submit

Task 2. Remediate the security vulnerabilities


In this task, you’ll remediate the security vulnerabilities identified in the previous task. Then,
you’ll check the security status of the Cloud Storage bucket in the report to confirm that the
issues have been remediated.
1. In the Google Cloud console, from the Navigation menu ( ), select Cloud
Storage > Buckets.
2. Under the Filter section, click the Name link of the bucket for your project
(BUCKET_NAME). The Bucket details page opens.
3. Click the Permissions tab. The Permissions section lists all the permissions provided for
the bucket.
First, remove the public access to the Cloud Storage bucket.

4. Under the Permissions section, click the View by Roles tab.


5. Expand the Storage Object Viewer role and select the checkbox for allUsers.
6. Click Remove Access.
7. A pop-up will appear asking you to confirm the access removal. Ensure Remove
allUsers from the role Storage Object Viewer on this resource is selected and
click Remove.
Next, switch the access control to uniform. This will enforce a single (uniform) set of
permissions for the bucket and its objects.

8. On the Access control tile, click Switch to uniform.


9. On the Edit access control dialog, select Uniform.
10. Click Save.
Finally, run a compliance report to confirm that the vulnerability issues have been remediated.

11. In the Google Cloud console, from the Navigation menu ( ),


select Security > Compliance.
12. In the CIS Google Cloud Platform Foundation 2.0 tile, and click View details to view
the report again.
The number of active findings for the Cloud Storage buckets should not be anonymously or
publicly accessible and Bucket policy only should be Enabled rules should now be 0. This
indicates that the Public bucket ACL and Bucket policy only disabled vulnerabilities for the
Cloud Storage bucket have been remediated.

Note: If the active findings for the Public bucket ACL or Bucket policy only disabled don't
display as 0 (zero) after you have successfully remediated the vulnerabilities, you may have to
wait a few minutes and refresh.

Click Check my progress to verify that you have completed this task correctly.

Remediate the security vulnerabilities


Check my progress
CONCLUSION

Throughout this lab, you have gained practical experience in identifying and prioritizing threats
using the Security Command Center. You also remediated the vulnerabilities identified for your
project, and generated a report to confirm that the vulnerabilities have been remediated.

By remediating the vulnerabilities and ensuring the compliance status of the Cloud Storage
bucket, you’ve helped your organization to prevent data breaches, unauthorized access, and data
loss.
Exp. no: 03 Create a role in Google Cloud IAM

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google console (or right-click and select Open Link in Incognito Window)
if you are running the Chrome browser. The lab Sign in page opens in a new browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username 1 below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username 1"


Copied!

content_copy

You can also find the Google Cloud username 1 in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.
Important: You must use the credentials from the left panel. Do not use your Google Cloud
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left

Task 1. Create a custom role


Applying the principle of least privilege is integral to IAM. It ensures that users are only given
the permissions they need to perform their tasks. Custom roles provide a way to tailor
permissions to an organization's needs, making sure that users do not have broad and excessive
permissions.

In this task, you'll create a custom role for the audit team at Cymbal. You'll then grant the custom
role restricted access for viewing the database contents.

1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > Roles. The Roles page opens.
2. On the Explorer bar, located near the top of the Roles page, click + Create Role.
3. In the Create Role dialog, specify the following settings and leave the remaining settings
as their defaults:
4. Click the + Add permissions. The Add permissions dialog box opens.
5. In the Filter permissions by role field, type Firebase Realtime.
6. In the results drop-down field, select the Firebase Realtime Database Viewer checkbox.
7. Click OK.
8. Under Filter, select
the firebase.clients.list and firebasedatabase.instances.list checkboxes to add these
permissions to the custom role.
9. Click Add.
10. In the Create Role dialog, click Create.
Property Value (type or select)

Title Audit Team Reviewer

Custom role, allowing the audit team to conduct its review


Description activities. This role grants read-only access to Firebase
database resources.

ID CustomRole

Role launch
General Availability
stage

Each custom role can be given a role launch stage which reflects the different phases of a role's
development, testing, and deployment. These stages help users understand the current state of a
role and its suitability for various use cases.

There are several launch stages in Google Cloud. The three primary role launch stages you
should know about are:

Alpha: Roles in the Alpha stage are typically experimental and may undergo significant changes.
They are not recommended for production environments. Users can provide feedback on alpha
roles to influence their development.

Beta: Roles in the Beta stage are more mature than alpha roles but might still receive updates
and improvements based on user feedback. They are considered suitable for certain non-
production scenarios but may not be fully stable.

General Availability (GA): Roles that have reached General Availability have undergone
thorough development, testing, and refinement. They are considered stable, reliable, and suitable
for widespread use in production environments. GA roles have been extensively reviewed and
are intended to provide consistent and dependable behavior.
The new role should now be created and added to the existing roles in the project.

Click Check my progress to verify that you have completed this task correctly.

Create a custom role


Check my progress

Task 2. Grant a role to a user


In this task, you'll assign the custom role you created in Task 1 to an existing user.

Note: For the purposes of this lab, you’ll grant the new role to Google Cloud username
2 provided in the Lab Details panel.

1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > IAM. The IAM page opens.
2. On the View By Principals tab, click Grant access. The Grant access dialog window
will open.

The Grant access dialogue box is a crucial component of the IAM system in Google Cloud. It
provides you with the ability to precisely define and manage these permissions for users, groups,
and service accounts.
3. Copy the Google Cloud username 2: Username 2 and paste it into the New
principals field.

4. Expand the Select a role drop-down menu, select Custom, and then select Audit Team
Reviewer. This is the role you created in the previous task.

5. Click Save.
The custom role should now be assigned to the user.
Click Check my progress to verify that you have completed this task correctly.

Grant a role to a user


Check my progress

Task 3. Verify the role


So far, you've created a custom role with the appropriate permissions and granted the role to the
user. Now, you'll need to check your work to verify that the user has been assigned the role you
created. Ensuring that you've correctly configured settings is an integral in part of your workflow
as a cloud security analyst.

In this task, you'll use Google Cloud's Policy Analyzer to create a query to check the roles
granted to the user.

1. In the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > Policy Analyzer. The Policy Analyzer page opens.
2. In the Analyze policies section, on the Custom Query tile, click Create custom query.
A pop-up may appear at the top left Google Cloud menu ( ) with the text “Click on the
menu anytime to find solutions for your business”. Select Got it and proceed to the next
step.
3. In the Set the query parameters section, expand the Parameter 1 drop-down menu and
select Principal.
4. Copy the Google Cloud username 2: Username 2 and paste it into the Principal field.

5. Click Continue.
6. In the Advanced options for query results section, select the List resources within
resource(s) matching your query checkbox.
7. Click Analyze and then select Run query in the drop-down menu.
The results should return the role granted to the user. Use the results to answer the following
question(s).

Which role has been granted to the user?

BigQuery Admin

Storage Admin

Pub/Sub Admin

Audit Team Reviewer

Submit

CONCLUSION

Great work! You have successfully utilized IAM to create a custom role, grant access to a user
for that role, and verified the permissions within Google Cloud. Cymbal Bank's audit team can
now begin working on their database audit using the custom role you created.

IAM defines who has access to which resources based on their role. It is critical for managing
digital identities in an organization's environment and will be integral in your work as a cloud
security analyst.

By using IAM services, you are well on your way to effectively managing access and
permissions to storage resources.
Exp. no: 04 Access a firewall and create a rule

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.
Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-left
Task 1. Create a firewall rule
In this task, you'll create a firewall rule that allows HTTP and SSH connectivity. You will also
specify a target tag for the newly created firewall rule.

In Google Cloud, firewall rules must specify targets to define which VM instances they apply
to. Target tags can be used to apply a firewall rule to a specific group of VMs, helping simplify
the management of firewall rules. You'll use target tags to enable this firewall rule to the web
server only.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select VPC Network > Firewall. The Firewall policies page displays.
Note: If a message is displayed stating that you don't have the required permissions to view the
firewall policies inherited by this project, you can disregard it and continue with the next steps.

3. On the toolbar, click + Create Firewall Rule. The Create a firewall rule dialog
displays.
4. Specify the following, and leave the remaining settings as their defaults:

Field Value

Name allow-http-ssh

Logs On

Network vpc-net

Targets Specified target tags

Target tags http-server

Source filter IPv4 ranges


Source IPv4 ranges 0.0.0.0/0

In the Protocols and Select Specified protocols and ports


Select the TCP checkbox
ports section
In the Ports field enter 80, 22

5. Click Create.
Note: Wait until the Successfully created firewall rule "allow-http-ssh" message displays
before continuing.

Click Check my progress to verify that you have completed this task correctly.

Create a firewall rule


Check my progress

Task 2. Generate HTTP network traffic

In this task, you'll generate HTTP network traffic to the web server by visiting its external IP
address. The network traffic you generate will then be recorded as logs that you can analyze in
the Logs Explorer.

First, you need to generate network traffic.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Compute Engine > VM instances. The VM instances page opens.
3. For web-server, click on the External IP link to access the server.

(Alternatively, you can add the External IP value to http://EXTERNAL_IP/ in a new browser
window or tab.) A default web page should display.

Next, you need to find the IP address of the computer you’re using.
4. Access your IP address using the following link whatismyip.com. It will directly reply
with your IP.

Note: Ensure that the IP address only contains numerals (IPv4) and is not represented in
hexadecimal (IPv6).

5. Copy the IP address and save it in a notepad. You’ll need to use this in the next task.
Click Check my progress to verify that you have completed this task correctly.

Generate HTTP network traffic


Check my progress

Task 3. Analyze the web server Flow Logs

In this task, you'll access and analyze the VPC Flow Logs for the web server using the Logs
Explorer.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Logging > Logs Explorer. The Logs Explorer page opens. (You may need to
expand the More Products drop-down menu within the Navigation menu and locate
Logging under Operations.)
3. On the left side of the Logs Explorer page, the Log fields pane is presented.
The Resource type and Severity sections are available. Under the Resource
type section, select Subnetwork.

Entries from the subnetwork logs will display on the Query results pane to the right of the Log
fields pane.
4. On the Log fields pane, in the Log name section,
select compute.googleapis.com/vpc_flows to access the VPC Flow logs for the network.
If this option doesn’t display, wait a few minutes for this log type to show up.
Once selected, entries from the VPC Flow Logs display on the Query results pane.

5. In the Query builder at the top of the page, at the end of line 2, press ENTER to create a
new line.
6. On line 3, enter the following:

jsonPayload.connection.src_ip=YOUR_IP
Copied!

content_copy

Your query should resemble the following:

resource.type="gce_subnetwork"
log_name="projects/PROJECT_ID/logs/compute.googleapis.com%2Fvpc_flows"
jsonPayload.connection.src_ip=YOUR_IP
7. Replace YOUR_IP with the IP address you saved from Task 2. This query will search for
network traffic logs originating from your IP address that you had generated in the
previous task.
8. Click Run query. The query results should display on the Query results pane.

Note: If the vpc_flows filter option doesn’t display or if there are no logs, you might have to wait
a few minutes and refresh. If after a couple of minutes, the vpc_flows filter option still doesn’t
display, navigate to the Compute Engine page and click on the External IP of the web server a
few times to generate more traffic and check back on the vpc_flows filter option.

9. In the Query results pane, expand one of the log entries.


10. Within the entry, expand jsonPayload by clicking the expand arrow >. Then, expand
the connection field.

Here you can examine the details about the network connection to the web server:

 dest_ip - This is the destination IP address of the web server.


 dest_port - This is the destination port number of the web server which is HTTP port 80.
 protocol - The protocol is 6 which is the IANA protocol for TCP traffic.
 src_ip - This is the source IP address of your computer.
 src_port - This is the source port number that's assigned to your computer. According to Internet
Assigned Numbers Authority (IANA) standards, this is typically a random port number between
49152-65535.
After analyzing the details of this log entry, you should notice that the network traffic you
generated (on HTTP port 80) was allowed due to the firewall rule allow-http-ssh you created
previously. This rule allowed incoming traffic on ports 80 and 22.
According to the log entries, what is the IP address of the web server?

0.0.0.0

10.1.3.2

127.0. 0.1

255.255.255.255

Submit

Task 4. Create a firewall rule to deny HTTP traffic

In this task, you'll create a new firewall rule that denies traffic from port 80.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select VPC network > Firewall. The Firewall policies page displays.
3. On the toolbar, click + Create Firewall Rule.
4. In the Create a firewall rule dialog, specify the following, and leave the remaining
settings as their defaults:

Field Value

Name deny-http

Logs On

Network vpc-net
Action on match Deny

Targets Specified target tags

Target tags http-server

Source filter IPv4 ranges

Source IPv4 ranges 0.0.0.0/0

In the Protocols and Select Specified protocols and ports


Select the TCP checkbox
ports section
In the Ports field enter 80

5. Click Create.
Click Check my progress to verify that you have completed this task correctly.

Create a firewall to deny HTTP traffic


Check my progress

Task 5. Analyze the firewall logs


In this task, you'll test the deny-http firewall rule that you created in the previous task.

First, attempt to connect to the web server.

1. Click the Navigation menu ( ).


2. Select Compute Engine > VM instances. The VM instances page opens.
3. For web-server, click on the External IP link to access the server.
The following error message should display on the page:
This error occurred because of the deny-http firewall rule you created in the previous task. To
verify this, access the Logs Explorer to analyze the firewall logs for the web server.

4. In the Google Cloud console, click the Navigation menu ( ).


5. Select Logging > Logs Explorer. The Logs Explorer page opens. (You may need to
expand the More Products drop-down menu within the Navigation menu and locate
Logging under Operations.)
6. Under the Resource type section, select Subnetwork.
7. On the Log fields pane, in the Log name section,
select compute.googleapis.com/firewall to access the firewall logs for the network.
8. In the Query builder at the top of the page, at the end of line 2, press ENTER to create a
new line.
9. On line 3, enter the following:

jsonPayload.connection.src_ip=YOUR_IP DENIED
Copied!

content_copy

Replace YOUR_IP with the IP address you saved from Task 2. This query will search for
firewall logs that denied your IP address connection to the web server. Your query should
resemble the following:

resource.type="gce_subnetwork"
log_name="projects/PROJECT_ID/logs/compute.googleapis.com%2Ffirewall"
jsonPayload.connection.src_ip=YOUR_IP DENIED
10. Click Run query. The query results should display on the Query results pane.
11. In the Query results pane, expand one of the log entries.
12. Within the log entry, expand the jsonPayload field by clicking the expand arrow >.
Then, expand the connection field. You can examine the details about the network
connection to the web server to verify if the firewall rule was successfully triggered:

 dest_ip - This is the destination IP address of the web server which is 10.1.3.2.
 dest_port - This is the destination port number of the web server which is HTTP port 80.
 protocol - The protocol is 6 which is the IANA protocol for TCP traffic.
 src_ip - This is the source IP address of your computer.
 src_port - This is the source port number that's assigned to your computer.
 disposition - This field indicates whether the connection was allowed or denied. Here,
it's denied which indicates that the connection to the server was denied.
13. Within the log entry, expand the rule_details field by clicking the expand arrow >. You
can examine the details about the firewall rule. Additionally, you can extract more
information from the following fields in the log entry by expanding them:
 action - The action taken by the rule, DENY in this case.
 direction - The rule's traffic direction can be either ingress or egress, here it is INGRESS which
means the action will apply to incoming traffic.
 ip_port_info - The protocol and ports this rule controls.
The ip_protocol and port_range lists TCP port 80.
 source_range - The traffic sources that the firewall rule is applied to. Here it is 0.0.0.0/0.
 target_tag - This lists all the target tags that the firewall rule applies to. Here, it is http-server,
the target tag you added to the firewall rule in the previous task.
By examining the details of this firewall log entry, you should notice that the firewall rule deny-
http you set up to deny HTTP traffic was successfully triggered. This rule denied incoming
network traffic on port 80.

Click Check my progress to verify that you have completed this task correctly.

Analyze the firewall logs


Check my progress

CONCLUSION

You now have practical experience in creating and testing firewall rules for a web server in a
cloud environment. By creating firewall rules and analyzing log entries, you have a familiarity
with the intricacies of perimeter protection. This is useful for monitoring and analyzing potential
security incidents or threats, which is an essential part of a security analyst's role.

You’re well on your way to understanding how to modify firewall rules to ensure maximum
network security.
Exp. no: 05 Identify vulnerabilities and remediation techniques

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.

Task 1. Launch a virtual machine


In this task, you’ll create a static IP address and launch the virtual machine to run the vulnerable
application.

1. On the Google Cloud console title bar, click the Activate Cloud Shell ( ) icon. If
prompted, click Continue.
2. Copy the following command into the Cloud Shell terminal:
gcloud compute addresses create xss-test-ip-address --region="REGION"
Copied!

content_copy

This command creates a static IP address named xss-test-ip-address in the REGION region. This
static IP will be used for scanning the vulnerable web application.

3. Press ENTER.
If prompted, click Authorize.
4. Copy the following command into the Cloud Shell terminal:
gcloud compute addresses describe xss-test-ip-address \
--region="REGION" --format="value(address)"
Copied!

content_copy

This command returns the static IP address you generated.

5. Press ENTER.
6. Copy the IP address from the output and save it in a notepad. You’ll need to use this in a
later task.
7. Copy the following command into the Cloud Shell terminal:
gcloud compute instances create xss-test-vm-instance --address=xss-test-ip-address --no-service-
account \
--no-scopes --machine-type=e2-micro --zone="ZONE" \
--metadata=startup-script='apt-get update; apt-get install -y python3-flask'
Copied!

content_copy

This command creates a VM instance to run the vulnerable application.

8. Press ENTER.
Note: The startup script will install python-flask, a Web Application Framework, which is used
for running a simple Python application. This application demonstrates cross-site scripting
(XSS) vulnerability, which is a common web application security vulnerability.

Click Check my progress to verify that you have completed this task correctly.

Launch a virtual machine


Check my progress

Task 2. Set up and run the vulnerable application


In this task, you’ll download and extract the web application files for the vulnerable application,
and then deploy the application in the SSH-in-browser.

First, you’ll create a firewall rule that will allow Web Security Scanner to access the vulnerable
application.

1. Copy the following command into the Cloud Shell terminal:


gcloud compute firewall-rules create enable-wss-scan \
--direction=INGRESS --priority=1000 \
--network=default --action=ALLOW \
--rules=tcp:8080 --source-ranges=0.0.0.0/0
Copied!

content_copy

This command creates a firewall rule that allows access to the web application from any source
IP address. This allows the Web Security Scanner to access the vulnerable application and
perform a scan.

2. Press ENTER.
Next, use an SSH connection to connect to the VM instance.

3. In the Google Cloud console, click the Navigation Menu ( ).


4. Select Compute Engine > VM instances.
5. On the VM instances page, in the Connect column, click on the SSH button next to your
test instance.
This will open an SSH connection to your VM instance in a new browser window.

6. A pop-up may appear asking you to allow SSH in-browser to connect to VMs.
Click Authorize.
Now, extract the web application files.

7. Copy the following command into the SSH-in-browser page (not in Cloud Shell):
gsutil cp gs://cloud-training/GCPSEC-ScannerAppEngine/flask_code.tar . && tar xvf
flask_code.tar
Copied!

content_copy

This command downloads and extracts the vulnerable web application files.

8. Press Enter.
9. Finally, copy the following command into the SSH-in-browser page:

python3 app.py
Copied!

content_copy

This command starts the application.

10. Press ENTER.


A message should indicate that the application is up and running.
Note: Since this is a web application that was installed for use in development, there may be
vulnerabilities associated with the configuration file. It is important to test any application prior
use on a public facing network.Note: Do not close the SSH-in-browser page when performing
the next task, as the application must continue to run.

Click Check my progress to verify that you have completed this task correctly.

Set up and run the vulnerable application


Check my progress

Task 3. Access the vulnerable application


In this task, you’ll test your application for a vulnerability known as cross-site scripting (XSS).
XSS vulnerabilities can be exploited by malicious scripts, such as HTML code, in content that is
then served to web browsers.

1. While the application is running, open a new browser window.


2. Copy the URL below into the browser tab, and replace <YOUR_EXTERNAL_IP> with
the static IP address of the VM you saved in a notepad in Task 1:
http://<YOUR_EXTERNAL_IP>:8080
Copied!

content_copy

A Cymbal Bank corporate banking portal with a web form should appear.

3. Copy the following HTML code including the script tags into the web form:
<script>alert('This is an XSS Injection to demonstrate one of OWASP vulnerabilities')</script>
Copied!

content_copy

This code injects an OWASP® vulnerability.


4. Click POST.
The injected code displayed a message back to the browser. This action by itself is not malicious,
however attackers can introduce malicious code into an exploitable application to either steal
data from it or implant malware onto the user's device.

The alert window opens with the following message: “This is an XSS Injection to demonstrate
one of OWASP vulnerabilities”.

Which security vulnerability is associated with actions performed in this task?

Cross-Site Scripting (XSS)

SQL Injection

Cross-Site Request Forgery (CSRF)

Ransomware

Submit

Task 4. Scan the application


In this task, you’ll scan the application for vulnerabilities using the Web Security Scanner.

First, enable the Web Security Scanner API.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select APIs & Services > Enabled APIs and services. The APIs & Services page
displays.
3. Click + Enable APIs and services.
4. In the search field, type Web Security Scanner, and press ENTER.
5. Select Web Security Scanner API.
6. Click Enable.
Now, scan the application for vulnerabilities.

7. In the Google Cloud console, click the Navigation menu > View All Products ( ).
8. Select Security > Web Security Scanner.
If the Web Security Scanner API is enabled then the Cloud Web Security Scanner page
displays the Scan configs details.

9. In the Cloud Web Security Scanner toolbar, click + New scan.


10. In the Name section, name the scan Cross-Site Scripting scan.

In the Starting URLs section, the Starting URLs 1 field should be pre-populated with your
static IP address.

11. Add a colon and the port number 8080 at the end of the IP address. The Starting URL
1 should resemble the following:
http://<EXTERNAL_IP>:8080
Copied!

content_copy

12. If present, delete Starting URL 2.


13. In the Excluded URLs section, verify that Authentication is set to None,
and Schedule set to Never. Leave all other fields unchanged.
14. Click Save to create the scan.
15. Click Run Scan to start the scan.
16. Return to the SSH-in-browser window.
In the SSH-in-browser window, you should view logs being generated as Web Security Scanner
tests all possible URLs for potential vulnerabilities.

17. When the scan is complete, return to the Google Cloud console.
Note: The scan might take 5-10 minutes to complete.

The Results tab should indicate the cross-site vulnerabilities, demonstrating how Web Security
Scanner can detect a XSS vulnerability.

The vulnerabilities can also be found in the Vulnerabilities tab under the Security Command
Centre.

Click Check my progress to verify that you have completed this task correctly.

Scan the application


Check my progress

Task 5. Remediate the vulnerabilities


In this task, you'll remediate the application's XSS vulnerability and re-run the application with
the new fix.

The recommendation for fixing the current vulnerabilities is to validate and escape untrusted
user-supplied data, which also points to the corresponding OWASP® rules.

You will do this by editing the code of the vulnerable application to include lines of code that
validate and escape the user-supplied data.

1. Return to the SSH-in-browser page connected to your VM instance.


2. Press CTRL + C to stop the running application. Alternatively, you can click the Send
key combination icon on the top right corner of the SSH-in-browser window to input
the CTRL + C key combination.
Now, edit the app.py file using the nano editor.

3. Copy the following command into the SSH-in-browser page:


nano app.py
Copied!

content_copy

This command opens the nano code editor.

4. Press ENTER.
5. To fix the XSS vulnerability, you validate the output string variable. The output string is
the processed output of the user-supplied web form input.

Ensure that the application does not accept user input as HTML code, instead, it will escape
special characters supplied by user input. To do this, locate the two lines that set the output
string:

# output_string = "".join([html_escape_table.get(c, c) for c in input_string])


output_string = input_string
Copied!

content_copy

6. Remove the # symbol from the first line, and add it to the beginning of the next line
(ensure that you indent your code properly.) The final lines must resemble the following:
@app.route('/output')
def output():
output_string = "".join([html_escape_table.get(c, c) for c in input_string])
# output_string = input_string
return flask.render_template("output.html", output=output_string)
Copied!

content_copy

7. Press CTRL + X to exit nano, then Y to save your changes, and then ENTER to save
your changes.
8. Copy the following command into the SSH-in-browser terminal:
python3 app.py
Copied!

content_copy

This command re-runs the application.

9. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.

Remediate the vulnerabilities


Check my progress

Task 6. Re-scan the web application


In this task, you’ll re-scan the application to make sure there are no vulnerabilities.

1. Return to the Cloud Web Security Scanner page in the Google Cloud console.
2. Click Run to re-run the scan.
Note: The scan might take 5-10 minutes to complete.

The Results tab should now indicate that there are no vulnerabilities found.

Click Check my progress to verify that you have completed this task correctly. Be sure you wait
until the scan completes to get credit for completing this task.

Re-scan the web application


Check my progress

CONCLUSION

Through this lab, you gained practical experience in scanning for application vulnerabilities. You
learned the importance of a security analyst's ability to scan for application vulnerabilities, which
is essential for helping identify and address potential weaknesses, managing risks, meeting
compliance requirements, and ultimately, maintaining a robust security posture to protect an
organization’s assets and reputation.

By closing security gaps and addressing weaknesses, you can help prevent potential exploitation,
minimize the impact of security incidents, and maintain compliance with industry regulations.

In this lab, you completed one of the fundamental aspects of proactive cybersecurity strategies.

Exp. no: 06 Change firewall rules using Terraform and Cloud Shell

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.
"Google Cloud username"
Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top- left

Task 1. Clone the Terraform repo


In this task, you'll clone the Terraform example repository using the Cloud Shell terminal. The
Terraform example contains the configuration file, which you'll use to provision the firewall
rules.

1. In the Google Cloud console, click the Activate Cloud Shell


2. Click Continue.
It should only take a few moments to provision and connect to the Cloud Shell environment.

3. Copy the following command into the Cloud Shell terminal:


cloudshell_open --repo_url "https://github.com/terraform-google-modules/docs-examples.git" --
print_file "./motd" --dir "firewall_basic" --page "editor" --tutorial "./tutorial.md" --
open_in_editor "main.tf" --force_new_clone
Copied!

content_copy

This command clones the Terraform example directory.

4. Press ENTER.
This command performs the following actions:

 Clones the terraform-google-modules.


 Prints the motd file name.
 Switches to the firewall_basic directory.
 Checks the cloned files, for example tutorial.md.
 Opens main.tf in Cloud Shell Editor.
Once the cloning is complete, you’ll be at
the ~/cloudshell_open/docs-examples/firewall_basic location in the terminal. Your Cloud Shell
prompt should display similar output to the following example:

student_01_c2e095df84e2@cloudshell:~/cloudshell_open/docs-examples/firewall_basic
(qwiklabs-gcp-04-fde36f013e65)$
5. Copy the following command into the Cloud Shell terminal to list the contents of the
directory:
ls
Copied!

content_copy

You should notice that several files in the directory have been
downloaded: backing_file.tf, main.tf, motd, and tutorial.md.

6. Copy the following command into the Cloud Shell terminal to analyze the configuration
of the firewall rule:
cat main.tf
Copied!

content_copy

7. Press ENTER.
The main.tf file is the configuration file that defines the resources that Terraform will create.
Two resources will be created: a firewall rule google_compute_firewall named test-firewall-$
{local.name_suffix} with rules to allow ICMP and TCP traffic from ports 80, 8080, and 1000-
2000 and a VPC network google_compute_network named test-network-${local.name_suffix}.
The variable ${local.name_suffix} is a local variable that automatically generates unique names
for resources.

Which one of the following protocols are being modified (allow/deny) on the firewall using the
Terraform main.tf file in Cloud Shell Editor?

icmp, web

test-firewall, test-network

web, test-network

icmp, tcp

Submit

Task 2. Deploy the VPC network and firewall

In this task, you'll deploy a new VPC network and a new firewall rule. This task provides hands-
on experience with building a VPC network and subnets.

Note: Run the following commands in sequence in the Cloud Shell terminal.

1. Copy the following command into the Cloud Shell terminal.


export GOOGLE_CLOUD_PROJECT=Project ID
Copied!

content_copy

This command sets the project ID.

2. Press ENTER.
3. Copy the following command into the Cloud Shell terminal:
terraform init
Copied!

content_copy

This command initializes the Terraform script.

4. Press ENTER.
The output should return a message stating that the Terraform has been successfully initialized.
Take a moment to examine the output. You'll notice that Terraform will create a new firewall
and VPC network:

5. Once the initialization is complete, copy the following command into the Cloud Shell
terminal:
terraform apply
Copied!

content_copy

This command applies the changes and deploys the Terraform script.

6. Press ENTER.
Note: If an Authorize Cloud Shell dialog box appears, click Authorize to grant permission to
use your credentials for the gcloud command.

7. The command prompt will prompt you to Enter a value. Type "yes", and press ENTER.
This will start creating the VPC network and firewall rules.

Once it’s completed, the output should return the following message:

Apply complete! Resources: 3 added, 0 changed, 0 destroyed.


This means that the VPC and firewall have been successfully deployed.

Click Check my progress to verify that you have completed this task correctly.

Check my progress
Check my progress

Task 3. Verify the deployment of the resources


In this task, you'll verify that the newly created VPC and firewall rules have been successfully
deployed.

1. In the Google Cloud console, from the Navigation menu ( ), select VPC network >
VPC networks. The VPC networks page opens.
2. You should notice two VPC networks, default and the newest one you just created, test-
network. Click test-network to access the VPC network details.
3. Click Firewalls. Use the expand arrow to expand vpc-firewall-rules. Under Protocols
and ports and Action you should notice the firewall rules are the same rules as defined
in the configuration file: Allow and tcp:80, 1000-2000, 8080 icmp.
Note: To ensure that resource names are unique, both the test-network and test-firewall names
will be dynamically appended with a unique identifier. For example, test-network-curly-
penguin. This unique identifier is generated automatically by the ${local.name_suffix} local
variable, which is defined in the configuration file. This helps prevent resource naming conflicts
and ensures the proper organization of infrastructure components.

CONCLUSION

You've successfully built a VPC network and subnet using Terraform and the Cloud Shell. This
lab provides the foundation to developing advanced automated solutions that can be given to
system administrators to use with Terraform.

By creating the VPC network and firewall, you have gained a better understanding of how it
enables you to automate the process of provisioning and modifying firewall rules. This helps
establish consistency across various environments, while also helping reduce the chance of
human error.
Exp. no: 07 Create symmetric and asymmetric keys

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab
Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.
Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-left

Task 1. Create a symmetric key


In this task, you’ll delve into the intricate process of crafting a symmetric key, complete with
considerations for its designated region and the crucial aspect of its protection level. You'll begin
by generating a symmetric key with carefully tailored parameters.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Security > Key Management.
3. On the Key Rings tabbed page, click + Create Key Ring.

Now, specify the key details.

4. For Key ring name, enter demo-key-ring.


5. For the Location type category, select Region.
6. Expand the Region drop-down menu, and select REGION.

Note: Be sure to select the specified region, otherwise you get an error. Note: Be sure you select
the specified region, otherwise you get an error.

7. Click Create.
8. In the Name and protection level category, in the Key name field, enter demo-key.

The Protection level should be set to Software by default, if not, select it now.

9. Click Continue. The Key material category expands.


10. For Key material, select Generated key.
11. Click Continue. The Purpose and algorithm category expands.
12. For Purpose, select Symmetric encrypt/decrypt.
13. Click Continue. The Versions category expands.
14. For Key rotation period, select 90 days.
15. For Starting on, leave as the default value.
16. Click Continue. No additional settings are needed.
17. Click Create.
Once the key is created, it can be used for a variety of implementations such as data encryption
and decryption.

Symmetric keys are commonly used to encrypt sensitive data before storage or transmission.
When data needs to be accessed or shared, the same symmetric key is used to decrypt the
encrypted content, ensuring that only authorized parties can access the original information.

Click Check my progress to verify that you have completed this task correctly.

Create a symmetric key


Check my progress

Task 2. Create an asymmetric key

In this task, you'll create an asymmetric key with specific settings, including that of its algorithm
and protection level.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Security > Key Management. The Key Rings tabbed page opens, listing the
newly-created key.
3. Under Name, click the link for the key you created in the previous task: demo-key-ring.
The Key ring details page opens.
4. In the Keys tabbed page, click + Create Key.

Now, specify the key details.

5. For Key name, enter demo-asymmetric-key.


6. For Protection Level, select Software.
7. Click Continue. The Key material category expands.
8. For Key Material, select Generated key.
9. Click Continue. The Purpose and algorithm category expands.
10. For Purpose, select Asymmetric decrypt.
11. For Algorithm, leave as the default value.
12. Click Continue.
13. For Versions, no settings are required.
14. Click Continue. No additional settings are needed.
15. Click Create.

The asymmetric key for decryption should now be created.

Asymmetric keys can also be used for digital signatures. Digital signatures help verify the
authenticity and integrity of messages, files, or software, ensuring that they have not been
tampered with during transmission. Digital signatures use two keys, one for signing which
involves the user's private key, and one for verifying signatures which involves the user's public
key. The output of the signature process is called the digital signature.

Click Check my progress to verify that you have completed this task correctly.

Create an asymmetric key


Check my progress

CONCLUSION
Great work! Through this lab activity, you have gained practical experience in creating both
symmetric and asymmetric keys, which play a crucial role in ensuring secure data and
communication over networks.

Having created both types of keys, you now have a better understanding of their significance in
cryptography. Your newfound ability to create these keys allows you to assist customers in
securely storing large amounts of data.
Exp. no: 08 Determine the difference between normal activity and an
incident
Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog. Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud account
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a temporary
account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.

Task 1. Grant permissions to an external account


In this task, you’ll grant project owner rights to an external gmail account. Granting owner rights
to an external account will trigger the Event Threat Detection IAM detectors. Granting project
owner rights to an external account is considered anomalous behavior or potentially malicious
activity. Event Threat Detection will identify this activity as a threat and generate findings which
you'll examine in the upcoming tasks.

1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
The IAM page opens.
On the View By Principals tab, note the two student users that have been automatically
configured for the qwiklabs.net organization. These two users are also the same users listed in
the Lab details panel as Google Cloud username 1 and Google Cloud username 2.

These two users have automatically been granted owner roles to the lab project by a service
account as part of a normal provisioning process. This will trigger an alert finding or incident
because an external principal has an owner role. However, because both users belong to the
qwiklabs.net organization this alert is considered normal activity. You will examine this alert
finding later.

2. On the View By Principals tab, click Grant Access. The Grant access dialog displays.
3. Under the Add principals section, in the New principals field,
type [email protected].
4. Expand the Select a role drop-down menu, select Basic, and then select Owner.
5. Click Save.
You have now assigned the owner role to the external user [email protected]. This
will trigger a finding in SCC because this user is outside of the qwiklabs.net organization.

Click Check my progress to verify that you have completed this task correctly.

Grant permissions to an external account


Check my progress

Task 2. Access the Event Threat Detection findings


In this task, you’ll access the Event Threat Detection findings in the Security Command Center.

1. In the Google Cloud console, in the Navigation menu ( ), click Security > Findings.
The Findings page opens.
You should notice three findings with high severities listed in the Finding query results panel.
In this lab, you’ll examine two Persistence: IAM anomalous grant findings to determine
whether the finding is normal activity or whether it is malicious.

Note: If the Persistence: IAM anomalous grant findings are not listed, you may have to wait a few
minutes and refresh. Wait until both these active findings display before continuing.

The Persistence: IAM anomalous grant indicates that an anomalous IAM grant was detected.
This means that a user or service account was granted access to a resource that they should not
have had access to. This could be a potential indication of a malicious actor attempting to gain
unauthorized access to your environment.

Next, filter the findings to display a list of Persistence: IAM anomalous grant category
findings.

2. In the Quick filters panel, in the Category section, select the checkbox for the Persistence: IAM
anomalous grant category.
Note: Selecting attributes with quick filters automatically adds them to the query. Notice that the Query
preview is updated with the Persistence: IAM anomalous grant category you selected. You can locate
specific findings or groups of findings by editing the findings query.

The filter returns two Persistence: IAM anomalous grant findings.

3. Click the Event time column header to sort the findings in descending order, so that the earliest
finding is at the top.
Task 3. Analyze the findings
In this task, you'll examine these findings to determine which is normal activity and which is a
genuine incident.

1. In the Findings query results panel, in the Category column, click the Persistence:
IAM Anomalous Grant finding with the earliest event time. The Persistence: IAM
Anomalous Grant dialog opens on the Summary tab, which displays the finding
summary.
2. Find the Principal email row. This is the user account that granted the owner role to the
user. Notice that the service account belongs to the qwiklabs.net organization. With this
information, you can establish that this finding represents normal and expected activity.
3. Click the Source Properties tab, and expand properties > sensitiveRoleGrant >
members. Again, the email address listed for principalEmail is the user that granted the
owner role, and the email address(es) listed for members is the user that was granted the
owner role.

Which user was granted the owner role in the earliest Persistence: IAM Anomalous Grant finding record?

The external user [email protected]

A Compute Engine service account

None of these options

A user belonging to the qwiklabs.net organization

Submit

Next, you'll locate the malicious activity associated with the external user account you had
granted access to: [email protected].

4. Click the close (X) button to return to the Findings page.


5. In the Findings query results panel, in the Category column, click on the Persistence: IAM
Anomalous Grant findings record with the latest event time.
6. Note the value on the Principal email row. This is the user account email address that granted
the owner role to the user.
7. Click Source Properties tab, and expand properties > sensitiveRoleGrant > members. You
should notice the user account [email protected], which is an external user account.
With this information, you can establish that this finding is associated with an unauthorized and
malicious actor.
Which user was granted the owner role in the Persistence: IAM Anomalous Grant finding with the latest
event time?

The default Compute Engine service account

None of these options

A student user belonging to the qwiklabs.net organization

The external [email protected] user


Submit

Which Persistence: IAM Anomalous Grant finding is a genuine incident?

The finding with the latest event time

The finding with the earlier event time

None of these options

Both findings are genuine incidents

Submit

Task 4. Access the findings in Cloud Logging


In this task, you’ll access the events related to the Security Command Center findings in Cloud
Logging.

1. In the Google Cloud console, in the Navigation menu ( ) click Logging > Logs Explorer.
The Logs Explorer page opens. (You may need to click More Products to expand
the Navigation menu options and locate Logging under Operations.)
2. Copy the following query into the Query builder at the top of the page:
protoPayload.authorizationInfo.permission="resourcemanager.projects.setIamPolicy"
protoPayload.methodName="InsertProjectOwnershipInvite"
Copied!

content_copy

This query filters the IAM logs.

3. Click Run query. The query results should display on the Query results pane.
4. In the Query results pane, expand the audit log listed for your project.
5. Click Expand nested fields. All the nested fields contained in the log are made visible.
You can now examine the details of the anomalous request event including information such as:

 authenticationInfo: The email of the user who made the request.


 request: The email identity of the user the anomalous grant was made to.
 request Metadata: The IP address of the system where the request was made, the browser user agent of
the web browser that was used.
This information can be vital when investigating whether an event is normal activity or an actual
threat event.

Which user account made the request to grant the project owner role to the [email protected] user?

A Google Cloud IAM service account

None of these options


An external @gmail.com account

A student user belonging to the qwiklabs.net organization

Submit

Task 5. Fix the finding


In this task, you’ll remediate the malicious Persistence: IAM Anomalous Grant finding by
removing the project owner role that you had previously assigned to the external user.

1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
The IAM page opens.
2. Next to the [email protected] user, click the Edit principal ( ) icon. The Edit
permissions page opens.
3. Click the Delete ( ) icon to delete the owner role.
4. Click Save.
The policy will be updated, and the owner role removed from the [email protected]
user.

Click Check my progress to verify that you have completed this task correctly.

Fix the finding


Check my progress

CONCLUSION
Great work! Through this lab activity, you have gained practical experience in analyzing a
security alert to determine whether it is a genuine malicious activity.

You did this by granting permissions to an external user, viewing the Event Threat Detection
findings in the Security Command Center, and accessing the findings in Cloud Logging. Finally,
you remediated the finding by removing the project owner role from the external user.

As a security analyst, these are skills that can enable you to quickly take steps to contain,
mitigate, and remediate any threats.
Exp. no: 09 Explore false positives through incident detection

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google console (or right-click and select Open Link in Incognito Window)
if you are running the Chrome browser. The lab Sign in page opens in a new browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username 1 below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username 1"


Copied!

content_copy

You can also find the Google Cloud username 1 in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!
content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials from the left panel. Do not use your Google Cloud
credentials.Note: Using your own Google Cloud account for this lab may incur extra charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-lef

Task 1. Create a service account


Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

In this task, you’ll create a service account and grant it permissions sufficient to trigger an
anomalous threat finding in SCC.

1. In Google Cloud console, in the Navigation menu ( ), click IAM & Admin > Service
Accounts.
2. In the action bar, click + Create Service Account.
3. In the Service account details section:

 In the Service account name field, type test-account.

Notice the Service account ID automatically populates.

 Click Create and Continue.

Notice the pop-up message “Service account created”.

4. In the Grant this service account access to project section, expand the Select a
role drop-down menu, select Basic, and then select Owner.
5. Click Continue, and then click Done.

Notice the test-account service account listed in the Service accounts list.

Click Check my progress to verify that you have completed this task correctly.

Create a service account


Check my progress

Task 2. Create a JSON authentication key for your service account


Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

In this task, you’ll create and download a JSON authentication key for the new service account
you created in the previous task. You’ll then use Cloud Shell to upload that key to your Google
Cloud account. This will trigger a threat finding in SCC.

1. Still on the Service Accounts page, inline with the test-account service account,
click Actions ( ) > Manage keys. The test-account page opens.
2. In the Keys section, click Add Key > Create new key.
3. In the Create private key dialog, set the Key type to JSON.
4. Click Create.

The console prompts you to download the key to your local device. Once downloaded,
you’ll use Cloud Shell to upload the key to your Google Cloud (student) account.

5. On your local device, navigate to the key file you just downloaded and rename it test-
account.
6. In the Google Cloud console, click the Activate Cloud Shell ( ) icon.
7. Click Continue.

It should only take a few moments to provision and connect to the Cloud Shell
environment.

8. In the Cloud Shell title bar, click More ( ) > Upload > Choose Files.
9. Navigate to and select the file on your local machine, and then in the Upload dialog,
click Upload.
10. Copy the following command into the Cloud Shell terminal:

ls
Copied!

content_copy
This command lists the key file you just uploaded.

11. Press ENTER.


In the test-account page, in the Key list, notice the key you just created with the Key creation
date as the current date.

Click Check my progress to verify that you have completed this task correctly.

Create a JSON authentication key for your service account


Check my progress

Task 3. Trigger the false positive finding


Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

In this task, you’ll reconfigure the Cloud Shell environment to use the new test-account service
account that you created in Task 1. This will trigger a threat finding in SCC. Then, you’ll assign
excessive permissions to the lab project.

1. Copy the following command into the Cloud Shell terminal:


2. export PROJECT_ID=$(gcloud info --format='value(config.project)')
3. export SA_NAME="test-account@${PROJECT_ID}.iam.gserviceaccount.com"
gcloud auth activate-service-account ${SA_NAME} --key-file=test-account.json
Copied!

content_copy

This command activates the new service account.

2. Press ENTER.
3. Copy the following command into the Cloud Shell terminal:
4. gcloud auth list

Copied!

content_copy

This command confirms that you activated the service account, and that gcloud is using this
service account.

4. Press ENTER.

In the output, the following confirms the service account is active:


Output:

ACTIVE: *
ACCOUNT: test-account@"Google Cloud project ID".iam.gserviceaccount.com
5. Copy the following command into the Cloud Shell terminal:
6. export STUDENT2="Google Cloud username 2"
gcloud projects add-iam-policy-binding $PROJECT_ID --member user:$STUDENT2 --
role roles/editor
Copied!

content_copy

This command grants the editor role to user 2 so that you can access and remediate the false
positive finding in the next task.

6. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.

Assign excessive permissions to trigger threat detection


Check my progress

Task 4. Sign in as the second user


You'll need to switch Google Cloud accounts by logging into the Google Cloud console using
the second user account provided in the Lab Details panel. You will use this user account to
perform the remaining tasks.

1. In the Google Cloud console, click on the user icon in the top-right corner of the screen,
and then click Add account.
2. Navigate back to the Lab Details panel, copy the Google Cloud username 2: Google
Cloud username 2 and password. Then, paste the username and password into the
Google Cloud console Sign in dialog.

Task 5. View the threat finding in SCC

Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.

In this task, you’ll locate and examine the SCC finding generated by the service Event Threat
Detection. This finding is a false positive that was triggered by the activity you generated in
Tasks 1-3.

To view the Event Threat Detection finding in SCC:

1. In the Navigation menu ( ), click Security > Findings.


2. In the Quick filters pane, locate the Category section, then select User managed
service account key. If necessary, click View more to find it.

The Findings query results panel updates to display only the selected finding category.

3. In the Findings query results panel, display the details of the finding by clicking the
most recent (see Event time) User managed service account key in
the Category column. The details panel for the finding opens and displays
the Summary tab.

Leave the User managed service account key page open to answer the following questions.

1. What is the severity of the alert?

High

Critical

Medium

Low

Submit

2. What is the threat finding class for the alert?

Vulnerability

Misconfiguration

Observation

Threat

Submit

3. When is it important to monitor for threats?

Whenever you access the corporate network on your tablet or smartphone

Whenever your device is on

Whenever your device is connected to the internet

Whenever you device is on site

Submit
4. Which tab in the User managed service account key page provides compliance standards,
explanation of the threat, and a recommendation on how to handle the threat?

Summary

Source Properties

JSON

Submit

Task 6. Fix the finding


Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.

In this task, you'll remediate the false positive by deleting the JSON authentication key for
the test-account service account.

1. In Google Cloud console, in the Navigation menu ( ), click IAM & Admin > Service
Accounts.
2. On the Service accounts page, click the email address of the test-account service
account.
3. Click the Keys tab.
4. From the list of keys, click the Delete service account key ( ) icon to delete the key.
A pop-up will appear asking you to confirm the action. Click Delete.

Click Check my progress to verify that you have completed this task correctly.

Delete the key


Check my progress

Conclusion
You have completed this lab! You used SCC to investigate a false positive and took action to
remediate it. As a cloud security analyst, you'll likely encounter false positive alerts. It's
important to understand how and why false positive alerts are triggered and how you can take
action to remediate them
.

Exp. no: 10 Analyze audit logs using Big Query

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.
2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.
Activate Cloud Shell

Cloud Shell is an online development and operations environment accessible anywhere with your
browser. Cloud Shell provides command-line access to your Google Cloud resources.

1. Click Activate Cloud Shell ( ) at the top right of the Google Cloud console. You may
be asked to click Continue.
After Cloud Shell starts up, you'll see a message displaying your Google Cloud Project ID for
this session:

Your Cloud Platform project in this session is set to YOUR_PROJECT_ID


The command-line tool for Google Cloud, gcloud,comes pre-installed on Cloud Shell and
supports tab-completion. In order to access Google Cloud, you'll have to first authorize gcloud.

2. List the active account name with this command:


gcloud auth list
Copied!

content_copy

3. A pop-up will appear asking you to Authorize Cloud Shell. Click Authorize.
4. Your output should now look like this:

Output:

ACTIVE: *
ACCOUNT: [email protected]
To set the active account, run:
$ gcloud config set account `ACCOUNT`
5. List the project ID with this command:
gcloud config list project
Copied!

content_copy

Example output:

[core]
project = qwiklabs-gcp-44776a13dea667a6
Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview
guide.

Task 1. Generate account activity


Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

In this task, you'll create and delete cloud resources to generate account activity which you'll
access as Cloud Audit Logs.

1. Copy the following commands into the Cloud Shell terminal:

gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID

echo "this is a sample file" > sample.txt

gcloud storage cp sample.txt gs://$DEVSHELL_PROJECT_ID

gcloud compute networks create mynetwork --subnet-mode=auto

export ZONE=$(gcloud compute project-info describe \


--format="value(commonInstanceMetadata.items[google-compute-default-zone])")

gcloud compute instances create default-us-vm \


--machine-type=e2-micro \
--zone=$ZONE --network=mynetwork

gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID


Copied!

content_copy

2. Press ENTER.
Click Check my progress to verify that you have completed this task correctly.

Generate account activity


Check my progress

Task 2. Export the audit logs


Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

The activity you generated in the previous task was recorded as audit logs. In this task you'll
export these logs to a BigQuery dataset for further analysis.
1. In the Google Cloud console, in the Navigation menu ( ) click Logging > Logs
Explorer. The Logs Explorer page opens. (You may need to click More Products to
expand the Navigation menu options and locate Logging under Operations.)
2. When exporting logs, the current filter will be applied to what is exported. Copy the
following query into the Query builder:
logName = ("projects/Project ID/logs/cloudaudit.googleapis.com%2Factivity")
Copied!

content_copy

3. Click Run query. The query results should display on the Query results pane. This
query filters for Cloud Audit logs within your project.
4. Under the Query editor field, click More actions > Create sink. The Create logs
routing sink dialog opens.
Note: If your browser window is narrow, the UI may display More instead of More actions.

5. In the Create logs routing sink dialog, specify the following settings and leave all other
settings at their defaults:

Section Field: values

Sink name: AuditLogsExport


Sink details
Click Next.

Select sink service: BigQuery dataset


Sink
Select BigQuery dataset: Create new BigQuery dataset.
destination
The Create dataset dialog opens.

Dataset ID: auditlogs_dataset


Create Click Create Dataset.
dataset The Create dataset dialog closes, and you'll return to
the Sink destination dialog.

Click Next.
Sink
Uncheck the Use Partitioned Tables checkbox, if it is
destination
already selected, and click Next.
Notice the pre-filled Build inclusion
filter:logName=("projects/[PROJECT
Choose logs
ID]/logs/cloudaudit.googleapis.com%2Factivity")
to include in
Click Next.
sink
Click Create Sink.
Return to the Logs Explorer page.

6. In the Logging navigation pane, click Log Router to view the AuditLogsExport sink in
the Log Router Sinks list.

7. Inline with the AuditLogsExport sink, click More actions ( ) > View sink details to
view information about the AuditLogsExport sink you created. The Sink details dialog
opens.
8. Click Cancel to close the Sink details dialog when you're done viewing the sink
information.
All future logs will now be exported to BigQuery, and the BigQuery tools can be used to perform
analysis on the audit log data. The export does not export existing log entries.

Click Check my progress to verify that you have completed this task correctly.

Export the audit logs


Check my progress

Task 3. Generate more account activity

Note: Make sure you are on the username 1: Google Cloud username 1 Google Cloud console.

In this task, you'll create and delete cloud resources to generate additional account activity which
you'll then access in BigQuery to extract additional insights from the logs.

1. Copy the following commands into the Cloud Shell terminal:

gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID


gcloud storage buckets create gs://$DEVSHELL_PROJECT_ID-test

echo "this is another sample file" > sample2.txt

gcloud storage cp sample.txt gs://$DEVSHELL_PROJECT_ID-test

export ZONE=$(gcloud compute project-info describe \


--format="value(commonInstanceMetadata.items[google-compute-default-zone])")

gcloud compute instances delete --zone=$ZONE \


--delete-disks=all default-us-vm
Copied!

content_copy

These commands generate more activity to view in the audit logs exported to BigQuery.

2. Press ENTER.
When prompted, enter Y, and press ENTER. Notice you created two buckets and deleted a
Compute Engine instance.

3. When the prompt appears after a few minutes, continue by entering the following
commands into the Cloud Shell terminal:
gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID

gcloud storage rm --recursive gs://$DEVSHELL_PROJECT_ID-test


Copied!

content_copy

4. Press ENTER.
Notice you deleted both buckets.

Click Check my progress to verify that you have completed this task correctly.

Generate more account activity


Check my progress

Task 4. Sign in as the second user


You'll need to switch Google Cloud accounts by logging into the Google Cloud console using
the second user account provided in the Lab Details panel. You will use this user account to
analyze the logs.
1. In the Google Cloud console, click on the user icon in the top-right corner of the screen,
and then click Add account.
2. Navigate back to the Lab Details panel, copy the Google Cloud username 2: Google
Cloud username 2 and password. Then, paste the username and password into the
Google Cloud console Sign in dialog.
Task 5. Analyze the Admin Activity logs
Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.

In this task, you'll review the Admin activity logs generated in the previous task. Your goal is to
identify and apply filters to isolate logs that may indicate suspicious activity. This will enable
you to export this subset of logs and streamline the process of analyzing them for potential
issues.

Admin Activity logs record the log entries for API calls or other administrative actions that
modify the configuration or metadata of resources. For example, the logs record when VM
instances and App Engine applications are created when permissions are changed.

Note: You can view audit log entries in the Logs Viewer, Cloud Logging, and in the Cloud SDK.
You can also export audit log entries to Pub/Sub, BigQuery, or Cloud Storage.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Logging > Logs Explorer. The Logs Explorer page opens. (You may need to
expand the More Products drop-down menu within the Navigation menu and locate
Logging under Operations.)
3. Ensure that the Show query toggle button is activated. This opens the Query
builder field.
4. Copy and paste the following command into the Query builder field. Notice your
Google Cloud project ID, project ID in the command.
logName = ("projects/"PROJECT_ID"/logs/cloudaudit.googleapis.com%2Factivity")
Copied!

content_copy

5. Click Run query.


6. In the Query results, locate the log entry indicating that a Cloud Storage bucket was
deleted, it will contain the storage.buckets.delete summary field. Summary fields are
included in the log results to highlight important information about the log entry.
This entry refers to storage.googleapis.com, which calls the storage.buckets.delete method to
delete a bucket. The bucket name is the same name as your project id: PROJECT_ID.

7. Within this entry, click on the storage.googleapis.com text, and select Show matching
entries. The Query results should now display only six entries related to created and
deleted cloud storage buckets.
8. In the Query editor field, notice
the protoPayload.serviceName="storage.googleapis.com" line was added to the query
builder, this filters your query to entries only matching storage.googleapis.com.
9. Within those query results, click storage.buckets.delete in one of the entries, and
select Show matching entries.
Notice another line was added to the Query builder text:

logName = ("projects/"PROJECT_ID"/logs/cloudaudit.googleapis.com%2Factivity")
protoPayload.serviceName="storage.googleapis.com"
protoPayload.methodName="storage.buckets.delete"
The Query results should now display all entries related to deleted Cloud Storage buckets. You
can use this technique to easily locate specific events.

10. In the Query results, expand a storage.buckets.delete event by clicking the expand
arrow > next to the line:

11. Expand the authenticatitonInfo field by clicking the expand arrow > next to the line:

Notice the principalEmail field which displays the email address of the user account that
performed this action which is the user 1 account you used to generate the user activity.

Task 6. Use BigQuery to analyze the audit logs

Note: Make sure you are on the username 2: Google Cloud username 2 Google Cloud console.

You've generated and exported logs to a BigQuery dataset. In this task, you'll analyze the logs
using the Query editor.

Note: When you export logs to a BigQuery dataset, Cloud Logging creates dated tables to hold
the exported log entries. Log entries are placed in tables whose names are based on the entries'
log names.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Click BigQuery.
Note: The Welcome to BigQuery in the Cloud Console message box appears providing links to
the quickstart guide and release notes for UI updates. Click Done to proceed.

3. In the Explorer pane, click the expand arrow beside your project, Google Cloud project
ID. The auditlogs_dataset dataset is displayed.
Note: If auditlogs_dataset is not displayed, reload your browser window.

Next, verify that the BigQuery dataset has appropriate permissions to allow the export writer to
store log entries.

4. Click the auditlogs_dataset dataset.


5. In the auditlogs_dataset toolbar, click the Sharing dropdown menu, and
select Permissions.
6. On the Share permission for "auditlogs_dataset" page, expand the BigQuery Data
Editor section.
7. Confirm that the service account used for log exports is a listed permission. The service
account is similar to: [email protected]

This permission is assigned automatically when log exports are configured so this is a
useful way to check that log exports have been configured.

8. Click Close to close the Share Dataset window.


9. In the Explorer pane, click the expander arrow next to the auditlogs_dataset dataset to
view the cloudaudit_googleapis_com_acitivty table. This table contains your exported
logs.
10. Select the cloudaudit_googleapis_com_acitivty table. The table schema displays. Take
a moment to review the table schema and details.
11. Expand the Query drop-down menu and select In new tab.

12. In the Untitled tab of the query builder, delete any existing text and copy and paste the
following command:
SELECT
timestamp,
resource.labels.instance_id,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gce_instance"
AND operation.first IS TRUE
AND protopayload_auditlog.methodName = "v1.compute.instances.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000;
Copied!

content_copy

This query returns the users that deleted virtual machines in the last 7 days.

13. Click Run.


After a couple of seconds, BigQuery will return each time a user deleted a Compute Engine
virtual machine within the past 7 days. You should notice one entry, which is the activity you
generated in the previous tasks as user 1. Remember, BigQuery shows only the activity that
occurred after you created the export.

14. Replace the previous query in the Untitled tab with the following:
SELECT
timestamp,
resource.labels.bucket_name,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gcs_bucket"
AND protopayload_auditlog.methodName = "storage.buckets.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000;
Copied!

content_copy

This query returns the users that deleted Cloud Storage buckets in the last 7 days. You should
notice two entries, which is the activity you generated in the previous tasks as user 1.

15. Click Run.


The ability to analyze audit logs in BigQuery is very powerful. In this activity, you viewed just
two examples of querying audit logs.

Click Check my progress to verify that you have completed this task correctly.

Use BigQuery to analyze the audit logs


Check my progress

CONCLUSION

Great work! You have successfully queried in Logs Explorer. You then exported logs and
created a dataset that you analyzed in BigQuery.

You have shown how you can use audit logs and filter for types of malicious activity and then
further analyze those logs in BigQuery as a way to analyze the threats.

Exp. no: 11 Recover VMs with Google Backup and DR Service

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!

content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.

Task 1. Connect to the Backup and DR console


Before you can begin implementing recovery actions, you'll first need to connect to the Backup
and DR console.

1. In the Google Cloud console, click the Navigation menu ( ) > Backup and DR. (You
will have to click More Products and then scroll down to find Backup and DR in
the Operations section).
2. From left nevigation pane, click Management console.
3. In the Log in to the management console section, click Log in to the management
console.
4. If asked to Choose an account, click your Google Cloud Username: USERNAME.
5. Skip the Welcome to Google Backup and DR! tour. The Backup and DR management
console opens.
6. In the Backup and DR management console titlebar, click Manage > Appliances.
If the management server and the Backup and Recovery server are successfully installed, the
Connectivity status has a green check.

Note: If the Update Status is Pending (yellow exclamation point), an update is waiting for
installation. You can ignore this and continue to your next task.

Task 2. Create a backup plan template

In this task, you’ll create a backup plan template.


Backup plan templates are composed of backup policies. In policies, you define when to run a
backup, how frequently to run a backup, and how long to retain the backup image for
— Days, Weeks, Months, or Years.

1. In the Backup and DR management console titlebar, click Backup Plans > Templates,
and then click +Create Template.
2. In the Template field, set the template name to vm-backup.
Note: Template names are text strings. The only allowed special characters are spaces,
underscores (_), and dashes (-).

3. In the Description field, type Virtual Machine Backups.


4. In the Policies box, next to Snapshot, click + Add to add a production to snapshot
backup policy.
Note: If the Policies box is not displayed, scroll to the right or expand the browser window.

The Production to Snapshot dialog opens.

5. In the Create/Edit Policy section, set the following fields and leave all other settings at
their defaults:

Field Value

Policy Name Daily VM snapshot

Scheduling Continuous

Every 2 Hour(s)

Note: The Scheduling policy type can be either Windowed or Continuous. The default
is Windowed:

• Windowed defines a discrete snapshot backup schedule adhering to a specific frequency and
time window.
• Continuous defines a continuous snapshot backup schedule

6. Click Create Policy.


7. Click Save Template.
8. Click Okay to acknowledge template creation Success.
Keep the Backup and DR management console open in a new tab for the entire lab.

Click Check my progress to verify that you have completed this task correctly.

Create a backup plan template


Check my progress

Task 3. Validate the backup and recovery appliance service account


permissions
In this task, you’ll view the required IAM roles of the backup/recovery appliance to verify that it
has the correct IAM roles.

An appliance is a hardware or software device that is designed to perform a specific task.


Security appliances are used to protect networks from unauthorized access, attacks, and data
breaches.

Every appliance has a dedicated service account attached to it—that was created during
appliance deployment in the project where the appliance was deployed. For appliances installed
on version 11.0.2 and higher, a corresponding cloud credential for this service account is
automatically created at the time of an appliance deployment.

The name of the cloud credential is based on the appliance name followed by the suffix -sa. For
example, if the name of the backup/recovery appliance is bur-appliance-us-east1, then the name
appliances corresponding cloud credential is bur-appliance-us-east1-sa.

To view and verify the required IAM role:

1. Return to the Google Cloud console, in the Navigation menu ( ), click IAM &
Admin > IAM.
2. In the Name column, find the service account attached to your backup appliance, the
service account's name should be Service account for backup and recovery appliance.
3. In the Role column, notice that the Backup and DR Cloud Storage Operator role is
already assigned.
Task 4. Discover and add Compute Engine instances to the management console

In this task, you’ll use the onboarding wizard to onboard your Compute Engine instances.
Onboarding an instance means you attach the template to the instance.

1. Return to the Backup and DR management console.


2. From the titlebar, click Backup and Recover > Back Up.
3. In the Google Cloud section, click Compute Engine.
4. Under Credential, select backup, and click Next.
The Project ID and Zone drop-down options are populated with details from the appliance that
maps to the workflow credential.

5. Click Search.
The results are listed in the search results. You may have to scroll down to view them:

 lab-vm
 qwiklabs-appliance
6. Select the lab-vm Compute Engine instance for backup, and then click Next.
Note: If no instances or only one instance appear, ensure that the zone selected matches the zone
where your Compute Engine instance(lab-vm) is located or running.

7. In the Enable backups for Compute Engine VM instances? page, select the lab-
vm and then set the following:

 Action: From the drop-down menu, select Apply a backup template.


 Backup template: From the drop-down menu, select vm-backup.
8. Click OK.
9. Click Next.
A Summary of changes screen appears and provides the following information:

 Instance Name: lab-vm


 Appliance: qwiklabs-appliance
 Action: Apply a backup template

11. Click Finish to complete the onboarding process. This triggers the back up of the
selected Compute Engine instances based on the Policy Template you attached.
12. Click Finish to confirm your intent to finish.
After onboarding is complete the Status is a green check. This means the policy template is
attached to the selected VM.

Note: Backup and DR ensures that the chosen Compute Engine instances get backed up at the
frequency you set in the backup policy.

13. In the Backup and DR management console titlebar, click Monitor > Jobs.
You can monitor the progress of the backup job. When the job is finished, you have an image
that you can restore if needed.

If the jobs list is empty, the backup job has either not started or is already completed. Use
different filter options to populate the jobs list, for example Succeeded or All filter options.
Filter results are listed in the Jobs list.

Note: The job may take five minutes or longer to finish.

Click Check my progress to verify that you have completed this task correctly.

Discover and add Compute Engine instances to the management console


Check my progress

Task 5. Restore a Compute Engine instance

Now that you have an image of your Compute Engine instance, in this task, you’ll create a brand
new Compute Engine instance using the backup image that you created in the previous task.

1. From the Backup and DR management console titlebar, click Backup &
Recover > Recover.
2. Click the name of the Compute Engine instance you want to recover (lab-vm) to select it.
Click Next.
3. In the action bar, click Table. In the Images list, one image is displayed because there
has only been one backup image created.
4. Select the image and click Mount.
Note: Typically, the Mount panel has many selection choices that allow you to choose where
and how to restore an image. In this lab, you may have only one timeline option as you just
created the first backup.

5. Under Mount, select Mount as new GCE instance.


6. Review the configuration options and then update the following:

 Region: Change this to REGION


 Zone: Change this to ZONE
 Instance name: lab-vm-recovered
7. Scroll to the bottom of the page and click Mount.
8. On the Success dialog, click Go to Job Monitor.
9. In the filter pane, in the Status section, uncheck Running. Two jobs are displayed, an
earlier with a Succeeded status, and the one you just started with a Running status.
When both jobs have a Succeeded status, you have the Computer Engine instance.

Note: The job may take five minutes or longer depending on the region you selected.
To view the recovered VM, go to the Google Cloud console, in Navigation menu ( ),
click Compute Engine > VM instances to view three VM instances:

 lab-vm
 lab-vm-recovered
 qwiklabs-appliance
Click Check my progress to verify that you have completed this task correctly.

Restore a Compute Engine instance


Check my progress

Task 6. Restore a Compute Engine instance to an alternate project


In this task, you’ll restore a Computer Engine instance using the back up template you created,
but this time to a different project.

You can also create a brand new Compute Engine instance in a different project from backup
images.

Note: Before you set the default service account as a Principal in a different project, you must
add the default service account as a Principal in the target project.

To restore a Compute Engine instance to an alternate project, you first add the service account of
project 1 as a principle to Google Cloud project 2 and then recover the instance on Google Cloud
project 2:

1. In the Google Cloud console, in the Navigation menu ( ), click IAM & Admin > IAM.
2. In the list of principals, find and copy the email of the Service account for backup and
recovery appliance to use in Step 6. The email is similar to the following: qwiklabs-
[email protected].
3. In the Google Cloud console, click the Project selection drop-down. If the project lists
only one project, click All to open the All tab.
4. Search for Google Cloud project ID 2:PROJECT_ID_2 and then click to select that
project ID. You are now in the Permissions page for Google Cloud project ID
2:PROJECT_ID_2.
5. Click Grant access.
6. In the Add Principles section, in the New principals field, paste the email address of the
service account of Google Cloud project 1, named Service account for backup and
recovery appliance. It should be still in your clipboard.
7. In the Assign roles section:

 Click Select a role and assign the Backup and DR > Backup and DR Compute
Engine Operator role.
 Click +Add Another Role.
 Click Select a role and assign the Backup and DR > Backup and DR Cloud
Storage Operator role.
8. Click Save.
You’ve added the service account of Google Cloud project 1 as a principal to Google Cloud
project 2. You can now recover the instance on google Cloud project 2.

9. From the Backup and DR management console, navigate to Backup &


Recover > Recover.
10. Select lab-vm (the Compute Engine instance you want to recover), and click Next.
Note: The green check next to the instance means that instance has been backed up, the red X
means it hasn't.

11. In the action bar, click Table.


12. In the Images list, select the top image and then click Mount.
13. Under Mount, select Mount as new GCE instance.
14. Review the configuration and update the following options:

 Project: Change this to PROJECT_ID_2 to simulate recovering to a different


project in the Google Cloud.
 Instance name: Notice you can use the same instance name because you are in a
different Google Cloud project. Update the instance name to lab-vm-project2.
 Region: Change this to REGION.
 Zone: Change this to ZONE.
15. Select Mount at the bottom of the panel. A Mount job starts. On the Success dialog,
click Go to Job Monitor to monitor the status of this current job. The job may take five
minutes or longer depending on what region you selected.
16. To view the recovered Compute Engine instance in the Google Cloud console of Google
Cloud project 2, in the Navigation menu, click Compute Engine > VM instances.
Note: The job may take five minutes or longer to finish.Note: Before you set the default service
account as a Principal in a different project, you must add the default service account as
a Principal in the target project.

Click Check my progress to verify that you have completed this task correctly.

Restore a Compute Engine instance to an alternate project

Check my progress

Conclusion

Great work! You successfully used Google Backup and DR Service to create a backup template
and then applied it to two Compute Engine instances.
You have shown how to prepare for issues with VMs and the service. When a device
malfunctions, you can use Backup and DR Service to restore mal-functioning devices across
multiple Google Cloud projects.

Exp. no: 12 Respond and recover from a data breach

Date:

How to start your lab and sign in to the Google Cloud console

1. Click the Start Lab button. On the left is the Lab Details panel with the following:

 Time remaining
 The Open Google Cloud console button
 The temporary credentials that you must use for this lab
 Other information, if needed, to step through this lab

Note: If you need to pay for the lab, a pop-up opens for you to select your payment
method.

2. Click Open Google Cloud console (or right-click and select Open Link in Incognito
Window) if you are running the Chrome browser. The Sign in page opens in a new
browser tab.

Tip: You can arrange the tabs in separate, side-by-side windows to easily switch between
them.

Note: If the Choose an account dialog displays, click Use Another Account.

3. If necessary, copy the Google Cloud username below and paste it into the Sign
in dialog. Click Next.

"Google Cloud username"


Copied!
content_copy

You can also find the Google Cloud username in the Lab Details panel.

4. Copy the Google Cloud password below and paste it into the Welcome dialog.
Click Next.
"Google Cloud password"
Copied!

content_copy

You can also find the Google Cloud password in the Lab Details panel.

Important: You must use the credentials the lab provides you. Do not use your Google Cloud
account credentials.Note: Using your own Google Cloud account for this lab may incur extra
charges.

5. Click through the subsequent pages:

 Accept the terms and conditions


 Do not add recovery options or two-factor authentication (because this is a
temporary account)
 Do not sign up for free trials
After a few moments, the Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking
the Navigation menu at the top-
left.

Task 1. Analyze the data breach and gather information

One morning, the security team detects unusual activity within their systems. Further
investigation into this activity quickly reveals that the company has suffered a massive security
breach across its applications, networks, systems, and data repositories. Attackers gained
unauthorized access to sensitive customer information, including credit card data, and personal
details. This incident requires immediate attention and thorough investigation. The first step
towards understanding the scope and impact of this breach is to gather information and analyze
the available data.
In this task, you'll examine the vulnerabilities and findings in Google Cloud Security Command
Center to determine how the attackers gained access to the data, and which remediation steps to
take.

Important: The vulnerabilities listed in this section rely on specific security checks being run
beforehand. If some checks haven't run yet, the related vulnerabilities might not appear in the
Security Command Center when you complete the steps in this section. Don't worry though! You
can still use the information provided in this task to analyze the available findings and proceed
with the remediation steps in the tasks that follow.

First, navigate to the Security Command Center to view an overview of the active
vulnerabilities.

1. In the Google Cloud console, in the Navigation menu ( ), click Security > Risk
Overview. The Security Command Center Overview page opens.
2. Scroll down to Active vulnerabilities. This provides an overview of current security
vulnerabilities or issues that need attention within the Google Cloud environment.
3. Select the Findings By Resource Type tab. The security findings or vulnerabilities based
on the type of cloud resource affected (e.g., instances, buckets, databases) are organized.
By reviewing active vulnerabilities and findings by resource type, you can prioritize and
address security issues effectively.

You'll note that there are both high and medium severity findings relating to the Cloud Storage
bucket, the Compute Instance virtual machine, and the firewall.

Which three resource types are listed with high severity findings?

Bucket, Subnetwork, and ServiceAccountKey


Bucket, compute.Instance, and Firewall

Network, Subnetwork, and compute.Instance

Network, Firewall, and Bucket

Submit

Next, navigate to the PCI DSS report.

4. In the Security Command Center menu, click Compliance. The Compliance page
opens.
5. In the Google Cloud compliance standards section, click View details in the PCI DSS
3.2.1 tile. The PCI DSS 3.2.1 report opens.
6. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
Note: Make sure to follow these steps to assess the PCI report, and do not refresh the page, as
the required filters will be removed, and the correct information won't be displayed.

The Payment Card Industry Data Security Standard (PCI DSS) is a set of security requirements
that organizations must follow to protect sensitive cardholder data. As a retail company that
accepts and processes credit card payments, Cymbal Retail must also ensure compliance with the
PCI DSS requirements, to protect cardholder data.

As you examine the PCI DSS 3.2.1 report, notice that it lists the rules that are non-compliant,
which relate to the data breach:

 Firewall rule logging should be enabled so you can audit network access: This medium
severity finding indicates that firewall rule logging is disabled, meaning that there is no record of
which firewall rules are being applied and what traffic is being allowed or denied. This is a
security risk as it makes it difficult to track and investigate suspicious activity.
 Firewall rules should not allow connections from all IP addresses on TCP or UDP port
3389: This high severity finding indicates that the firewall is configured to allow Remote
Desktop Protocol (RDP) traffic for all instances in the network from the whole internet. This is a
security risk as it allows anyone on the internet to connect to the RDP port on any instance in the
network.
 Firewall rules should not allow connections from all IP addresses on TCP or SCTP port 22:
This high severity finding indicates that the firewall is configured to allow Secure Shell (SSH)
traffic to all instances in the network from the whole internet. SSH is a protocol that allows
secure remote access to a computer. If an attacker can gain access to a machine through SSH,
they could potentially steal data, install malware, or disrupt operations.
 VMs should not be assigned public IP addresses: This high severity finding indicates that a
particular IP address is actively exposed to the public internet and is potentially accessible to
unauthorized individuals. This finding is considered a potential security risk because it could
allow attackers to scan for vulnerabilities or launch attacks on the associated resource.
 Cloud Storage buckets should not be anonymously or publicly accessible: This high severity
finding indicates that there is an Access Control List (ACL) entry for the storage bucket that is
publicly accessible which means that anyone on the internet can read files stored in the bucket.
This is a high-risk security vulnerability that needs to be prioritized for remediation.
 Instances should not be configured to use the default service account with full access to all
Cloud APIs: This medium severity finding indicates that a particular identity or service account
has been granted full access to all Google Cloud APIs. This finding is considered a significant
security risk because it grants the identity or service account the ability to perform any action
within the Google Cloud environment, including accessing sensitive data, modifying
configurations, and deleting resources.
Since you're focusing on identifying and remediating the issues related to the security incident,
please disregard the following findings as they do not relate to the remediation tasks you’re
completing:

 VPC Flow logs should be Enabled for every subnet VPC Network: There are a number of
low severity findings for Flow Logs disabled. This indicates that Flow Logs are not enabled for a
number of subnetworks in the Google Cloud project used for this lab. This is a potential security
risk because Flow Logs provide valuable insights into network traffic patterns, which can help
identify suspicious activity and investigate security incidents.
Note: Enabling logging for cloud resources is important in maintaining observability. However,
you will not remediate this finding in this lab activity as the subnetworks are part of this lab
environment. As a result, this finding will still be visible on the report after you have completed
the remediation tasks.

 Basic roles (Owner, Writer, Reader) are too permissive and should not be used: This
medium severity finding indicates that primitive roles are being used within the Google Cloud
environment. This is a potential security risk because primitive roles grant broad access to a wide
range of resources.
 An egress deny rule should be set: This low severity finding indicates that no egress deny rule
is defined for the monitored firewall. This finding raises potential security concerns because it
suggests that outbound traffic is not restricted, potentially exposing sensitive data or allowing
unauthorized communication.
The following table pairs the rules listed in the report with their corresponding findings category.
This will assist you when examining the findings according to resource type later:

Findings
Rule
category

Firewall rule Firewall rule logging should be enabled so you can


logging disabled audit network access
Firewall rules should not allow connections from all
Open RDP port
IP addresses on TCP or UDP port 3389

Firewall rules should not allow connections from all


Open SSH port
IP addresses on TCP or SCTP port 22

Public IP address VMs should not be assigned public IP addresses

Public bucket Cloud Storage buckets should not be anonymously or


ACL publicly accessible

Instances should not be configured to use the default


Full API access
service account with full access to all Cloud APIs

Flow logs VPC Flow logs should be Enabled for every subnet
disabled VPC Network

Primitive roles Basic roles (Owner, Writer, Reader) are too


used permissive and should not be used

Egress deny rule


An egress deny rule should be set
not set

Overall, these findings indicate a critical lack of security controls and non-compliance with
essential PCI DSS requirements; they also point to the vulnerabilities associated with the data
breach.

Next, navigate to the Security Command Center, and filter the findings for further examination
and analysis of the vulnerabilities in the Google Cloud environment.
7. In the Google Cloud console, in the Navigation menu ( ), click Security > Findings.
The Findings page opens.
8. In the Quick filters panel, in the Resource Type section, select the checkbox for
the Google Cloud storage bucket resource type.
The following active findings pertaining to the storage bucket should be listed:

 Public bucket ACL: This finding is listed in the PCI DSS report, and indicates that anyone with
access to the internet can read the data stored in the bucket.
 Bucket policy only disabled: This indicates that there is no explicit bucket policy in place to
control who can access the data in the bucket.
 Bucket logging disabled: This indicates that there is no logging enabled for the bucket, so it will
be difficult to track who is accessing the data.
These findings indicate that the bucket is configured with a combination of security settings that
could expose the data to unauthorized access. You'll need to remediate these findings by
removing the public access control list, disabling public bucket access, and enabling the uniform
bucket level access policy.

Note: Enabling logging for cloud resources is important in maintaining observability. However,
you will not remediate the Bucket logging disabled finding in this lab activity as this would
require working with multiple projects. As a result, this finding will still be visible after you have
completed the remediation tasks.

9. In the Quick filters panel, in the Resource Type section, uncheck Google Cloud
storage bucket, andselect the checkbox for the Google compute instance resource type.
The following active findings that pertain to the virtual machine named cc-app-01 should be
listed:

 Malware bad domain: This finding indicates that a domain known to be associated with
malware was accessed from the google.compute.instance named cc-app-01. Although this
finding is considered to be of low severity, it indicates that malicious activity has occurred on the
virtual machine instance and that it has been compromised.
 Compute secure boot disabled: This medium severity finding indicates that secure boot is
disabled for the virtual machine. This is a security risk as it allows the virtual machine to boot
with unauthorized code, which could be used to compromise the system.
 Default service account used: This medium severity finding indicates that the virtual machine is
using the default service account. This is a security risk as the default service account has a high
level of access and could be compromised if an attacker gains access to the project.
 Public IP address: This high severity finding is listed in the PCI DSS report and indicates that
the virtual machine has a public IP address. This is a security risk as it allows anyone on the
internet to connect to the virtual machine directly.
 Full API access: This medium severity finding is listed in the PCI DSS report, and indicates that
the virtual machine has been granted full access to all Google Cloud APIs.
These findings indicate the virtual machine was configured in a way that left it very vulnerable to
the attack. To remediate these findings you'll shut the original VM (cc-app-01) down, and create
a VM (cc-app-02) using a clean snapshot of the disk. The new VM will have the following
settings in place:

 No compute service account


 Firewall rule tag for a new rule for controlled SSH access
 Secure boot enabled
 Public IP address set to None

10. In the Time range field, expand the drop-down, and select Last 30 days. This will
ensure the list includes findings for the last 30 days.
11. In the Quick filters panel, in the Resource Type section, uncheck Google compute
instance, and select the checkbox for the Google compute firewall resource type.
The following active findings should be listed that pertain to the firewall:

 Open SSH port: This high severity finding indicates that the firewall is configured to allow
Secure Shell (SSH) traffic to all instances in the network from the whole internet.
 Open RDP port: This high severity finding indicates that the firewall is configured to allow
Remote Desktop Protocol (RDP) traffic to all instances in the network from the whole internet.
 Firewall rule logging disabled: This medium severity finding indicates that firewall rule
logging is disabled. This means that there is no record of which firewall rules are being applied
and what traffic is being allowed or denied.
These findings are all listed in the PCI DSS report and highlight a significant security gap in the
network's configuration. The lack of restricted access to RDP and SSH ports, coupled with
disabled firewall rule logging, makes the network highly vulnerable to unauthorized access
attempts and potential data breaches. You'll need to remediate these by removing the existing
firewall overly broad rules, and replacing them with a firewall rule that allows SSH access only
from the addresses that are used by Google Cloud's IAP SSH service.

Now that you have analyzed the security vulnerabilities, it’s time to work on remediating the
report findings.

Which of the following findings are listed as high severity findings?

Public bucket ACL, Public IP address, Open SSH port, and Open RDP port

Public IP address, Default service account used, Full API access, and Firewall rule logging
disabled

Bucket policy only disabled, Bucket logging disabled, Malware bad domain, and Compute
secure boot disabled

Firewall rule logging disabled, Compute secure boot disabled, Public IP address, and Bucket
logging disabled

Submit
Task 2. Fix the Compute Engine vulnerabilities

In this task, you'll shut down the vulnerable VM cc-app-01, and create a new VM from a
snapshot taken before the malware infection. VM snapshots are effective in restoring the system
to a clean state, and ensures that the new VM will not be infected with the same malware that
compromised the original VM.

1. In the Google Cloud console, click the Navigation menu ( ).


2. Select Compute Engine > VM instances. The VM instances page opens.
The current VM cc-app-01 should be listed under VM instances. This is the vulnerable VM that
has been compromised and must be shut down.

3. Select the checkbox for the cc-app-01 VM.


4. Click Stop.
5. A pop-up will appear asking you to confirm that the VM should be stopped, click Stop.
Click Check my progress to verify that you have completed this task correctly.

Shut down the vulnerable VM


Check my progress

Next, create a new VM from a snapshot. This snapshot has already been created as part of
Cymbal Retail's long term data backup plan.

6. In the action bar, click + Create instance.


7. In the Name field, type cc-app-02.
8. In the Machine type section, expand the drop-down, select Shared-core, and then
select e2-medium.
9. Click OS and storage.
10. In the Operating system and storage section, click Change. The Boot disk dialog
opens.
11. Select the Snapshots tab.
12. Expand the Snapshot drop-down menu, and select cc-app01-snapshot.
13. Click Select.
14. Click Security.
15. In the Identity and API access section, expand the Service accounts drop-down menu,
and select Qwiklabs User Service Account.
16. Click Networking.
17. In the Network tags field, type cc. You'll use this tag to apply firewall rules to this
specific VM.
18. In the Network interfaces section, expand the default network.
19. Expand the External IPv4 address drop-down menu, and select None.
20. Click Create.
The new VM cc-app-02 should now be created from the cc-app01-snapshot. (It may take a few
minutes for the new VM to be created.)
Now, turn Secure Boot on for the new VM cc-app-02 to address the Secure Boot
disabled finding.

20. Select the checkbox for the cc-app-02 VM.


21. Click Stop.
22. A pop-up will appear asking you to confirm that the VM should be stopped, click Stop.
Wait for the cc-app-02 VM to be stopped before you continue.

23. In the VM instances section, click the cc-app-02 link. The cc-app-02 page opens.
24. In the cc-app-02 toolbar, click Edit. The Edit cc-app-02 instance page opens.
25. Scroll down to the Security and access section, and under Shielded VM, select the
checkbox for the Turn on Secure Boot option. This will address the Compute secure
boot disabled finding.
26. Click Save.
27. In the Compute Engine menu, select VM instances.
28. Select the checkbox for the cc-app-02 VM.
29. Click Start/Resume.
30. A pop-up will appear asking you to confirm that the VM should be started, click Start.
The cc-app-02 VM instance will restart and the Secure Boot disabled finding will be
remediated.

Click Check my progress to verify that you have completed this task correctly.

Create a new VM from existing snapshot


Check my progress

Challenge: Delete the compromised VM

Delete the compromised VM cc-app-01.

Click Check my progress to verify that you have completed this task correctly.

Delete the compromised VM


Check my progress

By following these steps, you have effectively created a new VM from the snapshot, ensuring it
is free from malware and misconfigurations. You also deleted the compromised VM, eliminating
the source of the security breach.

Task 3. Fix Cloud Storage bucket permissions


In this task, you'll revoke public access to the storage bucket and switch to uniform bucket-level
access control, significantly reducing the risk of data breaches. By removing all user permissions
from the storage bucket, you can prevent unauthorized access to the data stored within.

1. In the Navigation menu ( ), select Cloud Storage > Buckets. The Buckets page opens.
2. Click the project_id_bucket storage bucket link. The Bucket details page opens.
You'll note there is a myfile.csv file in the publicly accessible bucket. This is the file that
contains the sensitive information that was dumped by the malicious actor. Perform the
following steps to address the Public bucket ACL finding.

3. Click the Permissions tab.


4. In the Public access tile, click Prevent public access.
5. Click Confirm.

Challenge: Modify storage bucket access

Switch the access control to uniform and remove permissions for the allUsers principals from
the storage bucket to enforce a single set of permissions for the bucket and its objects. You'll also
need to ensure that users who rely on basic project roles to access the bucket won't lose their
access.

Click Check my progress to verify that you have completed this task correctly.

Modify storage bucket access.


Check my progress

By following these steps, you have effectively prevented public access to the bucket, switched to
uniform bucket-level access control, and removed all user permissions, addressing the Public
bucket ACL, Bucket policy only disabled, and Bucket logging disabled findings.

Task 4. Limit firewall ports access

In this task, you'll restrict access to RDP and SSH ports to only authorized source networks to
minimize the attack surface and reduce the risk of unauthorized remote access.

Exercise extreme caution before modifying overly permissive firewall rules. The rules may be
allowing legitimate traffic, and improperly restricting it could disrupt critical operations. In this
lab, ensure the Compute Engine virtual machine instances tagged with target tag "cc" remain
accessible via SSH connections from the Google Cloud Identity-Aware Proxy address range
(35.235.240.0/20). To maintain uninterrupted management access, create a new, limited-access
firewall rule for SSH traffic before removing the existing rule allowing SSH connections from
any address.

Challenge: Restrict SSH access

Create a new firewall rule. This rule must restrict SSH access to only authorized IP addresses
from the source network 35.235.240.0/20 to compute instances with the target tag cc.

Click Check my progress to verify that you have completed this task correctly.

Restrict SSH access


Check my progress

Task 5. Fix the firewall configuration

In this task, you'll delete three specific VPC firewall rules that are responsible for allowing
unrestricted access to certain network protocols, namely ICMP, RDP, and SSH, from any source
within the VPC network. Then, you'll enable logging on the remaining firewall rules.

Challenge: Customize firewall rules

Delete the default-allow-icmp, default-allow-rdp, and default-allow-ssh firewall rules. These


rules are overly broad and by deleting them, you'll allow for a more secure and controlled
network environment.

By deleting these rules, you have restricted access to these protocols, limiting the potential for
unauthorized access attempts and reducing the attack surface of your network.

Customize firewall rules


Check my progress

Challenge: Enable logging

Enable logging for the remaining firewall rules limit-ports (the rule you created in a previous
task) and default-allow-internal.

Enabling logging allows you to track and analyze the traffic that is allowed by this rule, which is
likely to be internal traffic between instances within your VPC.

Click Check my progress to verify that you have completed this task correctly.

Enable logging
Check my progress

By customizing firewall rules and enabling logging, you've addressed the Open SSH port, Open
RDP port, and Firewall rule logging disabled findings. The new firewall rule better protects
the network and improves network visibility.

Task 6. Verify compliance

After diligently addressing the vulnerabilities identified in the PCI DSS 3.2.1 report, it's crucial
to verify the effectiveness of your remediation efforts. In this task, you'll run the report again to
ensure that the previously identified vulnerabilities have been successfully mitigated and no
longer pose a security risk to the environment.

1. In the Security Command Center menu, click Compliance. The Compliance page
opens.
2. In the Google Cloud compliance standards section, click View details in the PCI DSS
3.2.1 tile. The PCI DSS 3.2.1 report opens.
3. Click on the Findings column to sort the findings and display the active findings at the
top of the list.
All major vulnerabilities are now resolved.

Note: While you addressed the high and medium severity vulnerabilities, the flow logs remain
disabled for a number of subnetworks. This finding will still be visible on the report after you
have completed the remediation tasks, as this relates to this lab environment.

CONCLUSION

You have helped the security team at Cymbal Bank to mitigate the impact of the data breach,
address the identified vulnerabilities, and significantly enhanced the security posture of Cymbal
Bank’s Google Cloud environment.

First, you examined and analyzed the vulnerabilities and findings in Google Cloud Security
Command Centre.

Next, you shut the old VM down and created a new VM from a snapshot taken before the
malware infection.

Then, you fixed the cloud storage permissions by revoking public access to the storage bucket
and switching to uniform bucket-level access control. You also removed all user permissions
from the storage bucket.

Next, you fixed the firewall rules by deleting the default-allow-icmp, default-allow-rdp, and
default-allow-ssh firewall rules, and enabling logging for the remaining firewall rules.
Finally, you run a compliance report to confirm that the vulnerability issues have been
remediated.

Remember, as a security analyst it is crucial to maintain regular security audits and implement
ongoing monitoring practices for continued protection against evolving threats and
vulnerabilities.

You might also like