Cloud Computing Unit-V
Cloud Computing Unit-V
Standards,
and
Applications
Mahfooz Alam
Assistant Professor
Department of MCA
G. L. Bajaj College of Technology
and Management, Greater Noida
Outlines
1. Security in Clouds:
• Cloud security challenges
• Software as a Service Security
2. Common Standards:
• The Open Cloud Consortium
• The Distributed management
• Task Force
• Standards for application Developers
• Standards for Messaging
• Standards for Security,
• End user access to cloud computing,
• Mobile Internet devices and the cloud.
• Hadoop
• MapReduce
• Virtual Box
• Google App Engine
• Programming Environment for Google App Engine
Security Challenges in Cloud Computing
There is no doubt that Cloud Computing provides various Advantages but there are
also some security issues in cloud computing. Some Security Issues in Cloud
Computing are as follows
1. Data Loss: It is one of the issues faced in Cloud Computing. This is also known as
Data Leakage. As we know our sensitive data is in the hands of Somebody else,
and we don’t have full control over our database. So, if the security of cloud
service is to break by hackers then it may be possible that hackers will get access
to our sensitive data or personal files.
2. Interference of Hackers and Insecure APIs: It is important to protect the Interface
and APIs which are used by an external user. But also in cloud computing, few
services are available in the public domain which is the vulnerable part of Cloud
Computing because it may be possible that these services are accessed by some
third parties. So, it may be possible that with the help of these services, hackers
can easily hack or harm our data.
3. User Account Hijacking: Account Hijacking is the most serious security issue in
Cloud Computing. If somehow the Account of a User or an Organization is hijacked
by a hacker then the hacker has full authority to perform Unauthorized Activities.
Security Challenges in Cloud Computing
4. Changing Service Provider: Vendor lock-in is also an important Security issue in
Cloud Computing. Many organizations will face different problems while shifting
from one vendor to another. For example: If an Organization wants to shift
from AWS Cloud to Google Cloud Services then they face various problems like
shifting of all data, also both cloud services have different techniques and
functions, so they also face problems regarding that. Also, it may be possible that
the charges of AWS are different from Google Cloud, etc.
5. Lack of Skill: While working, shifting to another service provider, needing an extra
feature, how to use a feature, etc. are the main problems caused by IT Companies
who don’t have skilled Employees. So it requires a skilled person to work with
Cloud Computing.
6. Denial of Service (DoS) Attack: This type of attack occurs when the system
receives too much traffic. Mostly DoS attacks occur in large organizations such as
the banking sector, government sector, etc. When a DoS attack occurs, data is
lost. So, in order to recover data, it requires a great amount of money as well as
time to handle it.
What are Cloud Security Standards?
It was essential to establish guidelines for how work is done in the cloud due to the
different security dangers facing the cloud. They offer a thorough framework for how
cloud security is upheld with regard to both the user and the service provider.
• Cloud security standards provide a roadmap for businesses transitioning from a
traditional approach to a cloud-based approach by providing the right tools,
configurations, and policies required for security in cloud usage.
• It helps to devise an effective security strategy for the organization.
• It also supports organizational goals like privacy, portability, security, and
interoperability.
• Certification with cloud security standards increases trust and gives businesses a
competitive edge.
Need for Cloud Security Standards
• Ensure cloud computing is an appropriate environment: Organizations need to
make sure that cloud computing is the appropriate environment for the
applications as security and mitigating risk are the major concerns.
• To ensure that sensitive data is safe in the cloud: Organizations need a way to
make sure that the sensitive data is safe in the cloud while remaining compliant
with standards and regulations.
• No existing clear standard: Cloud security standards are essential as earlier there
were no existing clear standards that can define what constitutes a secure cloud
environment. Thus, making it difficult for cloud providers and cloud users to
define what needs to be done to ensure a secure environment.
Common Cloud Security Standards
1. NIST (National Institute of Standards and Technology)
NIST is a federal organization in the US that creates metrics and standards to boost
competition in the scientific and technology industries. The National Institute of
Regulations and Technology (NIST) developed the Cybersecurity Framework to comply
with US regulations such as the Federal Information Security Management Act and
the Health Insurance Portability and Accountability Act (HIPAA) (FISMA).
2. ISO-27017
This standard has not yet been introduced to the marketplace. It attempts to offer
further direction in the cloud computing information security field. Its purpose is to
supplement the advice provided in ISO/IEC 27002 and various other ISO27k standards,
such as ISO/IEC 27018 on the privacy implications of cloud computing, and ISO/IEC
27031 on business continuity.
3. ISO-27018
The protection of personally identifiable information (PII) in public clouds that serve
as PII processors is covered by this standard. Despite the fact that this standard is
especially aimed at public-cloud service providers like AWS or Azure,
Common Cloud Security Standards [Cont…]
4. CIS controls
Organizations can secure their systems with the help of Internet Security Center (CIS)
Controls, which are open-source policies based on consensus. To easily access a list of
evaluations for cloud security, consult the CIS Benchmarks customized for particular
cloud service providers
5. FISMA
In accordance with the Federal Information Security Management Act (FISMA), all
federal agencies and their contractors are required to safeguard information systems
and assets. NIST, using NIST SP 800-53, was given authority under FISMA to define the
framework security standards.
6. Cloud Architecture Framework
These frameworks, which frequently cover operational effectiveness, security, and cost-
value factors, can be viewed as best-party standards for cloud architects. This
framework, developed by Amazon Web Services, aids architects in designing workloads
and applications on the Amazon cloud.
7. General Data Protection Regulation (GDPR)
For the European Union, there are laws governing data protection and privacy. Even
though this law only applies to the European Union, it is something you should keep in
mind if you store or otherwise handle any personal information of residents of the EU.
Common Cloud Security Standards [Cont…]
4. CIS controls
Organizations can secure their systems with the help of Internet Security Center (CIS)
Controls, which are open-source policies based on consensus. To easily access a list of
evaluations for cloud security, consult the CIS Benchmarks customized for particular
cloud service providers
5. FISMA
In accordance with the Federal Information Security Management Act (FISMA), all
federal agencies and their contractors are required to safeguard information systems
and assets. NIST, using NIST SP 800-53, was given authority under FISMA to define the
framework security standards.
6. Cloud Architecture Framework
These frameworks, which frequently cover operational effectiveness, security, and cost-
value factors, can be viewed as best-party standards for cloud architects. This
framework, developed by Amazon Web Services, aids architects in designing workloads
and applications on the Amazon cloud.
7. General Data Protection Regulation (GDPR)
For the European Union, there are laws governing data protection and privacy. Even
though this law only applies to the European Union, it is something you should keep in
mind if you store or otherwise handle any personal information of residents of the EU.
Common Cloud Security Standards [Cont…]
8. SOC Reporting
A form of audit of the operational processes used by IT businesses offering any service
is known as a “Service and Organization Audits 2” (SOC 2). A worldwide standard for
cybersecurity risk management systems is SOC 2 reporting. Your company’s policies,
practices, and controls are in place to meet the five trust principles, as shown by the
SOC 2 Audit Report. The SOC 2 audit report lists security, availability, processing
integrity, confidentiality, and confidentiality as security principles. If you offer software
as a service, potential clients might request proof that you adhere to SOC 2 standards.
9. PCI DSS
For all merchants who use credit or debit cards, the PCI DSS (Payment Card Industry
Data Security Standard) provides a set of security criteria. For businesses that handle
cardholder data, there is PCI DSS. The PCI DSS specifies fundamental technological and
operational criteria for safeguarding cardholder data. Cardholders are intended to be
protected from identity theft and credit card fraud by the PCI DSS standard.
Common Cloud Security Standards [Cont…]
10. HIPAA
The Health Insurance Portability and Accountability Act (HIPAA), passed by the US
Congress to safeguard individual health information, also has parts specifically dealing
with information security. Businesses that handle medical data must abide by HIPAA
law. The HIPAA Security Rule (HSR) is the best choice in terms of information security.
The HIPAA HSR specifies rules for protecting people’s electronic personal health
information that a covered entity generates, acquires, makes use of, or maintains.
11. CIS AWS Foundations v1.2
Any business that uses Amazon Web Service cloud resources can help safeguard
sensitive IT systems and data by adhering to the CIS AWS Foundations Benchmark.
12. ACSC Essential Eight
ACSC Essential 8 (also known as the ASD Top 4) is a list of eight cybersecurity mitigation
strategies for small and large firms. In order to improve security controls, protect
businesses’ computer resources and systems, and protect data from cybersecurity
attacks, the Australian Signals Directorate (ASD) and the Australian Cyber Security
Centre (ACSC) developed the “Essential Eight Tactics.”
Open Cloud Consortium
The Open Cloud Consortium (OCC): supports the development of standards for cloud
computing and frameworks for interoperating between clouds; supports the
development of benchmarks for cloud computing; supports open-source software for
cloud computing; manages a testbed for cloud computing called the Open Cloud
Testbed; sponsors workshops and other events related to cloud computing.”
DMTF: Distributed Management Task Force
It is a Standards Development Non-profit Organization, which builds by developing open
management-based standards that extend over various promising and long-established
conventional Information technology-based organizational structures and facilities
comprising virtualization, networking, cloud computing, servers, and data storage and
promotes interoperability in support of business ventures and Internet backgrounds.
The head office of DMTF a non-profit organization is situated in Portland, Oregon.
The corporations, whose representatives came together as a board of directors of DMTF,
comprise:
• Broadcom Inc.
• Intel Corporation
• Dell Technologies
• Hitachi, Ltd.
• NetApp
• HP Inc.
• Cisco
• Lenovo
• Hewlett Packard Enterprise
Evolution of DMTF
History
• In 1992, the DMTF was first time established as the "Desktop Management Task
Force".
• In 1999, the name of the organization was changed to "Distributed Management
Task Force" because the organization was progressed and started dealing with
distributed management in the course of further standards.
DMTF Standards
Some of the standards of DMTF comprise:
• CADF, an abbreviation of Cloud Auditing Data Federation.
• DASH, an abbreviation of Desktop and Mobile Architecture for System Hardware.
• MCTP, an abbreviation of Management Component Transport Protocol.
• NC-SI, an abbreviation of Network Controller Sideband Interface.
• OVF, an abbreviation of Open Virtualization Format.
• PLDM , an abbreviation of Platform Level Data Model.
• SMASH, an abbreviation of Systems Management Architecture for Server Hardware.
DMTF Advantages
• The members of the corporations get front-line access to the information concerning
DMTF standards and tools.
• The members of the corporations get a chance to take part in the designation of
these DMTF standards.
• The members of the corporations get a chance to work together on tasks in the
company of implementers of these DMTF standards and tools.
Hadoop
Hadoop is an open source framework from Apache and is used to store process and
analyze data which are very huge in volume. Hadoop is written in Java and is not OLAP
(online analytical processing). It is used for batch/offline processing. It is being used by
Facebook, Yahoo, Google, Twitter, LinkedIn and many more. Moreover, it can be scaled
up just by adding nodes in the cluster.
Modules of Hadoop
1. HDFS: Hadoop Distributed File System. Google published its paper GFS and on the
basis of that HDFS was developed. It states that the files will be broken into blocks
and stored in nodes over the distributed architecture.
2. Yarn: Yet another Resource Negotiator is used for job scheduling and managing the
cluster.
3. Map Reduce: This is a framework which helps Java programs to do parallel
computation on data using key-value pair. The Map task takes input data and
converts it into a data set which can be computed in Key value pair. The output of the
Map task is consumed by reduced task and then the out of reducer gives the desired
result.
4. Hadoop Common: These Java libraries are used to start Hadoop and are used by
other Hadoop modules.
Hadoop Architecture
The Hadoop architecture is a package of the file system, MapReduce engine, and the
HDFS (Hadoop Distributed File System). The MapReduce engine can be MapReduce/MR1
or YARN/MR2.
A Hadoop cluster consists of a single master and multiple slave nodes. The master node
includes Job Tracker, Task Tracker, NameNode, and DataNode whereas the slave node
includes DataNode and TaskTracker.
Hadoop Distributed File System (HDFS)
The Hadoop Distributed File System (HDFS) is a distributed file system for Hadoop. It
contains a master/slave architecture. This architecture consists of a single NameNode
that performs the role of master, and multiple DataNodes performing the role of a slave.
Both NameNode and DataNode are capable enough to run on commodity machines. The
Java language is used to develop HDFS. So any machine that supports Java language can
easily run the NameNode and DataNode software.
NameNode
o It is a single master server exists in the HDFS cluster.
o As it is a single node, it may become the reason for single-point failure.
o It manages the file system namespace by executing an operation like opening,
renaming, and closing the files.
o It simplifies the architecture of the system.
DataNode
o The HDFS cluster contains multiple data nodes.
o Each DataNode contains multiple data blocks.
o These data blocks are used to store data.
o It is the responsibility of DataNode to read and write requests from the file system's
clients.
Hadoop Distributed File System (HDFS)
o It performs block creation, deletion, and replication upon instruction from the
NameNode.
Job Tracker
o The role of Job Tracker is to accept the MapReduce jobs from a client and process the
data by using NameNode.
o In response, NameNode provides metadata to Job Tracker.
Task Tracker
o It works as a slave node for Job Tracker.
o It receives tasks and code from Job Tracker and applies that code to the file. This
process can also be called a Mapper.
MapReduce Layer
The MapReduce comes into existence when the client application submits the
MapReduce job to Job Tracker. In response, the Job Tracker sends the request to the
appropriate Task Trackers. Sometimes, the TaskTracker fails or time out. In such a case,
that part of the job is rescheduled.
Advantages of Hadoop
o Fast: In HDFS the data are distributed over the cluster and are mapped which helps in
faster retrieval. Even the tools to process the data are often on the same servers,
thus reducing the processing time. It is able to process terabytes of data in minutes
and Peta bytes in hours.
o Scalable: Hadoop cluster can be extended by just adding nodes in the cluster.
o Cost Effective: Hadoop is open source and uses commodity hardware to store data so
it is really cost-effective as compared to traditional relational database management
systems.
o Resilient to failure: HDFS has the property with which it can replicate data over the
network, so if one node is down or some other network failure happens, then
Hadoop takes the other copy of data and uses it. Normally, data are replicated thrice
but the replication factor is configurable.
History of Hadoop
The Hadoop was started by Doug Cutting and Mike Cafarella in 2002. Its origin was the
Google File System paper, published by Google.
Testing: EUC is an optimum environment for rapid prototyping and testing. Organizations
can offer new tools to their workers to test how effective and user-friendly they are on a
large scale. EUC allows you to change design and structure to rapidly carry out testing
campaigns.
Collaboration: Teams using EUC services can securely collaborate on projects, with their
content stored on the cloud. You can also share project information with external users
for cross-organizational collaboration with changes made in real-time.