Cloud Computoing Module I
Cloud Computoing Module I
What is Cloud?
The term Cloud refers to a Network or Internet. In other words, we can say
that Cloud is something, which is present at a remote location. Cloud can
provide services over public and private networks, i.e., WAN, LAN or VPN.
Basic Concepts
Deployment models define the type of access to the cloud, i.e., how the cloud is
located
Public clouds are the most common deployment model in which necessary IT infrastructure
is established by a 3rd party service provider who makes it available to any consumer on
subscription basis.Public cloud allows systems and services to be easily
accessible to the general public.Public cloud may be less secure because of
its openness.
The private cloud(operated by a single organization)allows systems and
services to be accessible within an organization.It is more secure because
of its private nature.The use of cloud-based in-house solutions is also
driven by the need of keeping confidential information within the
organization’s premises.Institutions such as government and banks with
high security ,privacy and regulatory concerns prefer to build and use
their own private or enterprise cloud.
Cloud computing is based on service models. These are categorized into three basic
service models which are -
● Infrastructure-as–a-Service (IaaS)
● Platform-as-a-Service (PaaS)
● Software-as-a-Service (SaaS)
Once the data is stored in the cloud, it is easier to get back-up and restore
that data using the cloud.
2) Improved collaboration
3) Excellent accessibility
5) Mobility
Cloud computing allows us to easily access all cloud data via mobile.
Cloud offers us a huge amount of storage capacity for storing our important
data such as documents, images, audio, video, etc. in one place.
8) Data security
Data security is one of the biggest advantages of cloud computing. Cloud
offers many advanced features related to security and ensures that data is
securely stored and handled
Challenges
● Technical challenges arise for cloud service providers for the
management of large computing infrastructures.
● Security in terms of confidentiality,secrecy and protection of data in a
cloud environment.
● Legal issues may arise when cloud computing infrastructures across
diverse geographical locations.(different countries may potentially
create disputes on what are the rights that third parties have on your
data)
HISTORICAL DEVELOPMENTS
3. Web 2.0
➔Web is the primary interface through which cloud computing delivers
its services
➔ A set of technologies and services that facilitate interactive
information sharing, collaboration, user-centered design, and
application composition.
➔ Web 2.0 brings interactivity and flexibility into Web pages, providing
enhanced user experience by gaining Web-based access to all the
functions that are normally found in desktop applications.
➔ These capabilities are obtained by integrating a collection of standards
and technologies such as XML, Asynchronous JavaScript and XML
(AJAX), Web Services, and others
➔ Ex: Google Documents, Google Maps, Flickr, Facebook, Twitter,
YouTube Blogger, and Wikipedia.
application’s logic. Currently, there are three types of roles: Web role,
worker role, and virtual machine role.
★The Web role is designed to host a Web application,
★the worker role used to perform workload processing,
★ The virtual machine role provides a virtual environment in which the
computing stack can be fully customized, including the operating
systems.
4. Hadoop
★ Apache Hadoop is an open-source framework that is suited for
processing large data sets on commodity hardware
★Hadoop is an implementation of MapReduce,
Eras of Computing
● The two fundamental and dominant models of computing are-
sequential and parallel
● the four key elements of computing developed during this eras were
- architectures, compilers, applications and problem solving
environments
LEVELS OF PARALLELISM
Levels of parallelism are decided based on the lumps of code (grain size) that
can be a potential candidate for parallelism.
DEFINITIONS:-
A distributed system is a collection of independent computers connected through a
network ,communicating and coordinating their actions only by passing messages.It
appears to its users as a single coherent system.
the server absorbs the other two. In the fat-client model, the client
encapsulates presentation and most of the application logic, and
the server is principally responsible for the data storage and
maintenance.
2. Peer-to-peer
● The peer-to-peer model, introduces a symmetric architecture in
which all the components, called peers, play the same role and
incorporate both client and server capabilities of the client/server
model.
● Each peer acts as a server when it processes requests from other
peers and as a client when it issues requests to other peers.
● With respect to the client/ server model that partitions the
responsibilities of the IPC between server and clients, the peer-to
peer model attributes the same responsibilities to each component.