Cloud Computing
Cloud Computing
1. Resources Pooling
It means that the Cloud provider pulled the computing resources to provide services to multiple customers
with the help of a multi-tenant model. There are different physical and virtual resources assigned and
reassigned which depends on the demand of the customer.
The customer generally has no control or information over the location of the provided resources but is able to
specify location at a higher level of abstraction
2. On-Demand Self-Service
It is one of the important and valuable features of Cloud Computing as the user can continuously monitor the
server uptime, capabilities, and allotted network storage. With this feature, the user can also monitor the
computing capabilities.
. Easy Maintenance
The servers are easily maintained and the downtime is very low and even in some cases, there is no downtime.
Cloud Computing comes up with an update every time by gradually making it better.
The updates are more compatible with the devices and perform faster than older ones along with the bugs which
are fixed.
The user can access the data of the cloud or upload the data to the cloud from anywhere just with the help of a
device and an internet connection. These capabilities are available all over the network and accessed with the
help of internet.
5. Availability
The capabilities of the Cloud can be modified as per the use and can be extended a lot. It analyzes the storage
usage and allows the user to buy extra Cloud storage if needed for a very small amount.
6. Automatic System
Cloud computing automatically analyzes the data needed and supports a metering capability at some level of
services. We can monitor, control, and report the usage. It will provide transparency for the host as well as the
customer.
7. Economical
It is the one-time investment as the company (host) has to buy the storage and a small part of it can be provided
to the many companies which save the host from monthly or yearly costs. Only the amount which is spent is on
the basic maintenance and a few more expenses which are very less.
8. Security
Cloud Security, is one of the best features of cloud computing. It creates a snapshot of the data stored so that
the data may not get lost even if one of the servers gets damaged.
The data is stored within the storage devices, which cannot be hacked and utilized by any other person. The
storage service is quick and reliable.
9. Pay as you go
In cloud computing, the user has to pay only for the service or the space they have utilized. There is no hidden
or extra charge which is to be paid. The service is economical and most of the time some space is allotted for
free.
Cloud Computing resources used to monitor and the company uses it for recording. This resource utilization is
analyzed by supporting charge-per-use capabilities.
This means that the resource usages which can be either virtual server instances that are running in the cloud
are getting monitored measured and reported by the service provider. The model pay as you go is variable
based on actual consumption of the manufacturing organization.
Types of Cloud
Cloud computing is Internet-based computing in which a shared pool of
resources is available over broad network access, these resources can be
provisioned or released with minimum management efforts and service-provider
interaction.
Types of Cloud
1. Public cloud
2. Private cloud
3. Hybrid cloud
4. Community cloud
5. Multicloud
Public Cloud
Public clouds are managed by third parties which provide cloud services over the
internet to the public, these services are available as pay-as-you-go billing
models.
They offer solutions for minimizing IT infrastructure costs and become a good
option for handling peak loads on the local infrastructure. Public clouds are the
go-to option for small enterprises, which can start their businesses without large
upfront investments by completely relying on public infrastructure for their IT
needs.
The fundamental characteristics of public clouds are multitenancy. A public
cloud is meant to serve multiple users, not a single customer. A user requires a
virtual computing environment that is separated, and most likely isolated, from
other users.
Advantages of using a Public cloud are:
1. High Scalability
2. Cost Reduction
3. Reliability and flexibility
4. Disaster Recovery
Disadvantages of using a Public cloud are:
1. Loss of control over data
2. Data security and privacy
3. Limited Visibility
4. Unpredictable cost
Private cloud
Private clouds are distributed systems that work on private infrastructure and
provide the users with dynamic provisioning of computing resources. Instead of
a pay-as-you-go model in private clouds, there could be other schemes that
manage the usage of the cloud and proportionally billing of the different
departments or sections of an enterprise. Private cloud providers are HP Data
Centers, Ubuntu, Elastic-Private cloud, Microsoft, etc.
Hybrid cloud:
A hybrid cloud is a heterogeneous distributed system formed by combining
facilities of the public cloud and private cloud. For this reason, they are also
called heterogeneous clouds.
A major drawback of private deployments is the inability to scale on-demand and
efficiently address peak loads. Here public clouds are needed. Hence, a hybrid
cloud takes advantage of both public and private clouds.
Community cloud:
Community clouds are distributed systems created by integrating the services of
different clouds to address the specific needs of an industry, a community, or a
business sector. But sharing responsibilities among the organizations is difficult.
Multicloud
2. Storage Virtualization
Storage virtualization works by gathering and merging multiple physical storage arrays and
presenting them as a single storage location to the user over a network. It is employed
typically by organizations and individuals looking to scale and maintain their systems’ storage
without investing in physical storage devices.
Application virtualization refers to the process of deploying a computer application over a network (the cloud).
The deployed application is installed locally on a server, and when a user requests it, an instance of the
application is displayed to them. The user can then engage with that application as if it was installed on their
system.
Application virtualization is a powerful concept that takes away most of the drawbacks of installing
applications locally.
Using this, users can access a plethora of applications in real-time without having to allocate too much storage
to all of them.
Users can also run applications not supported by their devices’ operating systems.
And let us not forget how it eliminates the need for managing and updating several applications across different
operating systems for IT teams.
5. Desktop Virtualization
Desktop virtualization is similar to application virtualization, but the apps are now replaced with whole desktop
environments.
The desktop environments, also called virtual machines (VMs), are housed on powerful servers that can host
several desktop sessions concurrently. Users can access these VMs on their devices as and when required,
regardless of the specifications of their devices.
Desktop virtualization is especially useful for enterprises as it offers a consistent desktop experience to all
employees.
IT teams responsible for managing a company’s devices can now manage and issue updates centrally.
Virtual desktops also minimize the security risks associated with employees storing the company data locally.
And, since most of the data is stored on servers, device failure will not result in any major loss.
6. Data Virtualization
Data virtualization is a solution to the data management problem of analyzing data from different sources
collectively and at a much faster pace. It enables organizations to centrally manage and alter data from several
sources, such as excel files, google analytics reports, HubSpot reports, etc., while offering a holistic view
(single view) of the data.
Data virtualization works by separating the collected data from its underlying data logic. A virtualization layer,
called a data virtualization tool, acts as a mediator between the source and the front-end usage of the data.
Virtualizing data enables users to collectively view heterogeneous data sets via a single interface as well as
access the source of the collected data in real-time.
Data virtualization is primarily used as a part of data integration in areas such as BI (business intelligence),
Cloud computing and of course, data management.
What is mobile cloud computing? Write applications of Mobile cloud computing.
MCC stands for Mobile Cloud Computing which is defined as a combination of mobile
computing, cloud computing, and wireless network that come up together purpose such as
rich computational resources to mobile users, network operators, as well as to cloud
computing providers. Mobile Cloud Computing is meant to make it possible for rich mobile
applications to be executed on a different number of mobile devices. In this technology,
data processing, and data storage happen outside of mobile devices. Mobile Cloud
Computing applications leverage this IT architecture to generate the
following advantages:
1. Extended battery life.
2. Improvement in data storage capacity and processing power.
3. Improved synchronization of data due to “store in one place, accessible from anywhere ”
platform theme.
4. Improved reliability and scalability.
5. Ease of integration.
There are two types of applications of mobile cloud computing (MCC) that are almost
similar. These are as follows:
What is the difference between Mobile Cloud Computing and Cloud Computing
Cloud Service Models
There are the following three types of cloud service models -
Characteristics of IaaS
There are the following characteristics of IaaS -
Characteristics of PaaS
There are the following characteristics of PaaS -
Example: AWS Elastic Beanstalk, Windows Azure, Heroku, Force.com, Google App
Engine, Apache Stratos, Magento Commerce Cloud, and OpenShift.
Characteristics of SaaS
There are the following characteristics of SaaS -
Mobile Computing Mobility has become a very popular word and rapidly increasing part in today’s computing area.
An incredible growth has appeared in the development of mobile devices such as, smartphone, PDA, GPS Navigation
and laptops with a variety of mobile computing, networking and security technologies. In addition, with the
development of wireless technology like WiMax, Ad Hoc Network and WIFI, users may be surfing the Internet much
easier but not limited by the cables as before. Thus, those mobile devices have been accepted by more and more
people as their first choice of working and entertainment in their daily lives. So, what is Mobile computing exactly?
In Wikipedia, it is described as a form of human-computer interaction by which a computer is expected to be
transported during normal usage.
Mobile computing is based on a collection of three major concepts: hardware, software and communication. The
concepts of hardware can be considered as mobile devices, such as smartphone and laptop, or their mobile
components. Software of mobile computing is the numerous mobile applications in the devices, such as the mobile
browser, anti-virus software and games. The communication issue includes the infrastructure of mobile networks,
protocols and data delivery in their use. They must be transparent to end users. Thus the framework of mobile
computing is as shown in below Fig. 1.
Cloud Computing
Cloud Computing has become a popular phrase since 2007. However, there is no consensual definition on what a
Cloud Computing or Cloud Computing System is, due to dozens of developers and organizations described it from
different perspectives. C. Hewitt [4] introduces that the major function of a cloud computing system is storing data
on the cloud servers, and uses of cache memory technology in the client to fetch the data. Those clients can be PCs,
laptops, smartphones and so on. R. Buyya [5] gives a definition from the perspective of marking that cloud
computing is a parallel and distributed computing system, which is combined by a group of virtual machines with
internal links. Such systems dynamically offer computing resources from service providers to customers according to
their Service level Agreement (SLA). However, some authors mentioned that cloud computing was not a completely
new concept. L. Youseff [6] from UCSB argue that cloud computing is just combined by many existent and few new
concepts in many research fields, such as distributed and grid computing, Service-Oriented Architectures (SOA) and
in virtualization.
This layer provides the hardware facility and infrastructure for clouds. In data center layer, a number of servers are
linked with high-speed networks to provide services for customers. Typically, data centers are built in less populated
places, with high power supply stability and a low risk of disaster.
Infrastructure Layer It includes resources of computing and storage. In this layer, physical devices and hardware,
such as servers and storages are virtualized as a resource pool to provide computing storage and network services to
users, in order to install operation system (OS) and operate software application. Thus it is denoted as Infrastructure
as a Service (IaaS). IaaS enables the provision of storage, hardware, servers and networking components. The client
typically pays on a per-use basis. Thus, clients can save cost as the payment is only based on how much resource
they really use. Infrastructure can be expanded or shrunk dynamically as needed. Services of this layer such as
Elastic Computing Cloud (EC2) of Amazon and S3 (Simple Storage Service)
. Platform Layer This layer is considered as a core layer in the cloud computing system, which includes the
environment of parallel programming design, distributed storage and management system for structured mass data,
distributed file system for mass data, and other system management tools for cloud computing. Program developers
are the major clients of the platform layer. All platform resources such as program testing, running and maintaining
are provided by the platform directly but not to end users. Thus, this type of services in a platform layer is called
Platform as a Service (PaaS). PaaS offers an advanced integrated environment for building, testing and deploying
custom applications. The typical services are Google App Engine, Azure from Microsoft and Amazon Map
Reduce/Simple Storage Service.
Application Layer This layer provides some simple software and applications, as well as costumer interfaces to end
users. Thus we name this type of services in the application layer as Software as a Service (SaaS). SaaS supports a
software distribution with specific requirements. In this layer, the users can access an application and information
remotely via the Internet and pay only for that they use. Users use client software or a browser to call services from
providers through the Internet, and pay costs according to the utility business model (like water or electricity). The
earliest SaaS is the Customer Relationship Management (CRM) from Salesforce, which was developed based on the
force.com (a PaaS in Salesforce). Some other services provided by Google on-line office such as documents,
spreadsheets, presentations are all SaaS. Although the cloud computing architecture can be divided into four layers
as shown in Fig. 2, it does not mean that the top layer must be built on the layer directly below it. For example, the
SaaS application can be deployed directly on IaaS, instead of PaaS. Also, some services can be considered as a part of
more than one layer. For example, data storage service can be viewed as either in IaaS or PaaS. Given this
architectural model, the users can use the services flexibly and efficiently.
Nowadays, both hardware and software of mobile devices get greater improvement than before, some smartphones
such as iPhone 4S, Android serials, Windows Mobile serials and Blackberry, are no longer just traditional mobile
phones with conversation, SMS, Email and website browser, but are daily necessities to users. Meanwhile, those
smartphones include various sensing modules like navigation, optics, gravity, orientation, and so on. This brings a
convenient and intelligent mobile experience to users.
following advantages:
1. Extended battery life.
2. Improvement in data storage capacity and processing power.
3. Improved synchronization of data due to “store in one place, accessible
from anywhere ” platform theme.
4. Improved reliability and scalability.
5. Ease of integration.
3-Tier Architecture of Mobile computing
A 3-tier architecture is an application program that is organized into three major parts,
comprising of:
● The decision to offload computation is made dynamically based on factors such as network conditions, device
resources, and application requirements.
● The system continuously monitors parameters like network latency, bandwidth, device battery level, and available
computational resources. Resource-aware Offloading: ● Depending on the current state of the mobile device and
the network, tasks can be offloaded to remote servers or executed locally on the device. ● Offloading decisions may
consider factors like the complexity of the computation, data transfer costs, and the availability of suitable remote
resources. Context Awareness: ● Adaptive offloading systems often take into account the context of the user and the
application. ● Contextual information may include the user's location, preferences, and the nature of the application
(e.g., real-time tasks, background tasks). Energy Efficiency: ● Offloading computationally intensive tasks to more
powerful remote servers can potentially save energy on the mobile device. ● However, energy consumption
associated with data transfer and communication should also be considered, as excessive offloading might lead to
increased network usage and energy consumption. Quality of Service (QoS): ● Maintaining a satisfactory level of
service is crucial. Adaptive offloading mechanisms need to balance the trade-off between offloading to improve
performance and avoiding situations where network delays compromise user experience. Machine Learning and
Predictive Analytics: ● Some adaptive offloading systems use machine learning algorithms and predictive analytics to
anticipate future conditions and make more informed decisions. ● By learning from historical data and adapting to
changing patterns, the system can improve its decision-making capabilities over time
Explain the concept of mobile data offloading using opportunistic
communication
Mobile data offloading using opportunistic communication is a strategy
employed
in mobile networking to alleviate network congestion and improve data
transfer
efficiency. It involves redirecting data traffic away from traditional cellular
networks to alternative communication channels or nodes when
opportunities
arise. This concept exploits transient network connections, intermittent
connectivity, and proximity-based communication to offload data in a
decentralized manner. Here's a breakdown of the key components and
concepts
involved:
Opportunistic Communication:
Opportunistic communication refers to the ability of devices to establish ad-
hoc
connections opportunistically, leveraging short-range wireless technologies
like
Bluetooth, Wi-Fi Direct, or even device-to-device communication protocols.
Devices opportunistically communicate when they come into proximity with
each
other, forming temporary networks or connections to exchange data.
Mobile Data Offloading:
Mobile data offloading involves diverting data traffic away from traditional
cellular
networks (like 3G, 4G, or 5G) to alternative networks or communication
channels
to reduce congestion and improve performance.
Offloading can occur to Wi-Fi networks, small cells, or even other mobile
devices
directly.
Advantages of Opportunistic Communication for Offloading:
Reduction in Cellular Network Load: By offloading data onto alternative
channels,
cellular network congestion can be reduced, leading to better overall
network
performance.
Improved Data Transfer Efficiency: Opportunistic communication can utilize
available resources more efficiently by exploiting direct device-to-device
communication or nearby Wi-Fi networks.
Enhanced Coverage and Connectivity: In scenarios where traditional
network
coverage is limited, opportunistic communication can extend connectivity
by
leveraging nearby devices as relays or access points.
Challenges and Considerations:
Intermittent Connectivity: Opportunistic communication relies on
intermittent
connections, which may result in delays or disruptions in data transfer.
Security and Privacy: Since data is often relayed through multiple devices or
networks, ensuring the security and privacy of transmitted data becomes
crucial.
Resource Management: Efficiently managing the resources of participating
devices (e.g., battery power, bandwidth) is essential to avoid excessive
resource
consumption or depletion.
Applications and Use Cases:
Disaster Recovery: In disaster scenarios where traditional communication
infrastructure is damaged, opportunistic communication can enable data
exchange between nearby devices to coordinate relief efforts.
Crowdsourced Networking: Opportunistic communication can be used to
create
crowdsourced networks, where users share resources and connectivity to
extend
network coverage in areas with poor infrastructure.
Public Transportation: In crowded environments such as buses or trains,
opportunistic communication can facilitate local data sharing among
passengers
without relying solely on cellular networks