Cloud Computing
Cloud Computing
Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over anetwork (typically the Internet). The name comes from the use of a cloud-shaped symbol as an abstraction for the complex infrastructure it contains in system diagrams. Cloud computing entrusts remote services with a user's data, software and computation. There are many types of public cloud computing:[1]
Infrastructure as a service (IaaS) Platform as a service (PaaS) Software as a service (SaaS) Storage as a service (STaaS) Security as a service (SECaaS) Data as a service (DaaS) Test environment as a service (TEaaS) Desktop as a service (DaaS) API as a service (APIaaS)
The business model, using software as a service, users also rent application software and databases. The cloud providers manage the infrastructure and platforms on which the applications run.
End users access cloud-based applications through a web browser or a light-weight desktop or mobile app while thebusiness software and user's data are stored on servers at a remote location. Proponents claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand.[2][3] Cloud computing relies on sharing of resources to achieve coherence and economies of scale similar to a utility (like the electricity grid) over a network.[4] At the foundation of cloud computing is the broader concept of converged infrastructure and shared services.
Contents
[hide]
4 Service models
o o o
4.1 Infrastructure as a service (IaaS) 4.2 Platform as a service (PaaS) 4.3 Software as a service (SaaS)
o o o o
6.1 Public cloud 6.2 Community cloud 6.3 Hybrid cloud 6.4 Private cloud
7 Architecture
o o
8 Issues
o o o o o o o
8.1 Privacy 8.2 Compliance 8.3 Legal 8.4 Open source 8.5 Open standards 8.6 Security 8.7 Sustainability
o o
[edit]History
This section requires expansion. (June
2012)
The origin of the term cloud computing is obscure, but it appears to derive from the practice of using drawings of stylized clouds to denote networks in diagrams of computing and communications systems. The word cloud is used as a metaphor for the Internet, based on the standardized use of a cloud-like shape to denote a network on telephony schematics and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents. The cloud symbol was used to represent the Internet as early as 1994.[5][6] In the 1990s, telecommunications companies who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider and that which was the responsibility of the users. Cloud computing extends this boundary to cover servers as well as the network infrastructure. [7] The underlying concept of cloud computing dates back to the 1950s; when large-scale mainframe became available in academia and corporations, accessible via thin clients / terminal computers. Because it was costly to buy a mainframe, it became important to find ways to get the greatest return on the investment in them, allowing multiple users to share both the physical access to the computer from multiple terminals as well as to share the CPU time, eliminating periods of inactivity, which became known in the industry as time-sharing.[8] As computers became more prevalent, scientists and technologists explored ways to make large-scale computing power available to more users through time sharing, experimenting with algorithms to provide the optimal use of the infrastructure, platform and applications with prioritized access to the CPU and efficiency for the end users.[9] John McCarthy opined in the 1960s that "computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The
Challenge of the Computer Utility. Other scholars have shown that cloud computing's roots go all the way back to the 1950s when scientist Herb Grosch (the author ofGrosch's law) postulated that the entire world would operate on dumb terminals powered by about 15 large data centers.[10] Due to the expense of these powerful computers, many corporations and other entities could avail themselves of computing capability through time sharing and several organizations, such as GE's GEISCO, IBM subsidiary The Service Bureau Corporation (SBC, founded in 1957), Tymshare (founded in 1966), National CSS (founded in 1967 and bought by Dun & Bradstreet in 1979), Dial Data (bought by Tymshare in 1968), and Bolt, Beranek and Newman(BBN) marketed time sharing as a commercial venture. The development of the Internet from being document centric via semantic data towards more and more services was described as "Dynamic Web".[11] This contribution focused in particular in the need for better meta-data able to describe not only implementation details but also conceptual details of model-based applications. The ubiquitous availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture,autonomic, and utility computing have led to a tremendous growth in cloud computing.[12][13][14] After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" (teams small enough to be fed with two pizzas) could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.[15][16] In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds.[17] In the same year, efforts were focused on providingquality of service guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting to areal-time cloud environment.[18] By mid2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them"[19] and observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing... will result in dramatic growth in IT products in some areas and significant reductions in other areas."[20] On March 1, 2011, IBM announced the Smarter Computing framework to support Smarter Planet.[21] Among the various components of the Smarter Computing foundation, cloud computing is a critical piece.
[edit]Similar
Autonomic computing Computer systems capable of self-management.[22] Clientserver model Clientserver computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).[23]
Grid computing "A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks."
Mainframe computer Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, police and secret intelligence services, enterprise resource planning, and financial transaction processing.[24]
Utility computing The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity."[25][26]
Peer-to-peer Distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client server model).
Cloud gaming - Also known as on-demand gaming, this is a way of delivering games to computers. The gaming data will be stored in the provider's server, so that gaming will be independent of client computers used to play the game.
[edit]Characteristics
Cloud computing exhibits the following key characteristics:
Agility improves with users' ability to re-provision technological infrastructure resources. Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure.[27] This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).[28] The e-FISCAL project's state of the art repository[29] contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available inhouse.
Device and location independence[30] enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is
off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.[28]
Virtualization technology allows servers and storage devices to be shared and utilization be increased. Applications can be easily migrated from one physical server to another.
Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.) Peak-load capacity increases (users need not engineer for highest possible load-levels) Utilisation and efficiency improvements for systems that are often only 1020% utilised.[15]
Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.[31]
Scalability and elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, selfservice basis near real-time[32], without users having to engineer for peak loads.[33][34]
Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.[28]
Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.[35] Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford.[36] However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer and can be accessed from different places.
[edit]On-demand
self-service
See also: Self-service provisioning for cloud computing services and Service catalogs for cloud computing services On-demand self-service allows users to obtain, configure and deploy cloud services themselves using cloud service catalogues, without requiring the assistance of IT.[37][38] This feature is listed by the The National Institute of Standards and Technology (NIST) as a characteristic of cloud computing. [39] The self-service requirement of cloud computing prompts infrastructure vendors to create cloud computing templates, which are obtained from cloud service catalogues. Manufacturers of such templates or blueprints include Hewlett-Packard (HP), which names its templates as HP Cloud Maps[40] RightScale[41] and Red Hat, which names its templates CloudForms.[42]
The templates contain predefined configurations used to by consumers to set up cloud services. The templates or blueprints provide the technical information necessary to build ready-to-use clouds.[41] Each template includes specific configuration details for different cloud infrastructures, with information about servers for specific tasks such as hosting applications, databases, websites and so on. [41] The templates also include predefined Web service, the operating system, the database, security configurations and load balancing.[42] Cloud consumers use cloud templates to move applications between clouds through a self-service portal. The predefined blueprints define all that an application requires to run in different environments. For example, a template could define how the same application could be deployed in cloud platforms based on Amazon Web Service, VMware or Red Hat.[43] The user organization benefits from cloud templates because the technical aspects of cloud configurations reside in the templates, letting users to deploy cloud services with a push of a button.[44][45] Cloud templates can also be used by developers to create a catalog of cloud services.[46]
[edit]Service
models
Cloud computing providers offer their services according to three fundamental models:[4][47] Infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) where IaaS is the most basic and each higher model abstracts from the details of the lower models.
[edit]Infrastructure
as a service (IaaS)
See also: Category:Cloud infrastructure In this most basic cloud service model, cloud providers offer computers, as physical or more often as virtual machines, and other resources. The virtual machines are run as guests by a hypervisor, such as Xen or KVM. Management of pools of hypervisors by the cloud operational support system leads to the
ability to scale to support a large number of virtual machines. Other resources in IaaS clouds include images in a virtual machine image library, raw (block) and file-based storage, firewalls, load balancers, IP addresses,virtual local area networks (VLANs), and software bundles.[48] Amies, Alex; Sluiman, Harm; Tong IaaS cloud providers supply these resources on demand from their large pools installed in data centers. For wide area connectivity, the Internet can be used orincarrier clouds -- dedicated virtual private networks can be configured., Qiang Guo (July 2012). "Infrastructure as a Service Cloud Concepts". Developing and Hosting Applications on the Cloud. IBM Press. ISBN 978-0-13-306684-5. To deploy their applications, cloud users then install operating system images on the machines as well as their application software. In this model, it is the cloud user who is responsible for patching and maintaining the operating systems and application software. Cloud providers typically bill IaaS services on a utility computing basis, that is, cost will reflect the amount of resources allocated and consumed. IaaS refers not to a machine that does all the work, but simply to a facility given to businesses that offers users the leverage of extra storage space in servers and data centers. Examples of IaaS include: Amazon CloudFormation (and underlying services such as Amazon EC2), Rackspace Cloud, Terremark, Windows Azure Virtual Machines, Google Compute Engine. and Joyent.
[edit]Platform
as a service (PaaS)
Main article: Platform as a service See also: Category:Cloud platforms In the PaaS model, cloud providers deliver a computing platform typically including operating system, programming language execution environment, database, and web server. Application developers can develop and run their software solutions on a cloud platform without the cost and complexity of buying and managing the underlying hardware and software layers. With some PaaS offers, the underlying computer and storage resources scale automatically to match application demand such that cloud user does not have to allocate resources manually. Examples of PaaS include: Amazon Elastic Beanstalk, Cloud Foundry, Heroku, Force.com, EngineYard, Mendix, Google App Engine, Windows Azure Compute and OrangeScape.
[edit]Software
as a service (SaaS)
Main article: Software as a service In this model, cloud providers install and operate application software in the cloud and cloud users access the software from cloud clients. The cloud users do not manage the cloud infrastructure and platform on which the application is running. This eliminates the need to install and run the application on the cloud user's own computers simplifying maintenance and support. What makes a cloud application different from other applications is its scalability. This can be achieved by cloning tasks onto multiple virtual machines at
run-time to meet the changing work demand.[49] Load balancers distribute the work over the set of virtual machines. This process is transparent to the cloud user who sees only a single access point. To accommodate a large number of cloud users, cloud applications can be multitenant, that is, any machine serves more than one cloud user organization. It is common to refer to special types of cloud based application software with a similar naming convention: desktop as a service, business process as a service, test environment as a service, communication as a service. The pricing model for SaaS applications is typically a monthly or yearly flat fee per user,[50] so price is scalable and adjustable if users are added or removed at any point.[51] Examples of SaaS include: Google Apps, innkeypos, Quickbooks Online, Successfactors Bizx, Limelight Video Platform, Salesforce.com, Microsoft Office 365 and Onlive.
[edit]Cloud
clients
See also: Category:Cloud clients Users access cloud computing using networked client devices, such as desktop computers, laptops, tablets and smartphones. Some of these devices - cloud clients - rely on cloud computing for all or a majority of their applications so as to be essentially useless without it. Examples are thin clients and the browser-based Chromebook. Many cloud applications do not require specific software on the client and instead use a web browser to interact with the cloud application. With Ajax and HTML5 these Web user interfaces can achieve a similar or even better look and feel as native applications. Some cloud applications, however, support specific client software dedicated to these applications (e.g., virtual desktop clients and most email clients). Some legacy applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology.
[edit]Deployment
models
[edit]Public
cloud
Public cloud applications, storage, and other resources are made available to the general public by a service provider. These services are free or offered on a pay-per-use model. Generally, public cloud service providers like Amazon AWS, Microsoft and Google own and operate the infrastructure and offer access only via Internet (direct connectivity is not offered).[28]
[edit]Community
cloud
Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized. [4]
[edit]Hybrid
cloud
Hybrid cloud is a composition of two or more clouds (private, community or public) that remain unique entities but are bound together, offering the benefits of multiple deployment models.[4] By utilizing "hybrid cloud" architecture, companies and individuals are able to obtain degrees of fault tolerance combined with locally immediate usability without dependency on internet connectivity. Hybrid cloud architecture requires both on-premises resources and off-site (remote) server-based cloud infrastructure. Hybrid clouds lack the flexibility, security and certainty of in-house applications.[52] Hybrid cloud provides the flexibility of in house applications with the fault tolerance and scalability of cloud based services.
[edit]Private
cloud
Private cloud is cloud infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally.[4] Undertaking a private cloud project requires a significant level and degree of engagement to virtualize the business environment, and it will require the organization to reevaluate decisions about existing resources. When it is done right, it can have a positive impact on a business, but every one of the steps in the project raises security issues that must be addressed in order to avoid serious vulnerabilities.[53] They have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from less hands-on management,[54] essentially "[lacking] the economic model that makes cloud computing such an intriguing concept".[55][56]
[edit]Architecture
Cloud architecture,[57] the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue. Elastic provision implies intelligence in the use of tight or loose coupling as applied to mechanisms such as these and others.
[edit]The
Intercloud
Main article: Intercloud The Intercloud[58] is an interconnected global "cloud of clouds"[59][60] and an extension of the Internet "network of networks" on which it is based.[61][62][63]
[edit]Cloud
engineering
Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high-level concerns of commercialisation, standardisation, and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web,performance, information, security, platform, risk, and quality engineering.
[edit]Issues [edit]Privacy
The cloud model has been criticised by privacy advocates for the greater ease in which the companies hosting the cloud services control, thus, can monitor at will, lawfully or unlawfully, the communication and data stored between the user and the host company. Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity.[64] Using a cloud service provider (CSP) can complicate privacy of data because of the
extent to which virtualization for cloud processing (virtual machines) and cloud storage are used to implement cloud service.[65] The point is that CSP operations, customer or tenant data may not remain on the same system, or in the same data center or even within the same provider's cloud. This can lead to legal concerns over jurisdiction. While there have been efforts (such as US-EU Safe Harbor) to "harmonise" the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones."[66] Cloud computing poses privacy concerns because the service provider may access the data that is on the cloud at any point in time. They could accidentally or deliberately alter or even delete information.[67] Postage and delivery services company, Pitney Bowes launched Volly, a cloud-based, digital mailbox service to leverage its communication management assets. They also faced the technical challenge of providing strong data security and privacy. However, they were able to address the same concern by applying customized, application-level security, including encryption. [68]
[edit]Compliance
In order to obtain compliance with regulations including FISMA, HIPAA, and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment modes that are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA"[69][70] and Rackspace Cloud or QubeSpace are able to claim PCI compliance.[71] Many providers also obtain a SAS 70 Type II audit, but this has been criticised on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely.[72] Providers typically make this information available on request, under non-disclosure agreement.[73][74] Customers in the EU contracting with cloud providers outside the EU/EEA have to adhere to the EU regulations on export of personal data.[75] U.S. Federal Agencies have been directed by the Office of Management and Budget to use a process called FedRAMP (Federal Risk and Authorization Management Program) to assess and authorize cloud products and services. Federal CIO Steven VanRoekel issued a memorandum to federal agency Chief Information Officers on December 8, 2011 defining how federal agencies should use FedRAMP. FedRAMP consists of a subset of NIST Special Publication 800-53 security controls specifically selected to provide protection in cloud environments. A subset has been defined for the FIPS 199 low categorization and the FIPS 199 moderate categorization. The FedRAMP program has also established a Joint Accreditation Board (JAB) consisting of Chief Information Officers from DoD, DHS and GSA. The JAB is responsible for establishing accreditation standards for 3rd party organizations who will perform the assessments of cloud solutions. The JAB will also review authorization packages and may grant provisional authorization (to
operate). The federal agency consuming the service will still have the final responsibility for final authority to operate.[76]
[edit]Legal
This section requires expansion with: examples and additional citations.(August 2012)
As can be expected with any revolutionary change in the landscape of global computing, certain legal issues arise; everything from trademark infringement, security concerns to the sharing of propriety data resources.
[edit]Open
source
See also: Category:Free software for cloud computing Open-source software has provided the foundation for many cloud computing implementations, prominent examples being the Hadoop framework[77] and VMware's Cloud Foundry.[78] In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to be run over a network.[79]
[edit]Open
standards
See also: Category:Cloud standards Most cloud providers expose APIs that are typically well-documented (often under a Creative Commons license[80]) but also unique to their implementation and thus not interoperable. Some vendors have adopted others' APIs and there are a number of open standards under development, with a view to delivering interoperability and portability.[81]
[edit]Security
Main article: Cloud computing security As cloud computing is achieving increased popularity, concerns are being voiced about the security issues introduced through adoption of this new model. The effectiveness and efficiency of traditional protection mechanisms are being reconsidered as the characteristics of this innovative deployment model can differ widely from those of traditional architectures.[82] An alternative perspective on the topic of cloud security is that this is but another, although quite broad, case of "applied security" and that similar security principles that apply in shared multi-user mainframe security models apply with cloud security.[83] The relative security of cloud computing services is a contentious issue that may be delaying its adoption.[84] Physical control of the Private Cloud equipment is more secure than having the equipment off site and under someone elses control. Physical control and the ability to visually inspect the data links and access ports is required in order to ensure data links are not compromised. Issues barring the adoption of cloud computing are due in large part to the private and public sectors' unease surrounding the external
management of security-based services. It is the very nature of cloud computing-based services, private or public, that promote external management of provided services. This delivers great incentive to cloud computing service providers to prioritize building and maintaining strong management of secure services.[85] Security issues have been categorised into sensitive data access, data segregation, privacy, bug exploitation, recovery, accountability, malicious insiders, management console security, account control, and multi-tenancy issues. Solutions to various cloud security issues vary, from cryptography, particularly public key infrastructure (PKI), to use of multiple cloud providers, standardisation of APIs, and improving virtual machine support and legal support.[82][86][87] Cloud computing offers many benefits, but it also is vulnerable to threats. As the uses of cloud computing increase, it is highly likely that more criminals will try to find new ways to exploit vulnerabilities in the system. There are many underlying challenges and risks in cloud computing that increase the threat of data being compromised. To help mitigate the threat, cloud computing stakeholders should invest heavily in risk assessment to ensure that the system encrypts to protect data; establishes trusted foundation to secure the platform and infrastructure; and builds higher assurance into auditing to strengthen compliance. Security concerns must be addressed in order to establish trust in cloud computing technology.
[edit]Sustainability
Although cloud computing is often assumed to be a form of "green computing", there is no published study to substantiate this assumption.[88] citing the servers affects the environmental effects of cloud computing. In areas where climate favors natural cooling and renewable electricity is readily available, the environmental effects will be more moderate. (The same holds true for "traditional" data centers.) Thus countries with favorable conditions, such as Finland,[89] Sweden and Switzerland,[90] are trying to attract cloud computing data centers. Energy efficiency in cloud computing can result from energyaware scheduling and server consolidation.[91] However, in the case of distributed clouds over data centers with different source of energies including renewable source of energies, a small compromise on energy consumption reduction could result in high carbon footprint reduction.[92]
[edit]Abuse
As with privately purchased hardware, customers can purchase the services of cloud computing for nefarious purposes. This includes password cracking and launching attacks using the purchased services.[93] In 2009, a banking trojan illegally used the popular Amazon service as a command and control channel that issued software updates and malicious instructions to PCs that were infected by the malware.[94]
[edit]IT
governance
Main article: Corporate governance of information technology The introduction of cloud computing requires an appropriate IT governance model to ensure a secured computing environment and to comply with all relevant organizational information technology policies.[95][96] As such, organizations need a set of capabilities that are essential when effectively
implementing and managing cloud services, including demand management, relationship management, data security management, application lifecycle management, risk and compliance management. [97]
[edit]Research
Many universities, vendors and government organizations are investing in research around the topic of cloud computing:[98][99]
In October 2007, the Academic Cloud Computing Initiative (ACCI) was announced as a multi-university project designed to enhance students' technical knowledge to address the challenges of cloud computing.[100]
In April 2009, UC Santa Barbara released the first open source platform-as-a-service, AppScale, which is capable of running Google App Engine applications at scale on a multitude of infrastructures.
In April 2009, the St Andrews Cloud Computing Co-laboratory was launched, focusing on research in the important new area of cloud computing. Unique in the UK, StACC aims to become an international centre of excellence for research and teaching in cloud computing and will provide advice and information to businesses interested in using cloud-based services.[101]
In October 2010, the TClouds (Trustworthy Clouds) project was started, funded by the European Commission's 7th Framework Programme. The project's goal is to research and inspect the legal foundation and architectural design to build a resilient and trustworthy cloud-of-cloud infrastructure on top of that. The project also develops a prototype to demonstrate its results.[102]
In December 2010, the TrustCloud research project [103][104] was started by HP Labs Singapore to address transparency and accountability of cloud computing via detective, data-centric approaches[105] encapsulated in a five-layer TrustCloud Framework. The team identified the need for monitoring data life cycles and transfers in the cloud,[103] leading to the tackling of key cloud computing security issues such as cloud data leakages, cloud accountability and cross-national data transfers in transnational clouds.
In July 2011, the High Performance Computing Cloud (HPCCLoud) project was kicked-off aiming at finding out the possibilities of enhancing performance on cloud environments while running the scientific applications - development of HPCCLoud Performance Analysis Toolkit which was funded by CIM-Returning Experts Programme - under the coordination of Prof. Dr. Shajulin Benedict.
In June 2011, the Telecommunications Industry Association developed a Cloud Computing White Paper, to analyze the integration challenges and opportunities between cloud services and traditional U.S. telecommunications standards.[106]
In 2011, FEMhub launched NCLab, a free SaaS application for science, technology, engineering and mathematics (STEM). NCLab has more than 10,000 users as of July 2012.
Cloud computing is not only cost effective, but utilizing it also helps to cut back on global wastes. It is environmentally friendly since it is shared by multiple users. The down time is cut in half and the resources are stretched. Flexible There is a high rate of flexibility when using cloud computing because people can opt out of using it whenever they want too. This is also one of the main reasons people love to use this method. Service level agreements are what cover the costs in this case. If the correct quality is not provided then has to pay a penalty cost. Device Diversity The cloud computing method can be accessed through various different electronic devices that are able to have access to the internet. These devices would include and iPad, smartphone, Laptop, or desktop computer. Lots of Storage Space When you use the internet with the cloud services then your company will have lots more room to store the files and data that they need to store. Customize Settings Last but not least, you will enjoy the fact that cloud computing allows you to customize your business applications. This is a great benefit because the world of online business is very competitive. Now these are the 10 cloud computing advantages.
About the Author: Global business expert Laurel Delaney is the founder of GlobeTrade.com (a Global TradeSource, Ltd. company). She also is the creator of Borderbuster, an e -newsletter, and The Global Small Business Blog, all highly regarded for their global small business coverage. You can reach Delaney at [email protected] or follow her on Twitter @LaurelDelaney.
Image: Victor1558/Flickr
One of the benefits of cloud computing is increased efficiency; services are rapidly deployed and ready for use in a matter of minutes versus the weeks or months it traditionally takes. But there is more to cloud computing than just getting your compute resources, storage capacity or application as a service within minutes. Based on personal experience with cloud consumers, here are the top five business benefits beyond efficiency. Business agility Getting the compute resources you need when you need them tends to shorten IT projects resulting in less FTE to deliver the project and a quicker and more predictive time-to-market. Being able to deliver results faster, cheaper and with more quality might just give your business a competitive edge and make her more nimble on her feet. I have seen a data analytics project being reduced from 4 months to just 3 weeks, reducing the projects time-to-market and overall cost significantly. New business models
It has become much easier to start business innovation initiatives, often enabled by readily available cloud services. Utilizing or combining these services can result in new and innovative business models, generating new value propositions and resulting in new revenue streams. There are even companies that are building entirely new business models and value propositions solely using cloud services. I see this last category especially in small and medium enterprises, but also think of Spotify, and BitCasa. Less operational issues Utilizing standardized services can significantly reduce issues and defects. This increases business continuity and reduce time spent on operational issues, focusing more on the things that matter. Cloud computing allows you to deploy the same service or topology of services repetitively, with the same result every time. This allows organizations to predicatively deploy pre-build server images, application services or entire application landscapes defined using design patterns. Better use of resources On the other side of the business agility model, more efficient projects and less operational issues allow your employees to spend their time on other more useful activities that may offer a greater potential value to your business. This benefit is different for every organization and harder to quantify, but people are an organizations biggest asset and this allows you to better utilize this asset. Another take on better resource usage is based on the fact the principle of economies of scale; cloud service providers, in general, more efficiently utilize physical resources and reduce energy consumption in contract to a traditional IT approach. Less capital expense. There is some debateabout the value of shifting from a capital expense (CapEx) model to an operational expense (OpEx) model. Overall sentiment is that, specifically for short and midterm projects, the OpEx model is more attractive because there are no long term financial commitments. In the OpEx model zero upfront investment is required, which allows organizations to start projects faster but also end them without losing any investments in the cloud services. As you see, there is much more to cloud computing than technology alone. The true power of cloud is what the technology, implementing rapidly deployed services in the cloud, can mean for your business. Edwin Schouten is the Cloud Services Leader for IBM Global Technology Services in the Benelux region (Belgium, Netherlands and Luxembourg). He has 14 years experience in IT, of which the last 8 years in IT Architecture. He is Open Group Certified IT Architect, Expert level IBM IT Architect and has a Masters of Science degree in IT Architecture. He is an evangelist for cloud computing, both internal and external of IBM and active participant in cloud computing standardization organizations in the Benelux.
Advantage
Elasticity
Our cloud offers you both horizontal and vertical scalability. Horizontal scalability allows you to rapidly change the number of separate instances to match changes in demand. Vertical scalability allows you to rapidly change the size of instances themselves in a flexible way. Critically, the elasticity associated with Infrastructure-as-aService (IaaS) allows you to downscale quickly as well as upscale. This means you can meet peaks in demand but also not have excess capacity that you are paying for when you dont need it. An ideal example is mobile application hosting with its explosive potential growth.
Quality of Service
We understand that the quality of our service is paramount for all our customers. That is why we are able to offer a 100% service level agreement (SLA) with a x50 compensation rate for qualifying service interruptions. Read our comprehensive cloud computing SLA.
Cost Reduction
Significant cost savings can be achieved through transitioning existing infrastructure to cloud infrastructure. By not owning physical hardware users benefit from having no hardware maintenance costs or depreciation. Likewise scalability and pay per usage allow tight cost controls and avoid over purchase of capacity. This makes cloud hosting an ideal platform forCPanel reseller hosting and Plesk reseller hosting.
By removing the need to purchase physical infrastructure, developing solutions and bringing new products to market more quickly. This relates not only to the lead times and management overhead associated with actually purchasing physical hardware and software but also to the decision making processes internally. By using cloud computing the up-front investment is vastly reduced and the user isnt locked into longer term contracts that represent a significant commitment.
Virtualisation
Virtualisation is the key technological advance that supports the new world of cloud computing. Virtualisation serves to mask the technological complexity of the new cloud computing platforms and provides the user with the benefits. By hiding the underlying complexity of virtualising hardware, users can benefit from increased ease of use and be relieved of the overhead of managing and maintaining physical hardware. By not being tied to a particular capacity, users of cloud computing can vary their resource consumption in a way that is much more adaptable and flexible. Finally, by using virtual instead of physical computing resources, location becomes unimportant and invisible. Resources from many different physical locations can be managed as one resource. A user needing additional capacity in New York for example can add capacity instantly whilst at the same time managing capacity in London and Tokyo on the same system. If physical hardware was used, such flexibility and speed of deployment would be impossible.
Automation
Cloud computing resources are generally controllable via an API. One of the key benefits of an API is its ability to grant the user meaningful control of their resources from anywhere. Likewise, it is possible for users to write code and scripts to automate and manage their computing resources based on their requirements. Such arrangements with physical hardware are impossible or require expensive sophisticated software management systems. CloudSigma has such a full API interface allowing management and control of resources remotely and if required in an automated fashion.
Life is easier with your head in the clouds, right? Perhaps. But does the same hold true for your companys data? Are there risks to implementing this progressive solution into your business? The answer is yes. Like most things in life, the benefits come with risksits just a matter of knowing if the benefits outweigh the risks and vice versa. Lets take a look at some grounded facts about cloud computing to help you decide if you are ready to go up, up, and away.
Benefit 1: Flexibility
Network Dependency may mean dependency to the internet, but it means independence from the office. Employees are now more able to access data from servers outside the office and not hardwired in-house serverscreating a more flexible and mobile work lifestyle for organizations.
Not only does cloud computing provide flexibility for workers, it provides flexibility in implementing changes and new technologies without high risk and cost. Because organizations arent bound to a hard-wired IT infrastructure that cost billions to create in the first place, the have room to experiment and change things with the ability to just as easily revert back to their original system if things do not work out.
Risk 3: Centralization
Organizations usually outsource data and application services to a centralized provider. In cloud computing, we know that network dependency is a drawback due to outages. Centralized data can certainly add another risk to cloud computing: If the providers service goes down, all clients are affected.
Benefit 3: Reliability
While internet connectivity and the provider itself being subject to outages is a scary fact of the nature of cloud computing, there is still more reliability in comparison to in-house systems because of the economies of scale. The vendor is more able to give 24/7 technical support and highly trained experienced staff to support the infrastructure at its best condition, and the benefits will reach all their clients. Compare this to each organization having a team of on-site IT people with varied skill set.