AWS Overview
AWS Overview
October 2012
Page 1 of 18
October 2012
Introduction
Managing the unique and groundbreaking changes in both technology and business over the past decade has created an ongoing IT infrastructure challenge for many senior technology executives. Indeed, over the past ten years, the typical business application architecture has evolved from a desktop-centric installation, then to client/server solutions, and now to loosely coupled web services and service-oriented architectures (SOA). Each evolutionary step has built on the previous one while adding new challenges, dimensions, and opportunities for IT departments and their business partners. Recently, virtualization has become a widely accepted way to reduce operating costs and increase the reliability of enterprise IT. In addition, grid computing makes a completely new class of analytics, data crunching, and business intelligence tasks possible that were previously cost and time prohibitive. Along with these technology changes, the speed of innovation and unprecedented acceleration in the introduction of new products has fundamentally changed the way markets work. Along with the wide acceptance of software as a service (SaaS) offerings, these changes have paved the way for the latest IT infrastructure challenge: cloud computing.
Key Attributes Distinguish Cloud Computing Services, March 2009. David W. Cearley and David Mitchell Smith, Gartner.
Page 2 of 18
October 2012
To understand the impact and promise of cloud computing, one may first analyze the significance of and lessons learned from business outsourcing. Focusing on a core competency and then shifting the peripheral business tasks to other organizations is a proven business strategy. Today, organizations outsource business functions such as logistics, human resources (HR), payroll, and facilities. Many organizations have taken advantage of IT outsourcing as a way to move some capabilities out of their internal organization altogether. Superficially, at least, cloud computing resembles the trend of business outsourcing because it provides the benefits of leveraging the expertise of others and being cost efficient. However, cloud computing also provides flexibility, scalability, elasticity, and reliability. These additional benefits are why enterprise organizations see cloud computing as a powerful next step in their IT infrastructure evolution.
Page 3 of 18
October 2012
Flexible
The first key difference between AWS and other IT models is flexibility. Using traditional models to deliver IT solutions often requires large investments in new architectures, programming languages, and operating systems. Although these investments are valuable, the time that it takes to adapt to new technologies can also slow down your business and prevent you from quickly responding to changing markets and opportunities. When the opportunity to innovate arises, you want to be able to move quickly and not always have to support legacy infrastructure and applications or deal with protracted procurement processes. In contrast, the flexibility of AWS allows you to keep the programming models, languages, and operating systems that you are already using or choose others that are better suited for their project. You dont have to learn new skills. Flexibility means that migrating legacy applications to the cloud is easy and cost-effective. Instead of re-writing applications, you can easily move them to the AWS cloud and tap into advanced computing capabilities. Building applications on AWS is very much like building applications using existing hardware resources. Since AWS provides a flexible, virtual IT infrastructure, you can use the services together as a platform or separately for specific needs. AWS run almost anythingfrom full web applications to batch processing to offsite data back-ups.
Page 4 of 18
October 2012
In addition, you can move existing SOA-based solutions to the cloud by migrating discrete components of legacy applications. Typically, these components benefit from high availability and scalability, or they are self-contained applications with few internal dependencies. Larger organizations typically run in a hybrid mode where pieces of the application run in their data center and other portions run in the cloud. Once these organizations gain experience with the cloud, they begin transitioning more of their projects to the cloud, and they begin to appreciate many of the benefits outlined in this document. Ultimately, many organizations see the unique advantages of the cloud and AWS and make it a permanent part of their IT mix. Finally, AWS provides you flexibility when provisioning new services. Instead of the weeks and months it takes to plan, budget, procure, set up, deploy, operate, and hire for a new project, you can simply sign up for AWS and immediately begin deployment on the cloud the equivalent of 1, 10, 100, or 1,000 servers. Whether you want to prototype an application or host a production solution, AWS makes it simple for you to get started and be productive. Many customers find the flexibility of AWS to be a great asset in improving time to market and overall organizational productivity.
Cost-Effective
Cost is one of the most complex elements of delivering contemporary IT solutions. It seems that for every advance that will save money, there is often a commensurate investment needed to realize that savings. For example, developing and deploying an e-commerce application can be a low-cost effort, but a successful deployment can increase the need for hardware and bandwidth. Furthermore, owning and operating your own infrastructure can incur considerable costs, including power, cooling, real estate, and staff. In contrast, the cloud provides an on-demand IT infrastructure that lets you consume only the amount of resources that you actually need. You are not limited to a set amount of storage, bandwidth, or computing resources. It is often difficult to predict requirements for these resources. As a result, you might provision too few resources, which has an impact on customer satisfaction, or you might provide too many resources and miss an opportunity to maximize return on investment (ROI) through full utilization. The cloud provides the flexibility to strike the right balance. AWS requires no up-front investment, long-term commitment, or minimum spend. You can get started through a completely self-service experience online, scale up and down as needed, and terminate your relationship with AWS at any time. You can access new resources almost instantly. The ability to respond quickly to changes, no matter how large or small, means that you can take on new opportunities and meet business challenges that could drive revenue and reduce costs. If you want to consult with AWS for deeper technical discussions, our sales and solutions architecture teams are available.
Page 5 of 18
October 2012
Secure
AWS delivers a scalable cloud-computing platform that provides customers with end-to-end security and end-to-end privacy. AWS builds security into its services in accordance with security best practices, and documents how to use the security features. It is important that you leverage AWS security features and best practices to design an appropriately secure application environment. Ensuring the confidentiality, integrity, and availability of your data is of the utmost importance to AWS, as is maintaining your trust and confidence. AWS takes the following approaches to secure the cloud infrastructure: Certifications and accreditations. AWS has in the past successfully completed multiple SAS70 Type II audits, and now publishes a Service Organization Controls 1 (SOC 1) report, published under both the SSAE 16 and the ISAE 3402 professional standards. In addition, AWS has achieved ISO 27001 certification, and has been successfully validated as a Level 1 service provider under the Payment Card Industry (PCI) Data Security Standard (DSS). In the realm of public sector certifications, AWS has received authorization from the U.S. General Services Administration to operate at the FISMA Moderate level, and is also the platform for applications with Authorities to Operate (ATOs) under the Defense Information Assurance Certification and Accreditation Program (DIACAP). We will continue to obtain the appropriate security certifications and conduct audits to demonstrate
Page 6 of 18
October 2012
the security of our infrastructure and services. AWS will continue to obtain the appropriate security certifications and accreditations to demonstrate the security of our infrastructure and services. Physical security. Amazon has many years of experience designing, constructing, and operating large-scale data centers. The AWS infrastructure is located in Amazon-controlled data centers throughout the world. Knowledge of the location of the data centers is limited to those within Amazon who have a legitimate business reasons for this information. The data centers are physically secured in a variety of ways to prevent unauthorized access. Secure services. Each service in the AWS cloud is architected to be secure. The services contain a number of capabilities that restrict unauthorized access or usage without sacrificing the flexibility that customers demand. Data privacy. You can encrypt personal and business data in the AWS cloud, and publish backup and redundancy procedures for services so that your customers can protect their data and keep their applications running.
For more information on security policies and procedures for AWS, consult the AWS Security Center at aws.amazon.com/security.
Experienced
AWS provides a low-friction path to cloud computing by design. Nevertheless, as with any IT project, the move to the AWS cloud should be done thoughtfully. You should hold your cloud-computing partner to the same high standards that you would expect of any hardware or software vendor. The trust that you place in your cloud-computing vendor will be critical as your organization grows and your customers continue to expect the best experience. The AWS cloud provides levels of scale, security, reliability, and privacy that are often cost-prohibitive for many organizations to meet or exceed. AWS has built an infrastructure based on lessons learned from over sixteen years experience managing the multi-billion dollar Amazon.com business. AWS customers benefit as Amazon continues to hone its infrastructure management skills and capabilities. Today Amazon.com runs a global web platform serving millions of customers and managing billions of dollars worth of commerce every year. AWS has been operating since 2006, and today serves hundreds of thousands of customers worldwide. Moreover, AWS has a demonstrated track record of listening to its customers and delivering highly innovative new features at a rapid pace. These new releases have the same high standards of security and reliability that are demonstrated in all existing AWS infrastructure services. In addition to new services, AWS constantly hones its operational expertise to ensure ongoing dependability, and we continue to incorporate both industry best practices and proprietary advances into the cloud infrastructure. Choosing AWS as a cloud-computing provider allows you to take advantage of these ongoing investments.
Page 7 of 18
October 2012
Compute
Amazon Elastic Compute Cloud (Amazon EC2)
Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale computing easier for developers and system administrators. Amazon EC2s simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazons proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use. Amazon EC2 provides developers and system administrators the tools to build failure resilient applications and isolate themselves from common failure scenarios.
Page 8 of 18
October 2012
Auto Scaling
Auto Scaling allows you to scale your Amazon EC2 capacity up or down automatically according to conditions you define. With Auto Scaling, you can ensure that the number of Amazon EC2 instances youre using increases seamlessly during demand spikes to maintain performance, and decreases automatically during demand lulls to minimize costs. Auto Scaling is particularly well suited for applications that experience hourly, daily, or weekly variability in usage. Auto Scaling is enabled by Amazon CloudWatch and available at no additional charge beyond Amazon CloudWatch fees.
Storage
Amazon Simple Storage Service (Amazon S3)
Amazon S3 is storage for the Internet. It is designed to make web-scale computing easier for developers. Amazon S3 provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. The container for objects stored in Amazon S3 is called an Amazon S3 bucket. Amazon S3 gives any developer access to the same highly scalable, reliable, secure, fast, inexpensive infrastructure that Amazon uses to run its own global network of websites. The service aims to maximize benefits of scale and to pass those benefits on to developers.
Amazon Glacier
Amazon Glacier is an extremely low-cost storage service that provides secure and durable storage for data archiving and backup. In order to keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable. With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month, a significant savings compared to on-premises solutions. Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for
Page 9 of 18
October 2012
what you use. Amazon Glacier changes the game for data archiving and backup because you pay nothing up front, pay a very low price for storage, and can scale your usage up or down as needed, while AWS handles all of the operational heavy lifting required to do data retention well. It only takes a few clicks in the AWS Management Console to set up Amazon Glacier, and then you can upload any amount of data you choose.
AWS Import/Export
AWS Import/Export accelerates moving large amounts of data into and out of AWS using portable storage devices for transport. AWS transfers your data directly onto and off of storage devices using Amazons high-speed internal network and bypassing the Internet. For significant data sets, AWS Import/Export is often faster than Internet transfer and more cost effective than upgrading your connectivity. AWS Import/Export supports importing and exporting data into and out of Amazon S3 buckets in the US Standard, US West (Oregon), US West (Northern California), EU (Ireland), and Asia Pacific (Singapore) regions. The service also supports importing data into Amazon Elastic Block Store (Amazon EBS) snapshots in the US East (N. Virginia), US West (Oregon), and US West (Northern California) regions.
Page 10 of 18
October 2012
Content Delivery
Amazon CloudFront
Amazon CloudFront is a web service for content delivery. It integrates with other Amazon Web Services to give developers and businesses an easy way to distribute content to end users with low latency, high data transfer speeds, and no commitments. Amazon CloudFront can be used to deliver your entire website, including dynamic, static and streaming content using a global network of edge locations. Requests for objects are automatically routed to the nearest edge location, so content is delivered with the best possible performance. Amazon CloudFront is optimized to work with other Amazon Web Services, like Amazon S3 and Amazon EC2. Amazon CloudFront also works seamlessly with any origin server, which stores the original, definitive versions of your files. Like other Amazon Web Services, there are no contracts or monthly commitments for using Amazon CloudFrontyou pay only for as much or as little content as you actually deliver through the service.
Database
Amazon Relational Database Service (Amazon RDS)
Amazon Relational Database Service (Amazon RDS) is a web service that makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while managing time-consuming database administration tasks, freeing you up to focus on your applications and business. Amazon RDS gives you access to the capabilities of a familiar MySQL, Oracle or SQL Server database. This means that the code, applications, and tools you already use today with your existing databases can be used with Amazon RDS. Amazon RDS automatically patches the database software and backs up your database, storing the backups for a retention period that you define and enabling point-in-time recovery. You benefit from the flexibility of being able to scale the compute resources or storage capacity associated with your relational database instance by using a single API call. In addition, Amazon RDS makes it easy to use replication to enhance availability and reliability for production databases and to scale out beyond the capacity of a single database deployment for read-heavy database workloads.
Page 11 of 18
October 2012
Amazon DynamoDB
Amazon DynamoDB is a fully managed NoSQL database service that provides fast and predictable performance with seamless scalability. With a few clicks in the AWS Management Console, customers can launch a new Amazon DynamoDB database table, scale up or down their request capacity for the table without downtime or performance degradation, and gain visibility into resource utilization and performance metrics. Amazon DynamoDB enables customers to offload the administrative burdens of operating and scaling distributed databases to AWS, so customers dont have to worry about hardware provisioning, setup and configuration, replication, software patching, or cluster scaling. Amazon DynamoDB is designed to address the core problems of database management, performance, scalability, and reliability. Developers can create a database table that can store and retrieve any amount of data, and serve any level of request traffic. DynamoDB automatically spreads the data and traffic for the table over a sufficient number of servers to handle the request capacity specified by the customer and the amount of data stored, while maintaining consistent, fast performance. All data items are stored on solid state drives (SSDs) and are automatically replicated across multiple Availability Zones in a Region to provide built-in high availability and data durability. Amazon DynamoDB enables customers to offload the administrative burden of operating and scaling a highly available, distributed database cluster while only paying a low variable price for the resources they consume.
Amazon ElastiCache
Amazon ElastiCache is a web service that makes it easy to deploy, operate, and scale an in-memory cache in the cloud. The service improves the performance of web applications by allowing you to retrieve information from a fast, managed, in-memory caching system, instead of relying entirely on slower disk-based databases. Amazon ElastiCache is protocol-compliant with Memcached (a widely adopted memory object caching system) so code, applications, and popular tools that you use today with existing Memcached environments will work seamlessly with the service. Amazon ElastiCache simplifies and offloads the management, monitoring, and operation of in-memory cache environments, enabling you to focus on the differentiating parts of your applications. Using Amazon ElastiCache, you can add an in-memory cache to your application architecture in a matter of minutes. With a few clicks of the AWS Management Console, you can launch a Cache Cluster consisting of a collection of Cache Nodes, each node running Memcached software. You can then scale the amount of memory associated with your Cache Cluster in minutes by adding or deleting Cache Nodes to meet the demands of your changing workload. In addition, Amazon ElastiCache automatically detects and replaces failed Cache Nodes, providing a resilient system that mitigates the risk of overloaded databases, which slow website and application load times. Through integration with Amazon CloudWatch, Amazon ElastiCache provides enhanced visibility into key performance metrics associated with your Cache Nodes.
Page 12 of 18
October 2012
Amazon CloudWatch
Amazon CloudWatch provides monitoring for AWS cloud resources and the applications customers run on AWS. Developers and system administrators can use it to collect and track metrics, gain insight, and react immediately to keep their applications and businesses running smoothly. Amazon CloudWatch monitors AWS resources such as Amazon EC2 and Amazon RDS DB Instances, and can also monitor custom metrics generated by a customers applications and services. With Amazon CloudWatch, you gain system-wide visibility into resource utilization, application performance, and operational health. Amazon CloudWatch provides a reliable, scalable, and flexible monitoring solution that you can start using within minutes. You no longer need to set up, manage, or scale your own monitoring systems and infrastructure. Using Amazon CloudWatch, you can easily monitor as much or as little metric data as you need. Amazon CloudWatch lets you programmatically retrieve your monitoring data, view graphs, and set alarms to help you troubleshoot, spot trends, and take automated action based on the state of your cloud environment.
Page 13 of 18
October 2012
Most existing application containers or platform-as-a-service solutions, while reducing the amount of programming required, significantly diminish developers' flexibility and control. Developers are forced to live with all the decisions predetermined by the vendor and have little to no opportunity to take back control over various parts of their application's infrastructure. However, with Elastic Beanstalk, you retain full control over the AWS resources powering your application. If you decide you want to take over some (or all) of the elements of their infrastructure, you can do so seamlessly by using Elastic Beanstalk's management capabilities. To ensure easy portability of your application, Elastic Beanstalk is built using familiar software stacks such as the Apache HTTP Server for PHP and Python, Apache Tomcat for Java, and Microsoft IIS for .NET web applications. There is no additional charge for Elastic Beanstalk. You pay only for the AWS resources you need to store and run your applications.
AWS CloudFormation
AWS CloudFormation gives developers and systems administrators an easy way to create and manage a collection of related AWS resources, provisioning and updating them in an orderly and predictable fashion. You can use AWS CloudFormations sample templates or create your own templates to describe the AWS resources, and any associated dependencies or runtime parameters, required to run your application. You dont need to figure out the order in which AWS services need to be provisioned or the subtleties of how to make those dependencies work. AWS CloudFormation takes care of this for you. Once deployed, you can modify and update the AWS resources in a controlled and predictable way. This allows you to version control your AWS infrastructure in the same way as you version control your software. You can deploy and update a template and its associated collection of resources (called a stack) using the AWS Management Console, AWS CloudFormation command line tools, or CloudFormation API. AWS CloudFormation is available at no additional charge, and you pay only for the AWS resources needed to run your applications.
Application Services
Amazon Simple Queue Service (Amazon SQS)
Amazon Simple Queue Service (Amazon SQS) offers a reliable, highly scalable, hosted queue for storing messages as they travel between computers. By using Amazon SQS, developers and administrators can simply move data between distributed components of their applications that perform different tasks, without losing messages or requiring each component to be always available. Amazon SQS makes it easy to build an automated workflow, working in close conjunction with Amazon EC2 and the other AWS infrastructure web services. Amazon SQS works by exposing Amazons web-scale messaging infrastructure as a web service. Any computer on the Internet can add or read messages without any installed software or special firewall configurations. Components of applications using Amazon SQS can run independently, and do not need to be on the same network, developed with the same technologies, or running at the same time.
Page 14 of 18
October 2012
Page 15 of 18
October 2012
Amazon CloudSearch
Amazon CloudSearch is a fully-managed search service in the cloud that allows customers to easily integrate fast and highly scalable search functionality into their applications. With a few clicks in the AWS Management Console, developers simply create a search domain, upload the data they want to make searchable to Amazon CloudSearch, and then the service automatically provisions the technology resources required and deploys a highly tuned search index. Amazon CloudSearch seamlessly scales as the amount of searchable data increases or as the query rate changes. Developers can change search parameters, fine-tune search relevance, and apply new settings at any time without having to upload the data again. Amazon CloudSearch enables customers to offload the administrative burden of operating and scaling a search platform. Customers don't have to worry about hardware provisioning, data partitioning, or software patches.
Page 16 of 18
October 2012
Networking
Amazon Virtual Private Cloud (Amazon VPC)
Amazon Virtual Private Cloud (Amazon VPC) lets you provision a private, isolated section of the AWS cloud where you can launch AWS resources in a virtual network that you define. With Amazon VPC, you can define a virtual network topology that closely resembles a traditional network that you might operate in your own data center. You have complete control over your virtual networking environment, including selection of your own IP address range, creation of subnets, and configuration of route tables and network gateways. You can easily customize the network configuration for your Amazon VPC. For example, you can create a public-facing subnet for your webservers that has access to the Internet, and place your backend systems such as databases or application servers in a private-facing subnet with no Internet access. You can leverage multiple layers of security (including security groups and network access control lists) to help control access to Amazon EC2 instances in each subnet. Additionally, you can create a hardware virtual private network (VPN) connection between your corporate data center and your VPC and leverage the AWS cloud as an extension of your corporate data center.
Amazon Route 53
Amazon Route 53 is a highly available and scalable Domain Name System (DNS) web service. It is designed to give developers and businesses an extremely reliable and cost-effective way to route end users to Internet applications by translating human readable names, such as www.example.com, into the numeric IP addresses, such as 192.0.2.1, that computers use to connect to each other. Route 53 effectively connects user requests to infrastructure running in AWS, such as an EC2 instance, an elastic load balancer, or an Amazon S3 bucket. Route 53 can also be used to route users to infrastructure outside of AWS. Amazon Route 53 is designed to be fast, easy to use, and cost effective. It answers DNS queries with low latency by using a global network of DNS servers. Queries for your domain are automatically routed to the nearest DNS server, and thus are answered with the best possible performance. With Route 53, you can create and manage your public DNS records with the AWS Management Console or with an easy-to-use API. Its also integrated with other Amazon Web Services. For instance, by using the AWS Identity and Access Management (IAM) service with Route 53, you can control who in your organization can make changes to your DNS records. Like other Amazon Web Services, there are no long-term contracts or minimum usage requirements for using Route 53you pay only for managing domains through the service and the number of queries that the service answers.
Page 17 of 18
October 2012
As you answer each question, apply the lenses of flexibility, cost effectiveness, scalability, elasticity, and security. Taking advantage of Amazon Web Services will allow you to focus on your core competencies and leverage the resources and experience Amazon provides.
Page 18 of 18