0% found this document useful (0 votes)
63 views

Researchpaper - Serverless Computing

This document discusses serverless computing and its evolution. It provides an overview of serverless computing architecture and how it differs from traditional virtual machines and containers by allowing users to develop applications without managing servers. The document also explores challenges of serverless computing like cold start times and outlines opportunities for serverless in various domains like machine learning and IoT.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views

Researchpaper - Serverless Computing

This document discusses serverless computing and its evolution. It provides an overview of serverless computing architecture and how it differs from traditional virtual machines and containers by allowing users to develop applications without managing servers. The document also explores challenges of serverless computing like cold start times and outlines opportunities for serverless in various domains like machine learning and IoT.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Serverless Computing: Challenges and Road Ahead

Manan Sharma, Deepak Saini, Utpal Chaudhary, Riteek Raj Modi, Vani Gupta,
Monika

Chandigarh University, Mohali, Punjab, India

{20BCS4130,20BCS4066,20BCS4160,20BCS4115,20BCS4068,
20BCS4070}@cuchd.in

Abstract – With the advent of cloud computing application building and


application deployment have evolved into new dimensions. Serverless is
one such dimension of cloud computing. Serverless computing as the name
suggests enables a user to deploy an application without worrying about
servers. Here you are only billed for execution time rather than for
resources allowed to you. Cloud programming models have been evolving
continuously and it is this continuous effort that has resulted in this
serverless computing. This model evolved as a way to achieve optimum cost
and minimum configuration overheads and increases the application's
ability to scale in the cloud. In this paper, we talk about how serverless
computing is evolving in recent years and also explore various challenges
or roadblocks in serverless computing with practical implementation in
AWS. Further, we try to look into the future and application of serverless
computing and see how much more can it evolve.

Keywords: Cloud computing, Virtual Machines, Containers, Serverless,


AWS Lambda, AWS Dynamo, Cold Start Problem

1. Introduction
Cloud Computing is the on-demand delivery of computing resources such as
computing power and storage usually through the public network. It relies on sharing
of resources among its users and usually uses the ‘pay-as-you-go’ model, which
means now users have to pay only for OPEX (operational expense) rather than
CAPEX (capital expense). It surely helped in reducing capital expenses but also lead
to unexpected operating expenses for users.

AWS was one of the first companies which lead to the Commercialization of Cloud
Computing. AWS was launched in July 2002 with the aim to innovate and build
entrepreneurial applications on the cloud. By 2010 other vendors like Microsoft
Azure, and Google Cloud Platform came into the picture. After June 2015, the term
serverless began to gain popularity after the release of the AWS service called AWS
Lambda. Now at present, we have many cloud providers other than AWS like
Microsoft Azure, Google Cloud, Rackspace, IBM, Oracle, etc.

When we have so many vendors for the same product, so it becomes difficult to
choose the right vendor. A business should consider these factors before choosing a
vendor like cost, performance, limits, programming language, programming model,
composability, deployment, security, accounting, monitoring, and debugging.

AWS is dominating the market when it comes to cloud computing, here is the
pictorial representation of different cloud vendors and their shares in the market.
2

Fig. 1.1. Different Cloud Providers and their Market Share

For many years, in cloud computing, the architecture entire machine down to the
hardware layers was virtualized with the help of a hypervisor. In this architecture
there was a pool of virtual machines each carrying a copy of the operating system was
created on the top of the machine. In short virtual machines creates a copy of
hardware or computer system that makes it possible for a single machine to host
several virtual machines that act as if they are separate piece of hardware.

Fig. 1.2. Architecture for Virtual Machines


3

These virtual machines did not interact with other virtual machines on the same host
machine and were self-contained units and contained copies of everything from the
operating system to libraries and databases. However, with time, it turned out it was
not the most effective architecture. Here we were virtualizing down to the hardware
level, in the new architecture virtualization is limited to software levels above the
operating system level. This new architecture makes use of containers instead of
virtual machines. According to IBM "Containers are a lighter-weight, more agile way
of handling virtualization — since they don't use a hypervisor, you can enjoy faster
re-source provisioning and speedier availability of new applications”. So instead of
virtualizing the hardware containers virtualize the operating system so each individual
container contains only the application and its libraries and dependencies. Containers
deliver a much higher level of abstraction than virtual machines and also provide
resources faster than Virtual Machines.

Fig. 1.3. Architecture for Containers

Containers are the integral elements of many PaaS (Platform as a service) offerings.
However, Containers aren't limited to PaaS. Organizations often use containers in the
local data center for the exact same benefits; enabling users to easily deploy and
spawn applications within containers. Although containers are far more efficient and
faster than virtual machines, their growth is inhibited due to basic infrastructural
elements called servers. This gives rise to new computing architecture known as
serverless computing.

Although its name is serverless it does not mean that servers are not involved here,
here what it does is that users can develop their application independently without
thinking about servers as cloud providers manage servers on behalf of their
customers. Serverless computing is a programming model and cloud architecture
where the application is decomposed into ‘triggers’ (events) and ‘actions’ (functions)
where small code snippets are executed in the cloud without any control over the re-
sources. Here we are billed not for resources allowed but for the execution time. The
application consumes the resources only at the time of execution and once the
execution time is over, it releases the resources. The price model includes only the
amount of time in which the resources were in use and the application developer need
not pay for resources until they are executed, thus it is referred to as ‘serverless’.
4

Fig. 1.4. Architecture for Serverless computing

Since we are being charged only for the execution time and the whole process
revolves around functions it is also known as FAAS (Function as a Service). FaaS is a
simple and straightforward approach just writing functions in programs to reuse the
code in the upcoming program. Similarly, the FaaS provides the functions which are
registered by developers in the various cloud services platforms. These platforms
support a variety of languages like C++, Java, Python, JavaScript, etc. The Users can
define the event which triggers each function.

The core capability of a serverless platform is that of an event processing system, as


depicted in img1.5. A set of user-defined functions must be managed by the service,
which must also accept events sent over HTTP or received from event sources, decide
which function(s) to dispatch the events to, locate an existing instance of the function
or create a new instance, send the events to the function instance, wait for a response,
collect execution logs, make the response available to the user, and stop the function
when it is no longer required. Implementing such capability while taking factors like
cost, scalability, and fault tolerance into account is difficult. The platform must
launch a function and handle its input in a rapid and effective manner.

The platform also needs to queue events, and based on the state of the queues and
arrival rate of events, schedule the execution of functions, and manage to stop and
deallocate resources for idle functions instances. Moreover, cloud providers also need
to consider scalability and failure management in the cloud, and should provide
heterogeneous hardware support, and a variety of programming languages should be
explored.

Serverless computing has limitless opportunities in various fields like machine


learning, image and video processing, IoT software, E-commerce, banking, and many
more. We are just at the onset of serverless computing. Its affordable scalability, ease,
and simplicity as compared to physical servers are some of the main reasons for the
popularity of serverless computing.
5

2. Literature Review

Dahanayake A. (2017) serverless research that architecture helps reduce operational costs
in comparison with the traditional solution of constantly keeping a running server. As a
result of the study of Acharya B. (2020), a fully serverless, scalable application was
designed and implemented on the serverless database with a new-made API. An effort
was made to demonstrate how writing functions, looking at DynamoDB streams, events
subsystem, API, and the storage pattern should be carried out. Finally, this study talks
about the advantages of next-generation serverless technology without thinking about
scaling and management of servers.

Utomo P. (2020) has deep-dived into serverless computing discussing how GitHub
provides the environment for developing and publishing websites in an integrated
environment. With GitHub Pages, we can host the static web easily, fast, and for free. It
helps the developer for integrating the development and deployment process, because
GitHub Pages are integrating with the GitHub environment that supports continuous
integration and continuous delivery, and also become a content delivery network (CDN),
as a part of the JAM stack building block.

Vergadia P. (2020) has talked about how Compute Engine offers scale, performance,
and value that allows you to easily launch large compute clusters on Google’s
infrastructure. For security and authentication, numerous techniques provided by AWS
were dis- covered including the use of a Single Sign On session by Dani A (2020). OTP-
based authentication was found to be more reliable and secure than HTTP-based
methods. Cloud implications relating to infrastructure elasticity, load balancing,
provisioning variation, infrastructure, and memory reservation size were studied.

Shelar N. (2021) has talked about creating a simple serverless website that helps users to
request feedback forms for college surveys. Serverless is a process that shows services,
practices, and strategies which are used to build a website so as to innovate and develop
for faster changes. Serverless computing contains infrastructure management tasks
such as capacity provisioning and patching.

Krishnan H. (2021) researched that with the help of serverless computing, the time
required to meet the market is low, cost-effective, and provides higher efficiency. In
serverless cloud computing, the developer need not worry about owning, managing, and
maintenance of servers as it is carried out by the cloud service provider.

Anand K. (2021) has eloquently discussed the technologies used to construct a serverless
web application. They have used the AWS platform for serverless web application
construction. There are various AWS services like AWS Lambda, Amazon API Gateway,
AWS Amplify, Amazon DynamoDB, and Amazon Cognito. AWS DynamoDB is used
as the database for the web applications, and AWS Lambda is used to create functions to
write from or to write to the database, AWS API Gateway creates a REST API, an
application programming interface that allows the web application to interact with
RESTful API. AWS S3 hosts the web application and AWS CloudFront delivers the web
application from the nearest location of the user.
6

Author Year Technologies Limitations Advantages


Used

It shows how
serverless archi-
The development of tecture helps
AWS (Amazon
serverless prototype reduce opera-
Web Services)
application is limited tional costs in
Ajantha Lambda, API
2017 to the Amazon Web comparison with
Dahanayake Gateway,
Services (AWS) envi- the traditional
Cloudwatch
ronment. solution of con-
stantly keeping a
running server.

The usage of
GitHub as web
The usage of GitHub
hosting and
as web hosting and
Prayudi JAMstack as an
JAMstack as an ap-
Utomo 2020 GitHub platform approach for
proach for developing
developing web,
web, not very popular
not very popular
among developers
among develop-
ers

The usage of GitHub


as web hosting and
JAMstack as an ap- Employing
proach for developing Lambda as
web, not very popular serverless logic,
AWS Lambda, among developers . faster, event-
Bibek API Gateway, driven, cost-
S3, AWS Dy- Moreover, the com-
Acharya 2020 effective,secure
namoDB, SNS, plexity to debug and
applications can
and handle stateless func-
be built bymeet-
CloudWatch. tion.
ing everycom-
pliance concern
A separate tools and
at every slab.
IDE might help.
7

Large internet
companies like
Amazon,
This study helps to
Netflix, and
implement abstrac-
LinkedIn
AWS Lambda tion for serverless
deploy big
along with other architecture which
multistage
existing AWS enables efficient
A. Dani applications in
2020 services like S3, cross server data
the cloud
DynamoDB, management, re-
which can be
CloudWatch etc. source allocation
developed,
and isolation
tested, de-
amongst other
ployed, scaled,
things
operated and
upgraded in-
dependently.

Compute Engine
offers scale,
performance,
and value that
allows you to
easily launch
what happens if our large compute
website gets really clusterson
To deploy web- popular and the traffic G o o g l e ’ s i n-
sites on Google grows from 100s to frastructure.
Priyanka millions of users? We There are no
2020 Cloud using
Vergadia need to make sure that upfront invest-
Google Compute our application can ments and you
Engine gracefully handle can run thou-
peaks and dips in traf- sands of virtual
fic. CPUs on a sys-
tem that has
been designed to
be fast, and to
offer strong
consistency of
performance.
8

In serverless
AWS Lambda, cloud comput-
Azure, and ing, the devel-
Google cloud. the control over the oper need not
basic compo- infrastructure is lost, worry about
nents such as as the cloud service owning, man-
Hari Krish-
2021 serverless API provider takes care of agement, and
nan Andi
gateway, it, we cannot produce maintenance of
FaaS (Function any change in in- servers as it is
as a service) and frastructure. carried out by
BaaS (Backend the cloud service
as a service). provider.

it is a cloud
service platform
which offers
compute power,
Security concern for
database stor-
Nikita 2021 GitHub platform backend phase devel-
age, content
opers.
delivery and
other functional-
ity to help de-
velopers.
9

3. Triggering a Serverless Application on AWS

AWS Lambda is a computing service that lets you run code without provisioning
ormanaging servers.
With AWS lambda, you can run code for virtually any type of application or backend
service with all zero administration.

AWS lambda manages all the administration like:

1. Provisioning and capacity o f its compute fleet that offers a balance of


memory, CPU, network, and other resources.

2. Servers and O.S maintenance.

3. High availability and Automatic Scaling.

4. Monitoring fleet health.

5. Deploy your code.

6. Monitoring and logging your Lambda function.

7. AWS lambda runs your code on high-availability compute infrastructure.

Principles of any basic development

• Strong data consistency

• Data normalization

• Transactions

• Fast immediate responses

• High-performance infrastructure that rarely fails

You can still use these principles, but some of them are not ideal for serverless
computation Strong data consistency, data normalization, and transactions are the
basis.
10

for any SQL databases, and SQL databases do not scale and are structured. Therefore,
a NoSQL database is what you need if you need a highly scalable solution. Data
frequently move across the system in the serverless world as a reaction to a particular
occurrence. In this case, the data is ultimately consistent (at least, eventually!). In this
situation, transactions are also not the greatest option. Data is frequently duplicated
and denormalized since databases need to be arranged for quick reading in order to
perform at their peak. Containers are where the serverless system's components, glue,
and core run. You may control how powerful your hardware is with a memory option.
Its memory ranges from 128 MB to 3 GB. Even though it's not much, you don't need
to execute any functions. If you need multiple functions at the same time, you get a
new container and new capabilities.

Implementing Serverless Database Application

1. Log in to your AWS account.

2. Search for the IAM account in the services and features column.

3. Create a new role in the IAM account for accessing full control of DynamoDB.

Fig. 3.1 Adding permissions for DynamoDB


11

4. Create a policy for that role by giving full access to dynamo DB and s3 bucket.

5. Search for the s3 bucket and connect your s3 bucket to the database you
created in Dynamo DB.

6. Create an s3 bucket in the global region.

7. Search for AWS lambda and then connect it to the bucket created in s3 by
creating a new lambda function and follow the steps as do.

Fig. 3.2. Lambda Function

8. At th e final page of lambda, when the lambda function is created there is a


triggerfunction where we have to add our S3 bucket to trigger the bucket.

9. Scroll up the page and you will see the space for the code to run, deploy
manage, and many more.

10. At the space of code write down the code for invoking the database by triggering
the s3 bucket.
12

Fig. 3.2. Lambda Code

Fig. 3.2. Dynamo Table creation

Fig. 3.2. Triggering Application


13

4. Challenges

Lack of Control: You don't own or have any control over the server infrastructure
when using serverless computing. This means that problems with your server's
hardware, software, or other components could have a significant negative influence
on your business.

Possible compatibility issues: You could desire to employ cloud services from many
suppliers, which could present compatibility problems. Although this is theoretically
conceivable, you can run into compatibility problems if you utilize one provider for
your serverless architecture but wish to use their capabilities with other cloud
services.

Potential effects on performance: A function is terminated after a period of


inactivity. This results in a temporary slowdown in the time it takes the code to
execute when it is called again, a condition known as a cold start that may have an
impact on your business operations.

What is the cold start problem?

The capability of serverless to scale to zero during periods of inactivity is a key


selling factor. The resources of a function are spun down if it is not actively being
used, which restores platform capacity and lowers the cost to the user of reserving
those components. From a financial standpoint, this is perfect because users will only
be charged for the time and resources their code actually uses.

The drawback of this is that there is a known delay the next time it needs to execute
when the resources totally spin down. Running the function requires reallocating the
resources, which takes time. You end up with a single set of performance
characteristics for recently utilized "hot" functions and another profile for “cold”
functions that must be patiently created by the platform before being executed.
14

Serverless Security Risks & Challenges:

An unsafe configuration
Providers of cloud services provide a variety of options and features. Unattended incorrect
settings or configurations might pose serious security risks. It may serve as a point of entry
for assaults against serverless infrastructures.

Permissions for overly powerful functions


Each autonomous function within the serverless ecosystem has its own services and
responsibilities for a specific purpose. Users shouldn't have access to more than they need,
therefore make sure of that. The functions' rights and permissions should be set up
correctly. A situation where functions become overprivileged may result, posing a possible
security risk.

Injection of event data


Application with injection issues is one of the most frequent security threats. These can be
triggered by events in cloud storage, NoSQL database, code modifications, and another
source besides untrusted inputs in application calls. Careful assessment is required of
various input kinds that may not come from unreliable event sources. The possible assault
surface is expanded by this diverse collection of event sources.

Insufficient function recording and monitoring


The early warning indicators of an attack may go unnoticed because serverless solutions
may not offer sufficient security features for recording and monitoring applications.

Reliance on unreliable third parties


Serverless applications deal with the integration of back-end cloud services, database
services, and other requirements from third parties. If they have weaknesses, this can open
the door to exploitation.
Although the cloud service provider is in charge of protecting every aspect of the cloud,
including data centers, networks, servers, operating systems, and configurations, this does
not imply that developers play no part in strengthening security.
It is a shared security responsibility since the application developer is still in charge of the
application's logic, code, data, and application-layer configurations.

Poorly handled exceptions and lengthy error messages


Line-by-line debugging services are frequently scarce in a serverless architecture. Some
developers may opt for verbose error messages and turn on the debugging mode to make
things simpler. Additionally, when migrating apps to production, developers could neglect
to clean up the code, leaving the verbose error messages in place. This might provide
details regarding serverless operations and the reasoning behind them.
15

Other serverless challenges include:

Security: As perimeters evaporate, serverless architecture calls for new security paradigms and
best practices. It is exceedingly challenging to reliably implement security standards throughout
the entire application because each container or serverless workload provider uses its own
security frameworks.

Observability: Using outdated techniques to monitor and troubleshoot contemporary


applications is challenging. In addition to the fact that outdated measures are no longer useful, it
is challenging to set up serverless applications for monitoring agents. In any scenario,
distributed asynchronous tracing must be implemented in addition to conventional logs.

Cost: The costs of serverless computing may be both direct and indirect. For instance,
significant reliance on API calls might increase expenses and lead to difficult-to-fix
performance bottlenecks.

5. Future of Serverless Computing


The benefits of serverless computing and the use of AWS Lambda for serverless
applications have been discussed in this paper.
Future iterations of the application may explore the use of alternative NoSQL
databases in place of DynamoDB. Integration of AWS Lambda across platforms with
other well-known suppliers of serverless technology, such as Google Cloud and
Microsoft Azure Functions, etc. Serverless computing is here to stay for a very long
time and will change the way we design, test, deploy, and manage our apps with the
aid of AWS and other comparable cloud services and their reducing costs for the
resources.
Cost, security, effort, time to market, and other factors play a big part in the IT sector.
The amount of time needed to host a project is minimal when using cloud computing,
but security and maintenance costs cannot be considered. However, employing
serverless cloud computing reduces execution time, and maintenance costs, and
delivers great security. The advantages of serverless computing include the fact that
the cloud service provider handles maintenance, server ownership, and resource
allocation instead of the developer. As pricing is determined by how much time is
spent using the program or resource, maintenance costs are low.
A difficulty with serverless cloud computing is that it cannot be utilized for processes
that take a long time to complete because the cost is determined by how long the code
takes to execute. Given that vendors are in charge of the entire backend, there are
certain security issues as well.
16

6. Conclusion
Serverless computing is a growing market and it continues to evolve as cloud
providers continue to come up with new solutions for its drawbacks and challenges.
Serverless computing allows developers to focus on developing applications and
business logic while taking care of various resources.
Sadly, there has been a lack of interest from the research community and academics
regarding this. It seems that the task to solve various wide variety of technically
challenging and intellectually deep problems in this space, ranging from infrastructure
issues such as optimizations to the cold start problem to the design of a composable
programming model has been left to Big Cloud providers or future generations. We
strongly feel if more academicians get into it, it will not only benefit cloud industry
but also humanity as with proper and efficient usage of servers we can lower the
carbon footprint to a significant level

References
1. Prayudi Utomo, Falahah Suprapto based on “GitHub platform” , On-
line; accessed 2020.

2. Nikita Shela, Siddhi Kumbhare, Abhishek Gorde on “AWS Cloud Plat-


form” , Online; accessed 2021.

3. Priyanka Vergadia, to deploy the website on Google Cloud using “Google


Compute Engine”. Online; accessed 2020.

4. Hari Krishnan Andi on the AWS Lambda Azure, and Google cloud ”.
Online; accessed 2021.

5. Prof. Ajantha Dahanayake, Dr. Antti Knutas on “AWS Cloudwatch” .


Online; accessed 2017.

6. Karan Anand, Mr Ganeshan M using “Lambda Function” Online; ac-


cessed 2021.

7. Bibek Acharya using AWS Lambda , API Gateway , DynamoDB Online;


accessed 2020.

8. A. Dani, C. Pophale, A. Gutte, B. Choudhary, and S.S. Sonawani using


services like S3, Cloudwatch Online; accessed 2020.

You might also like