0% found this document useful (0 votes)
49 views25 pages

Unit 3 Cloud Security

The document provides an overview of cloud computing, detailing its definition, essential characteristics, delivery and deployment models, and the advantages and limitations associated with it. It emphasizes cloud security challenges, including data breaches and insider threats, and outlines various security measures for Software as a Service (SaaS) applications. Additionally, it highlights major cloud computing companies and the importance of cloud security in the digital transformation of organizations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views25 pages

Unit 3 Cloud Security

The document provides an overview of cloud computing, detailing its definition, essential characteristics, delivery and deployment models, and the advantages and limitations associated with it. It emphasizes cloud security challenges, including data breaches and insider threats, and outlines various security measures for Software as a Service (SaaS) applications. Additionally, it highlights major cloud computing companies and the importance of cloud security in the digital transformation of organizations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

UNIT-3

What is Cloud Computing, Essential Characteristics, Cloud security challenges, Software


as a service security, secure software development life cycle, data usage, data privacy,
identity access management, physical security.

What Is Cloud Computing?


Cloud Computing means storing and accessing the data and programs on remote servers that
are hosted on the internet instead of the computer’s hard drive or local server. Cloud
computing is also referred to as Internet-based computing, it is a technology where the
resource is provided as a service through the Internet to the user. The data that is stored can
be files, images, documents, or any other storable document.
The following are some of the Operations that can be performed with Cloud Computing
 Storage, backup, and recovery of data
 Delivery of software on demand
 Development of new applications and services
 Streaming videos and audio

Delivery models
Software as a Service (SaaS) Deployment models
Platform as a Service (PaaS) Public cloud

Infrastructure as a Service (IaaS) Private cloud


Community cloud
Hybrid cloud

Cloud computing
Infrastructure
Distributed infrastructure
Defining attributes
Resource virtualization
Massive infrastructure
Autonomous systems
Utility computing. Pay-per-usage
Resources
Accessible via the Internet
Compute & storage servers
Networks Services Elasticity

Applications

Architecture Of Cloud Computing


Cloud computing architecture refers to the components and sub-components required for
cloud computing. These components typically refer to:
 Front end ( Fat client, Thin client)
 Back-end platforms ( Servers, Storage )
 Cloud-based delivery and a network ( Internet, Intranet, Intercloud )
Front End ( User Interaction Enhancement )
The User Interface of Cloud Computing consists of 2 sections of clients. The Thin clients are
the ones that use web browsers facilitating portable and lightweight accessibilities and others
are known as Fat Clients that use many functionalities for offering a strong user experience.
Back-end Platforms ( Cloud Computing Engine )
The core of cloud computing is made at back-end platforms with several servers for storage
and processing computing. Management of Applications logic is managed through servers
and effective data handling is provided by storage. The combination of these platforms at the
backend offers the processing power, and capacity to manage and store data behind the cloud.

Cloud-Based Delivery and Network


On-demand access to the computer and resources is provided over the Internet, Intranet, and
Intercloud. The Internet comes with global accessibility, the Intranet helps in internal
communications of the services within the organization and the Intercloud enables
interoperability across various cloud services. This dynamic network connectivity ensures an
essential component of cloud computing architecture on guaranteeing easy access and data
transfer.

Cloud computing structure


The following are the types of Cloud Computing:
 Infrastructure as a Service (IaaS)
 Platform as a Service (PaaS)
 Software as a Service (SaaS)
Infrastructure as a Service ( IaaS )
 Flexibility and Control: IaaS comes up with providing virtualized computing resources
such as VMs, Storage, and networks facilitating users with control over the Operating
system and applications.
 Reducing Expenses of Hardware: IaaS provides business cost savings with the
elimination of physical infrastructure investments making it cost-effective.
 Scalability of Resources: The cloud provides in scaling of hardware resources up or down
as per demand facilitating optimal performance with cost efficiency.

Platform as a Service ( PaaS )


 Simplifying the Development: Platform as a Service offers application development by
keeping the underlying Infrastructure as an Abstraction. It helps the developers to
completely focus on application logic ( Code ) and background operations are completely
managed by the AWS platform.
 Enhancing Efficiency and Productivity: PaaS lowers the Management of Infrastructure
complexity, speeding up the Execution time and bringing the updates quickly to market
by streamlining the development process.
 Automation of Scaling: Management of resource scaling, guaranteeing the program’s
workload efficiency is ensured by PaaS.
SaaS (software as a service)
 Collaboration And Accessibility: Software as a Service (SaaS) helps users to easily access
applications without having the requirement of local installations. It is fully managed by
the AWS Software working as a service over the internet encouraging effortless
cooperation and ease of access.
 Automation of Updates: SaaS providers manage the handling of software maintenance
with automatic latest updates ensuring users gain experience with the latest features and
security patches.
 Cost Efficiency: SaaS acts as a cost-effective solution by reducing the overhead of IT
support by eliminating the need for individual software licenses.

Cloud Deployment Models


 Private Deployment Model: It provides an enhancement in protection and customization
by cloud resource utilization as per particular specified requirements. It is perfect for
companies which looking for security and compliance needs.
 Public Deployment Model: It comes with offering a pay-as-you-go principle for
scalability and accessibility of cloud resources for numerous users. it ensures cost-
effectiveness by providing enterprise-needed services.
Hybrid Deployment Model
It comes up with a combination of elements of both private and public clouds providing
seamless data and application processing in between environments. It offers flexibility in
optimizing resources such as sensitive data in private clouds and important scalable
applications in the public cloud.

Top Reasons to switch from On-premise to Cloud Computing


Reduces cost
The cost-cutting ability of businesses that utilize cloud computing over time is one of the
main advantages of this technology. On average 15% of the total cost can be saved by
companies if they migrate to the cloud. By the use of cloud servers businesses will save and
reduce costs with no need to employ a staff of technical support personnel to address server
issues. There are many great business modules regarding the cost-cutting benefits of cloud
servers such as the Coca-Cola and Pinterest case studies.

More storage
For software and applications to execute as quickly and efficiently as possible, it provides
more servers, storage space, and computing power. Many tools are available for cloud
storage such as Dropbox, Onedrive, Google Drive, iCloud Drive, etc.

Employees Better Work Life Balance


Direct connections between cloud computing benefits, and the work and personal lives of an
enterprise’s workers can both improve because of cloud computing. Even on holidays, the
employees have to work with the server for its security, maintenance, and proper
functionality. But with cloud storage the thing is not the same, employees get ample of time
for their personal life and the workload is even less comparatively.

Top leading Cloud Computing companies


Amazon Web Services(AWS)
 One of the most successful cloud-based businesses is Amazon Web Services(AWS),
which is an Infrastructure as a Service(Iaas) offering that pays rent for virtual computers
on Amazon’s infrastructure.
Microsoft Azure Cloud Platform
 Microsoft is creating the Azure platform which enables the .NET Framework Application
to run over the internet as an alternative platform for Microsoft developers. This is the
classic Platform as a Service(PaaS).
Google Cloud Platform ( GCP )
 Google has built a worldwide network of data centers to service its search engine. From
this service, Google has captured the world’s advertising revenue. By using that revenue,
Google offers free software to users based on infrastructure. This is called Software as a
Service(SaaS).
Advantages of Cloud Computing
The following are main advantages of Cloud Computing:
 Cost Efficiency: Cloud Computing provides flexible pricing to the users with the
principal pay-as-you-go model. It helps in lessening capital expenditures of
Infrastructure, particularly for small and medium-sized businesses companies.
 Flexibility and Scalability: Cloud services facilitate the scaling of resources based on
demand. It ensures the efficiency of businesses in handling various workloads without
the need for large amounts of investments in hardware during the periods of low demand.
 Collaboration and Accessibility: Cloud computing provides easy access to data and
applications from anywhere over the internet. This encourages collaborative team
participation from different locations through shared documents and projects in real-time
resulting in quality and productive outputs.
 Automatic Maintenance and Updates: AWS Cloud takes care of the infrastructure
management and keeping with the latest software automatically making updates they is
new versions. Through this, AWS guarantee the companies always having access to the
newest technologies to focus completely on business operations and innvoations.

Limitation of Cloud Computing


The following are the main disadvantages of Cloud Computing:
 Security Concerns: Storing of sensitive data on external servers raised more security
concerns which is one of the main drawbacks of cloud computing.
 Downtime and Reliability: Even though cloud services are usually dependable, they may
also have unexpected interruptions and downtimes. These might be raised because of
server problems, Network issues or maintenance disruptions in Cloud providers which
negative effect on business operations, creating issues for users accessing their apps.
 Dependency on Internet Connectivity: Cloud computing services heavily rely on Internet
connectivity. For accessing the cloud resources the users should have a stable and high-
speed internet connection for accessing and using cloud resources. In regions with limited
internet connectivity, users may face challenges in accessing their data and applications.
 Cost Management Complexity: The main benefit of cloud services is their pricing model
that coming with Pay as you go but it also leads to cost management complexities. On
without proper careful monitoring and utilization of resources optimization,
Organizations may end up with unexpected costs as per their use scale. Understanding
and Controlled usage of cloud services requires ongoing attention.

Cloud security
Cloud security is a collection of procedures and technology designed to address external and
internal threats to business security. Organizations need cloud security as they move toward
their digital transformation strategy and incorporate cloud-based tools and services as part of
their infrastructure.

Cloud security System Security

Quick accessible Slow and procedural access

Efficient resource application Less application efficiency

Usage-based cost Higher cost

Third-party data centres Owned data centres


Reduced time to market Longer time to market

Less infrastructure cost High infrastructure costs

The challenges of cloud security were as follows:-


1. Data Breaches:
One of the primary concerns in cloud security is the risk of data breaches. Storing sensitive
data in a shared environment poses a potential threat, as unauthorized access to the cloud
infrastructure could lead to data theft or leakage. Organizations must implement robust
encryption, access control, and data loss prevention measures to protect their data from
unauthorized access.
2. Insider Threats:
The risk of insider threats in a cloud environment cannot be overlooked. Employees or third-
party service providers with authorized access to cloud resources could misuse their privileges
and compromise the security of the system. Implementing strong identity and access
management controls is essential to mitigate the risk of insider threats.
3. Vulnerabilities in APIs:
Application Programming Interfaces (APIs) are essential for cloud-based applications to
communicate and interact with cloud services. However, vulnerabilities in APIs can be
exploited by attackers to gain unauthorized access to data or resources. Organizations must
conduct regular security assessments of APIs and ensure that they are robust and secure.
4. DDoS Attacks:
Distributed Denial of Service (DDoS) attacks can disrupt cloud services by overwhelming the
network or infrastructure with a flood of traffic. CSPs and organizations must implement DDoS
protection measures to mitigate the impact of such attacks and ensure the availability of their
cloud services.
5. Data Loss and Recovery:
The risk of data loss in the cloud, whether due to accidental deletion, hardware failure, or other
factors, is a concern for organizations. Implementing data backup and recovery plans is critical
to ensure that data is protected and can be recovered in the event of a loss.
6. Lack of Visibility and Control: Organizations must implement robust security monitoring
and management tools to gain visibility into their cloud infrastructure and maintain control
over security policies and configurations.
7. Shared Responsibility model-The provider is responsible for the security of the underlying
infrastructure, the customer is responsible for securing their data and applications. This
presents a challenge in terms of determining where the responsibility lies for different aspects
of security, and ensuring that all areas are adequately covered.
8. Dynamic nature: It can be difficult to maintain visibility and control over an ever-changing
environment, and traditional security measures may struggle to keep pace with the rate of
change. Additionally, misconfigurations or oversights in the dynamic environment can lead to
security vulnerabilities that can be exploited by malicious actors.
9. Multi-tenancy:- Multi-tenancy allows multiple customers to share the same physical
resources, such as servers and storage, within the cloud environment. Proper isolation of
resources and strong access controls are critical to mitigating these risks in a multi-tenant
environment.
10. Complexity:- Furthermore, the increasing complexity of cloud environments contributes to
security challenges. This complexity makes it more difficult to maintain a comprehensive view
of the security posture across all components, and increases the likelihood of oversight or
misconfiguration.
11. Environment:- Cloud environments are attractive targets for cyber threats due to the volume
of valuable data and resources they host. Cybercriminals are constantly evolving their tactics,
making it challenging for organizations to stay ahead of emerging threats.

Securing Software as a Service (SaaS) applications and data is a critical concern for
organizations that rely on cloud-based services. SaaS security encompasses a range of measures
and techniques designed to protect the confidentiality, integrity, and availability of SaaS
applications and the data they handle.

Types of SaaS Security Software:


1. Identity and Access Management (IAM) Solutions:
IAM solutions are crucial for controlling and managing user access to SaaS applications. These
tools enable administrators to set access policies, enforce multi-factor authentication, and
monitor user activities to prevent unauthorized access and data breaches.
2. Data Loss Prevention (DLP) Software:
DLP software helps organizations prevent the unauthorized transmission of sensitive data from
SaaS applications. These tools allow businesses to monitor, detect, and block the inadvertent
or malicious sharing of confidential information, thereby reducing the risk of data leaks.
3. Encryption Tools:
Encryption plays a vital role in SaaS security by ensuring that data stored and transmitted
within SaaS applications is protected from unauthorized access. Encryption tools use
algorithms to encode data, making it unreadable without the appropriate decryption key,
thereby safeguarding sensitive information from cyber threats.

4. Cloud Access Security Broker (CASB) Solutions:


CASB solutions offer comprehensive security controls for SaaS applications by providing
visibility into user activities, enforcing security policies, and detecting and remediating security
threats. These tools help organizations extend their security posture to the cloud and enforce
consistent security policies across multiple SaaS applications.
5. Threat Intelligence Platforms:
Threat intelligence platforms provide organizations with actionable insights into emerging
cyber threats and vulnerabilities that may impact their SaaS environments. By leveraging threat
intelligence, businesses can enhance their incident response capabilities and proactively protect
their SaaS applications from evolving security risks.
6. Security Information and Event Management (SIEM) Systems:
SIEM systems collect and analyze security event data from SaaS applications to identify and
respond to security incidents. These platforms offer real-time monitoring, threat detection, and
automated response capabilities, allowing organizations to mitigate security risks and ensure
compliance with regulatory requirements.
7. Vulnerability Management Tools:
Vulnerability management tools help organizations identify and remediate security weaknesses
within their SaaS applications. These tools facilitate regular vulnerability assessments, patch
management, and security updates to minimize the risk of exploitation by cyber adversaries.
SaaS security is a multifaceted endeavor that requires a combination of robust security
measures and specialized software tools to protect cloud-based applications and data.

Characteristics of Cloud Computing


There are many characteristics of Cloud Computing here are few of them :
1. On-demand self-services: The Cloud computing services does not require any human
administrators, user themselves are able to provision, monitor and manage computing
resources as needed.
2. Broad network access: The Computing services are generally provided over standard
networks and heterogeneous devices.
3. Rapid elasticity: The Computing services should have IT resources that are able to scale
out and in quickly and on as needed basis. Whenever the user require services it is
provided to him and it is scale out as soon as its requirement gets over.
4. Resource pooling: The IT resource (e.g., networks, servers, storage, applications, and
services) present are shared across multiple applications and occupant in an uncommitted
manner. Multiple clients are provided service from a same physical resource.
5. Measured service: The resource utilization is tracked for each application and occupant,
it will provide both the user and the resource provider with an account of what has been
used. This is done for various reasons like monitoring billing and effective use of
resource.
6. Multi-tenancy: Cloud computing providers can support multiple tenants (users or
organizations) on a single set of shared resources.
7. Virtualization: Cloud computing providers use virtualization technology to abstract
underlying hardware resources and present them as logical resources to users.
8. Resilient computing: Cloud computing services are typically designed with redundancy
and fault tolerance in mind, which ensures high availability and reliability.
9. Flexible pricing models: Cloud providers offer a variety of pricing models, including pay-
per-use, subscription-based, and spot pricing, allowing users to choose the option that
best suits their needs.
10. Security: Cloud providers invest heavily in security measures to protect their users’ data
and ensure the privacy of sensitive information.
11. Automation: Cloud computing services are often highly automated, allowing users to
deploy and manage resources with minimal manual intervention.
12. Sustainability: Cloud providers are increasingly focused on sustainable practices, such as
energy-efficient data centers and the use of renewable energy sources, to reduce their
environmental impact.
Fig – characteristics of cloud computing

What is a Secure Software Development Cycle (SSDLC)


A Secure SDLC requires adding security testing at each software development stage, from
design, to development, to deployment and beyond. Examples include designing applications
to ensure that your architecture will be secure, as well as including security risk factors as part
of the initial planning phase.
Security is an important part of any application that encompasses critical functionality. This
can be as simple as securing your database from attacks by nefarious actors or as complex as
applying fraud processing to a qualified lead before importing them into your platform.
Security applies at every phase of the software development life cycle (SDLC) and needs to be
at the forefront of your developers’ minds as they implement your software’s requirements. In
this article, we’ll explore ways to create a secure SDLC, helping you catch issues in
requirements before they manifest as security problems in production.
With dedicated effort and the right security solutions, security issues can be addressed in the
SDLC pipeline well before deployment to production. This reduces the risk of finding security
vulnerabilities in your app and works to minimize the impact when they are found.
Secure SDLC’s aim is not to completely eliminate traditional security checks, such as
penetration tests, but rather to include security in the scope of developer responsibilities and
empower them to build secure applications from the outset.
Why Is Secure SDLC Important?
Secure SDLC is important because application security is important. The days of releasing a
product into the wild and addressing bugs in subsequent patches are gone. Developers now
need to be cognizant of potential security concerns at each step of the process. This requires
integrating security into your SDLC in ways that were not needed before. As anyone can
potentially gain access to your source code, you need to ensure that you are coding with
potential vulnerabilities in mind. As such, having a robust and secure SDLC process is critical
to ensuring your application is not subject to attacks by hackers and other nefarious users. New
tools such as application security posture management can help to provide a holistic view of
the components of your application security setup, as well as provide context about
vulnerabilities.
Software Development Lifecycle (SDLC) describes how software applications are built. It
usually contains the following phases:
 Requirements gathering
 Analysis of the requirements to guide design
 Design of new features based on the requirements
 Development of new capabilities (writing code to meet requirements)
 Testing and verification of new capabilities—confirming that they do indeed meet the
requirements
 Deployment of the new project
 Maintenance and evolution of these capabilities once the release goes out the door
Phases of SDLC
The Waterfall model is one of the earliest and best-known SDLC methodologies, which laid
the groundwork for these SDLC phases. Developed in 1970, these phases largely remain the
same today, but there have been tremendous changes in software engineering practices that
have redefined how software is created.
Traditionally, software was written for highly specialized applications, and software programs
developed using the Waterfall methodology often took years to release. Modern-day practices
now focus on increasing the pace of innovation while continuing to build well-functioning
software applications. Companies have moved on from Waterfall, with most using some form
of the agile SDLC, first published in 2001.
Agile development advocates for splitting up large monolithic releases into multiple mini-
releases, each done in two- or three-week-long sprints, and uses automation to build and verify
applications. This allows companies to iterate much more quickly. Instead of the infrequent,
monolithic deployments characteristic of Waterfall-driven applications, agile development
often focuses on releasing new functionality multiple times a day, building software
incrementally instead of all at once.
What are the Secure Software Development Life Cycle Processes?
Implementing SDLC security affects every phase of the software development process. It
requires a mindset that is focused on secure delivery, raising issues in the requirements and
development phases as they are discovered. This is far more efficient—and much cheaper—
than waiting for these security issues to manifest in the deployed application. Secure software
development life cycle processes incorporate security as a component of every phase of the
SDLC.
While building security into every phase of the SDLC is first and foremost a mindset that
everyone needs to bring to the table, security considerations and associated tasks will actually
vary significantly by SDLC phase.

5 phases of Secure Software Development Life Cycle


Each phase of the SDLC must contribute to the security of the overall application. This is done
in different ways for each phase of the SDLC, with one critical note: Software development
life cycle security needs to be at the forefront of the entire team’s minds. Let’s look at an
example of a secure software development life cycle for a team creating a membership renewal
portal:
Phase 1: Requirements
In this early phase, requirements for new features are collected from various stakeholders. It’s
important to identify any security considerations for functional requirements being gathered
for the new release.
 Sample functional requirement: user needs the ability to verify their contact information
before they are able to renew their membership.
 Sample security consideration: users should be able to see only their own contact
information and no one else’s.
Phase 2: Design
This phase translates in-scope requirements into a plan of what this should look like in the
actual application. Here, functional requirements typically describe what should happen, while
security requirements usually focus on what shouldn’t.
 Sample functional design: page should retrieve the user’s name, email, phone, and
address from CUSTOMER_INFO table in the database and display it on screen.
 Sample security concern: we must verify that the user has a valid session token before
retrieving information from the database. If absent, the user should be redirected to the
login page.
Phase 3: Development
When it’s time to actually implement the design and make it a reality, concerns usually shift to
making sure the code well-written from the security perspective. There are usually established
secure coding guidelines as well as code reviews that double-check that these guidelines have
been followed correctly. These code reviews can be either manual or automated using
technologies such as static application security testing (SAST).
That said, modern application developers can’t be concerned only with the code they write,
because the vast majority of modern applications aren’t written from scratch. Instead,
developers rely on existing functionality, usually provided by free open source components to
deliver new features and therefore value to the organization as quickly as possible. In fact,
90%+ of modern deployed applications are made of these open-source components. These
open-source components are usually checked using Software Composition Analysis (SCA)
tools.
Secure coding guidelines, in this case, may include:
 Using parameterized, read-only SQL queries to read data from the database and
minimize chances that anyone can ever commandeer these queries for nefarious
purposes
 Validating user inputs before processing data contained in them
 Sanitizing any data that’s being sent back out to the user from the database
 Checking open source libraries for vulnerabilities before using them
Phase 4: Verification
The Verification phase is where applications go through a thorough testing cycle to ensure they
meet the original design & requirements. This is also a great place to introduce automated
security testing using various technologies. The application is not deployed unless these tests
pass. This phase often includes automated tools like CI/CD pipelines to control verification
and release.
Verification at this phase may include:
 Automated tests that express the critical paths of your application
 Automated execution of application unit tests that verify the correctness of the
underlying application
 Automated deployment tools that dynamically swap in application secrets to be used in
a production environment
Phase 5: Maintenance and Evolution
The story doesn’t end once the application is released. In fact, vulnerabilities that slipped
through the cracks may be found in the application long after it’s been released. These
vulnerabilities may be in the code developers wrote, but are increasingly found in the
underlying open-source components that comprise an application. This leads to an increase in
the number of “zero-days”—previously unknown vulnerabilities that are discovered in
production by the application’s maintainers.
These vulnerabilities then need to be patched by the development team, a process that may in
some cases require significant rewrites of application functionality. Vulnerabilities at this stage
may also come from other sources, such as external penetration tests conducted by ethical
hackers or submissions from the public through what’s known as “bug bounty” programs.
Addressing these types of production issues must be planned for and accommodated in future
releases.
Prepare for zero-day vulnerabilities with Snyk
Learn how Snyk can enable your developers to remediate zero-day vulnerabilities faster to
reduce exposure and risk.
The Benefits of SSDLC
Secure SDLC is the ultimate example of what’s known as a “shift-left” initiative, which refers
to integrating security checks as early in the SDLC as possible.
Doing so helps development teams properly plan releases, making it easier to catch and address
issues that arise that could affect the release timeline. This is most certainly preferable to
receiving an unpleasant surprise once the application deploys to production. SSDLC, therefore,
helps keep releases on track.
What’s more, SSDLC at its core has the security efforts being led by the development team
itself. This allows the issues to be fixed by the domain experts who wrote the software rather
than having a different team fix the bugs as an afterthought. This empowers developers to take
ownership of the overall quality of their applications, which leads to more secure applications
being deployed to production.
While all the extra effort of security testing within the SDLC process may sound like a lot of
work and expensive to build, today, the vast majority of it is being automated. This is
particularly true for development operations or DevOps (more on this as follows). The secure
SDLC environment requires frequent collaboration between DevOps and the engineers
implementing the application’s functionality, and this collaboration needs to be incorporated
into the SDLC itself.
By fixing these issues early in the process, development teams can reduce the total cost of
ownership of their applications. Discovering issues late in the SDLC can result in a 100-fold
increase in the development cost needed to fix those issues, as seen in the chart below.
Secure SDLC Best Practices
1. Educate Your Developers
Secure SDLC goes hand in hand with multiple related initiatives, including:
 Creating secure coding guidelines
 Providing developers with security awareness and secure coding training
 Setting clear expectations around how quickly issues discovered in production need to
be addressed (also known as remediation SLAs).
Not all of these need to happen for an effective SSDLC implementation, but much like a jigsaw
puzzle, you’ll need to put enough pieces together before you can see the big picture.
2. Have Clear Requirements
Whatever you create, it should be easy to understand. Development teams need clear
requirements that are easy to act upon. This applies to all security advice, recommendations,
and guidelines. Any vulnerabilities discovered in tests need to be easy to act on. It’s key that
all people, processes, and tools involved bring solutions to the table instead of just pointing out
problems
3. Maintain a Growth Mindset
Since SSDLC will change how multiple teams work and interact, it’s important for everyone
to go into this experience with an open mind, and for the security team to have the mindset of
empowering developers to secure their own applications
4. Tie Implementation to Other Initiatives
For well-established applications and teams, it may often be easier to implement SSDLC
changes when it’s tied to another modernization effort, such as a cloud transformation, a
DevOps initiative, or its more security-conscious variation, DevSecOps.
5. Tackle the Big Problems First
Focus on the most important issues and actionable fixes rather than addressing every
vulnerability found. While it may be possible for newer or smaller applications to fix every
security issue that exists, this won’t necessarily work in older and larger applications. A triage
approach can also be helpful. This focuses on not only preventing security issues from making
it into production, but also ensuring existing vulnerabilities are triaged and addressed over time.
SSDLC and DevSecOps
It’s important to discuss the relationship between SSDLC and DevSecOps. They are sometimes
used interchangeably, which can lead to confusion. While SSDLC and DevSecOps are closely
linked, they are actually complementary practices. Both SSDLC and DevSecOps focus on
empowering developers to have more ownership of their application, ensuring they are doing
more than just writing and testing their code to meet functional specifications.
Secure SDLC is focused on how the application is designed and built; DevSecOps seeks to
shift ownership of the production environment for each application away from traditional IT
teams and into the hands of the developers. This lets developers focus on automating build,
test, and release processes as much as possible.
DevOps and DevSecOps have started a revolution in redefining the role of software developers.
This has of course been aided by other major changes, such as cloud transformations. But while
empowering developers and accelerating security testing is key to success for most modern
organizations, it would be a mistake to view application security as just an automation
challenge. Instead, it’s important to drive cultural and process changes that help raise security
awareness and considerations early in the development process. This must permeate all parts
of the software development life cycle, regardless of whether one calls it SSDLC or
DevSecOps.
SSDLC allows you to shift security risks left, addressing the origin of security issues at the
requirements phase instead of having to backtrack from the maintenance phase. By focusing
on security at every stage of development, you can rest assured your application will be far
more secure as a result.
What is data in use?

Data in use includes all data that is accessed, processed, and regularly modified by applications,
users, or devices. It is the state where data is most vulnerable to security risks due to the
numerous threat vectors present when it is accessed or shared.

Examples of data in use


1. Files shared between employees: When multiple users are simultaneously editing a document
stored in the cloud, the document data is in use.
2. Online banking transactions: Every time a user logs in to their online banking account, checks
their balance, or makes a transaction, their financial data is in use.
3. Real-time analytics: When real-time data is actively queried, analyzed, and processed to gain
insights into customer behavior or market trends, the data being processed is in use.
4. Database queries: When a software application queries a database for specific information, the
retrieved data is in use while it's being processed by the application

Three states of data


Data in use is one of the three types of data states, the other two being data at rest and data in
transit. To apply the right security controls, it is important to understand the flow of data in use
along with where and to whom it is exposed. Data at rest is data that is stored in hard drives,
servers, or cloud storage and lies dormant within organizations' repositories. Data in transit is
data that is actively moving between two endpoints within or outside of the organization.

Data in use vs. data in transit

Each state of data differs in terms of its vulnerability to attacks, threats, and the security controls
that can be applied.

Aspect Data in use Data in transit

Actively being processed or Actively moving between source


State manipulated, residing in memory or and destination points over the
on devices internal network or internet
Aspect Data in use Data in transit

Malicious insiders, malware


Potential Unsecure communication
affecting applications, and data
vulnerabilities protocols and unencrypted data
leakage from user actions

Unauthorized access, malware, Unauthorized interception, man-


Potential threats memory scrapping, insider threats, in-the-middle attacks, and data
and data leakage tampering

User authorization and


Secure communication protocols
Security authentication, stringent user
(e.g., HTTPS, VPNs), encryption,
controls permissions management, and
and network security controls
securing file sharing methods

Threats to data in use

Some of the threats to data in use include:

 Unauthorized access: Unauthenticated users can gain access to sensitive data during
processing, leading to data breaches and leaks.
 Malware and malicious code: Infected applications or devices can compromise data in use,
potentially leading to data corruption or theft.
 Memory scrapping: Sophisticated attackers can exploit vulnerabilities to extract data from an
application's memory.
 Insider threats: Employees or collaborators with access to data in use can misuse it intentionally
or inadvertently.
 Data leakage: Inadequate controls can result in unintended data exposure, such as through
copy-paste operations or screen captures.

Protecting data in use

You can protect data in use by safeguarding it where it is used the most, usually within the
organization. Approach data security from a 360-degree perspective to close as many security
backdoors as possible:
 Implement sound user authentication and authorization controls, like enforcing multi-factor
authentication to minimize the chances of user credentials being stolen by hackers.
 Periodically review and resolve user permissions for permission inheritance issues, such as
excess privileges to user roles that don't require them. Tools like a security permission
analyzer can help identify effective user permissions.
 Get notified about crucial file events for files classified as restricted, sensitive, or confidential
by a data classification tool.
 Look for sudden spikes in file modifications or deletions that can indicate a ransomware attack.
Deploy a file integrity monitoring solution to track real-time file changes.
 Keep your endpoints secure by monitoring outbound emails, USB activity, potential web
uploads, and more using data leak prevention software.

Data Privacy Meaning

Data privacy is the right of people to control their own personal data. When it comes to data
privacy, there are two major types of information:

 Personal Information: This includes any identifying information about a person, such
as your name, home address, phone number, etc.
 Sensitive Personal Information: This includes any information that is related to an
individual’s sexual orientation or health history.

Different jurisdictions have different requirements for data privacy. For example, with the new
and controversial General Data Protection Regulation (also known as GDPR), the privacy of
minors is prioritized, as is the explicit consent of users to collect information while they use a
website. In the context of medical records, health care professionals in the United States must
abide by HIPPA, the Health Insurance Portability and Accountability Act. This is a set of
guidelines that all practitioners must follow that serves to protect the privacy of patients.

Why do we have data privacy?

Data privacy is necessary because it ensures that our personal information stays private. Data
privacy is important so that we don’t have to worry about our data being used in malicious
ways against us. It also helps ensure the integrity of businesses, as well as governments. If
companies and organizations didn’t have data privacy, they could use the information they
gather about you in any way they want.

What to expect in the future

The future of data privacy is hard to predict. The laws are changing rapidly and they will
continue to change as time goes on. It’s unclear what the future of data privacy will look like
and how it will affect our lives in the coming years, but we can make some educated guesses
about what it could be like in the next few years.

As technology improves and evolves, so does the way we communicate, share information,
and work - which means we are constantly putting more personal information out into the
world, making it easier for hackers to steal that information. We have seen many large data
breaches occur over the last few years from Target to Equifax. These incidents pose a
significant risk for companies who store their customers’ sensitive data online because when
that information gets hacked, it can lead to massive identity theft cases across the country."

Data Privacy Basics

The key components of data privacy include:

- Data confidentiality. This means that all data collected is only shared between the consenting
parties.

- Data security. This ensures that the data collected is housed somewhere secure and that the
proper precautions are taken to prevent it from being misused or accessed maliciously.

- Transparency in data usage. The terms and conditions laid out between both parties is clear,
understood, and represents the full picture of how the data will be used.

- Compliance. Depending on the geographically location, the data in question, and the role of
the parties involved, ensuring that proper compliance with applicable legislations is followed.
Examples

Examples of data privacy include:

 Ensuring that sensitive data is only accessed by authorized personnel.


 Encrypting data to prevent unauthorized access.
 Limiting the collection and use of personal data to only what is necessary.
 Providing users with control over their personal data, such as the ability to delete or
modify their data.
 Complying with relevant laws and regulations around data privacy, such as GDPR or
CCPA.

Data privacy is a crucial issue in today's world of increasing data breaches and cyber attacks.
It refers to the protection of personal information and ensuring that it is not misused or accessed
without authorization. One example of data privacy is ensuring that sensitive data, such as
financial information or medical records, is only accessed by authorized personnel. This can
be achieved through access control measures, such as usernames and passwords, or biometric
authentication.

Encrypting data is another example of data privacy. This means encoding sensitive information
so that it cannot be read by unauthorized individuals. Encryption is commonly used for data
transmitted over the internet, such as online banking transactions or email correspondence.

Limiting the collection and use of personal data to only what is necessary is another key aspect
of data privacy. This means that organizations should only collect and use personal information
that is needed for a specific purpose, and not collect more data than necessary. For example, a
retailer may ask for a customer's name and email address to send promotional emails, but
should not ask for sensitive information such as their social security number.

Providing users with control over their personal data is also important for data privacy. This
means giving users the ability to delete or modify their data, such as their personal information
or search history. Users should also be able to control who has access to their data and how it
is used.

Finally, complying with relevant laws and regulations around data privacy, such as GDPR or
CCPA, is crucial for protecting personal information. These regulations require organizations
to inform users about how their data is collected and used, and to obtain explicit consent before
collecting or sharing personal information.

Identity and access management (IAM) is the discipline of managing user accounts and IT
permissions in an organization. In simple terms, IAM ensures that users can access the
resources they need, while protecting sensitive data from unauthorized access. When done
right, IAM helps companies save time, improve their cybersecurity and comply with laws and
industry standards. Learn how IAM works and how companies can automate user and privilege
administration by reading our beginner’s guide to identity and access management!

What Is Identity and Access Management?

Identity and access management refers to the administration of user accounts (identities) and
their permissions and privileges (access). Permissions in IT systems such as NTFS
permissions on Windows file servers govern which files users can open, which
applications they can use and which areas of the network they can access. Therefore, assigning
the correct privileges to each user is a key requirement for a safe and productive IT
environment.

The more users, devices and applications are part of a network, the more difficult it becomes
to manage accounts and permissions by hand. Once your organization reaches a size where user
management and access rights management become an ongoing challenge, you should consider
dedicated identity and access management software to automate and centralize the process. An
IAM system ensures the efficient, accurate and secure administration of accounts and privileges
across your entire digital infrastructure.

Identity Management vs. Access Management

While identity management is concerned with creating, updating and deleting accounts as part
of the user lifecycle, access management deals with the specific permissions of each account,
including access to unstructured data stored on file servers, SharePoint and similar platforms.
More on the difference between identity management and access management.

Why Is Identity and Access Management So Important?

Without the right accounts and permissions, there is almost nothing employees can get done in
a modern workplace. Users depend on their Windows accounts, email accounts, cloud
accounts, accounts in third-party apps and more. At the same time, businesses store more and
more sensitive data in these different applications, and the threat of data breaches and cyber
attacks continues to rise.

Identity and access management ensures that only the right people can access IT resources.
This way, IAM protects critical data from both external attacks and insider threats such
as employee data theft. At the same time, IAM helps your IT department save valuable time by
automating routine tasks like user provisioning and permission audits, while providing self-
service features for end users.
Advantages of Identity and Access Management

While identity and access management is critical to data security, this is far from the only
advantage IAM brings to organizations. Managing accounts and permissions through a central,
automated platform drastically reduces the administration workload for your IT staff. Even
common helpdesk tickets like password resets or access requests can be outsourced to the IAM
system, allowing your admins to focus on more important tasks and long-term projects.

Meanwhile, IAM helps your end users get to the data they need faster and easier since they can
request new permissions directly through the platform instead of sending out tickets or emails.
That means fewer delays in the day-to-day operations. And with every change documented
automatically, the business is always in control of who has access to what.

Benefits of IAM:

 Accurate access for every user


 Automatic provisioning and deprovisioning
 Compliant user management and auditing
 Protection from internal and external threats
 Clear overview of effective permissions
 Fewer support tickets
 Fewer delays for end users
 Less time wasted for admins

Identity and access management covers different tasks and functions. Learn more in our
overview.
Is Identity and Access Management Mandatory?

There are various laws, industry norms and security standards that require organizations to
restrict access to sensitive data and actively manage accounts and permissions. It is not
mandatory to use an identity and access management system to meet these requirements.
However, there is no practical way to comply with these regulations without the support of an
IAM system.

Without IAM software, it would take astronomical effort to enforce least privilege access,
perform regular user access reviews and document all changes to accounts and permissions. So
while identity and access management is not explicitly mandatory, there is simply no way
around IAM for organizations with more than a few dozen IT users.

How Does Identity and Access Management Work?

Identity and access management lets organizations control which users have access to which
files and systems. On a technical level, there are two separate steps that govern access to IT
resources:

Authentication: Users verify their identity by entering their credentials (username and
password) and completing multi-factor authentication. If the verification is completed
successfully, the user is allowed into the system. In the case of central identity providers
like Active Directory and Azure AD, they are also logged into various connected apps (single
sign-on).
Authorization: Once a user authenticates their identity, they are authorized to perform specific
actions. What a user is allowed to do depends on which permissions have been assigned to
them by the admin. In Windows environments, authorization is determined by checking an
account’s security identifier against the access control list of an object.

Account Provisioning

When an admin creates a new account, the IAM system automatically assigns the correct
permissions, groups and organizational units. Identity and access management determines
which privileges to give to which user through an access control model known as role-based
access control: First, organizations define the intended permissions for different business roles,
such as people working in different departments.

The identity and access management software then provides new accounts with the permissions
matching their role: employees in the sales department receive the permissions of the sales role,
employees in marketing receive marketing permissions and so on. This process also applies
when the role of a user changes: when a new role is assigned, the person is given all privileges
of their new role and loses all permissions associated with their old role.

PHYSICAL SECURITY

Physical security has an important role to play in protecting critical information and data. With
work and collaboration paradigm shifts, new cases of security threat arise. The physical
security structure consists of three main components: access control, permanent active
surveillance and testing. The success of an organisation’s physical security program can often
be attributed to how each of these components are implemented, improved, and maintained.

1. Physical Security definition


Physical security aims to protect people, property, and physical assets from any action or event
that could lead to loss or damage. Physical security is crucial, and security teams must work
together to ensure the security of digital assets.

Why is Physical Security important?

Physical security keeps your employees, facilities, and assets safe from real-world threats.
These threats can arise from internal or external intruders that question data security. Physical
attacks can cause a safe area to break into or the invasion of a restricted area part. An attacker
can easily damage or steal critical IT assets, install malware on systems, or leave a remote
access port on the network.

It is important to have strict physical security to protect against external threats, as well as
equally effective measures to avoid the risks of any internal intruder. The key is to understand
that physical security refers to the entire space, and it should not be restricted only to the front
door, but to the entire building. Any area that is left unprotected – such as the smoking area
(with doors for example facing the outside of the building, without the main entrance controls)
or the entrance to the car park, can pose a risk.

Security experts refer to this form of protection as a deep or layered protection, since there are
several control points in the physical infrastructures. Physical damage is as harmful as digital
loss, and therefore strict physical security measures must be taken.

Key components of physical security include:

• Access control and monitoring of physical access should cover the entire area, using
sophisticated physical security tools such as biometric and ID card restrictions. However, it is
important to understand the pros and cons of each measure and how these access controls can
be forged.

• Surveillance, containing burglar alarms, guards, and CCTV that keeps a complete record of
the entire movement. High-risk areas may have sophisticated detectors to ensure a more holistic
view.

The general principles of physical security measures should respond to:

• Physical Security Perimeter


• Physical Input Controls
• Security of Offices, Rooms, and Facilities
• Protection against External and Environmental Threats
• Working in Safe Areas
• Public Access, Loading and Unloading Areas
• Protection and Disposal of Equipment

IoT and Ai bring Physical Security to the digital world

Traditionally, physical and digital security were two distinct fields. Today organisations are
increasingly dependent on IoT and its integrations, increasing by themselves the need for an
improvement in their digital and physical security controls (network, servers, data, etc.). Virtual
machines and applications, even if they’re in the cloud, are as secure as your physical servers.
With technology constantly evolving, integrations with AIs are increasingly popular. With
regard to physical security, these integrations will continue to evolve, for example by allowing:

• Real-time analysis of video surveillance with detection of possible anomalies.


• Intelligent access control systems that enable a more reactive approach.
• Patrols of robots and automatic and proactive drones in search of potential anomalies and
threats.
• Crowd monitoring, allowing facial recognition and behavioural analysis.

What are the main threats to Physical Security?

Physical security focuses on keeping your facilities, people, and assets safe from real-world
threats. Currently, there are multiple attack vectors, and these can have a focus not only from
a physical and technological point of view, but also exploring weaknesses specific to the human
condition (social engineering).

Physical security also focuses on rules and controls that allow the protection of persons and
property in the event of natural disasters or catastrophes.

Some of the most common and most difficult attacks to mitigate are focused on Social
Engineering, psychologically manipulating people to perform actions or disclose confidential
information. Examples:

• Tailgating: The attacker manages to follow an authorized person to a reserved area.


• Piggybacking: The attacker manages to trick an authorized person by gaining their access to
reserved areas.

6. How can we protect Physical Security?

Your physical assets might get stolen, and that could be a major threat. In the following list,
we find some of the most commonly used controls for protection with regard to physical safety:

• Remote access: Allows remote location through applications.


• Gates: Helps form the outermost physical security layer. It makes it impossible, or at least,
to attempt to access the infrastructure hastily.
• Surveillance: Provides a visual and historical record.
• Alarm systems: Reactive layer on capturing historical events.
• Access controls: Control and record the movement of people and vehicles.
• Indicated lighting: Good indoor and outdoor lighting may be sufficient to prevent
unauthorized access, especially at night.
• Regular audits: All security checks should be regularly audited to ensure that everything is
working as expected.
• Incident Response: Organisations should be prepared to handle incidents, ensuring rapid,
organised, and efficient responses.
• Backups: Be sure to backup your device’s data constantly.

You might also like