0% found this document useful (0 votes)
15 views

Unit -5 Software Testing

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Unit -5 Software Testing

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Software Quality Assurance

Software Quality

 Quality is defined as the product or services that should be "fit for use and purpose."
 Quality is all about meeting the needs and expectations of customers concerning
functionality, design, reliability, durability, and price of the product.

Software Assurance

 Assurance is a positive declaration on a product or service. It is all about the product


which should work well.
 It provides a guarantee which would work without any problem according to expectations
and requirements.

Software Quality Control


Quality Control is a software engineering process that is used to ensure that the
approaches, techniques, methods, and processes designed for the project are
followed correctly. Quality control activities operate and verify that the application
meet the defined quality standards.
 It focuses on an examination of the quality of the end products and the
final outcome rather than focusing on the processes used to create a
product.
 It is a reactive process and is detection in nature.
 These activities monitor and verify that the project deliverables meet the
defined quality standards.

Quality Assurance (QA)

 Quality Assurance is also known as QA Testing. QA is defined as an activity to ensure that


an organization is providing the best product or service to the customers.
 Software Quality Assurance seems it is all about evaluation of software based on
functionality, performance, and adaptability; however software quality assurance goes
beyond the quality of the software, it also includes the quality of the process used to develop,
test and release the software.
 Software Quality assurance is all about the Software Development lifecycle that includes
requirements management, software design, coding, testing, and release management.
 Quality Assurance is the set of activities that defines the procedures and standards to develop
the product.

Software Quality Assurance Attribute

1. Functionality

2. Reliability
3. Usability

4. Efficiency

5. Maintainability.

6. Portability

Software Quality Assurance components:

1. Pre-project Plan

Pre-project Plan ensures that the resources required for project, schedule, and budget should be
clearly defined. Plan for development and ensuring quality has been determined.

Components are as:

o Required Resources (Hardware and Human resources)

o Development plan

o Schedules

o Risk evaluation

o Quality plan

o Project methodology
2. Project lifecycle component

A project lifecycle usually comprised of two stages:

1. Development Stage

In the Development Stage Component, Software Quality Assurance help to identify the design
and programming errors. Its Components are divided into the following sub-classes: Reviews,
Expert Opinions, and Software Testing.

2. Operation Maintenance Stage

In Operation Maintenance Stage, the Software Quality Assurance components include the
Development lifecycle component along with specialized components whose aim is to improve
the maintenance tasks.

3. Infrastructure error prevention and improvement components

The aim of this component is to the prevention of software faults and minimizes the rate of
errors.

These components are as:


o Procedure and work instructions

o Templates and Checklists

o Staff Training, Retainingand Certification

o Preventive and Corrective Actions

o Configuration Management

o Documentation Control

4. Software Quality Management Components

This class of component consists of controlling development and maintenance activities. These
components establish the managerial control of software development projects. The management
component aims to prevent the project from going over budget and behind schedule.

The management components include:

o Project Progress Control

o Software Quality Metrics

o Software Quality Costs

5. Standardization, Certification, and SQA assessment components

Aim of these components is to implement international managerial and professional standards


within the organization. These components help to improve the coordination among the
Organizational Quality Systems and establish standards for the project process. The component
includes:

o Quality management standards

o Project process standard

6. Organizing for Software Quality Assurance .


Software Quality Assurance Tools

Various QA tools help with quality assurance. There are different QA tools required for different
purposes. For comprehensive software quality assurance, we will need a different kind of tool
which is also known as QA software.

o Infrastructure

o Release Management

o Source Control

o Code Reviews

o Automates Code Analysis

o Peer Code Reviews

o Testing

o Test management

o Bug and Issue Tracking

o Browser, Device and OS Testing

o Usability Testing

o Load Testing

o Automates Testing and Continuous Integration

o Monitoring and Analytics

o Availability Monitoring

o Business Analytics

o Exception Handling

o Log Monitoring

o Performance Monitoring

o Security Testing and Monitoring


o Customer Support

How to do Quality Assurance

The whole process of quality assurance has to define the cycle called the PDCA cycle.

Phases of this cycle are as:

o Plan

o Do

o Check

o Act

Plan: The organization should plan and establish the process related objectives and determine
the process that is required to deliver a high-quality end product.

Do: Development and testing of processes and also change in the methods.

Check: Monitoring of processes, modify the methods, and check whether it meets the
predetermined objectives.

Act: Implement actions that are necessary to achieve improvements in the process.
An organization must use Quality Assurance to ensure that the product is designed and
implemented with correct procedures. This will help to reduce problems and errors in the final
product.

There are five types of Quality Assurance Function.

1. Technology Transfer
2. Validation For the entire system, validation master plan is prepared. Resource planning
for execution of a validation plan is done.
3. Documentation .
4. Quality assurance function also involves assuring the quality of products.
5. It also involves quality improvement plans.

Difference between Quality Assurance (QA) and Quality Control (QC)

Parameters Quality Assurance (QA) Quality Control (QC)

QA focuses on providing
QC focuses on fulfilling the
Objective assurance that the quality
quality requested.
requested will be achieved.

QA is the technique of QC is the technique to verify


Technique
managing quality. quality.

Involved in which QA is involved during the QC is not included during


phase? development phase. the development phase.
Parameters Quality Assurance (QA) Quality Control (QC)

Program execution QA does not include the QC always includes the


is included? execution of the program. execution of the program.

Type of tool QA is a managerial tool. QC is a corrective tool.

Process/ Product-
QA is process oriented. QC is product oriented.
oriented

The aim of quality control is


The aim of quality assurance is
Aim to identify and improve the
to prevent defects.
defects.

Quality Control is performed


Quality Assurance is performed
Order of execution after the Quality Assurance
before Quality Control.
activity is done.

Technique type QA is a preventive technique. QC is a corrective technique.

Measure type QA is a proactive measure. QC is a reactive measure.

QA is a low-level activity that QC is a high-level activity


Activity level identifies an error and mistakes that identifies an error that
that QC cannot. QA cannot.
Parameters Quality Assurance (QA) Quality Control (QC)

Pays main focus is on the Its primary focus is on final


Focus
intermediate process. products.

All team members of the project Generally, the testing team


Team
are involved. of the project is involved.

QA aims to prevent defects in QC aims to identify defects


Aim
the system. or bugs in the system.

QA is a less time-consuming QC is a more time-


Time consumption
activity. consuming activity.

Which statistical Statistical Process Control Statistical Quality Control


technique was (SPC) statistical technique is (SQC) statistical technique is
applied? applied to Quality Assurance. applied to Quality Control.

Example Verification Validation

Software Process Assessment


 Software Process Assessment is a disciplined and organized examination of the
software process which is being used by any organization bases the on the
process model.
 The Software Process Assessment includes many fields and parts like
identification and characterization of current practices, the ability of current
practices to control or avoid significant causes of poor (software) quality, cost,
schedule and identifying areas of strengths and weaknesses of the software.

Methods of SEICMM

There are two methods of SEICMM:

Capability Evaluation: Capability evaluation provides a way to assess the software process
capability of an organization.

Software Process Assessment: Software process assessment is used by an organization to


improve its process capability. Thus, this type of evaluation is for purely internal use.

Types of Software Process Assessment :


 Self Assessment : This is conducted internally by the people of their own
organisation.
 Second Party assessment: This is conducted by an external team or people of the
own organisation are supervised by an external team.
 Third Party assessment:
In an ideal case Software Process Assessment should be performed in a transparent,
open and collaborative environment.
Capability Evaluation: Capability evaluation provides a way to assess the software process
capability of an organization.

Software Process Assessment: Software process assessment is used by an organization to


improve its process capability. Thus, this type of evaluation is for purely internal use.

1. Purpose of Software Process Assessment

 Understanding Current Processes: Gain insights into how current software


development processes are being executed.
 Benchmarking: Compare processes against industry standards or best practices.
 Identifying Improvements: Highlight areas that need enhancement to improve quality,
efficiency, and productivity.
 Compliance: Ensure processes comply with organizational policies and regulatory
requirements.

2. Common Software Process Assessment Models

Capability Maturity Model Integration (CMMI)

 Maturity Levels: Ranges from Level 1 (Initial) to Level 5 (Optimizing).


 Process Areas: Includes specific areas such as project management, engineering, and
support.
 Assessment: Evaluates processes against predefined maturity levels.

ISO/IEC 15504 (SPICE - Software Process Improvement and Capability Determination)

 Process Capability Levels: Ranges from Level 0 (Incomplete) to Level 5 (Optimizing).


 Process Dimensions: Includes process performance and capability.
 Assessment: Measures the capability of processes in different dimensions.

ISO/IEC 12207
 Process Framework: Defines processes, activities, and tasks that are required for
developing and maintaining software.
 Assessment: Evaluates compliance with the defined framework.

3. Assessment Steps

Planning

 Defining Objectives: Establish what you want to achieve with the assessment.
 Selecting a Framework: Choose an appropriate assessment model (e.g., CMMI,
SPICE).
 Identifying Scope: Determine which processes and areas will be assessed.

Data Collection

 Interviews: Conduct interviews with stakeholders and team members.


 Document Review: Analyze process documentation, project plans, and reports.
 Observations: Observe the processes in action to gather real-time data.

Evaluation

 Gap Analysis: Compare current processes against the chosen model’s criteria.
 Strengths and Weaknesses: Identify what is working well and what needs improvement.
 Rating: Assign maturity or capability levels based on the findings.

Reporting

 Documentation: Prepare a detailed report outlining the findings, including strengths,


weaknesses, and recommendations.
 Presentation: Share the results with stakeholders, including senior management and
development teams.

Improvement Planning

 Action Plans: Develop actionable plans to address identified weaknesses.


 Prioritization: Prioritize improvements based on impact and feasibility.
 Implementation: Implement the improvement plans and monitor progress.

4. Benefits of Software Process Assessment

 Enhanced Quality: Leads to higher quality software products by improving processes.


 Increased Efficiency: Streamlines processes, reducing time and effort.
 Risk Mitigation: Identifies potential risks and provides strategies to address them.
 Better Decision-Making: Provides data-driven insights for strategic decisions.
 Continuous Improvement: Encourages a culture of ongoing process improvement.

5. Challenges

 Resource Intensive: Can be time-consuming and require significant resources.


 Resistance to Change: Teams may resist changes suggested by the assessment.
 Complexity: Assessments can be complex and require expertise in the chosen model.
Level 1: Initial

Ad hoc activities characterize a software development organization at this level. Very few or no
processes are described and followed. Since software production processes are not limited,
different engineers follow their process and as a result, development efforts become chaotic.
Therefore, it is also called a chaotic level.

Level 2: Repeatable

At this level, the fundamental project management practices like tracking cost and schedule are
established. Size and cost estimation methods, like function point analysis, COCOMO, etc. are
used.

Level 3: Defined

At this level, the methods for both management and development activities are defined and
documented. There is a common organization-wide understanding of operations, roles, and
responsibilities. The ways through defined, the process and product qualities are not measured.
ISO 9000 goals at achieving this level.

Level 4: Managed

At this level, the focus is on software metrics. Two kinds of metrics are composed.

Product metrics measure the features of the product being developed, such as its size,
reliability, time complexity, understandability, etc.

Process metrics follow the effectiveness of the process being used, such as average defect
correction time, productivity, the average number of defects found per hour inspection, the
average number of failures detected during testing per LOC, etc.

Level 5: Optimizing
At this phase, process and product metrics are collected. Process and product measurement data
are evaluated for continuous process improvement.

Key Process Areas (KPA) of a software organization

Except for SEI CMM level 1, each maturity level is featured by several Key Process Areas
(KPAs) that contains the areas an organization should focus on improving its software process to
the next level. The focus of each level and the corresponding key process areas are shown in the
fig.

Software Process Framework


 A Software Process Framework is a structured approach that defines the
steps, tasks, and activities involved in software development.
 This framework serves as a foundation for software engineering, guiding the
development team through various stages to ensure a systematic and efficient
process.
 A Software Process Framework helps in project planning, risk management,
and quality assurance by detailing the chronological order of actions.
1. Tasks: They focus on a small, specific objective.
2. Action: It is a set of tasks that produce a major work product.
3. Activities: Activities are groups of related tasks and actions for a major
objective.

1. Communication

Definition: Communication involves gathering requirements from customers and


stakeholders to determine the system’s objectives and the software’s requirements.
Activities:
 Requirement Gathering: Engaging with consumers and stakeholders
through meetings, interviews, and surveys to understand their needs and
expectations.
 Objective Setting: Clearly defining what the system should achieve based
on the gathered requirements.
Explanation: Effective communication is essential to understand what the users need
from the software. This phase ensures that all stakeholders are on the same page
regarding the goals and requirements of the system.
2. Planning
Definition: Planning involves establishing an engineering work plan, describing
technical risks, listing resource requirements, and defining a work schedule.
Activities:
 Work Plan: Creating a detailed plan that outlines the tasks and activities
needed to develop the software.
 Risk Assessment: Identifying potential technical risks and planning how to
mitigate them.
 Resource Allocation: Determining the resources (time, personnel, tools)
required for the project.
 Schedule Definition: Setting a timeline for completing different phases of
the project.
Explanation: Planning helps in organizing the project and setting clear expectations.
It ensures that the development team has a roadmap to follow and that potential
challenges are anticipated and managed.
3. Modeling
Definition: Modeling involves creating architectural models and designs to better
understand the problem and work towards the best solution.
Activities:
 Analysis of Requirements: Breaking down the gathered requirements to
understand what the system needs to do.
 Design: Creating architectural and detailed designs that outline how the
software will be structured and how it will function.
Explanation: Modeling translates requirements into a visual and structured
representation of the system. It helps in identifying the best design approach and
serves as a blueprint for development.
4. Construction
Definition: Construction involves creating code, testing the system, fixing bugs, and
confirming that all criteria are met.
Activities:
 Code Generation: Writing the actual code based on the design models.
 Testing: Running tests to ensure the software works as intended, identifying
and fixing bugs.
Explanation: This phase is where the actual software is built. Testing is crucial to
ensure that the code is error-free and that the software meets all specified
requirements.
5. Deployment
Definition: Deployment involves presenting the completed or partially completed
product to customers for evaluation and feedback, then making necessary
modifications based on their input.
Activities:
 Product Release: Delivering the software to users, either as a full release or
in stages.
 Feedback Collection: Gathering feedback from users about their
experience with the software.
 Product Improvement: Making changes and improvements based on user
feedback to enhance the product.
Popular Software process Frameworks
 Angular
 React
 Vue.js
 Django
 Flask
 Ruby on Rails
 Spring
 Express
 Laravel
 ASP.NET Core

Assessment process principles:

1. Goal Orientation:
o Align assessment activities with organizational goals and business objectives.
o Ensure that the assessment focuses on aspects of the process that directly impact
the organization's ability to meet its goals.
2. Process Focus:
o Concentrate on the processes rather than individual performance.
o Evaluate how well processes are defined, managed, and executed.
3. Objectivity:
o Use objective, measurable criteria to evaluate processes.
o Ensure that the assessment is based on data and factual information rather than
opinions.
4. Systematic Approach:
o Follow a structured and well-defined methodology for conducting the assessment.
o Ensure consistency and repeatability in the assessment process.
5. Stakeholder Involvement:
o Involve all relevant stakeholders, including management, developers, and end-
users.
o Ensure that different perspectives are considered in the assessment.
6. Continuous Improvement:
o View assessment as a continuous process rather than a one-time activity.
o Use assessment findings to drive ongoing process improvement efforts.
7. Tailoring:
o Adapt the assessment approach to the specific context and needs of the
organization.
o Consider the size, complexity, and maturity of the organization when designing
the assessment.
8. Confidentiality and Sensitivity:
o Handle assessment data with confidentiality and sensitivity.
o Ensure that participants feel safe to provide honest and accurate information.
9. Cultural Awareness:
o Be aware of and respect the organizational culture.
o Consider cultural factors that might impact the effectiveness of processes and
their assessment.
10. Focus on Key Process Areas:
o Identify and assess critical process areas that have a significant impact on
software quality and project success.
o Prioritize areas for improvement based on their importance and impact.
11. Benchmarking:
o Compare the organization's processes against industry standards and best
practices.
o Identify gaps and opportunities for improvement based on these benchmarks.
12. Actionable Recommendations:
o Provide clear, practical, and actionable recommendations for process
improvement.
o Ensure that recommendations are feasible and aligned with the organization's
capabilities and resources.
13. Feedback and Communication:
o Ensure regular feedback and communication throughout the assessment process.
o Keep stakeholders informed about findings, progress, and next steps.

Conduct a software process assessment:


1. Preparation

 Define Objectives: Clearly outline the goals of the assessment.


 Select a Framework: Choose a standard framework or model (e.g., CMMI, ISO/IEC
15504 (SPICE), Agile methodologies).
 Form Assessment Team: Assemble a team of assessors with the necessary skills and
experience.
 Plan the Assessment: Develop a detailed plan, including scope, timeline, and resources.

2. Data Collection

 Document Review: Gather and review existing process documentation, project artifacts,
and other relevant materials.
 Surveys and Questionnaires: Distribute surveys to gather input from stakeholders.
 Interviews: Conduct interviews with key personnel (developers, managers, QA, etc.) to
gain insights into the processes.
 Observation: Observe the processes in action to understand the current practices.

3. Analysis

 Gap Analysis: Compare current processes against the chosen framework or standards to
identify gaps.
 Root Cause Analysis: Identify the underlying causes of any deficiencies or issues.
 SWOT Analysis: Assess strengths, weaknesses, opportunities, and threats related to the
current processes.

4. Assessment Report

 Findings: Document the findings from the data collection and analysis phases.
 Recommendations: Provide actionable recommendations for process improvement.
 Prioritization: Prioritize the recommendations based on impact and feasibility.
 Draft Report: Prepare a draft assessment report for review by stakeholders.
5. Review and Validation

 Stakeholder Review: Present the draft report to stakeholders for feedback.


 Validation: Validate findings and recommendations with the assessment team and
stakeholders.
 Finalize Report: Incorporate feedback and finalize the assessment report.

6. Action Plan

 Develop Action Plan: Create a detailed action plan to implement the recommendations.
 Assign Responsibilities: Assign tasks and responsibilities to appropriate team members.
 Set Deadlines: Establish timelines and milestones for implementing improvements.

7. Implementation

 Execute Action Plan: Implement the process improvements as outlined in the action
plan.
 Monitor Progress: Track the progress of implementation activities and make
adjustments as needed.

8. Follow-Up

 Periodic Reviews: Schedule follow-up assessments to ensure ongoing process


improvement.
 Continuous Improvement: Foster a culture of continuous improvement by regularly
revisiting and refining processes.

``9. Documentation and Communication

 Maintain Records: Keep detailed records of the assessment process, findings, and
improvements.
 Communicate Results: Ensure that all stakeholders are informed about the assessment
outcomes and progress of improvements.
10. Evaluation

 Measure Impact: Evaluate the impact of the implemented changes on the software
development process.
 Lessons Learned: Document lessons learned and best practices for future assessments.

Software testing implementation and consideration

Implementation Steps

1. Define Testing Objectives:


o Determine the goals of the testing effort (e.g., finding defects, ensuring
performance, validating requirements).
2. Develop a Test Plan:
o Outline the scope, approach, resources, and schedule for the testing activities.
o Include details on test items, features to be tested, testing tasks, responsibilities,
and deliverables.
3. Select Testing Types and Levels:
o Decide on the types of testing required (e.g., unit testing, integration testing,
system testing, acceptance testing).
o Define the levels of testing (e.g., component level, integration level, system
level).
4. Design Test Cases:
o Create detailed test cases based on requirements and design documents.
o Ensure test cases cover all functional and non-functional requirements.
5. Set Up the Test Environment:
o Prepare the hardware and software environment where testing will be conducted.
o Ensure the environment simulates the production environment as closely as
possible.
6. Execute Test Cases:
o Run the test cases and document the results.
o Use automated testing tools where applicable to increase efficiency and
repeatability.
7. Track and Manage Defects:
o Record defects found during testing in a defect tracking system.
o Prioritize and assign defects for resolution based on their severity and impact.
8. Perform Regression Testing:
o Re-test the software after defects are fixed to ensure that changes have not
introduced new defects.
o Use automated regression tests to streamline this process.
9. Evaluate Test Results:
o Analyze the outcomes of the testing efforts to assess the quality of the software.
o Use metrics and key performance indicators (KPIs) to measure test effectiveness.
10. Prepare Test Reports:
o Document the results of testing activities, including defect summaries, test
coverage, and test case pass/fail rates.
o Share test reports with stakeholders to provide visibility into the testing process
and outcomes.
11. Conduct Test Review and Retrospective:
o Review the testing process and outcomes with the team.
o Identify lessons learned and areas for improvement in future testing efforts.

Considerations

1. Test Coverage:
o Ensure comprehensive coverage of all functional and non-functional
requirements.
o Use traceability matrices to map test cases to requirements.
2. Test Data Management:
o Create and manage test data that reflects real-world scenarios.
o Ensure test data is secure and compliant with data protection regulations.
3. Automation:
o Identify repetitive and time-consuming test cases for automation.
o Choose appropriate automation tools and frameworks.
o Maintain and update automated test scripts as the software evolves.
4. Performance Testing:
o Include performance testing to assess the software’s responsiveness, stability, and
scalability.
o Use performance testing tools to simulate user load and measure performance
metrics.
5. Security Testing:
o Conduct security testing to identify vulnerabilities and ensure the software is
secure against potential threats.
o Use tools and techniques such as penetration testing, static code analysis, and
vulnerability scanning.
6. Usability Testing:
o Assess the software’s usability to ensure it meets user expectations for ease of use
and accessibility.
o Gather feedback from real users to identify usability issues.
7. Continuous Integration and Continuous Testing:
o Integrate testing into the continuous integration (CI) pipeline to ensure ongoing
quality.
o Use CI tools to automatically trigger tests with each code commit.
8. Test Environment Management:
o Ensure the test environment is stable, consistent, and reflects the production
environment.
o Use virtualization and containerization to manage test environments effectively.
9. Team Collaboration:
o Foster collaboration between developers, testers, and other stakeholders.
o Use collaboration tools and practices such as daily stand-ups, code reviews, and
pair programming.
10. Compliance and Standards:
o Adhere to industry standards and regulatory requirements relevant to the software
being tested.
o Ensure that testing practices comply with standards such as ISO, IEEE, and
others.
11. Risk-Based Testing:
o Prioritize testing activities based on the risk and impact of potential defects.
o Focus on critical areas of the application that have the highest risk.
12. Documentation:
o Maintain thorough documentation of test plans, test cases, test results, and defect
reports.
o Ensure documentation is clear, concise, and accessible to all stakeholders.

Software Quality Management System


Software Quality Management System contains the methods that are used by the
authorities to develop products having the desired quality.
Some of the methods are:
 Managerial Structure: Quality System is responsible for managing the
structure as a whole. Every Organization has a managerial structure.
 Individual Responsibilities: Each individual present in the organization
must have some responsibilities that should be reviewed by the top
management and each individual present in the system must take this
seriously.
 Quality System Activities: The activities which each quality system must
have been
o Project Auditing.
o Review of the quality system.
o It helps in the development of methods and guidelines.
Evolution of Quality Management System
Quality Systems are basically evolved over the past some years. The evolution of a
Quality Management System is a four-step process.
1. Inception: Product inspection task provided an instrument for quality
control (QC).
2. Quality Control: The main task of quality control is to detect defective
devices, and it also helps in finding the cause that leads to the defect. It
also helps in the correction of bugs.
3. Quality Assurance: Quality Assurance helps an organization in making
good quality products. It also helps in improving the quality of the product
by passing the products through security checks.
4. Total Quality Management (TQM): Total Quality
Management(TQM) checks and assures that all the procedures must be
continuously improved regularly through process measurements.
Evolution of Quality Management System

Software Quality Assurance Plan

A Quality Assurance Plan (QAP) is a document or set of documents that outlines the
systematic processes, procedures, and standards for ensuring the quality of a product
or service. It is a key component of quality management and is used in various
industries to establish and maintain a consistent level of quality in deliverables. For
a software product or service, an SQA plan will be used in conjunction with the
typical development, prototyping, design, production, and release cycle. An SQA
plan will include several components, such as purpose, references, configuration and
management, tools, code controls, testing methodology, problem reporting and
remedial measures, and more, for easy documentation and referencing.
Importance of Software Quality Assurance Plan

 Quality Standards and Guidelines: The SQA Plan lays out the
requirements and guidelines to make sure the programme satisfies
predetermined standards for quality.
 Risk management: It is the process of recognizing, evaluating and
controlling risks in order to reduce the possibility of errors and other
problems with quality.
 Standardization and Consistency: The strategy guarantees consistent
methods, processes, and procedures, fostering a unified and well-
structured approach to quality assurance.
 Customer Satisfaction: The SQA Plan helps to ensure that the finished
product satisfies customer needs, which in turn increases overall customer
satisfaction.
 Resource optimization: It is the process of defining roles, responsibilities,
and procedures in order to maximize resource utilization and minimize
needless rework.
 Early Issue Detection: SQA Plans help identify problems early on, which
lowers the expense and work involved in fixing them.
Objectives And Goals of Software Quality Assurance Plan
 The objectives and goals of a Quality Assurance Plan (QAP) are to ensure that
the products or services meet specified quality standards and requirements.
 The plan serves as a roadmap for implementing quality assurance processes
throughout a project or organizational activity.
 The specific objectives and goals can vary depending on the nature of the
project or industry, but common elements include.
Consideration of Software Quality Assurance Plan:
 Overhead in Small Projects: The cost of developing and upholding a
detailed SQA plan may be excessive for small projects compared to their
scope and complexity.
 Opposition to Change: An SQA strategy may involve changes that teams
accustomed to their current workflows find difficult to accept, which could
result in a time of adjustment and resistance.
 Documentation Complexity: SQA plans with high documentation
requirements run the risk of adding complexity and coming off as
bureaucratic to teams, which makes it difficult to keep documentation up
to date.
 Reliance on Human Elements: An SQA plan’s performance depends on
human elements like procedure adherence and attention to detail, which
makes human mistake a possibility.
Verification and Validation Testing
Verification testing

 Verification testing includes different activities such as business requirements, system


requirements, design review, and code walkthrough while developing a product.

 It is also known as static testing, where we are ensuring that "we are developing the
right product or not". And it also checks that the developed application fulfilling all the
requirements given by the client.
Validation testing

 Validation testing is testing where tester performed functional and non-functional testing.
Here functional testing includes Unit Testing (UT), Integration Testing (IT) and System
Testing (ST), and non-functional testing includes User acceptance testing (UAT).

 Validation testing is also known as dynamic testing, where we are ensuring that "we
have developed the product right." And it also checks that the software meets the
business needs of the client.

Difference between verification and validation testing

Verification Validation

We check whether we are developing the right product We check whether the developed product is right.
or not.

Verification is also known as static testing. Validation is also known as dynamic testing.

Verification includes different methods like Validation includes testing like functional testing, system
Inspections, Reviews, and Walkthroughs. testing, integration, and User acceptance testing.

It is a process of checking the work-products (not the It is a process of checking the software during or at the
final product) of a development cycle to decide end of the development cycle to decide whether the
whether the product meets the specified requirements. software follow the specified business requirements.

Quality assurance comes under verification testing. Quality control comes under validation testing.

The execution of code does not happen in the In validation testing, the execution of code happens.
verification testing.

In verification testing, we can find the bugs early in In the validation testing, we can find those bugs, which
the development phase of the product. are not caught in the verification process.

Verification testing is executed by the Quality Validation testing is executed by the testing team to test
assurance team to make sure that the product is the application.
developed according to customers' requirements.

Verification is done before the validation testing. After verification testing, validation testing takes place.

In this type of testing, we can verify that the inputs In this type of testing, we can validate that the user
follow the outputs or not. accepts the product or not.

You might also like