0% found this document useful (0 votes)
75 views

SE 07 Software Process

The document discusses software processes and verification and validation (V&V). It describes common software process models like waterfall, evolutionary development, and component-based development. It also covers V&V techniques like inspections, testing, and the differences between verification and validation. Key V&V goals are establishing confidence that software is fit for purpose and discovering defects, rather than being completely free of defects.

Uploaded by

Exia Clone
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views

SE 07 Software Process

The document discusses software processes and verification and validation (V&V). It describes common software process models like waterfall, evolutionary development, and component-based development. It also covers V&V techniques like inspections, testing, and the differences between verification and validation. Key V&V goals are establishing confidence that software is fit for purpose and discovering defects, rather than being completely free of defects.

Uploaded by

Exia Clone
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Munawar, PhD

Verification and Validation

SOFTWARE PROCESSES
The software process

 A structured set of activities required to develop a


software system
 Specification;
 Design;
 Validation;
 Evolution.
 A software process model is an abstract representation of a
process. It presents a description of a process from some
particular perspective.
Generic software process models

 The waterfall model


 Separate and distinct phases of specification and
development.
 Evolutionary development
 Specification, development and validation are interleaved.
 Component-based software engineering
 The system is assembled from existing components.
 There are many variants of these models e.g. formal
development where a waterfall-like process is used but the
specification is a formal specification that is refined through
several stages to an implementable design.
Waterfall model
Process iteration

 System requirements ALWAYS evolve in the


course of a project so process iteration where
earlier stages are reworked is always part of
the process for large systems.
 Iteration can be applied to any of the generic
process models.
Process activities

 Software specification
 Software design and implementation
 Software validation
 Software evolution
Software specification

 The process of establishing what services are


required and the constraints on the system’s
operation and development.
 Requirements engineering process
 Feasibility study;
 Requirements elicitation and analysis;
 Requirements specification (MS3);
 Requirements verification (IA1, TA);
 Requirements validation (Acceptance test)
The requirements engineering process
Verification and Validation

 Assuring that a software


system meets a user's needs
Topics covered

 Verification and validation planning


 Software inspections
Verification vs validation

 Verification:
"Are we building the product right"
 The software should conform to its
specification
 Validation:
"Are we building the right product"
 The software should do what the user really
requires
The V & V process

 Is a whole life-cycle process - V & V must be


applied at each stage in the software process.
 Has two principal objectives
 The discovery of defects in a system
 The assessment of whether or not the system is
usable in an operational situation.
Static and dynamic
verification
 Software inspections concerned with analysis of
the static system representation to discover
problems (static verification)
 May be supplement by tool-based document and code
analysis
 Software testing concerned with exercising and
observing product behaviour (dynamic
verification)
 The system is executed with test data and its
operational behaviour is observed
Static and dynamic V&V

Static
verification

Requirements High-level Formal Detailed


specification Program
specification design design

Dynamic
Prototype
validation
Program testing

 Can reveal the presence of errors NOT their


absence
 A successful test is a test which discovers one
or more errors
 The only validation technique for non-
functional requirements
 Should be used in conjunction with static
verification to provide full V&V coverage
V & V goals

 Verification and validation should establish


confidence that the software is fit for purpose
 This does NOT mean completely free of
defects
 Rather, it must be good enough for its
intended use and the type of use will
determine the degree of confidence that is
needed
V & V planning

 Careful planning is required to get the most


out of testing and inspection processes
 Planning should start early in the
development process
 The plan should identify the balance between
static verification and testing
 Test planning is about defining standards for
the testing process rather than describing
product tests
The V-model of development

Requir ements System System Detailed


specification specification design design

System Sub-system Module and


Acceptance
integration integration unit code
test plan
test plan test plan and tess

Acceptance System Sub-system


Service
test integration test integration test
The structure of a software
test plan**
 The testing process
 Requirements traceability
 Tested items
 Testing schedule
 Test recording procedures
 Hardware and software requirements
 Constraints
**highly recommended that you being writing the
acceptance tests now – don’t wait until the test
plan is due
Software inspections

 Involve people examining the source


representation with the aim of discovering
anomalies and defects
 Do not require execution of a system so may
be used before implementation
 May be applied to any representation of the
system (requirements, design, test data, etc.)
 Very effective technique for discovering
errors
Inspection success

 Many different defects may be discovered in


a single inspection. In testing, one defect may
mask another so several executions are
required
 The reuse of domain and programming
knowledge means that reviewers are likely to
have seen the types of error that commonly
arise
Inspections and testing

 Inspections and testing are complementary


and not opposing verification techniques
 Both should be used during the V & V process
 Inspections can check conformance with a
specification but not conformance with the
customer’s real requirements
 Inspections cannot check non-functional
characteristics such as performance, usability,
etc.
Program inspections

 Formalised approach to document reviews


 Intended explicitly for defect DETECTION
(not correction)
 Defects may be logical errors, anomalies in
the code that might indicate an erroneous
condition (e.g. an uninitialised variable), or
non-compliance with standards
Inspection pre-conditions

 A precise specification must be available


 Team members must be familiar with the
organisation standards
 Syntactically correct code must be available
 An error checklist should be prepared
 Management must accept that inspection will
increase costs early in the software process
 Management must not use inspections for staff
appraisal
The inspection process

Planning
Overview Follow-up
Individual
Rework
preparation
Inspection
meeting
Inspection procedure

 System overview presented to inspection


team
 Code and associated documents are
distributed to inspection team in advance
 Inspection takes place and discovered errors
are noted
 Modifications are made to repair discovered
errors
 Re-inspection may or may not be required
Inspection teams

 Made up of at least 4 members


 Author of the code being inspected
 Inspector who finds errors, omissions and
inconsistencies
 Reader who reads the code to the team
 Moderator who chairs the meeting and notes
discovered errors
 Other roles are Scribe and Chief moderator
Inspection checklists

 Checklist of common errors should be used to


drive the inspection
 Error checklist is programming language
dependent
 The 'weaker' the type checking, the larger the
checklist
 Examples: Initialisation, Constant naming,
loop termination, array bounds, etc.
Fault class Inspection check
Data faults Are all program variables initialised before their values
are used?
Have all constants been named?
Should the lower bound of arrays be 0, 1, or something
else?
Should the upper bound of arrays be equal to the size of
the array or Size -1?
If character strings are used, is a d elimiter explicitly
assigned?
Control fau lts For each conditional statement, is the condition correct?
Is each loop certain to terminate?
Are compound statements correctly bracketed?
In case s tatements, are all possible cases accounted for?
Input/outpu t faults Are all input variables used?
Are all output variables ass igned a value before they are
output?
Interface faults Do all function and procedure calls have the correct
number of parameters?
Do formal and actual parameter types match?
Are the parameters in the right order?
If components access shared memory, do they have the
same model of the shared memory structure?
Storage management If a linked structure is modified, have all links been
faults correctly reassigned?
If dynamic storage is used, has space been allocated
correctly?
Is space explicitly de-allocated after it is no longer
required?
Exception Have all pos sible error conditio ns been taken into Inspection
management faults account?
checks
Inspection rate

 500 statements/hour during overview


 125 source statement/hour during individual
preparation
 90-125 statements/hour can be inspected
 Inspection is therefore an expensive process
 Inspecting 500 lines costs about 40
man/hours
effort = £2800
Key points

 Verification and validation are not the same


thing.
 Verification shows conformance with specification
 Validation shows that the program meets the
customer’s needs
 Test plans should be drawn up to guide the
testing process.
 Static verification techniques involve
examination and analysis of the program for
error detection
Key points

 Program inspections are very effective in


discovering errors
 Program code in inspections is checked by a
small team to locate software faults

You might also like