1 Introduction1
1 Introduction1
1
Defect Reduction Techniques
Review
Testing
Formal verification
Development process
Systematic methodologies
2
Why Test?
3
How Do You Test a Program?
4
How Do You Test a Program?
5
How Do You Test a Program?
6
What’s So Hard About Testing ?
Consider int proc1(int x, int y)
Assuming a 64 bit computer
◦ Input space = 2128
7
Testing Facts
Consumes largest effort among all phases
◦ Largest manpower among all other
development roles
◦ Implies more job opportunities
About 50% development effort
◦ But 10% of development time?
◦ How?
8
Testing Facts
Testingis getting more complex and
sophisticated every year.
◦ Larger and more complex programs
◦ Newer programming paradigms
9
Overview of Testing Activities
10
Error, Faults, and Failures
A failure is a manifestation of an
error (also defect or bug).
◦ Mere presence of an error may not
lead to a failure.
11
Pesticide Effect
Errors that escape a fault detection technique:
◦ Can not be detected by further applications of that
technique.
12
Pesticide Effect
Assume we use 4 fault detection
techniques and 1000 bugs:
◦ Each detects only 70% bugs
◦ How many bugs would remain
◦ 1000*(0.3)4=81 bugs
13
Fault Model
Types of faults possible in a
program.
Some types can be ruled out
◦ Concurrency related-problems in a
sequential program
14
Fault Model of an OO Program
OO Faults
Structural Algorithmic
Faults Faults
15
Hardware Fault-Model
Simple:
◦ Stuck-at 0
◦ Stuck-at 1
◦ Open circuit
◦ Short circuit
Simple ways to test the presence of each
Hardware testing is fault-based testing
16
Software Testing
Each test case typically tries to establish
correct working of some functionality
◦ Executes (covers) some program elements
◦ For restricted types of faults, fault-based
testing exists.
17
Test Cases and Test Suites
Testa software using a set of
carefully designed test cases:
◦ The set of all test cases is
called the test suite
18
Test Cases and Test Suites
A test case is a triplet [I,S,O]
◦ I is the data to be input to the
system,
◦ S is the state of the system at which
the data will be input,
◦ O is the expected output of the
system.
19
Aim of Testing
The aim of testing is to identify all
defects in a software product.
However, in practice even after
thorough testing:
◦ one cannot guarantee that the
software is error-free.
20
Aim of Testing
Theinput data domain of most
software products is very large:
◦ it is not practical to test the
software exhaustively with each
input data value.
21
Aim of Testing
Testing does however expose many
errors:
◦ testing provides a practical way of
reducing defects in a system
◦ increases the users' confidence in a
developed system.
22
Aim of Testing
Testing is an important development
phase:
◦ requires the maximum effort among all
development phases.
In a typical development organization:
◦ maximum number of software engineers can be
found to be engaged in testing activities.
23
Aim of Testing
Manyengineers have the wrong
impression:
◦ testing is a secondary activity
◦ it is intellectually not as
stimulating as the other
development activities, etc.
24
Aim of Testing
Testing a software product is in fact:
◦ as much challenging as initial
development activities such as
specification, design, and coding.
Also, testing
involves a lot of
creative thinking.
25
Levels of Testing
Software products are tested at three
levels:
◦ Unit testing
◦ Integration testing
◦ System testing
26
Unit testing
During unit testing, modules are tested
in isolation:
◦ If all modules were to be tested together:
it may not be easy to determine which
module has the error.
27
Unit testing
Unittesting reduces debugging effort
several folds.
◦ Programmers carry out unit testing
immediately after they complete the
coding of a module.
28
Integration testing
Afterdifferent modules of a system
have been coded and unit tested:
◦ modules are integrated in steps according
to an integration plan
◦ partially integrated system is tested at
each integration step.
29
System Testing
System testing involves:
◦ validating a fully developed system
against its requirements.
30
Verification versus Validation
Verification
is the process of
determining:
◦ Whether output of one phase of development
conforms to its previous phase.
Validation is the process of determining:
◦ Whether a fully developed system
conforms to its SRS document.
31
Verification versus Validation
Verification
is concerned with
phase containment of errors,
◦ Whereas the aim of validation is that
the final product be error free.
32
Design of Test Cases
Exhaustivetesting of any non-trivial
system is impractical:
◦ Input data domain is extremely large.
Design an optimal test suite:
◦ Of reasonable size and
◦ Uncovers as many errors as possible.
33
Design of Test Cases
If test cases are selected randomly:
◦ Many test cases would not contribute to the significance
of the test suite,
◦ Would not detect errors not already being detected by
other test cases in the suite.
Number of test cases in a randomly selected test
suite:
◦ Not an indication of effectiveness of testing.
34
Design of Test Cases
Testing
a system using a large number of
randomly selected test cases:
◦ Does not mean that many errors in the
system will be uncovered.
Consider following example:
◦ Find the maximum of two integers x and
y.
35
Design of Test Cases
The code has a simple programming error:
If (x>y) max = x;
else max = x;
Test suite {(x=3,y=2);(x=2,y=3)} can detect the
error,
A larger test suite {(x=3,y=2);(x=4,y=3);
(x=5,y=1)} does not detect the error.
36
Design of Test Cases
Systematicapproaches are required to
design an optimal test suite:
◦ Each test case in the suite should detect
different errors.
37
Design of Test Cases
There are essentially three main
approaches to design test cases:
◦ Black-box approach
◦ White-box (or glass-box) approach
◦ Grey-box (or model based) approach
38
Black-Box Testing
Testcases are designed using only
functional specification of the software:
◦ Without any knowledge of the internal
structure of the software.
For
this reason, black-box testing is also
known as functional testing.
39
Black-box Testing Techniques
There are many approaches to design
black box test cases:
◦ Equivalence class partitioning
◦ Boundary value analysis
◦ State table based testing
◦ Decision table based testing
◦ Cause-effect graph based testing
◦ Orthogonal array testing
◦ Positive-negative testing
40
White-box Testing
Designing white-box test cases:
◦ Requires knowledge about the
internal structure of software.
◦ White-box testing is also called
structural testing.
41
White-Box Testing Techniques
There exist several popular white-box testing
methodologies:
◦ Statement coverage
◦ Branch coverage
◦ Path coverage
◦ Condition coverage
◦ MC/DC coverage
◦ Mutation testing
◦ Data flow-based testing
42
Coverage-Based Testing Versus
Fault-Based Testing
Idea behind coverage-based testing:
◦ Design test cases so that certain program
elements are executed (or covered).
◦ Example: statement coverage, path coverage,
etc.
Idea behind fault-based testing:
◦ Design test cases that focus on discovering
certain types of faults.
◦ Example: Mutation testing.
43
Why Both BB and WB Testing?
Black-box White-box
Impossible to write a Does not address the
test case for every question of whether or
possible set of inputs not a program matches the
and outputs specification
Some code parts may Does not tell you if all of
not be reachable the functionality has been
implemented
Does not tell if extra
functionality has been Does not discover missing
implemented. program logic
44
Grey Box / Model Based Testing
Ingrey box testing, test cases are
designed from design documents /
models, such as UML diagrams.
Grey-box testing is also called model
based testing.
Mainly used for testing of O-O systems.
45
Summary
Discussed importance of testing and the
basic concepts of testing.
Presented the levels of testing.
◦ Unit testing
◦ Integration testing
◦ System testing
Discussed the fundamentals of black box
testing, white box testing and grey box
testing.
12/7/202208/10/1003/08/
10 46
References
1. R. Mall, Fundamentals of Software Engineering,
(Chapter – 10), Fifth Edition, PHI Learning Pvt.
Ltd., 2018.
Thank You
48