Software Testing and
Quality Assurance
Lecture 2:
Foundations of
Software Testing
SFE 4030 1
Terminology:
Failure
“Deviation of the component or system from its expected delivery,
service or result”
“Manifested inability of a system
to perform required function”
2
3
Windows failure leads to “blue screen of death”.
4
Terminology: Defect /
Fault
“Flaw in component or system
that can cause the component or system
to fail to perform its required function”
“A defect, if encountered during execution,
may cause a failure of the component or system”
Synonym: Fault
5
6
Fault in Apple’s Secure Socket Layer code
7
https://avandeursen.com/2014/02/22/gotofail-
security/
Terminology:
Error
“A human action
that produces an incorrect result”
Synonym: Mistake
8
Faults, Failures, and
Bugs
• Failure:
• Manifested inability of a system
to perform required function.
• Defect (fault):
• missing / incorrect code
“bug”
• Error (mistake)
• human action producing fault
• And thus:
• Testing: Attempt to trigger
failures
• Debugging: Attempt to find faults given a failure
9
Principles of Testing
#1
• Testing shows the presence of defects
• Testing does not show the absence of defects!
• “no test team can achieve 100% defect detection
effectiveness” (Black et al)
10
Principles of Testing #2:
Exhaustive Testing is ….....
……….
11
Principle #2: Exhaustive Testing is
Impossible • A simple program: 3 inputs, 1 output
• a,b,c: 32 bit integers
• trillion test cases / s.
All oceans dry
All plants dead Tests done (2.5 bill. y)
26
Principles of Testing #3: Test
Early
• Start testing as early as possible
• To let tests guide design
• To get feedback as early as possible
• To find bugs when they are cheapest to fix
• To find bugs when have caused least damage
13
Faults can be introduced
at any moment in the
software development
process
Finding faults in different
development phases may
require different types of
testing
14
Cost to Repair:
Early Discovery Pays
Off 16
4
2
1
Requirement Design Unit test Acceptance Post release
15
test
Principles of Testing #4: If you find a bug, keep
Defects are likely to be on searching in its
‘neighborhood’
Clustered
• “Hot” components requiring frequent
change, bad habits, poor developers,
tricky logic, business uncertainty,
innovative, size, ….
• Pareto Principle / Law of vital few: “Pareto Diagram”
• 80% of effects come from 20% of causes for an IBM system:
70% of defects caused by
2 components.
• Use to focus test effort
30
Principles of Testing #5:
Is there one best test method for my project?
The pesticide paradox:
Every method you use to prevent or find
bugs leaves a residue of subtler bugs
against which those methods are ineffectual.
• Re-running the same test suite again and again on
a changing program gives a false sense of security
• We need variation in testing
17
Principles of Testing #6:
Is there a single test method for any project?
• Testing is context-dependent
18
Principles of Testing #7
Absence-of-errors
Fallacy
There is more to success than absence of errors
Thorough understanding of business value is necessary
“Building the software right versus building the right software.”
“Finding and removing defects is not a way to improve the overall
quality or performance of a system” – Russ Ackoff
https://embeddedartistry.com/blog/2019/2/5/beyond-continual-improvement 33
Revisiting the
Principles?
1. Testing cannot show absence of bugs
2. Exhaustive testing is impossible
3. Testing needs to start early
4. Defects tend to be clustered
5. Pesticide paradox yields test methods ineffective
6. Testing is context-dependent
7. There is more to quality than absence of defects
20
Psychology of
Testing
• We are all biased
• Independence of testing required:
• From self-testing to external approval
• Testers with different backgrounds
• Good knowledge of rigorous techniques and procedures required
21
Cognitive
Bias
• “System 1”: Fast, instinctive,
emotional.
• "System 2”: slower, more
deliberative, and more logical.
• System 2 requires effort and is
happy to let System 1 do the
work
22
Example of Cognitive
Bias
• “What You See is All There Is” (WYSIATI)
• Being satisfied with the evidence you see.
• Key problem in software engineering:
• “It works on my machine”
• “I’ve tried it, and it works”
• “All 100 tests pass, we can ship”
23
ACM Software EngineeringCode of
Ethics
1. PUBLIC - Software engineers shall act consistently with the public interest.
2. CLIENT AND EMPLOYER - Software engineers shall act in a manner that is in the best
interests of their client and employer consistent with the public interest.
3. PRODUCT - Software engineers shall ensure that their products and related
modifications meet the highest professional standards possible.
4. JUDGMENT - Software engineers shall maintain integrity and independence
in their professional judgment.
5. MANAGEMENT - Software engineering managers and leaders shall subscribe to and
promote an ethical approach to the management of software development and
maintenance.
6. PROFESSION - Software engineers shall advance the integrity and reputation of the
profession consistent with the public interest.
7. COLLEAGUES - Software engineers shall be fair to and supportive of their colleagues.
8. SELF - Software engineers shall participate in lifelong learning regarding the practice
of their profession and shall promote an ethical approach to the practice of the 24
profession.
25
https://www.acm.org/code-of-ethics
26
https://www.acm.org/code-of-ethics
Question!
Which test level corresponds to testing a system to check if individual
component are fulfilling functionalities?
A. Acceptance Testing
B. Integration Testing
C. System Testing
D. Unit Testing
27
Question!
Which test level corresponds to testing a system to check if individual
component are fulfilling functionalities?
A. Acceptance Testing
B. Integration Testing
C. System Testing
D. Unit Testing
28
Software Life Cycle
• Period of time that
• begins when a software system is conceived
• ends when the system is no longer available for use.
• Phases: concept development, requirements, design,
implementation, test, installation, retirement
• Phases may overlap and be performed iteratively
29
30
Verification vs validation
Verification:
"Are we building the product right”.
The software should conform to its specification.
Validation:
"Are we building the right product”.
The software should do what the user really requires.
Chapter 8 Software testing 32
V & V confidence
Aim of V & V is to establish confidence that the system is ‘fit for purpose’.
Depends on system’s purpose, user expectations and marketing
environment
33
V & V confidence
• Software purpose
• The level of confidence depends on how critical the software is to an
organisation.
• User expectations
• Users may have low expectations of certain kinds of software.
• Marketing environment
• Getting a product to market early may be more important than finding
defects in the program.
Validation and defect testing
The first goal leads to validation testing
You expect the system to perform correctly using a given
set of test cases that reflect the system’s expected use.
The second goal leads to defect testing
The test cases are designed to expose defects. The test
cases in defect testing can be deliberately obscure and need
not reflect how the system is normally used.
Chapter 8 Software testing/Sommerville 35
Testing Process Goals
Validation testing
To demonstrate to the developer and the system customer that
the software meets its requirements
A successful test shows that the system operates as intended.
Defect testing
To discover faults or defects in the software where its behavior is
incorrect or not in conformance with its specification
A successful test is a test that makes the system perform
incorrectly and so exposes a defect in the system.
Chapter 8 Software testing 36
Inspections and testing
Software inspections Concerned with analysis of
the static system representation to discover problems (static verification)
May be supplement by tool-based document and code analysis.
Software testing Concerned with exercising and
observing product behaviour (dynamic verification)
The system is executed with test data and its operational behaviour is
observed.
Chapter 8 Software testing 37
Inspections and testing
Chapter 8 Software testing 38
Software inspections
These involve people examining the source representation with the aim of discovering
anomalies and defects.
Inspections not require execution of a system so may be used before implementation.
They may be applied to any representation of the system (requirements, design,
configuration data, test data, etc.).
They have been shown to be an effective technique for discovering program errors.
Chapter 8 Software testing 39
Advantages of Inspections
During testing, errors can mask (hide) other errors. Because inspection is a
static process, you don’t have to be concerned with interactions between
errors.
Incomplete versions of a system can be inspected without additional costs. If
a program is incomplete, then you need to develop specialized test harnesses
to test the parts that are available.
As well as searching for program defects, an inspection can also consider
broader quality attributes of a program, such as compliance with standards,
portability and maintainability.
Chapter 8 Software testing 40
Inspections and testing
Inspections and testing are complementary and not opposing
verification techniques.
Both should be used during the V & V process.
Inspections can check conformance with a specification but not
conformance with the customer’s real requirements.
Inspections cannot check non-functional characteristics such as
performance, usability, etc.
Chapter 8 Software testing 41
A model of the software
testing process
Chapter 8 Software testing 42
Stages of testing
Development testing, where the system is tested during
development to discover bugs and defects.
Release testing, where a separate testing team test a complete
version of the system before it is released to users.
User testing, where users or potential users of a system test the
system in their own environment.
Chapter 8 Software testing 43
Test
Levels
• Component (unit) testing.
Units in isolation
• Integration testing
Interaction
between units
• System testing
System-level
properties
Test levels in right leg of V model
• Acceptance
testing
Focus on user 44
Graham, D., Black, R., & Van Veenendaal, E. (2019). Foundations of
needs software testing. Cengage.
Systems Thinking / Russ
Ackoff
• The defining properties of a system are properties of the whole which
none of its parts have
• A system is not the sum of the behavior of its parts, it is a product of
their interactions
• The performance of a system depends on how the parts fit, not how
they act taken separately
https://embeddedartistry.com/blog/2019/2/5/beyond-continual-improvement
45
Test
Types
Group of test activities
Aimed at testing a component or system
Focused on a specific test objective
46
Test
Types
• Testing of Function
• Functional testing, black box testing
• Testing of software product characteristics
• Non-functional testing
• Testing of software structure / architecture
• Structural testing, white box testing
• Testing related to changes
• Confirmation vs Regression testing 59
Regression
Testing
• Testing of a previously tested program
• Following modification
• To ensure that defects have not been introduced
• In unchanged areas of the software
• As a result of the changes made.
• Performed when the software or its environment is changed.
• Continuous delivery? Automate regression testing
60
Testing As Software
Evolves
• The norm in software development!
• Many updates per day in modern web apps
• Current cars (wireless updates)
• Formula 1 (every two weeks)
• Aircraft (737 max compatibility)
• Dependency and compatibility management are key
• Direct test efforts to changed software
• Confirm change works as expected, without regressions.
• Automated regression testing where possible
61
Maintenance Testing
• Testing after a system is stable and deployed
• Test changes to an operational system
• Test impact of changed environment to an operational system
(e.g. security updates of libraries used)
• Impact analysis:
• Determine which parts are affected by change
• Conduct regression testing for those.
62
Key Points
• Software development follows life cycle activities
• The V-Model helps reason about verification vs validation,
decomposition vs composition, and construction vs testing.
• We test at different levels
• Different test types target specific test objectives
• Iterative projects do continuous regression testing and thus maximize
test automation
64