0% found this document useful (0 votes)
41 views

f22 Sre Lecture 8

The document discusses requirements analysis and specification. It introduces requirements analysis as the process of studying user needs to define system requirements. Key aspects of analysis include identifying real problems, detecting conflicts, prioritizing requirements, and validating findings. Requirements are then specified by modeling the solution domain to satisfy the problem domain. Various modeling notations are described, ranging from natural language to formal notations. Verification and validation should be performed at each stage of the requirements process.

Uploaded by

M's Land
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

f22 Sre Lecture 8

The document discusses requirements analysis and specification. It introduces requirements analysis as the process of studying user needs to define system requirements. Key aspects of analysis include identifying real problems, detecting conflicts, prioritizing requirements, and validating findings. Requirements are then specified by modeling the solution domain to satisfy the problem domain. Various modeling notations are described, ranging from natural language to formal notations. Verification and validation should be performed at each stage of the requirements process.

Uploaded by

M's Land
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 82

Software

Requirements Engineering
Lecture 8
Introduction to Requirements Analysis and Specification
Today’s Agenda
 Introduction to Requirements Analysis
 Structuring Requirements
 Modeling

 Anatomy of a Good / Bad User Requirement


 Standard for Writing a Requirement
 Writing Pitfalls to Avoid
 A Few Simple Tests…
What is Requirements Analysis
 Theprocess of studying and analyzing the customer and the user needs to arrive at a definition of the
problem domain and system requirements
 Objectives
 Discover the boundaries of the new system (or software) and how it must interact with its environment within the
new problem domain
 Detect and resolve conflicts between (user) requirements
 Negotiate priorities of stakeholders
 Prioritize and triage requirements
 Elaborate system requirements, defined in the requirement specification document, such that managers can give
realistic project estimates and developers can design, implement, and test
 Classify requirements information into various categories and allocate requirements to sub-systems
 Evaluate requirements for desirable qualities

4
Questions
 Wehave seen how to specify requirements in terms of structure, standards,
and writing rules, but:
 How to identify the real problems to solve in the elicitation results?
 How to detect conflicting aspects?
 How to negotiate to resolve conflicts?
 How to decide what is important and a priority?
 How to ensure that nothing is forgotten?
 How to validate that the findings of the analysis are good?
 How to use models in this context?

5
How to Find the Real Problems?
 Ask: Why?
 Root cause analysis
 Determine (recursively) the factors that contribute to the problem(s) found by stakeholders

 The causes do not all have the same impact nor the same weight
 Some may perhaps not deserve to be corrected, at least for the moment
 Goal-oriented modeling can help understand these causes and their relationships

 This analysis identifies problems that need to be solved

6
What is Requirements Specification?
 The
invention and definition of the behavior of a new system (solution
domain) such that it will produce the required effects in the problem
domain
 Start
from a knowledge of the problem domain and the required effects
determined by elicitation and analysis Specification


description of Solution
 Generally involves modeling System needed to
problem solution satisfy Problem Domain
interface
 Results in the specification
domain documentsystem 
Hardware

Domain
 Precise expression of requirements, may include models 
Software
properties

Requirements
7
Requirements Analysis
 Problem analysis
 Development of product vision and project scope
 Analysis and elicitation feed each other
Elicitation
Elicitation Questions and points
Notes to consider

Analysis
Requirements Specification

 Analysis goes hand-in-hand with modeling

8
Requirements Modeling
 Elicitation/analysis and modeling are intermixed

Source: http://www.telelogic.com/download/paper/SystemsEngineeringSandwich.pdf
9
Requirements Modeling
 This is an essential task in specifying requirements
 Map elements obtained by elicitation to a more precise form
 Help better understand the problem
 Help find what is missing or needs further discussion

 Different modeling languages


 Informal: natural language
 Goal-oriented modeling (GRL)
 Functional modeling:
UML (Unified Modeling Notation)
SDL (Specification and Description Language)
Logic, e.g. Z, VDM (Vienna Development Method)
UCM (Use Case Maps)
...

10
Requirements Verification and Validation
 Need to be performed at every stage during the (requirements) process
 Elicitation
 Checking back with the elicitation sources
 “So, are you saying that . . . . . ?”
 Analysis
 Checking that the domain description and requirements are correct
 Specification
 Checking that the defined system requirement will meet the user requirements under the
assumptions of the domain/environment
 Checking conformity to well-formedness rules, standards…

11
Requirements Classification
 Inorder to better understand and manage the large number of requirements, it is important to
organize them in logical clusters
 It
is possible to classify the requirements by the following categories (or any other clustering that
appears to be convenient)
 Features
 Use cases
 Mode of operation
 User class
 Responsible subsystem

 This makes it easier to understand the intended capabilities of the product


 And more effective to manage and prioritize large groups rather than single requirements

12
Requirements Classification – Features
 A Feature is
 a set of logically related (functional) requirements that provides a capability to the user
and enables the satisfaction of a business objective

 The description of a feature should include1


 Name of feature (e.g. Spell check)
 Description and Priority
 Stimulus/response sequences
 List of associated functional requirements

[1] Source: K.E. Wiegers


13
Requirements Classification – Feature
Example (1)
 3.1 Order Meals
 3.1.1 Description and Priority
 A cafeteria Patron whose identity has been verified may order meals either to be delivered to a specified company location or to be picked up in the
cafeteria. A Patron may cancel or change a meal order if it has not yet been prepared.
 Priority = High.
 3.1.2 Stimulus/Response Sequences
 Stimulus: Patron requests to place an order for one or more meals.
 Response: System queries Patron for details of meal(s), payment, and delivery instructions.
 Stimulus: Patron requests to change a meal order.
 Response: If status is “Accepted,” system allows user to edit a previous meal order.
 Stimulus: Patron requests to cancel a meal order.
 Response: If status is “Accepted, ”system cancels a meal order.
 3.1.3 Functional Requirements
 3.1.3.1. The system shall let a Patron who is logged into the Cafeteria Ordering System place an order for one or more meals.
 3.1.3.2. The system shall confirm that the Patron is registered for payroll deduction to place an order.
 ......

Source: K.E. Wiegers


14
Models
 Accordingto B. Selic, a model is a reduced representation (simplified, abstract) of (one aspect of) a
system used to:
 Help understand complex problems and / or solutions
 Communicate information about the problem / solution
 Direct implementation

 Qualities of a good model


 Abstract
 Understandable
 Accurate
 Predictive
 Inexpensive

15
Modeling Notations
 Natural language (English)  Semi-formal notation (URN, UML...)
 Syntax (graphics) well defined
 No special training required
 Partial common understanding, reasonably easy to
 Ambiguous, verbose, vague, obscure ...
learn
 No automation  Partial automation
 Meaning only defined informally
 Ad hoc notation (bubbles and arrows)
 Still a risk of ambiguities
 No special training required
 Formalnotation (Logic, SDL, Petri nets,
 No syntax formally defined
FSM ...)
 meaning not clear, ambiguous  Syntax & semantics defined
 No automation  Great automation (analysis and transformations)
 More difficult to learn & understand

16
Modeling notations (2)
 Informallanguage is better
understood by all stakeholders
 Good for user requirements, contract
 But, language lacks precision
 Possibility for ambiguities
 Lack of tool support

 Formal languages are more precise


 Fewer possibilities for ambiguities
 Offer tool support (e.g. automated verification and transformation)
 Intended for developers

Source (for decision table): Easterbrook and Callahan, 1997


17
Modeling Structure
 Concepts of Entities and their Relationships. Use one of the following notations:
 ERD (Entity Relationship Diagram – the traditional version)
 UML class diagrams
 Relational tables

 Can be used for the following


 Model of the problem domain (called “domain model”)
 The two versions: existing and to-be
 Model of input and output data structures of system-to-be
 Model of the stored data (database)
 not necessarily an image of the domain data
 Additional data is introduced (e.g. user preferences)
 Architectural design of the system-to-be

18
Modeling inputs and outputs
 Nature of inputs and outputs:
 IO related to problem (problem data)
 Additional data related to solution (solution data)
 E.g., prompts, user options, error messages…
 Collected in Data Dictionary using
 Plain text (natural language)
 EBNF
 Code-like notations
 Logic (e.g., VDM)
 Structure charts
…

 Graphical output (screens, forms)


 Iconic (representational) drawings, prototype screens or forms, printouts produced by operational prototype

19
Modeling Dynamic Behavior
 Behavior modeling techniques
 Text (plain, function statements, use cases)
 Decision tables
 Activity Diagrams / Use Case Maps
 Finite state machines
 Simple state machines (FSM) : use state diagrams or transition table notation
 Extended state machines (e.g. UML State Machines – including SDL)
 Harel’s State Charts (concepts included in UML notation)
 Petri nets (allows for flexible concurrency, e.g. for data flow, similar to Activitity Diagrams)
 Logic (e.g. Z, VDM) for describing input-output assertions and possibly relationship to internal object state
that is updated by operations)
 It is important to chose what best suits the problem

20
Model Analysis
 By construction
 We learn by questioning and describing the system
 By inspection
 Execute/analyze the model in our minds
 Reliable?

 By formal analysis
 Requires formal semantics (mathematical) and tools
 Reliable (in theory), but expensive (for certain modeling approaches)

 By testing
 Execution, simulation, animation, test...
 Requires well-defined semantics and execution/simulation tools
 More reliable than inspection for certain aspects
 Possible to interact directly with the model (prototype)

21
Typical Modeling Approaches
 Many approaches involve modeling to get a global picture of the requirements
 Structured Analysis (1970)
 Object-Oriented Analysis (1990)
 Problem Frames (1995)
 State Machine-Based Analysis
 Conflict Analysis
 E.g. with mis-use cases or with GRL/UCM models and strategies/scenarios
 It is important to distinguish between
 Notation used for defining the model
 Process defining a sequence of activities leading to a desired model

 Note: Analysis can be on individual requirements as well


 Remember tips and tricks on how to write better requirements

22
Requirements Specification Document (1)
 Clearlyand accurately describes each of the essential requirements (functions, performance,
design constraints, and quality attributes) of the system / software and its external interfaces
 Defines the scope and boundaries of the system / software

 Each requirement must be described in such a way that it is feasible and objectively verifiable by
a prescribed method (e.g., by inspection, demonstration, analysis, or test)

 Basis for contractual agreements between contractors or suppliers and customers

 Elaborated from elicitation notes

23
Requirements Specification Document (2)
 Specifications are intended to a diverse audience
 Customers and users for validation, contract, ...
 Systems (requirements) analysts
 Developers, programmers to implement the system
 Testers to check that the requirements have been met
 Project Managers to measure and control the project

 Different levels of detail and formality is needed for each audience

 Different templates for requirements specifications


 e.g. IEEE 830

24
Example Specification (1) lamp
Appearance
12 cm
switch
lever

Causal Input Timing Output


relationship relationship

 When the switch lever is moved down, then, within 0.1 seconds, the lamp illuminates.

 When the switch lever is moved up, then, within 0.2 seconds, the lamp goes out.

Source: Bray 2004


25
Example Specification (2)
 Extract from the requirements specification
 R1: The system shall provide illumination of at least 500 candela.
 R2: The system shall fit within a cube with maximum width of 15cm.
 R3: The illumination can be switched on and off by a human operator.
 R4: The system shall respond to operator input within 0.5 seconds.
 R5: The system shall have a built-in power supply which should be capable of maintaining
continuous illumination for at least 4 hours.
 etc . . . . . . .

 Several alternative designs could satisfy these requirements


Source: Bray 2004
26
IEEE 830-1998 Standard
 Title of Standard
 « IEEE Recommended Practice for Software Requirements Specifications »

 Describes the content and qualities of a good software requirements


specification (SRS)

 Presents several sample SRS outlines


27
IEEE 830-1998 Standard – Objectives
 Help software customers to accurately describe what they wish to obtain

 Help software suppliers to understand exactly what the customer wants

 Help participants to:


 Develop a template (format and content) for the software requirements specification (SRS)
in their own organizations
 Develop additional documents such as SRS quality checklists or an SRS writer’s handbook

28
IEEE 830-1998 Standard – Benefits
 Establishthe basis for agreement between the customers and the suppliers on what the software
product is to do

 Reduce the development effort


 Forced to consider requirements early  reduces later redesign, recoding, retesting
 Provide a basis for realistic estimates of costs and schedules

 Provide a basis for validation and verification


 Facilitate transfer of the software product to new users or new machines
 Serve as a basis for enhancement requests

29
IEEE 830-1998 Standard – Considerations
 Section 4 of IEEE 830 (how to produce a good SRS)
 Nature (goals) of SRS
 Functionality, interfaces, performance, qualities, design constraints
 Environment of the SRS
 Where does it fit in the overall project hierarchy
 Characteristics of a good SRS
 Generalization of the characteristics of good requirements to the document
 Evolution of the SRS
 Implies a change management process
 Prototyping
 Helps elicit software requirements and reach closure on the SRS
 Including design and project requirements in the SRS
 Focus on external behavior and the product, not the design and the production process (describe in a separate document)

30
IEEE 830-1998 Standard – Structure of the
SRS
 Section 5 of IEEE 830

 Contents of SRS
 Introduction
 General description of the software product
 Specific requirements (detailed)
 Additional information such as appendixes and index, if necessary

31
IEEE 830-1998 Standard – Section 1 of SRS •Describe purpose of this SRS
•Describe intended audience

•Identify the software product


•Enumerate what the system will and will not do
 Title
•Describe user classes and benefits for each
 Table of Contents
 1. Introduction
 1.1 Purpose
 1.2 Scope •Define the vocabulary of the SRS
(may reference appendix)
 1.3 Definitions. Acronyms, and Abbreviations
 1.4 References
•List all referenced documents including sources
 1.5 Overview (e.g., Use Case Model and Problem Statement;
Experts in the field)
 2. Overall Description
 3. Specific Requirements •Describe the content of the rest of the SRS
 Appendices •Describe how the SRS is organized

 Index
IEEE 830-1998 Standard – Section 2 of SRS•Present the business case and operational concept of the system
•Describe how the proposed system fits into the business context
•Describe external interfaces: system, user, hardware, software, communication
•Describe constraints: memory, operational, site adaptation

 Title •Summarize the major functional capabilities


•Include the Use Case Diagram and supporting narrative
 Table of Contents (identify actors and use cases)
•Include Data Flow Diagram if appropriate
 1. Introduction
 2. Overall Description •Describe and justify technical skills
and capabilities of each user class
 2.1 Product Perspective
 2.2 Product Functions
 2.3 User Characteristics
 2.4 Constraints
 2.5 Assumptions and Dependencies
 3. Specific Requirements •Describe other constraints that will limit developer’s
options; e.g., regulatory policies; target platform,
 4. Appendices database, network software and protocols, development
standards requirements
 5. Index
IEEE 830-1998 Standard – Section 3 of SRS
(1) Specify software requirements in sufficient
detail to enable designers to design a system to satisfy
… those requirements and testers to verify
requirements
 1. Introduction
State requirements that are externally perceivable by
 2. Overall Description users, operators, or externally connected systems
 3. Specific Requirements Requirements should include, at a minimum, a
 3.1 External Interfaces description of every input (stimulus) into the system,
 3.2 every output (response) from the system, and all
Functions
functions performed by the system in response to an
 3.3 Performance Requirements input or in support of an output
 3.4 Logical Database Requirements
(a) Requirements should have characteristics of
 3.5 Design Constraints high quality requirements
 3.6 Software System Quality Attributes (b) Requirements should be cross-referenced to
their source.
 3.7 Object Oriented Models (c) Requirements should be uniquely identifiable
 4. Appendices (d) Requirements should be organized to
maximize readability
 5. Index
IEEE 830-1998 Standard – Section 3 of SRS
(2)
•Detail all inputs and outputs
… (complement, not duplicate, information presented in section 2)
•Examples: GUI screens, file formats
 1. Introduction
•Include detailed specifications of each
 2. Overall Description use case, including collaboration and
other diagrams useful for this purpose
 3. Specific Requirements
 3.1 External Interfaces •Include:
 3.2 Functions a) Types of information used
 3.3 Performance Requirements b) Data entities and their relationships

 3.4 Logical Database Requirements


•Should include:
 3.5 Design Constraints
a) Standards compliance
 3.6 Software System Quality Attributes b) Accounting & Auditing procedures
 3.7 Object Oriented Models
•The main body of requirements organized in a variety of
 4. Appendices possible ways:
a) Architecture Specification
 5. Index b) Class Diagram
c) State and Collaboration Diagrams
d) Activity Diagram (concurrent/distributed)
IEEE 830-1998 Standard – Templates
 Annex A of IEEE 830

 Section 3 (Specific Requirements) may be organized in many different ways based on


 Modes
 User classes
 Concepts (object/class)
 Features
 Stimuli
 Organizations

36
Relationship of IEEE 830 and ISO/IEC
12207 (1)
 12207
 Common framework for « Software life cycle processes »
 ISO/IEC 12207 = IEEE/EIA 12207

 IEEE 830-1998 and IEEE/EIA 12207.1-1997 both place requirements on documents


describing software requirements
 Annex B of IEEE 830 explains the relationship between the two sets of requirements for
those who want to produce documents that comply with both standards simultaneously
 Suchcompliance may be required by customers when requesting proposals or issuing call for
tenders

37
Relationship of IEEE 830 and ISO/IEC
12207 (1)

 Note: Table
B.3 is more detailed and shows the correspondence between the two standards at the level of
requirements types

38
Writing Better Requirements
DRAFT
 The greatest challenge to any thinker is stating the problem
in a way that will allow a solution.1
[1] Bertrand Russell, 1872-1970
41

Anatomy of a Good User Requirement


Defines the system under discussion Verb with correct identifier (shall or may)

The Online Banking System shall allow the Internet user


to access her current account balance in less than 5 seconds.

Defines a positive end result Quality criteria

 Identifies the system under discussion and a desired end result that is wanted within a specified time that is measurable

 The challenge is to seek out the system under discussion, end result, and success measure in every requirement
42

Example of a Bad User Requirement


Cannot write a requirement on the user No identifier for the verb

The Internet User quickly sees her current


account balance on the laptop screen.
X
Vague quality criteria What versus how
43

Standard for Writing a Requirement


 Each requirement must first form a complete sentence
 Not a bullet list of buzzwords, list of acronyms, or sound bites on a slide

 Each requirement contains a subject and predicate


 Subject: a user type (watch out!) or the system under discussion
 Predicate: a condition, action, or intended result
 Verb in predicate: “shall” / “will” / “must” to show mandatory nature; “may” / “should” to show optionality

 The whole requirement provides the specifics of a desired end goal or result
 Contains a success criterion or other measurable indication of the quality
44

Standard for Writing a Requirement


 Look for the following characteristics in each requirement
 Feasible (not wishful thinking)
 Needed (provides the specifics of a desired end goal or result)
 Testable (contains a success criterion/other measurable indication of quality)
 Clear, unambiguous, precise, one thought
 Prioritized
 ID

 Note: several characteristics are mandatory (answer a need, verifiable, satisfiable)


whereas others improve communication
45

Writing Pitfalls to Avoid


 Never describe how the system is going to achieve something (over-specification),
always describe what the system is supposed to do
 Refrain from designing the system
 Danger signs: using names of components, materials, software objects, fields & records in the user or
system requirements
 Designing the system too early may possibly increase system costs
 Do no mix different kinds of requirements (e.g., requirements for users, system, and how the
system should be designed, tested, or installed)
 Do not mix different requirements levels (e.g., the system and subsystems)
 Danger signs: high level requirements mixed in with database design, software terms, or very technical
terms
46

Writing Pitfalls to Avoid


The system shall use Microsoft Outlook to send an X
email to the customer with the purchase confirmation.
The system shall inform the customer
that the purchase is confirmed.
 “What versus how” test
47

Writing Pitfalls to Avoid


 Never build in let-out or escape clauses
 Requirements with let-outs or escapes are dangerous because of problems that arise in testing
 Danger signs: if, but, when, except, unless, although
 These terms may however be useful when the description of a general case with exceptions is much clearer and
complete that an enumeration of specific cases
 Avoid ambiguity
 Write as clearly and explicitly as possible
 Ambiguities can be caused by:
 The word or to create a compound requirement
 Poor definition (giving only examples or special cases)
 The words etc, …and so on (imprecise definition)
48

Writing Pitfalls to Avoid


 Do not use vague indefinable terms
 Many words used informally to indicate quality are too vague to be
verified
 Danger signs: user-friendly, highly versatile, flexible, to the
maximum extent,
The Easy approximately,
Entry as much
System shall be easy to use and
X
as require
possible, minimal
impacta minimum of training except for the professional mode.
49

Writing Pitfalls to Avoid


 Do not make multiple requirements
 Keep each requirement as a single sentence
 Conjunctions (words that join sentences together) are danger signs: and, or, with, also
 Do not ramble
 Long sentences with arcane language
 References to unreachable documents

The Easy Entry Navigator module shall consist of order


entry and communications, order processing, result
X
processing, and reporting. The Order Entry module shall be
integrated with the Organization Intranet System and
results are stored in the group’s electronic customer record.
50

Writing Pitfalls to Avoid


 Do not speculate
 There is no room for “wish lists” – general terms about things that somebody probably wants
 Danger signs: vague subject type and generalization words such as usually, generally, often,
normally, typically
 Do not express suggestions or possibilities
 Suggestions that are not explicitly stated as requirements are invariably ignored by developers
 Danger signs: may, might, should, ought, could, perhaps, probably

 Avoid wishful thinking



Therun
fully upgradeable, Easy
on allEntry X
Wishful thinking means asking for the impossible (e.g., 100% reliable, safe, handle all failures,
System may be fully adaptable to all
platforms)
situations and often require no reconfiguration by the user.
51

A Few Simple Tests…(1)


 “What versus how” test discussed earlier
 Example: a requirement may specify an ordinary differential equation that must be
solved, but it should not mention that a fourth order Runge-Kutta method should be
employed

 “What is ruled out” test


 Does the requirement actually make a decision (if no alternatives are ruled out, then
no decision has really been made)
 Example: a requirement may be already covered by a more general one
Source: Spencer Smith, McMaster U.
52

A Few Simple
The Tests…(2)
software shall be reliable. X
 “Negation” test
 If the negation of a requirement represents a position that someone might argue for, then the original
decision is likely to be meaningful
The car shall have an engine. X
 “Find the test” test

 The requirement is problematic if no test can be found or the requirement can be tested with a test that
does not make sense
 Test: look, here it is!
Source: Spencer Smith, McMaster U.
53
Rate these Requirements
The Order Entry system provides for quick, user- X
friendly and efficient entry and processing of all orders.

Invoices, acknowledgments, and shipping notices shall


be automatically faxed during the night, so customers
X
can get them first thing in the morning.

Changing report layouts, invoices, labels, and form


letters shall be accomplished.
X
The system shall be upgraded in one whack. X
The system has a goal that as much of the IS data as X
possible be pulled directly from the T&M estimate.
Towards Good Requirements Specifications (1)
 Valid (or “correct”)  Consistent
 Expresses actual requirements  Doesn’t contradict itself

 Complete (satisfiable)
 Specifies all the things the system  Uses all terms consistently
must do (including contingencies)  Note: inconsistency can be hard to
 ...and all the things it must not do! detect, especially in concurrency/timing
aspects and condition logic
 Conceptual Completeness
 Formal modeling can help
(e.g., responses to all classes of input)
 Structural Completeness  Beneficial
(e.g., no TBDs!!!)  Has benefits that outweigh the costs of
development

Source: Adapted from Blum 1992, pp164-5 and the IEEE-STD-830-1993


54
Towards Good Requirements Specifications (2)
 Necessary  Verifiable
 Doesn’t contain anything that isn’t  A process exists to test satisfaction
“required” of each requirement
 Unambiguous  “every requirement is specified
behaviorally”
 Every statement can be read in
exactly one way  Understandable (clear)
 Clearly defines confusing terms  E.g., by non-computer specialists
(e.g., in a glossary)
 Modifiable
 Uniquely identifiable  Must be kept up to date!
 For traceability and version control

Source: Adapted from Blum 1992, pp164-5 and the IEEE-STD-830-1993


55
Typical Mistakes
 Noise = the presence of text that carries no  Wishful thinking = text that defines a feature
relevant information to any feature of the problem that cannot possibly be validated
 Silence = a feature that is not covered by any text  Jigsaw puzzles = e.g., distributing requirements
 Over-specification = text that describes a feature of across a document and then cross-referencing
the solution, rather than the problem  Inconsistent terminology = inventing and then
 Contradiction = text that defines a single feature in changing terminology
a number of incompatible ways
 Putting the onus on the development staff = i.e.
 Ambiguity = text that can be interpreted in >=2 making the reader work hard to decipher the
different ways intent
 Forward reference = text that refers to a feature yet  Writing for the hostile reader (fewer of these
to be defined
exist than friendly ones)

Source: Steve Easterbrook, U. of Toronto


56
57

Key Questions and Characteristics


• Remember the key questions “Why?” or
“What is the purpose of this?”

• Feasible

• Needed

• Testable
Measures and Metrics
59

Non-Functional Requirements
 Non-Functional Requirements and Software Quality Attributes
 Software Quality
 Classifications of Non-Functional Requirements
 Quality Measures

 To measure is to know. If you can not measure it, you can not improve it. 1
[1] Lord Kelvin (1824 - 1907)
60

Software Quality (1)


 Most definitions require compliance with requirements

 “Conformance to explicitly stated functional and performance requirements,


explicitly documented development standards, and implicit characteristics that are
expected of all professionally developed software.”1

 Implication:
 We need to be able to explicitly quantify requirements and verify that any solution meets them
 We need measures

[1] Pressman, 1997


61

Software Quality (2)


 An interesting phenomenon:

Measurable objectives are usually achieved!

 Therefore, unless you have unrealistic values, requirements are usually met
 Important to know what measures exist!
 The chosen values, however, will have an impact on the amount of work during
development as well as the number of alternatives and architectural designs from which
developers may choose to meet the requirements
62

Quantification
 Non-functional requirements need to be measurable
 Avoid subjective characterization: good, optimal, better...

 Values are not just randomly specified


 Must have a rational
 Stakeholder must understand trade-offs
 Important to rank and prioritize the requirements

 Precise numbers are unlikely to be known at the beginning of the requirement process
 Do not slow down your initial elicitation process
 Ensure that quality attributes are identified
 Negotiate precise values later during the process
63

Measures vs. Metrics


 We use measures in a generic way but there is actually a distinction
between measures and metrics

 For example, consider reliability


 Metric: mean time between failures
 Measure: number of failures in a period of time (an observation!)
 Reading the text on Wikipedia about software and performance metrics, I get the impression that metrics and measure mean the same thing. To define a measure,
you have to define WHAT you measure (that is, the quality), the metric units used with the measurement values, and HOW you measure what is measured. For the
example of Reliability above, the first line defines the quality (WHAT?), and the second lines defines a measurement method (HOW?). – G.v. B.
64

Some Relationships
collection of qualities

Quality (WHAT?)

Required value (with unit)

HOW?

Source: D. Firesmith, http://www.jot.fm/issues/issue_2003_09/column6/


65

Performance Measures (1)


 Lots of measures
 Response time, number of events processed/denied in some interval of time, throughput, capacity,
usage ratio, jitter, loss of information, latency...
 Usually with probabilities, confidence interval

 Can be modeled and simulated (mainly at the architectural level) – performance


prediction
 Queuing model (LQN), process algebra, stochastic Petri nets
 Arrival rates, distributions of service requests
 Sensitivity analysis, scalability analysis
66

Performance Measures (2)


 Examples of performance requirements
 The system shall be able to process 100 payment transactions per second in
peak load.
 In standard workload, the CPU usage shall be less than 50%, leaving 50% for
background jobs.
 Production of a simple report shall take less than 20 seconds for 95% of the
cases.
 Scrolling one page up or down in a 200 page document shall take at most 1
second.
67

Reliability Measures (1)


 Measure degree to which the system performs as required
 Includes resistance to failure
 Ability to perform a required function under stated conditions for a specified period of time
 Very important for critical, continuous, or scientific systems

 Can be measured using


 Probability that system will perform its required function for a specified interval under stated conditions
 Mean-time to failure
 Defect rate
 Degree of precision for computations
68

Reliability Measures (2)


 Examples
 The precision of calculations shall be at least 1/106.
 The system defect rate shall be less than 1 failure per 1000 hours of
operation.
 No more than 1 per 1000000 transactions shall result in a failure
requiring a system restart.
69

Availability Measures (1)


 Definition: Percentage of time that the system is up and running correctly
 Can be calculated based on Mean-Time to Failure (MTBF) and Mean-Time to Repair (MTTR)
 MTBF : Length of time between failures
 MTTR : Length of time needed to resume operation after a failure
 Availability = MTBF/(MTBF+MTTR)

 May lead to architectural requirements


 Redundant components (lower MTBF)
 Modifiability of components (lower MTTR)
 Special types of components (e.g., self-diagnostic)

 Measurement: The mean time to failure and mean time to repair of critical components must be identified (typically measured) or estimated
 Modeling reliability and availability: e.g. Markov models
70

Availability Measures (2)


 Examples
 The system shall meet or exceed 99.99% uptime.
 The system shall not be unavailable more than 1 hour per 1000 hours of operation.
 Less than 20 seconds shall be needed to restart the system after a failure 95% of the time. (This is a MTTR
requirement)

 Availability Downtime
 90% 36.5 days/year
 99% 3.65 days/year
 99.9% 8.76 hours/year
 99.99% 52 minutes/year
 99.999% 5 minutes/year
 99.9999% 31 seconds/year
71

Security Measures (1)


There are at least two measures:
1. The ability to resist unauthorized attempts at usage
2. Continue providing service to legitimate users while under denial of service attack (resistance to DoS attacks)
 Measurement methods:
 Success rate in authentication
 Resistance to known attacks (to be enumerated)
 Time/efforts/resources needed to find a key (probability of finding the key)
 Probability/time/resources to detect an attack
 Percentage of useful services still available during an attack
 Percentage of successful attacks
 Lifespan of a password, of a session
 Encryption level
72

Security Measures (2)


 May lead to architectural requirements
 Authentication, authorization, audit
 Detection mechanisms
 Firewall, encrypted communication channels

 Can also be modeled (logic ...)

 Examples of requirements
 The application shall identify all of its client applications before allowing them to use its
capabilities.
 The application shall ensure that the name of the employee in the official human resource and
payroll databases exactly matches the name printed on the employee’s social security card.
 At least 99% of intrusions shall be detected within 10 seconds.
73

Usability Measures (1)


In general, concerns ease of use and of training end users. The following more specific measures
can be identified:
 Learnability
 Proportion of functionalities or tasks mastered after a given training time
 Efficiency
 Acceptable response time
 Number of tasks performed or problems resolved in a given time
 Number of mouse clicks needed to get to information or functionality

 Memorability
 Number (or ratio) of learned tasks that can still be performed after not using the system for a given time
period
 Error avoidance
 Number of error per time period and user class
 Number of calls to user support
74

Usability Measures (2)


 Error handling
 Mean time to recover from an error and be able to continue the task
 User satisfaction
 Satisfaction ratio per user class
 Usage ratio

 Examples
 Four out of five users shall be able to book a guest within 5 minutes after a 2-hour
introduction to the system.
 Novice users shall perform tasks X and Y in 15 minutes.
Experienced users shall perform tasks X and Y in 2 minutes.
 At least 80% of customers polled after a 3 months usage period shall rate their
satisfaction with the system at 7 and more on a scale of 1 to 10.
75

Maintainability Measures (1)


 Measures ability to make changes quickly and cost effectively
 Extension with new functionality
 Deleting unwanted capabilities
 Adaptation to new operating environments (portability)
 Restructuring (rationalizing, modularizing, optimizing, creating reusable components)

 Can be measured in terms of


 Coupling/cohesion metrics, number of anti-patterns, cyclomatic complexity
 Mean time to fix a defect, mean time to add new functionality
 Quality/quantity of documentation

 Measurement tools
 code analysis tools such as IBM Structural Analysis for Java
(http://www.alphaworks.ibm.com/tech/sa4j)
76

Maintainability Measures (2)


 Examples of requirements
 Every program module must be assessed for maintainability according to
procedure xx. 70% must obtain “highly maintainable” and none “poor”.
 The cyclomatic complexity of code must not exceed 7.
No method in any object may exceed 200 lines of code.
 Installation of a new version shall leave all database contents and all
personal settings unchanged.
 The product shall provide facilities for tracing any database field to places
where it is used.
77

Testability Measures
Measures the ability to detect, isolate, and fix defects
 Time to run tests
 Time to setup testing environment (development and execution)
 Probability of visible failure in presence of a defect
 Test coverage (requirements coverage, code coverage…)

 May lead to architectural requirements


 Mechanisms for monitoring
 Access points and additional control

 Examples
 The delivered system shall include unit tests that ensure 100% branch coverage.
 Development must use regression tests allowing for full retesting in 12 hours.
78

Portability Measures
Measure ability of the system to run under different computing environments
 Hardware, software, OS, languages, versions, combination of these
 Can be measured as
 Number of targeted platforms (hardware, OS…)
 Proportion of platform specific components or functionality
 Mean time to port to a different platform

 Examples
 No more than 5% of the system implementation shall be specific to the operating system.
 The meantime needed to replace the current Relational Database System with another
Relational Database System shall not exceed 2 hours. No data loss should ensue.
79

Integrability and Reusability Measures


Integrability
 Measures ability to make separated components work together
 Can be expressed as
 Mean time to integrate with a new interfacing system

Reusability
 Measures ability that existing components can be reused in new applications
 Can be expressed as
 Percentage of reused requirements, design elements, code, tests…
 Coupling of components
 Degree of use of frameworks
80

Robustness Measures
Measure ability to cope with the unexpected
 Percentage of failures on invalid inputs
 Degree of service degradation
 Minimum performance under extreme loads
 Active services in presence of faults
 Length of time for which system is required to manage stress conditions

 Examples
 The estimated loss of data in case of a disk crash shall be less than 0.01%.
 The system shall be able to handle up to 10000 concurrent users when satisfying all
their requirements and up to 25000 concurrent users with browsing capabilities.
81

Domain-specific Measures
The most appropriate quality measures may vary from one application domain to another, e.g.:

 Performance
 Web-based system:
Number of requests processed per second
 Video games:
Number of 3D images per second

 Accessibility
 Web-based system:
Compliance with standards for the blind
 Video games:
Compliance with age/content ratings systems (e.g., no violence)
82

Other Non-Functional Requirements


 What about NFRs such as “fun” or “cool” or “beautiful” or “exciting”?
 How can these be measured?
 The lists of existing quality attributes are interesting but they do not
include all NFRs.
 It is sometimes better to let customers do their brainstorming before
proposing the conventional NFR categories.
 In any case, we must also refine those goals into measurable
requirements.

You might also like