0% found this document useful (0 votes)
32 views15 pages

Measurement Maturity and The CMM: How Measurement Practices Evolve As Processes Mature

This article describes how the change in measurement approach across the CMM levels can be a natural evolution. At each successive level, the measurement practices should build on the previous levels and evolve to support the maturation of the processes. The authors have worked closely with a number of midto high-maturity organizations since 1994.

Uploaded by

anon_879989160
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views15 pages

Measurement Maturity and The CMM: How Measurement Practices Evolve As Processes Mature

This article describes how the change in measurement approach across the CMM levels can be a natural evolution. At each successive level, the measurement practices should build on the previous levels and evolve to support the maturation of the processes. The authors have worked closely with a number of midto high-maturity organizations since 1994.

Uploaded by

anon_879989160
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

SOFTWARE METRICS

People using the Software Engineering Institutes Capability Maturity Model for Software (CMM) often struggle with the apparent paradigm shift as they transition from levels 2 and 3 to levels 4 and 5. At levels 1 and 2, the measures are primarily status measures (for example, actual values vs. planned values). At level 3 defect measures are added. Then at level 4 there is a drastic change in measurement terminology, using terms such as process capability baselines and process performance baselines. People often interpret this terminology change to mean that a paradigm shift in the measurement process is needed in moving to level 4. There is confusion about how to make this transition without losing momentum. In this article the authors describe how the change in measurement approach across the CMM levels can be a natural evolution. At each successive level, the measurement practices should build on the previous levels and evolve to support the maturation of the processes. At each level, project and organizational measures from the previous levels should be refined and augmented, not replaced. The level 4 concepts of process capability baselines and process performance baselines provide a useful lens through which one can view the measurement practices at all levels. These measurement views, in turn, provide additional insight into the process improvements associated with each level. Key words: CMM, CMMI, high maturity, maturity levels, measurement, measurement maturity, measurement program, process capability baselines, process improvement, process performance baselines
6 SQP VOL. 4, NO. 3/ 2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
CHARLES WEBER AND BETH LAYMAN TeraQuest Metrics

INTRODUCTION
The authors have been working closely with a number of mid- to high-maturity organizations since 1994. These are organizations assessed at level 3, level 4, and level 5 using the Software Engineering Institutes (SEI) Capability Maturity Model for Software (CMM), version 1.1 (Paulk et al. 1995), and more recently with organizations using the CMM Integration SM models (SEI 2001). (Although this article specifically refers to the CMM for Software, the concepts presented in this article are applicable to any engineering CMM that uses the same model architecture. Where a specific measurement practice or KPA from the CMM for Software is referenced, a similar process area also exists in the CMMI.) These organizations are striving hard to implement the real intent of the CMM as well as achieve their targeted CMM

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
level rating. One thing the authors have noticed is the measurement confusion that organizations encounter as they improve from levels 2 and 3 to levels 4 and 5. This problem often seems to be related to the shift in measurement concepts and terminology at level 4 (that is, terms and concepts such as process capability baselines and process performance baselines). This article addresses this problem by demystifying some of these measurement concepts and terms, and by clarifying the natural evolution of measurement practices that should occur as organizations strive to improve their processes across all the CMM levels. There are two audiences for this article. The first audience includes people from organizations at level 3 or who are just starting on level 4people who are staring at this measurement chasm and trying to bridge it. This article shows that the measurement concepts presented in the level 4 key process areas (KPAs) are not new, but simply an evolution of level 2 and level 3 concepts. The second audience includes people from organizations at level 1 and level 2. This article also shows that if one understands these measurement concepts and incorporates the natural evolution of the CMM measurement practices into his or her measurement program, one can avoid the confusion that others have encountered. This article assumes that readers have an understanding of the CMM. For those who do not have this understanding, the authors recommend that they read (Paulk et al. 1993) for an overview of the CMM or (Paulk et al. 1995) for a detailed understanding. See the sidebar, Summary of the CMM for a thumbnail sketch of the CMM.

Summary of the CMM


The CMM is a process model that provides guidance to organizations to improve their ability to produce software by improving their processes. It is also used as a reference model in assessing their processes. It is the most widely used model of this kind in the world. It is a five-stage evolutionary improvement model that describes the essential practices that underlie the five stages. The five stages (or maturity levels) are: Level 1 (Initial) The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort and heroics. There are no key process areas (KPAs) at level 1. Level 2 (Repeatable) Basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications. The level 2 KPAs are requirements management, software project planning, software project tracking and oversight, software subcontract management, software quality assurance, and software configuration management. Level 3 (Defined) The software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization. All projects use an approved, tailored version of the organizations standard software process for developing and maintaining software. The level 3 KPAs are organization process focus, organization process definition, training program, integrated software management, software product engineering, intergroup coordination, and peer reviews. Level 4 (Managed) Detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled. The level 4 KPAs are quantitative process management and software quality management. Level 5 (Optimizing) Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies. The level 5 KPAs are defect prevention, technology change management, and process change management (Paulk et al. 1995).

Capability Maturity Model and CMM are registered in the U.S. Patent and Trademark Office. SM Capability Maturity Model Integration and CMMI are service marks of Carnegie Mellon University.

www.asq.org 7

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 1 Measurement and process maturity
Level 5 In Out

Level 4

In

Out

Level 3

In

Out

Level 2

In

Out

Source: Visibility into software process at each maturity level (Figure 2.2), Perdue in (Paulk et al., 1995)

CMM MEASUREMENT PHILOSOPHY


Measurement provides objective information about, and visibility into, project performance, process performance, process capability, and product and service quality. Use of measures and other information allow organizations to learn from the past in order to improve performance and achieve better predictability over time. The CMM certainly affirms this viewpoint and represents measurement practices as critical components of project, process, and quality management at all levels. Figure 1 shows the increased process and measurement visibility at each level. This increased visibility is a result of the more detailed definition of the processes and more sophisticated measurement practices. There are practices involving the collection and use of measures throughout the CMM. Measurement practices specifically designed to monitor process status and effectiveness are built into every KPA in the model. In addition, estimates of project size, effort, cost, critical computer resources, and schedule are required at the earliest stage (level 2) and are used to establish the

projects plans. Actual performance is tracked against the estimates and the plans. Historical data from past projects are expected to be used as a basis for deriving and validating estimates. Life-cycle defect measures are introduced at level 3. The key concept of level 4 is achieving predictability of results through a quantitative understanding of process performance. Level 5 assumes a quantitative basis for continuous process improvement and change management. The concepts of an organizations process capability baselines (PCBs) and a projects process performance baselines (PPBs) are first introduced at level 4, and they are not mentioned at any other level. (In CMM Integration the terminology has changed. There are no explicit terms that correspond to the projects process performance baseline and organizations process capability baseline. In one specific practice, the term organizational process performance baseline is used instead of the CMM term, organizations process capability baseline. In one of the CMMI process area goals, the phrase expected process performance of the organizations set of standard processes is used.) A careful reading of the CMM indicates that these are uniquely level 4 concepts. When organizations begin working on

8 SQP VOL. 4, NO. 3/ 2002, ASQ

2002, ASQ

Level 1

In

Out

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
level 4 improvements, these concepts often present a major challenge in understanding and implementing a new measurement paradigm, particularly in how the measures are organized and analyzed. Though this is a valid interpretation, the authors have found it to be easier and more effective if this barrier between level 3 and level 4 is crushed and replaced with a more evolutionary interpretation. These terms are defined in the CMM as follows: Process capability baseline (PCB) is defined as a documented characterization of the range of expected results that would normally be achieved by following a specific process under typical circumstances. A process capability baseline is typically established at an organizational level (Paulk et al. 1995). The essential aspect of PCBs is that they contain measures that can be used to predict the performance that projects can achieve. The measures are obtained from the organizations past performance (that is, performance data aggregated across multiple projects). These measures, along with other quantitative parameters (such as project size) and information (such as project and life-cycle characteristics, assumptions, and constraints), are made available so current and future projects can use them to plan, predict outcomes, and manage their efforts and results. Process performance baseline (PPB) is defined as a documented characterization of the actual results achieved by following a process, which is used as a benchmark for comparing actual process performance against expected

FIGURE 2 Process capability baselines (PCBs) and process performance baselines (PPBs)

Sched performance

Defect distribution

Effort distribution

Cost performance

Organization

Organizations standard process

Domain: New Development - Government x.x x.x x.x x.x x.x x.x N/A Domain: New Development - Commercial x.x x.x x.x x.x x.x x.x N/A Domain: New Maintenance Projects x.x N/A N/A x.x x.x x.x x.x

Turnaround time

Productivity

Defect rate

PCBs

Projects defined process Project

Projects defined process

Projects defined process

Sched performance

Sched performance

Sched performance

Defect distribution

Defect distribution

Defect distribution

Effort distribution

Effort distribution

Effort distribution

Cost performance

Cost performance

Cost performance

Productivity

Productivity

Productivity

Defect rate

Defect rate

Defect rate

PPBs
2002, ASQ

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

www.asq.org 9

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
process performance. A process performance baseline is typically established at the project level, although the initial process performance baseline will usually be derived from the organizations process capability baselines (Paulk et al. 1995). The essential aspect of PPBs is that they contain measures of a single projects performance in dimensions of importance to that project, along with other quantitative parameters and information needed to replan, predict outcomes, and manage their efforts and results. As the notional illustration in Figure 2 shows, there will be multiple PCBs for the organization, each describing a different attribute of performance that is of concern (for example, project productivity and defect density). Additionally, the organizations PCBs are typically separated by the different project types or domains that are represented in the organization. Similarly, each project will have multiple PPBs, and the set of PPBs selected for a project may be different from the other projects depending on project type, project domain, project tailoring of the organizations standard processes, and what is important for the project to manage. Some people prefer to think of a single organizational PCB that includes the various dimensions, with a similar view of the project PPB. That view, as well as the authors view that each one of these dimensions is a separate PCB or PPB, is both valid and immaterial in this article. (Note that not all the artifacts and data structures shown in Figure 2 would exist at the lower levels.) If one reads and reflects on the definitions of PCB and PPB provided here, one can see that there is nothing in these concepts that is unique to level 4. Though these terms are not used at the other levels, the underlying ideas behind PCBs and PPBs (that is, documenting expected and actual results) are present at all levels. Even level 2 measures fit within these definitions. There is a staged progression of these concepts from level 2 to level 3 to level 4 (where the concepts are fully elaborated) to level 5. In this article the authors will look at how these concepts are represented at each level and how they evolve. The authors will also look at how the quality of the measures improves and how the measures become more useful to the organization.

FIGURE 3 Level 2 measurement example (simple run chart)


Effort Allocation
300 250
Plan Actual

Staff months

200 150 100 50 0


Nov 00 Jan 01 Mar 01 May 01 Jul 01 Sep 01 Nov 01 Jan 02 Mar 02 May 02 July 02 30 Aug 01 Start of build 1 testing

Source: adapted from McGarry 2001

CMM LEVEL 2 PCBs AND PPBs


At level 2, the primary measures of concern are the size, cost, effort, critical computer resources, and schedule of the projects. In the software project planning (SPP) and software project tracking and oversight (SPTO) KPAs, the practices describe estimating, tracking and re-estimating, and recording these measures. These are the projects level 2 PPBs. Details about the size of each software component and the effort/duration of each task may not be known, so these measures are often monitored at fairly coarse levels, possibly on a phase-by-phase basis. Tracking involves monitoring actual performance against estimates and planned performance; this is often done using simple run charts showing planned vs. actuals over time (see Figure 3). Corrective action is reactiveactions are taken when actual performance begins to deviate significantly from estimates and usually involves revising the estimates and the plan. What is meant by significant deviation is subjective and depends on the experience and perspective of each manager. The deviation shown in Figure 3 may be significant to some managers and insignificant to other managers. In the same two KPAs (SPP and SPTO), the practices describe the recording of these measures along with assumptions and other associated information needed to reconstruct the estimates and assess their reasonablenessfor use by ongoing and future projects. This makes up the level 2 PCBs for the organization. At level 2, there is no organization-level analysis or cross-project consistency expected for the measures in the PCBsthe PCBs consist of raw PPB

10 SQP VOL. 4, NO. 3/ 2002, ASQ

2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 4 Level 2 baselines

Organization

PCBs

Size Effort Cost Schedule

Estimate x.x x.x x.x x.x

Assumptions aaaaaaaa bbbbbbbb ccccc ddddd

Size Effort Cost Schedule

Estimate x.x x.x x.x x.x

Size = x Size = x Size = x Effort = y


2002, ASQ

Project

Notes: aaaa aaaaaaaaaa

PPBs

data from each of the projects (see Figure 4). The practices in the SPP KPA also describe the use of historical data (that is, use of the organizations PCBs).

CMM LEVEL 3 PCBs AND PPBs


At level 3, the organization establishes its standard processes and a standard set of measures that the projects collect and report (described in the organization process definition KPA). This standard set of measures is intended to establish consistent measures across projects. (Note that in version 1.1 of the CMM, the description of a standard set of measures is not explicit; it was clarified in the CMM integration models). The practices of the integrated software management KPA describe the use of the organizations software process database as a source to estimate, plan, track, and replan the project. The organization process definition KPA describes the recording of the projects measures in this database. The measures are reviewed by the organization to ensure the integrity of the measures, and they are organized and used to

improve the organizations standard processes. There is an expectation that the measurement activities at level 3 are more coordinated than at level 2; this, along with the organizations analysis of data, is moving the organization toward quantitative management. However, the authors have noticed that, while the implementation of these level 3 measurement practices may be adequate for level 3, they are often a weak foundation for beginning work on level 4. The authors often see weaknesses in estimation and planning, data quality, data granularity, measurement automation and integration, and organizational analysis and use of measures. These weaknesses, in addition to the shift in measurement terminology at level 4, contribute to the measurement confusion as organizations push forward into higher maturity. The project planning and tracking at level 3 is primarily addressed in the integrated software management KPA. At the project level, work is broken into more granular units than at level 2 planning and tracking is often performed at the work-package level

www.asq.org 11

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 5 Level 3 measurement example
Cost Performance Index (CPI) 1.3 Monthly CPI 1.1 9.0
2002, ASQ

0.7
Jan. Feb. Mar. Apr. May Jun. Jul. Aug. Sep. Oct. Nov. Dec.

Project CPI

Goal

L. Threshold

U Threshold

(for example, the coding of an individual software component), consistent with the use of measurement techniques such as earned value. Tracking at this level of granularity still involves monitoring actual performance against planned performance; however, techniques like thresholds are also now used, as illustrated in Figure 5. Thresholds are derived by analyzing past project performance and are contained in the organizations PCBs. They represent in-process triggers for taking corrective action (that is, when the difference of the actual value from the planned value exceeds the threshold). They represent reasonable action limits based on the experiences and performance results of past projects. An example of a threshold is, corrective action is required when the actual cumulative effort expended for the work performed exceeds the plan by 12 percent of the planned value. With thresholds and tracking at finer levels of granularity, projects can proactively work to bring performance back in line with plans, rather than just replanning the project, as is done at level 2. Also at level 3, quality (that is, defect) measures are collected and analyzed, as described in the software product engineering and peer review KPAs. At level 2, the mechanisms for collecting defect data (that is, recording of change requests and problem reports in the software configuration management KPA) were established, but no real defect measurement collection or analysis is specifically identified in the CMM. Now at level 3, defect measures are collected and analyzed by all projects. Defect measures from peer reviews and testing, including characterization data such as type, severity, phase discovered, and so on, are collected and analyzed. Defect rates, thresholds, and distributions can be established and included in the organizations PCBs.

At level 3 the projects initial PPBs are derived from the organizations PCBs, as described in the organization process definition and integrated software management KPAs. These measures and other parameters are typically tailored to fit the project. They are revised with the projects data as the project proceeds. This is shown in Figure 6. In other words, the project uses the organizations PCB data for the initial estimates and plan, but may use the projects own PPB data for replanning. The project may also have project-specific measures that it includes in its PPBs. Similar to level 2, the projects PPBs include the estimates, actual values, and re-estimates of size, cost, effort, critical computer resources, and schedule, though these measures will be at a finer level of granularity compared to level 2. In addition, at level 3 the measures are more explicitly based on the defined life-cycle model and defined process and cover the significant attributes of all life-cycle phases (for example, defects found, rework, and effort expended).

CMM LEVEL 4 PCBs AND PPBs


At level 4, the characteristics of the organizations PCBs and projects PPBs are fully described in the CMM. Some of the measures may be the same as, or at least similar to, the measures used at level 2 and level 3. At level 4, the primary differences from the level 2 and level 3 measures and management parameters are: 1. There is a broader view of qualitynot just defects. These quality dimensions include reliability, usability, maintainability, flexibility, and so on. The most important attributes of product quality are identified and managed quantitatively throughout the projects life cycle. Quality of services, although not specifically addressed in the CMM, should also be included. 2. Many of the measures are taken at a finer level of granularityat the subprocess or processstep level (for example, individual peer review and design of individual components). 3. There is an objective understanding of the goodness of the measures. Dirty data are cleaned up and actions are taken to prevent recording of dirty data. (One obstacle that organizations commonly encounter at the

12 SQP VOL. 4, NO. 3/ 2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 6 Level 3 baselines

Project Project A Organization Project B Organizations standard process Project C Project D Project E Project F Project G Project H

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

Y Y Y Y Y Y Y Y

Y Y Y Y Y Y Y Y

Other characteristics Y Y Y Y Y Y Y Y PCBs X.X X.X X.X X.X PPBs


2002, ASQ

Defect density

Productivity

Project type Project K Projects defined process X.X X.X X.X X.X

Schedule

Effort

Tailors from...

Contributes to ... Bases estimates on...

Project I

Project J

Projects defined process Project

Projects defined process

Estimate Actuals Size Effort Cost Schedule X.X X.X X.X X.X X.X X.X X.X X.X Size Effort Cost Schedule

Estimate Actuals X.X X.X X.X X.X X.X X.X X.X X.X Size Effort Cost Schedule

Estimate Actuals

beginning of their level 4 improvement is that they lack a sufficiently long history of clean, granular data needed to begin level 4 analysis. If this is addressed during the establishment of the measurement program at levels 2 and 3 through data audits and verification activities, the road to level 4 can be dramatically shortened.) 4. There is a statistical understanding of the projects process performance, process variation, and process stability of individual

projects and the organization (that is, the projects in aggregate). For example, this may be in the form of control charts. 5. There is an ability to quantitatively predict future process performance as well as predict the overall quality of the products and services. This typically involves the use of quantitative process models such as a defect insertion/removal model or reliability model. (The authors have learned a considerable amount about level 4 since the CMM was

Tools

Cost

Size

www.asq.org 13

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
published in 1991. Although fundamentally the two KPAs are a correct description of level 4 practices, an effective implementation requires a good understanding of various statistical and process modeling concepts and interpretation of these KPAs in that context (Curtis et al. 2002; Wheeler and Chambers 1992; Wheeler and Poling 1998)). 6. There are sets of measures that, taken together, represent a quantitative model of the overall life-cycle process performance and expected results, not just measures of individual subprocesses or work products. Examples of this include measures of the number of defects found after delivery and measures of the overall development cycle time. At level 4, the organizations PCBs include: Measures that characterize the key attributes of the critical components of the organizations standard processes (for example, individual subprocesses, sequence of subprocesses, work products, and life-cycle phases). These are values (such as mean and variance) that projects can expect to achieve when they use a tailored version of the organizations standard processes. Measures that describe the important interrelationships among these process components (such as a defect insertion/removal model that uses defect measures from across the life cycle to predict the latent defects in a delivered product). These quantitative process models are calibrated and used by projects to

FIGURE 7 Level 4 baselines

Sched performance Tools

Other characteristics

Defect distribution

Effort distribution

Cost performance

Defect density

Organization

Project Project A

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X X.X X.X X.X X.X X.X X.X X.X

X.X

X.X

Organizations Project B standard C Project process D Project


Project E Project F Project G Project H

- Government X.X Y Y x.x X.X Y x.x x.x x.x x.x x.x N/A X.X X.X Y Y Y Domain: New Development - Commercial X.X Y Y x.x X.X Y x.x x.x x.x x.x x.x N/A X.X X.X Y Y Y Domain: New Maintenance Projects X.X X.X Y Y Y x.x X.X Y N/A x.x N/A x.x x.x x.x X.X Y Y

X.X X.X Y Y Y Domain: New Development

Turnaround time

Productivity Productivity

Project type

Schedule

Defect rate

Effort

Cost

Size

PCBs

Tailors from...

Contributes to... Bases estimates on...

Project I

Project J

Project K

Projects defined process Project

Projects defined process

Projects defined process

Sched performance

Sched performance

Defect distribution

Defect distribution

Defect distribution

Effort distribution

Effort distribution

Effort distribution

Cost performance

Cost performance

Turnaround time

Productivity

Productivity

Productivity

Defect rate

Defect rate

Defect rate

PPBs
2002, ASQ

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

x.x

14 SQP VOL. 4, NO. 3/ 2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
estimate and predict the values of attributes that cannot be measured until later in the life cycle (for example, number of latent defects in the delivered product). The organizations quantitative goals for process performance and quality, along with the (usually quantitative) description of their relationships to the organizations business issues. At level 4 these process performance and quality goals are derived from the organizations historical performance; they describe what can be achieved with the current standard processes. Measured, actual detailed performance results achieved by the projects (for individual key attributes of critical components of the organizations standard processes). Measured, actual overall performance results (such as overall productivity, overall defect density, and overall product reliability) achieved by the projects (as measured against the organizations performance and quality goals) (Curtis 2002a). At level 4, the organizations PCBs, as described in the quantitative process management KPA, reflect the different results achieved for the different tailored variants of the organizations standard processes and for different types of projects (for example, different life-cycle models, customers, and application domains). Depending on the specifics of each measure, these PCB values may be expressed as expected values, control limits, prediction intervals, specification limits, threshold values, mean and variance, and so on (see Figure 7). Similar to level 3, at level 4 the projects initial PPBs are derived from the organizations PCBs. Estimates, actual values, and re-estimates of size, cost, effort, critical computer resources, and schedule, as well as defect measures from level 3, still form a part of the level 4 PPBs. During the project planning stage, processes and measures are still tailored to fit the projects domain and specific needs. The selection and establishment of PPBs are still (perhaps even more so) based on the projects critical issues and areas of concern that require close monitoring and control. A project may still establish project-specific measures and project-specific PPBs. During project execution, the project will revise its PPBs with the projects actual performance data as the effort is under way. The additional focuses of the projects PPBs at level 4 are the measures needed to: Define the projects quantitative process performance and quality goals Establish a defined process and the project plans that can achieve these goals. In other words, the goals need to be based on what can be achieved with the projects defined process and plans, and the projects defined process and plans need to be capable of achieving the goals. Understand and manage the variance in the projects process performance Track the status, and predict and manage the achievement of the projects process performance and quality goals (Curtis et al. 2002) The way the projects are planned at level 4 is quite different. To fully understand these differences requires not only an understanding of the planning practices of both of the level 4 KPAs, but also a complete understanding of the entire KPAs and their intent. The projects defined process is constructed from the organizations components that have known quantitative capability the constructed defined process can achieve the goals. The projects are able to determine, up front, the performance and results that can be expected. For example, there may be two peerreview elements available, one cheaper but less effective in finding defects, and one that is more expensive but more effective in finding defects. The selection is based on the projects business issues.

FIGURE 8 Level 4 measurement examples


Problem report prediction model
Problem reports

Actual Model
15 14 13

Average preparation time spent


Avg. hours per reviewer
3.00 2.00 1.00 0.00

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

Code inspection ID

www.asq.org 15

2002, ASQ

Project A UCL LCL XBAR

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
Different methods and measures are used to manage different aspects of the process (for example, managing variance of defects found in inspections vs. managing variance in cost estimation). In some cases control charts may be used, while in other cases regression analysis, histograms, or run charts with thresholds may be used (see Figure 8).

Measuring Overall Process Improvement


A level 5 organization understands its critical business issues or areas of concern (such as our competition is consistently underbidding us). It knows how to set quantitative business and improvement goals to address these business issues (for example, reduce rework to 20 percent of the overall development effort and increase reuse to 35 percent of the product content). It knows what measures are needed to gauge its performance relative to these goals (for example, defect rates, amount of rework, and productivity by work element). A level 5 organization rigorously defines, collects, analyzes, tracks, and reports a fixed set of PCB measures to provide a clear picture of the organizations performance relative to these goals. The PCB measures are used by the organization to demonstrate overall improvement results over time and are used to calculate return on investment for the process improvement activities (see Figure 9). Similarly, the projects PPBs provide the means to quantitatively measure the organizations performance and results against the goals. Of course, the organizations business environment and business issues will change over time, and the definitions of the PCB measures used to assess overall improvement will also have to evolve and change. These definitions of measures, however, should be as persistent as possible over time so that the organization can understand the cost and effects of the process improvements. Any changes to the set of measures should be understood (that is, the reasons for the changes and how the revised set relates to the set it replaces).

CMM LEVEL 5 PCBs AND PPBs


On the surface it seems that the use of the organizations PCBs and the projects PPBs at level 5 are basically the same as they are at level 4, and this is true to some extent. However, their use at level 5 to benchmark, set business and process improvement goals, plan improvement efforts, pilot and evaluate candidate changes, track progress and results, and deploy process improvements complicates this picture. At level 5, there are basically three types of process improvement concerns: Improvement of the performance of individual processes (for example, improving defect detection efficiency of the testing process) Continuous incremental improvement of business results from many small improvements (for example, gradually increasing productivity by 8 percent as a result of 22 process improvements) Quantum improvement in business results from individual planned improvement efforts (for example, improving the product mean time between failures by 70 percentfrom 60 hours to 102 hoursas a result of a planned process improvement effort that addressed the life-cycle factors affecting reliability) In the following the authors will discuss the primary aspects of a level 5 process, and describe the role of the PCBs and PPBs in these level 5 activities. In implementing level 5, the authors usually recommend that organizations treat the process change management and technology change management KPAs as a single integrated improvement process. (See Curtis, Weber, and Paulk 2002 for an integrated view of these KPAs.) The defect prevention KPA is then a separate process but note that the causal analysis practices, which are the heart of this KPA, are, in fact, needed to effectively implement level 4 (Curtis, Weber, and Paulk 2002).

Setting the Organizations Process Improvement Goals


The primary factors driving process improvement at level 5, and in turn affecting the PCBs and PPBs, are the organizations business issues and business goals. The organizations business issues and business goals determine the business strategy. The organizations business issues, business goals, and business strategy determine the process improvement goals. The processes must serve the organizations business strategy and contribute their share to the business goals.

16 SQP VOL. 4, NO. 3/ 2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 9 Using PCBs to demonstrate improvements
Overall productivity trends LC defect density 8.00 6.00 4.00 2.00 0.00
96 97 98 999 000 001 2 2 19 19 19 1

Defect rates trend 50 40 30 20 10 0


96 19 97 19 98 19 99 19 00 20 01 20

Productivity

Domain A Domain B

Domain A Domain B

Estimation accuracy trends 1 Effort variance Percent reuse 0.5 0 -0.5 -1 40 30 20 10 0 1996 1997

Asset reuse

1998

1999

2000

2001

(Setting business goals and process improvement goals is a complicated process and is beyond the scope of this article. See (Curtis, Weber, and Paulk 2002) for information on setting goals). Websters New World College Dictionary, fourth edition, lists two definitions of the word goal: 1) an object or end that one strives to attain; 2) the line or place at which a race, trip, and so on is ended. Both types of goals are useful at level 5. Goals that fit the first definition give the organization an overall improvement direction and are most relevant to opportunistic (or incremental) process improvement activities. Goals that fit the second definition establish requirements for process improvement efforts and are most relevant to planned process improvement activities. In a level 5 organization, the process improvement goals should be expressed as target values for a unified set of PCB measures. Too often process improvement goals are expressed as a single measure (for example, improve productivity by 20 percent). This may explain why some organizations continually improve, but never get better it is easy to achieve improvement in a single dimension if the other dimensions are ignored. When process improvement goals are expressed as a unified set of PCB measures, all the critical business dimensions are quantitatively represented. It becomes explicit

when it is acceptable to sacrifice improvement in one dimension for improvement in another dimension (for example, increase development costs to reduce the number of defects in the delivered product). Process improvement suboptimization (if it occurs) is shown, and real process improvement (if it occurs) is also shown.

Planned Process Improvement Efforts


Planned process improvement efforts use PCBs and PPBs much like projects use them, as shown in Figure 10. A set of PCB values is defined as the requirements (goals) for the process improvement effort. These PCB values (goals) are used to select candidate changes and to evaluate them (for example, by piloting, quasiexperimental design, and other analysis techniques). The evaluations provide measures that make up the PPBs for the improvement effort. The PPBs are compared against the required PCB values (goals), and appropriate corrective actions (including possibly negotiating changes to the required PCBs) are taken. The improvement efforts development and evaluation phases are completed when the team demonstrates success against its required PCB values.

www.asq.org 17

2002, ASQ

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
FIGURE 10 Expanded PCB use at level 5
Sched performance Defect distribution Effort distribution Cost performance Turnaround time >5 2002, ASQ Defect distribution Effort distribution

Productivity

Business issues Customer feedback New opportunities

Current capability

Domain: New Development - Government x.x x.x x.x x.x x.x x.x N/A Domain: New Development - Commercial x.x x.x x.x x.x x.x x.x N/A Domain: New Maintenance Projects x.x N/A N/A x.x x.x x.x x.x

Planned improvement projects


Sched performance Defect distribution Effort distribution Cost performance

Productivity

Defect rate

Organizati Standar HOLD <1% Process

HOLD HOLD N/A

X X

The resulting (possibly revised) PCBs from the improvement effort become the PCBs that are used to deploy the process changes into routine use in the organization.

Deploying Planned Process Improvements


In general, the changes for a single planned process improvement effort are deployed as a separate bundle and not bundled with other changes. Typically a planned process improvement represents a substantial investment, and it typically will have a significant effect on how work is done in the organization and on the results of the work. It is important to be able to recognize obstacles and resistance to the changes so they can be addressed. It is also important to be able to measure the effects of the changes throughout and following the deployment activities. Sometimes changes that worked well in the evaluation stages do not scale up when fully deployed, or they may work well in one project but not in others. Sometimes the

full expected benefits are not realized when deployed, and occasionally there can be a negative effect on the results and the improvement has to be changed or withdrawn. Quantitative understanding of these changes (the cost and the effects) is essential. The PCBs from the evaluation phase of the improvement effort are the primary standard of measurement used by the deployment team to monitor the results of the improvements as they are deployed. The PPBs from the projects where the changes are deployed are compared against the target PCBs and other defined criteria. Depending on the results achieved in deployment, corrective actions may have to be taken. Depending on the changes and the situation of the projects and the organization, the changes may be deployed all at once or incrementally. In these cases, the deployment may be considered to be an extension of the evaluation phase, and the changes and the PCBs may be modified during the deployment. Once the changes are fully deployed, actual measured results from the projects (that is, the projects PPBs) are used to revise the organizations PCBs.

18 SQP VOL. 4, NO. 3/ 2002, ASQ

Defect rate

Productivity

Business and improvement goals

Cost performance

HOLD N/A HOLD

Sched performance

X X

Organizati Standar HOLD <5% >5-8% Process

Planned improvement projects

Defect rate

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature

Deploying Opportunistic Process Improvements


In general, opportunistic process improvements are bundled and deployed on a somewhat regular basis (for example, every six months or when a certain number of changes are ready for deployment). The cost and effects of the individual improvements are typically not of concern. What are important are the overall improvement trends. Most often, the PCBs are not changed when the improvements are deployed, though the process and measurement groups may hypothesize what effects are expected. The projects PPBs are monitored closely, both by the projects and by the organization, to measure the overall improvement. The organizations PCBs are revised from the projects PPBs, and they are also monitored by the organizations process and measurement group and the organizations management.

SUMMARY AND CONCLUSION


In this article the authors used the concepts of process capability baselines and process performance baselines to describe the natural evolution of the measurement practices across the CMM levels. At each level, the project and the organization measures from the previous levels are refined and augmented, not replaced. The authors expect that these concepts of the organizations PCBs and projects PPBs, as defined in the CMM and as applied across the levels, are now more clearly understood and that some of the confusion people experience upon venturing into the language of the level 4 KPAs has been eliminated. The authors believe that a better understanding of how these measurement concepts evolve across levels is useful, not only as a way to better understand and transition to level 4, but also in understanding how the measurement practices can be maximized at all levels. But understanding how measurement fits at the level one is working is not enough; one must also anticipate the measurement characteristics at the higher levels. At level 2 the definitions and use of the measures can be different from project to project. However, where possible (such as new measures), the measures should be specified in anticipation of the need for common measures at level 3.

At level 3 one should develop an understanding of the measures he or she might want to use to quantitatively manage ones process performance and quality results at level 4this may not be entirely clear when working on level 3. In particular, the focus should be on a good set of base measures (for example, product size, schedule, effort, and defect/quality) that can be combined in various ways and support the high maturity levels. The required granularity of these measures for level 4 analysis are likely to be finer than is needed at level 3. There may also be a need for a large sample of individual measures, possibly covering a medium to long time period. The projects and organization also need to pay attention to the quality and cleanliness of the measures at level 3 or the large amount of data that they collect may be garbage. Another area where a look ahead is important is the projects tailoring of the organizations standard processes. If too much tailoring flexibility is allowed, it will be difficult to construct meaningful PCBs for the organizationthe variance in the measures will be too large to do meaningful analysis (Layman and Weber 2002). At level 4 one should be aware of the organizations and projects business issues and the areas in which the organization will want to make substantial improvements. The organizations PCBs should be defined, and measures should be collected from the projects and analyzed to characterize the baseline capability and performance. These baselines provide the quantitative basis to measure the results of the process improvements at level 5.
ACKNOWLEDGMENTS
Much of this article is based on what the authors learned working with a number of TeraQuest clients, and particularly with their colleagues at TeraQuest, including Bill Curtis, Kent Johnson, and Joe Puffer. Their contributions are appreciated. The authors would also like to thank the anonymous reviewers of this article for their insightful comments. They significantly improved this article.

REFERENCES
Curtis, B., C. Weber, B. Layman, and J. Puffer. 2002. Solving the challenge of quantitative management, SEPG 2002 National Conference, Phoenix, February. Curtis, B., C. Weber, and M. Paulk. 2002. What the authors intended at levels 4 and 5, SEPG 2002 National Conference, Phoenix, February. Layman, B., and C. Weber. 2002. Growing a mature measurement program, Applications of Software Measurement 2002 Conference, Anaheim, Calif., February.

www.asq.org 19

Measurement Maturity and the CMM: How Measurement Practices Evolve as Processes Mature
McGarry, J., et al. 2001. Practical software measurement: Objective information for decision makers. Reading, Mass.: Addison-Wesley. Paulk, M. C., B. Curtis, M. B. Chrissis, and C. Weber. 1993. Capability maturity model, version 1.1. IEEE Software 10, no. 4: 18-27. Paulk, M. C., C. V. Weber, B. Curtis, and M. B. Chrissis. 1995. The capability maturity model for software: Guidelines for improving the software process. Reading, Mass.: Addison-Wesley. SEI. 2001. CMMI for systems engineering /software engineering, version 1.1 (staged representation) (CMU/SEI-2002-TR-004). Pittsburgh: Software Engineering Institute, Carnegie Mellon University. Wheeler, D. J., and D. S. Chambers. 1992. Understanding statistical quality control, second edition. Knoxville, Tenn.: SPC Press. Wheeler, D. J., and S. R. Poling. 1998. Building continual improvement: A guide for business. Knoxville, Tenn.: SPC Press.

BIOGRAPHIES
Charles Weber is a process improvement director with TeraQuest and works with clients in performing process assessments, and planning and managing process improvement programs. From 1989 to 2000 he worked for the Software Engineering Institute in several different roles. He has more than 25 years experience in software and systems engineering and management, primarily with IBM Federal Systems Company. He is one of the primary authors of Capability Maturity Model for Software, which has become a de facto standard for software process improvement, and is used in countries around the world. He is also a coauthor of the CMM integration models. He can be reached at [email protected]. Beth Layman has more than 20 years experience in software and systems development as an individual contributor, manager, trainer, and consultant. A published author and speaker, Layman is an authority on software measurement and quality management. She is an SEI-Authorized SW-CMM Lead Assessor, an ASQ CSQE, and a principal author of Practical Software Measurement (PSM). As a process improvement director at TeraQuest, Layman provides software process improvement-related training, assessments, and consulting support to TeraQuest clients. She can be reached at [email protected] .

BIBLIOGRAPHY
Florac, W. A., and A. D. Carleton. 1999. Measuring the software process. Reading, Mass.: Addison-Wesley. Kan, S. H. 1995. Metrics and models in software quality engineering. Reading, Mass.: Addison-Wesley.

How helpful did you find this article? Please provide feedback at our online reader survey, which will be active for approximately ten weeks after this issues publication: www.asq.org/mr/sqp4_issue3.html .
2002 American Society for Quality. Reprinted with permission.

C A L L

F O R

R E V I E W E R S

Software Quality Professional wants to identify more individuals willing to write reviews of books, videos, training materials, tools, or other resources of interest to our readers. Length and style of these reviews would depend on the subject matter, but they would need to be descriptive, objective, and referenced to specific topics within the CSQE body of knowledge. You may look at the Resource Reviews section of this and other recent issues of SQP for examples of suitable contributions. Material to be reviewed can be forwarded to the reviewer, if necessary. Books provided for review can typically be kept by the reviewer. Reviewers also receive a complimentary copy of the issue in which the review appears. Please correspond directly with Associate Editor Sue Carroll ([email protected]), expressing your areas of expertise in terms of the CSQE body of knowledge and the types of resources to which you would have access or would like to review. Deadlines for contributions are typically two months before the cover date of each issue of SQP.

20 SQP VOL. 4, NO. 3/ 2002, ASQ

You might also like