09 2188 PDF
09 2188 PDF
Case # 09-2188
MTR090189
Carolyn Kahn
Sheila McGourty
In support of:
• FY 2009 Sponsor Performance Management
Investigations (POCs: K. Buck and M. Steinbach)
• FY 2009 CEM Stakeholder-Driven Performance
Management MOIE (POCs: L. Oakley-Bogdewic and
K. Buck)
April 2009
This page intentionally left blank.
MTR090189
3
Abstract
Performance management supports and enables achievement of an organization and/or program's
strategic objectives. It connects activities to stakeholder and mission needs. Effective
performance management focuses an organization or program to achieve optimal value from the
resources that are allocated to achieving its objectives. It can be used to communicate
management efficiencies and to show transparency of goal alignment and resource targeting,
output effectiveness, and overall value of agency outcomes or progress toward those outcomes.
This paper investigates performance management from the perspective of research and
development (R&D) organizations and programs. R&D organizations face unique performance
management challenges, including difficulty in measuring performance, lack of timely data, and
the many unknowns associated with R&D efforts including lack of clarity of the initial scope of
many projects. The study team researched performance management for R&D organizations in
the commercial and government sectors. The paper provides insight into performance
management as it relates to R&D, science and technology (S&T), and intelligence communities.
It examines practices of commercial and government R&D organizations and programs and
presents eleven case studies, including five from commercial industry and six from government.
Organizations are increasingly using both quantitative and qualitative measures to manage
performance and improve sustainable value. The paper provides example metrics used by
commercial and government R&D organizations and programs. Many view R&D as an essential
means to achieving increased knowledge and innovation to provide a competitive advantage.
Government organizations should choose a suite of performance metrics consistent with their
specific missions and goals.
Performance management that is properly implemented with management support and active
employee involvement is a powerful tool for the enterprise. Internally, it cultivates a holistic,
long-term view of the organization. It helps an enterprise stay focused on attributes of success
and failure to achieve the organization's goals and deliver meaningful results. Externally, it
communicates management efficiencies, transparency of goal alignment and resource targeting,
output effectiveness, and overall value of agency outcomes or progress toward those outcomes.
R&D organizations should be allowed flexibility to design and implement a performance
management process aligned with their mission, goals, and objectives that can be systematically
implemented with management support and active employee involvement to convey the true
value of performance to the enterprise.
iv
Acknowledgements
The authors would like to thank Mary Przewlocki for her significant research investigations and
Kevin Buck, Frank Dello Russo, Lisa Oakley-Bogdewic, and Michele Steinbach for their
guidance and helpful reviews. Their efforts contributed to this study.
v
Table of Contents
1 Introduction ...................................................................................................................... 1
2 Performance Management .............................................................................................. 1
2.1 Performance Management for Research and Development ....................................... 2
2.2 Performance Management for Science and Technology ............................................ 3
2.3 Performance Management and the Intelligence Community ..................................... 3
3 Practices of R&D Organizations .................................................................................... 4
3.1 Commercial Industry Practices .................................................................................. 4
3.2 Government Experiences ........................................................................................... 7
3.2.1 Army Research Laboratory .................................................................................. 8
3.2.1.1 Peer Review ................................................................................................. 8
3.2.1.2 Customer Evaluation ................................................................................... 8
3.2.1.3 Performance Measures ................................................................................ 9
3.2.1.4 Conclusion ................................................................................................... 9
3.2.2 Office of Naval Research ..................................................................................... 9
3.2.3 Navy S&T........................................................................................................... 11
3.2.4 Department of Homeland Security ..................................................................... 14
3.2.5 Department of Energy ........................................................................................ 15
3.2.6 Federal Highway Administration ....................................................................... 15
4 R&D Metrics .................................................................................................................. 17
4.1 Commercial Industry Practices ................................................................................ 17
4.1.1 Example Efficiency Measures ............................................................................ 18
4.2 Government Experiences ......................................................................................... 19
4.2.1 Peer Review ........................................................................................................ 21
4.2.2 Example Efficiency Measures ............................................................................ 22
5 Conclusion ...................................................................................................................... 30
Appendix A The Three Dimensional Value Proposition Approach ................................ 31
Appendix B Acronym List ................................................................................................... 32
Bibliography ......................................................................................................................... 35
vi
List of Figures
Figure 7. Transition from Basic Research to Meeting Requirements ......................................... 10
Figure 8. Navy R&D Process...................................................................................................... 10
Figure 9. History of Military Critical Technology Developments ............................................. 12
Figure 10. The TRL Vector ........................................................................................................ 15
Figure 11. 3DVP Approach ........................................................................................................ 31
vii
List of Tables
Table 1. Relative Utility of Approaches - ARL Case Study ......................................................... 9
Table 2. RD&T Performance Management Framework............................................................. 16
Table 3. Commercial Industry Example Efficiency Measures ................................................... 18
Table 4. Example Efficiency Measures ...................................................................................... 22
viii
1 Introduction
Performance management is conducted for many types of organizations. Yet programs differ by
purpose, design, administration, budget, goals, performance, and type. One useful definition of
R&D organizations comes from the Office of Management and Budget's (OMB's) Program
Assessment Rating Tool (PART): R&D consists of "programs that focus on knowledge creation
or its application to the creation of systems, methods, materials, or technologies." 1
The MITRE Corporation (MITRE) conducted an investigation of performance management from
the perspective of R&D organizations and programs. The study team researched performance
management for R&D organizations in the commercial and government sectors. Following this
introduction, Section 2 of the paper presents background information on performance
management as it relates to R&D, science and technology (S&T), and intelligence communities.
Section 3 describes practices of commercial and government R&D organizations and programs.
Eleven case studies are presented, including five from commercial industry and six from
government. Section 4 discusses example metrics for R&D organizations. Concluding remarks
are provided in Section 5.
2 Performance Management 2
1
"Guide to the Program Assessment Rating Tool (PART)," Office of Management and Budget, January 2008.
2
Dr. Oakley-Bogdewic, Lisa, Carolyn Kahn, and Kevin Buck, "Recommendations for the Program Assessment
Rating Tool (PART)," The MITRE Corporation, Stakeholder-Driven Performance Management Mission Oriented
Investigation and Experimentation (MOIE), April 2009.
3 Colin Talbot, “Performance Management,” The Oxford Handbook of Public Management,” UK: Oxford
1
Most of these potential shortfalls with performance management can be characterized as barriers
to productivity improvements that are pervasive in the public sector. 6 Effective performance
management minimizes such potential shortfalls and also brings them to the attention of
management so that the eventual indicators of results are meaningful to both the program and its
stakeholders. Indeed, these shortfalls should not undermine the importance of performance
evaluation at organizational, program, or individual levels.
Performance management supports and enables achievement of an organization and/or program's
strategic objectives. It connects activities to stakeholder and mission needs. Effective
performance management focuses an organization or program to achieve optimal value realized
in public good outcomes from the resources and programs that support this value. It can be used
to communicate management efficiencies and to show transparency of goal alignment and
resource targeting, output effectiveness, and overall value of agency outcomes or progress
toward those outcomes. The remainder of this section provides background information on
performance management as it relates to R&D (Section 2.1), S&T (Section 2.2), and the
Intelligence Community (Section 2.3).
7
Harman, Wayne, and Robin Staton, "Science and Technology Metrics and Other Thoughts," Naval Surface
Warfare Center, Dahlgren, Virginia, July 2006.
3
often quantifiable, and S&TI comprises many engineers and scientists comfortable with the
processes and statistics required by performance management.
It can help create an environment where researchers and research A performance management
can flourish. Performance management can benefit the IC by system that is properly
questioning current practices, identifying and optimizing implemented with
processes, removing inefficiencies, and increasing employee management support and
active employee involvement
motivation and collaboration. IC personnel are not rewarded by a can potentially become a very
truly merit-based compensation system. Daily activities and powerful management tool.
operational decisions can be linked to results. More effective and
efficient outcomes that are aligned internally and externally can be selected, measured, and
monitored. Without sufficient planning and effective communication of performance
management requirements and delineation of roles/responsibilities, there is a real concern that
performance management will be viewed as another reporting requirement and not a true
management tool.
10
Kirchhoff, Bruce, Steven Walsh, Matt Merges, and Joseph Morabito, "A Value Creation Model for Measuring and
Managing the R&D Portfolio," Engineering Management Journal, March 2001.
11
A triangular distribution is typically used as a subjective description of a population, often in cases when only
limited data is available. It is a continuous probability distribution with a lower limit (minimum), upper limit
(maximum), and modal (most likely) value.
12
Walsh, Steven, "Portfolio Management for the Commercialization of Advanced Technologies," Engineering
Management Journal, March 2001.
5
Like many large and mature technology-based firms, Xerox employs a budgeting process for
R&D. Its internal process uses a key planning metric for the coming year, which it calls "R&D
intensity." This metric is defined as the planned R&D investment divided by the anticipated
revenue. The R&D intensity metric is periodically compared to competing firms, and it is kept
relatively constant year over year.
Similar to other large technology firms, Xerox organizes its R&D budget into two main parts:
product development and research laboratories. Approximately 80% of the total R&D budget is
allocated to product development and managed by the business divisions. The remaining 20% is
distributed to its research laboratories and is controlled at the corporate level.
The annual process for determining the next year's R&D Xerox makes tactical cost adjustments
budget coincides with the corporate-wide budget and reconsiders any strategic
activities and is tightly aligned with projected revenues implications at the beginning of the next
planning cycle.
and profits. The specific R&D budget is created by
categorizing those requests that fall under the scope of product development versus research
laboratories, being sure to identify the number of years anticipated for the realization of the
investments. Based on the current and anticipated economic environment, updated financial
targets are established for the following year and costs across the corporation are adjusted to
meet the new values. Unlike the model at Lucent which ties potential R&D investments to
strategic initiatives at the outset, these cost adjustments at Xerox are tactical; any strategic
implications are reconsidered at the beginning of the next planning cycle. Any decision to
increase R&D spending is usually tied to next year's anticipated revenue, with revisions possible
depending on short-term affordability. This approach implicitly assumes that the R&D budget
followed revenue and profit growth, rather than driving it. 13
3.1.3 Hewlett-Packard
Hewlett-Packard is the focus of a study on At Hewlett-Packard, the continual investment in
new product revenue and the link between future product lines is a key activity that serves to
product innovation activities and revenue establish a constant stream of revenue for the
growth. The study defines the process by company even as older product lines are phased out
which a company converts internal resources
(e.g., labor, materials) into products, the products are consumed by its customer base, the
company earns revenue, and the revenue can then be reinvested into current operations as well as
future R&D for innovative and new product lines. The continual investment in future product
lines is a key activity that serves to establish a constant stream of revenue for the company even
as older product lines are phased out of production.
A large company's revenue contributions
Three factors are identified that drive revenue growth: of a particular new-product year (or
the fraction of revenue invested in product innovation, vintage) fall into a regular pattern over
new product revenue gain, and the behavior of revenue time, which enables a company to
over time for a particular business. Using a graph called determine mathematical relationships for
a product vintage chart, a large company's revenue revenue growth as a function of R&D
investment and new product revenue
contributions of a particular new-product year (or growth.
vintage) fall into a regular pattern over time, which
enables a company to determine mathematical relationships for revenue growth as a function of
R&D investment and new product revenue growth. In this way, senior managers can gain
13
Hartmann, George, "Planning Your Firm's R&D Investment," Research Technology Management, March 2006.
6
clearer understanding of the interplay between product innovation, R&D investment, revenue
growth, and profitability over time. 14
3.1.4 IBM
Many companies believe there is a strong correlation between future revenue growth and internal
investments made in R&D. Some argue that R&D should increase spending, regardless of the
specific investment. However, there is some level of
R&D spending that will not yield additional revenue John Armstrong, former Vice President of
return. John Armstrong, former Vice President of Research and Technology at IBM, claims
"you can spend too much on R&D."
Research and Technology at IBM, claims "you can
spend too much on R&D." 15
In an attempt to quantify the value of its e-business initiatives, IBM established the Risk and
Opportunity Assessment process to assist in selecting and prioritizing e-business initiatives.
Within this process, IBM uses the Value Chain Modeling Tool to analyze and model the value
chain of its enterprise. This internal IBM approach has been successfully used to improve the
financial and operating performance of several of its business units.16
The Risk and Opportunity Assessment process includes the following stages:
1. Collection of data about the R&D initiative: Includes any background information on the
project as well as documented assumptions. This results in a data collection plan, definitions
of data requirements, and the actual collection of data.
2. Modeling and Analysis: A baseline model is built, defining key financial and operational
drivers for the initiative as well as highlighting various scenarios which could positively or
negatively impact the success of the project.
3. Development: A cost-benefit analysis is performed, potential solutions are prioritized, and a
final choice is made.
14
Patterson, Marvin L., "From Experience: Linking Product Innovation to Business Growth," Journal of Product
Innovation Management, 1998.
15
Hartmann, George, "Planning Your Firm's R&D Investment," Research Technology Management, March 2006.
16
Nassar, Ayman, "A System-Based Approach for Defining IT Operations Value Proposition," Management
Science and Engineering, October 2006.
7
3.2.1 Army Research Laboratory 17
The Army Research Laboratory (ARL) developed a performance measurement approach by
asking the question, "What information does the stakeholder really want to know from a
performance evaluation system, beyond what the ultimate outcomes and impacts of the research
will be?" ARL determined that its stakeholders want information that will aid the in answering
three questions:
1. Is the work relevant? Does anyone care about the effort? Is there a target or a goal, no
matter how distant, to which the sponsor can relate?
2. Is the program productive? Is the program moving toward a goal, or at least delivering a
product to its customer in a timely manner?
3. Is the work of the highest quality? Can we backup the claim to be a world-class research
organization doing world-class work?
To answer these questions, ARL used a combination of peer review, customer evaluation, and
performance measures.
ARL uses peer review, customer
3.2.1.1 Peer Review evaluations, and performance measures
to answer stakeholder questions:
According to the Organization of Economic Co- Is the work relevant?
operation and development, peer review is the judgment Is the program productive?
of scientific merit by other scientists working in, or Is the work of the highest quality?
close to the field in question. It is premised upon the
assumption that only an expert - with scientific knowledge about the cognitive development of
the field, its research agenda, and the practitioners within it - is capable of making certain
decisions.
ARL established a peer review group called the ARL Technical Assessment Board (TAB). TAB
membership consists of 15 world renowned scientists and engineers. Under the TAB, ARL has
size panels, each with six to seven members. The purposes of the TAB are three-fold: (1) to
review the scientific and technical quality of ARL's program; (2) to make an assessment on the
state of ARL's facilities and equipment; and (3) to appraise the preparedness of the technical
staff. The TAB assesses one-third of the ARL program each year with results forwarded to
senior management within the Army and the Department of Defense (DOD). The primary focus
of the peer review process is in answering the question, "Is the work of the highest quality?"
Data collection and analysis is in the form of an annual report based on the TAB review. The
qualitative nature of the review and validation of the data are a concern, but the independence of
the TAB from the ARL program minimizes any biases.
17
"Performance Measurement of Research and Development (R&D) Activities, Oak Ridge Associated Universities,
2005.
8
expectations, if they were delivered in a timely fashion, and if the products performed as needed.
For fundamental scientific research, in which the stakeholder is not clearly defined, the
laboratory director provides the needed feedback. ARL does not obtain feedback from senior
management since ARL does not deliver any tangible products to the Department of the Army
senior management. Instead, a Stakeholders' Advisory Board (SAB) meets once each year to
provide ARL with feedback needed to evaluate its performance. The SAB is chaired by the
Commanding General, and it determines the degree to which the ARL program is effective along
several dimensions (e.g., mission vs. customer funding, in-house vs. contractual work, and near-
term vs. far-term emphasis).
3.2.1.4 Conclusion
The following graphic depicts the relative utility of peer review, customer evaluation, and
performance measures in answering the three stakeholder questions for evaluating R&D
performance.
Table 1. Relative Utility of Approaches - ARL Case Study
Relevance Productivity Quality
Peer Review - * +
Customer Evaluation * * *
Performance Measures + + *
+ = Very Useful * = Somewhat Useful - = Less Useful
Overall, ARL gives performance measures the highest ratings, followed by customer evaluation,
and peer review.
18
Kostoff, Dr. Ronald N., "Science and Technology Metrics," Office of Naval Research.
9
Figure 1. Transition from Basic Research to Meeting Requirements
The Chief of Naval Research in 2004, Rear Admiral Jay M. Cohen, described ONR's basic
research investment strategy as “planting a thousand flowers, to get 100 projects, three
prototypes, and one profit-maker.” 19
Figure 8 illustrates the Navy's process of matching the capabilities to be developed through R&D
to those that meet end user requirements.
19
Interview quoted in Sea Power, February 2004, cited by Silberglitt, Richard, Lance Sherry, Carolyn Wong,
Michael Tseng, Emile Ettedgui, Aaron Watts, Geoffrey Stothard, "Portfolio Analysis and Management for Naval
Research and Development," RAND Corporation, 2004.
10
The RAND Corporation's PortMan R&D decision framework has been adapted to support ONR's
R&D decision-making. 20 This tool computes the expected value of an R&D project as the
product of three factors: value to the military of the capability sought through R&D, the extent
to which the performance potential matches the level required to achieve the capability, and the
project's transition probability. PortMan does not rely on the expected value as a point solution
but, rather, includes an estimate of uncertainty and their estimated direction over time.
Evaluation is based on best current information and tracking over time. PortMan has been used
in a case study.
This approach incorporates anchored scales, which are scales that include explicit descriptions of
the requirements or thresholds for assigning particular values. By requiring the evaluators to
answer specific questions concerning capability, performance potential, and transition
probability, the PortMan framework collects and records information needed to analyze the
positive and negative aspects of each R&D project, and to facilitate discussion and analysis of
possible investment strategies.
The approach also incorporates uncertainty to estimate expected value components, including
capability, performance potential, and transition probability. R&D investment strategies attempt
to balance the risk that R&D projects will fail to meet its objectives with their potential payoff.
20
The purpose of PortMan is to evaluate a defined group of actual or proposed projects and to provide a means for
creating a portfolio from them that maximizes the value of R&D investments. It does not generate an absolute score
for the total portfolio that could be used to compare to portfolios of other projects or to proportionally allocate funds
between portfolios of different projects.
21
Harman, Wayne, and Robin Staton, "Science and Technology Metrics and Other Thoughts," Naval Surface
Warfare Center, Dahlgren, Virginia, July 2006.
11
Figure 3. History of Military Critical Technology Developments
The remainder of this section discusses further findings of the Navy S&T study team.
The loss of institutional technical competence leads to According to Navy S&T:
failure. For instance, the Space Shuttle disaster has ● The loss of institutional technical
been attributed to the loss of technical in-house competence leads to failure.
competence, which was contracted out due to budgetary ● S&T projects never go according to
plan.
reasons. ● An organization will likely get what it
S&T projects never go according to plan. The results of chooses to measure. If it chooses to
research cannot be placed on a time schedule. Dollars measure the number of published papers,
then there will surely be papers
invested in research in times of peace may mean the life published, perhaps at the expense of
of the nation when it goes to war. Acquisition and other, more useful results.
operational managers often focus on current issues, not
what the future will need.
S&T metrics can be defined and collected in response to specific questions, but the program or
organization will likely get what it chooses to measure. For example, if the program or
organization chooses to measure the numbers of published papers and patent applications, then
there will surely be papers published and patents applied for, perhaps at the expenses of other,
more useful results.
The immediate ROI for Navy S&T is its contribution to the quality and development of its
people, who will determine future success and failure. A competent government technical
workforce requires significant tasking and responsibility in similar disciplines over multiple
years. This supports in-house technical authority through continuity of core technical
competencies. The availability of "hands-on" S&T projects helps attract, recruit, and retain the
right talent. S&T experience is correlated with the probability that an individual is or will
become a senior technical manager or leader. Military preparedness is a continuous function.
12
The intelligence gained from S&T investment is available on-demand. In-house S&T enables
the recruiting, training, and retention of a technically competent scientific and engineering
workforce. Real-time connections between S&T workforce and acquisition programs enable
focused technology for the warfighter. The in-house workforce is highly responsive because of
because of its ability to innovate and its knowledge of Naval systems. The S&T workforce tends
to be agile and adaptive, bringing vision and innovation to future Naval capabilities. New and
emerging S&T applications can be focused to support Navy fleet needs. The in-house S&T
workforce can anticipate and provide capabilities to stay ahead of future threats and mitigate
risk.
Transitions and speed of transition are not significant measures of S&T performance. They may
be better measures of the entire Research, Development, Test, and Evaluation (RDT&E)
acquisition process. S&T provides enabling technologies. It is a shopping list. Acquisition
programs must provide funds for further development and integration of the technology into their
systems. S&T programs do not have such funding and, therefore, cannot control transitions.
A better measure of S&T is how well the Navy is addressing current and future Navy needs, and
how prepared the workforce is to address those needs. The size of the Navy S&T budget and in-
house workforce should be determined by what it needs to do, i.e., what Navy capabilities need
to be enabled.
The nature of the evaluation may change depending on organizational management. The Chief
of Naval Operations wants to know the Navy's benefit from its investment. The S&T Director at
Dahlgren Division, who manages the Division investments, is more concerned with the quality
and appropriateness of the project selections.
Return on the Navy's S&T investment is frequently The ability to learn faster than opponents
requested. However, the report concluded that ROI for may be the only sustainable competitive
S&T could not be quantified in fiscal terms because of advantage.
the long delay between the S&T effort and when the
technology is actually incorporated into a Navy system, which could be decades later.
Furthermore, the Navy does not accrue a financial benefit for a successful S&T investment,
unlike industry. Some may readily compare outcomes of S&T investments in a given year to the
$2 billion cost. In actuality, the ROI is difficult to quantify. S&T capability acts as an additional
form of deterrence against adversaries. The ability to learn faster than opponents may be the
only sustainable competitive advantage. ROI must incorporate the resulting state of readiness of
the technical workforce to respond to recognized capability gaps, solve specific technical
problems, and create entirely new capabilities. It also should incorporate risk reduction
throughout the acquisition process, including the following risks: technological surprise;
acquiring expensive, unreliable, maintenance intensive systems; delaying program execution due
to immature technologies; failing to recognize future threats or needed capabilities; and
development risk. A competent workforce is the near-term Navy ROI for S&T. The Navy
benefits from a robust S&T program through its vision to predict future Naval needs and risk
reduction. As the Defense Advanced Research Projects Agency (DARPA) notes:
None of the most important weapons transforming warfare in the 20th century -
the airplane, tank, radar, jet engine, helicopter, electronic computer, not even the
atomic bomb [or unmanned systems, stealth, global positioning system, and
Internet technologies] - owed its initial development to a doctrinal requirement or
request of the military… If they don't know what to ask for then someone has to
tell them what they need. This is ROI.
13
3.2.4 Department of Homeland Security22
The S&T Directorate of the Department of Homeland Security (DHS) functions as the nation's
homeland security research, development, test, and evaluation manager for S&T. The
Directorate allocates 10% of its S&T funding to higher-risk innovation which, if successful, will
provide potentially game-changing technologies and systems in one to five years - much quicker
and with greater impact than incremental improvement typical in most programs. Within this
portfolio it allocates about one-tenth (or 1% of its total S&T budget) to truly high-risk efforts,
which are likely to fail. If successful, they will have profound impacts, and even projects that
fail will often result in enhanced understanding to improve subsequent basic and applied research
efforts to lead to breakthrough and leap-ahead capabilities. Another 50% of the S&T's
Directorate is allocated to transition of lower-risk projects dedicated to satisfying DHS customer-
defined capability needs, with spiral development, within three years. The remainder of the
annual S&T program includes specially mandated programs and projects.
To help meet real-world requirements and deliver The Directorate’s management and
effective and affordable technologies, the Directorate oversight process tracks success of
developed a customer-focused and output-based risk product transition in terms of three
analysis and requirements assessment architecture. The objective metrics: project cost, schedule
and technological readiness.
strategy-to-task framework directly links programs and
initiatives to specific strategic goals and customer requirements. The Directorate’s management
and oversight process tracks success of product transition in terms of three objective metrics:
project cost, schedule and technological readiness.
For cost, the Directorate aims to accurately estimate and track the RDT&E cost of a technology
or system. It "weeds" out under-performing projects. For its schedule metric, the Directorate
establishes detailed timelines, action plans, and milestones to monitor each project's progress.
Frequent program reviews and internal assessments enable early on correction of problems. The
Transition Office also formally elicits customer feedback from DHS components. For
technology readiness, the Directorate uses Technology Readiness Levels (TRLs) for systematic
measurement of periodic assessments of maturity of a specific technology or system, as shown in
Figure 10. The TRLs also support determining whether a capability solution is ready to be
transitioned to the field or should be modified or discarded.
It is critical to maintain outreach to scientists, engineers, and
managers worldwide to help meet critical needs and support
innovative S&T approaches.
22
"Science and Technology for a Safer Nation," US Department of Homeland Security, March 2008.
14
Figure 4. The TRL Vector
It is critical to maintain outreach to scientists, engineers, and managers worldwide to help meet
critical needs and support innovative S&T approaches.
23
"Performance Management," US Department of Transportation, Federal Highway Administration.
15
The National Cooperative Highway Research Program (NCHRP) concluded that "different types
of evaluation methods are appropriate for different types of research projects" and organizations.
The RD&T Performance Management Framework chart below identifies existing performance
measures and assessment mechanisms used by unit managers. These measures and mechanisms
are integrated across management function, enabling RD&T to manage, analyze, and integrate
information obtained from a variety of sources. The RD&T Leadership Council uses the
framework as a tool to assess unit performance measurement activities and to identify
measurement gaps.
Table 2. RD&T Performance Management Framework
Corporate Definition Related RD&T Performance Methodology
Management Measures
Strategies
Leadership Leadership focuses on how senior • Leadership Effectiveness Inventory • 360-degree feedback
leaders guide the organization. It (LEI) results • Action agenda
describes how leaders set direction • Action items completed • Performance plans
and high-performance expectations, • Performance plan items fulfilled • Quality self-
project a strong customer focus, • Self-assessment score assessments
and communicate clear and visible
values to employees.
Strategic Strategic planning examines how • Action items completed • Performance plans
Planning the organization sets strategic goals • Self-assessment score and action agenda
and develops key action plans. • Progress made on goals established • Quality self-
assessment
• Lab assessments
Customer/ Customer and partner focus • Percent of satisfaction with RD&T • American Customer
Partner Focus examines how the organization products and services Satisfaction Index
determines customer and market • Number of technology facilitation (ACSI)
requirements and expectations. plans in place • Technology
• Self-assessment score Innovation Network
• Lab assessment results (to be (TIN)
determined (TBD)) • Technology
• RD&T customer survey results Facilitation Action Plan
(TBD) (TFAP)
• Quality self-
assessments
• Lab assessments
• Customer surveys
Information Information and analysis examines • Performance measurement • Performance
and Analysis the management, effective use, and framework measurement
analysis of data and information to • Response level and content of framework
support key organization processes, feedback mechanisms • ACSI, TIN
to include the organization's • Self-assessment score • Quality self-
objectives • Lab Assessment results (TBD) assessments
• Lab Assessments
Human Human resource development and • Self-assessment score • Quality self-
Resource management examines how the • Percent of employee satisfaction assessments
Development organization enables its workforce survey rating • Employee satisfaction
to develop to its full potential and • Percent of payroll spent on training survey
how the workforce is aligned with and development • LADS
the organization's objectives • Number of Individual
Development Plans (IDPs) in place
and in Learning and Development
System (LAD)
16
• Number of “priority 1” training
needs met
• Number of vacancies filled
• Number of days positions are
vacant
• Number of student interns (Number
of Grant for Research Fellowships
(GRF), Summer Transportation
Intern Program for Diverse Groups
(STIPDG), etc.)
• Number of outreach activities
Process Process Management examines • Number of process improvements • Quality self-
Management aspects of how key production, documented assessments
delivery, and support processes are • Lab Assessment (TBD) • Lab assessments
designed, managed, and improved. • Number of contracts on time and • Project tracking
on budget system
• TIN (TBD) • ACSI
• SBIR (TBD)
Business Business results show the • Percent of project completion • Track project and
Results organization's performance and • Number of success stories services delivery
improvement in its key business • Research benefit (TBD) • RD&T success stories
areas: customer satisfaction, • Pilot and case studies
financial and marketplace
performance, human resources,
supplier and partner performance,
and operational performance. The
category also examines how the
organization performs relative to
competitors.
RD&T benefit assessments are largely retrospective analyses and require data collection
throughout the product development and delivery cycles to produce meaningful conclusions.
4 R&D Metrics
Section 4 provides example R&D metrics used by commercial industry and government
organizations and programs.
17
• Alignment with corporate strategies
• Cost
• Schedule
• Organizational flexibility
• Intellectual Property factors
• Market Lifecycle factors
• Fit with existing product portfolio
• Market Share
Since R&D projects are unique compared to other corporate spending projects, many firms have
expanded the list of metrics used to measure R&D performance instead of relying on the
traditional use of short-term financial metrics alone.
Agency or
Efficiency Measure
Organization
Alcoa Return-on-investment calculation:
(FY 2005) Improve existing ARIS by converting its mainframe system into Web-based system
designed by OAR and IC representatives in consultation with contractor
Variable cost improvement
Margin impact from organic growth
Capital avoidance
Cost avoidance
Annual impact of these four metrics over 5-year period becomes numerator; denominator is total
R&D budget
Metric is used most often to evaluate overall value of R&D program and current budget focus
(Atkins 2007)
Alcoa Time (Atkins 2007)
Alcoa Cost (Atkins 2007)
Alcoa Customer demand (Atkins 2007)
Alcoa Risk (Atkins 2007)
Alcoa Impact on business (Atkins 2007)
Alcoa Impact on customers (Atkins 2007)
Alcoa Location (Atkins 2007)
Alcoa Intellectual property (Atkins 2007)
Alcoa Aggregate R&D expenditures by laboratory group or by identifiable programs and publish value
capture or “success rate” for each on annual basis (Atkins 2007)
Alcoa ROI on R&D spending; success rate of launched products (Atkins 2007)
Dow Chemical Publications; participation and leadership in scientific community (collaborative research efforts;
trade associations; ILSI-HESI; external workshops; adjunct faculty positions, journal or book
editors, professional societies) (Bus 2007)
24
"Evaluating Research Efficiency in the U.S. Environmental Protection Agency," National Research Council,
2008.
18
IBM ROI on Summer Internship Program and Graduate Fellowship Program: what percentage return
as regular IBM research employees?
IBM “Bureaucracy Busters” Initiative to reduce bureaucracy in laboratory support, information-
technology support, HR processes, and business processes (Kenney 2007)
IBM Tracking of patent-evaluation process (Kenney 2007)
IBM Customer-satisfaction surveys for support functions to evaluate effect of service reductions
(Kenney 2007)
IBM Measurement of response time and turnaround for external contracts (Kenney 2007)
IBM Measurement of span of responsibility for secretarial support (Kenney 2007)
Procter & Time saved in product development (Daston 2007)
Gamble
Procter & Increased confidence about safety (Daston 2007)
Gamble
Procter & External relations benefits (although not quantifiable) (Daston 2007)
Gamble
Agency or
Program Year Efficiency Measure
Organization
EPA Endocrine Disruptors 2004 (OPPTS) Cost per labor hour of contracted validation studies
(combined EPAPART) (EPA, unpublished material, April 23, 2007)
EPA EPA Human Health 2005 Average time (in days) to process research-grant proposals from
Research RFA closure to submittal to EPA's Grants Administration
Division while maintaining a credible and efficient competitive
merit-review system (as evaluated by external expert review)
(EPA, unpublished material, April 23, 2007)
EPA Land Protection and 2006 Average time (in days) for technical support centers to process
Restoration Research and respond to requests for technical document review,
statistical analysis, and evaluation of characterization and
treatability study plans (EPA, unpublished material, April 23,
2007)
EPA Water Quality Research 2006 Number of peer reviewed publications per FTE (EPA,
unpublished material, April 23, 2007)
EPA Human Health Risk 2006 Average cost to produce Air Quality Criteria/Science
Assessment Program Assessment documents (EPA, unpublished material, April 23,
2007)
EPA EPA Ecological 2007 Percentage variance from planned cost and schedule (approved
Research 3/13/07) (EPA, unpublished material, April 23, 2007)
EPA Drinking Water 2007 Percentage variance from planned cost and schedule (approved
Research 3/13/07) (EPA, unpublished material, April 23, 2007)
25
"Evaluating Research Efficiency in the U.S. Environmental Protection Agency," National Research Council,
2008.
22
EPA PM Research 2007 Percentage variance from planned cost and schedule (approved
3/13/07) (EPA, unpublished material, April 23, 2007)
EPA Global Change 2007 Percentage variance from planned cost and schedule (approved
Research 3/13/07) (EPA, unpublished material, April 23, 2007)
EPA Pollution Prevention 2007 Percentage variance from planned cost and schedule (approved
Research 3/13/07) (EPA, unpublished material, April 23, 2007)
DOD Defense Basic Research 2002 Long-term measure: portion of funded research
chosen on basis of merit review; reduce non-merit-
reviewed and determined projects by half in 2
years (from 6.0% to 3.0%) (OMB 2007)
DOE Advanced Simulation and 2002 Annual average cost per teraflops of delivering,
Computing operating, and managing all Stockpile Stewardship
Program (SSP) production systems in given fiscal
year (OMB 2007)
DOE Coal Energy Technology 2005 Administrative costs as percentage of total
program costs (OMB 2007)
DOE Advanced Fuel Cycle Initiative 2003 Program direction as percentage of total R&D
program funding (OMB 2007)
DOE Generation IV Nuclear Energy 2003 Program direction as percentage of total R&D
Systems Initiative program funding (OMB 2007)
DOE National Nuclear Security 2005 Cumulative percentage of active research projects
Administration: Nonproliferation for which independent R&D peer assessment of
and Verification Research and project's scientific quality and mission relevance
Development has been completed during second year of effort
(and again in each later 3-year period for projects
found to be of merit) (OMB 2007)
DOE Nuclear Power 2010 2003 Program direction as percentage of total R&D
program funding (OMB 2007)
DOE Basic Energy Sciences/ 2006 Average achieved operation time of scientific user
Biological and Environmental facilities as percentage of total scheduled annual
Research operation time; cost-weighted mean percentage
variance from established cost and schedule
baselines for major construction, upgrade, or
equipment procurement projects (cost variance
listed first) (OMB 2007)
DOE Hydrogen Program 2003 In 2003, EERE Hydrogen Program had about 130
fuel-cell and hydrogen production research
projects that were subject to in-progress peer
review by independent experts
Agency or Program Year Efficiency Measure
Organization
For all reviewed projects, reviewers provided
written comments and numerical ratings
on a scale of 1-4, with 4 being highest
with resulting scores ranging of 2.2-3.9
Program used review results to make important
decisions to continue or discontinue projects
Research efficiency = 1− [( no. of projects
discontinued/(total no. of projects reviewed − no.
of projects judged as completed − earmark
projects)] (Beschen 2007)
23
DOI U.S. Geological Survey – 2005 Average cost per gigabyte of data available
Biological Information through servers under program control (EPA,
Management and Delivery unpublished material, 2006)
DOI U.S. Geological Survey – 2005 Average cost per sample for selected high-priority
Biological Research & environmentally available chemical analyses
Monitoring (EPA, unpublished material, 2006)
DOI U.S. Geological Survey – 2007 Average cost of systematic analysis or
Energy Resource Assessments investigation (dollars in millions) (EPA,
unpublished material, 2006)
DOI U.S. Geological Survey – 2003 Average cost of systematic analysis or
Mineral Resource Assessment investigation; average cost per analysis allows
comparisons among projects to determine how
efficiencies can be achieved (EPA, unpublished
material, 2006)
DOI U.S. Geological Survey – Water 2004 Average cost per analytic result, adjusted for
Resources Research inflation, is stable or declining over 5-year period
(EPA, unpublished material, 2006)
DOI U.S. Geological Survey – Water 2004 Percentage of daily stream flow measurement sites
Information Collection and with data that are converted from provisional to
Dissemination final status within 4 months of day of collection
(EPA, unpublished material, 2006)
DOI U.S. Geological Survey – 2005 Percentage improvement in detectability limits for
Biological Research & selected high-priority environmentally available
Monitoring chemical analytes (EPA, unpublished material,
2006)
DOI U.S. Geological Survey – 2003 Percentage of total cost saved through partnering
Geographic Research, for data collection of high-resolution imagery
Investigations, and Remote (EPA, unpublished material, 2006)
Sensing
DOI Bureau of Reclamation – 2003 Each year, increase in R&D cost-sharing per
Science and Technology reclamation R&D program dollar will contribute
Program toward achieving long-term goal of 34%
cumulative increase over 6-year period (OMB
2007)
DOT Highway Research and 2004 Annual percentage of all research projects
Development/Intelligent completed within budget (OMB 2007)
Transportation Systems
DOT Highway Research and 2004 Annual percentage of research-project deliverables
Development/Intelligent completed on time (OMB 2007)
Transportation Systems
DOT Railroad Research and 2004 Organizational Excellence: Percentage of projects
Development completed on time (OMB 2007)
Department National Assessment for 2003 Timeliness of NAEP data for Reading and
of Education Educational Progress Mathematics Assessment in support of President's
No Child Left Behind initiative (time from end of
data collection to initial public release of results
for reading and mathematics assessments) (EPA,
unpublished material, 2006)
24
Department National Center for Education 2003 NCES will release information from surveys
of Education Statistics within specified times; NCES collected baseline
information in 2005, examining time-to-release for
31 recent surveys ( National Assessment of
Educational Progress releases not included in these
figures) (EPA, unpublished material, 2006)
Agency or Program Year Efficiency Measure
Organization
DHHS National Center for Health 2005 Number of months for release of data as measured
Statistics by time from end of data collection to data release
on Internet (OMB 2007)
DHHS NIH Extramural Research By 2013, provide greater functionality and more
Programs streamlined processes in grant administration by
continuing to develop NIH Electronic Research
Administration System (eRA)
(FY 2004) Develop plan to integrate OPDIVs into
eRA
(FY 2005) Integrate DHHS 50% of eligible DHHS
OPDIVs as eRA users for administration of
research grants
(FY 2006) Integrate DHHS 100% of eligible
DHHS OPDIVs as eRA users for administration of
research grants
Conversion of business processes
(FY 2005) 25% of business processes done
electronically
(FY 2006) 40%
(FY 2007) 55%
(FY 2008) 80% ( Duran 2007)
DHHS NIH Intramural Research 2005 Reallocation of laboratory resources based on
Program extramural reviews by Boards of Scientific
Counselors (OMB 2007)
DHHS Bioterrorism: CDC Intramural 2006 Decrease annual costs for personnel and materials
Research development with development and continuous
improvement of budget and performance
integration information system tools (OMB 2007)
DHHS NIOSH 2004 Percentage of grant award or funding decisions
made available to applicants within 9 months of
application receipt or deadline date while
maintaining credible and efficient two-level peer-
review system (OMB 2007)
DHHS NIOSH Not used Determine future human capital resources needed
currently to support programmatic strategic goals, focusing
on workforce development or training and
succession planning (Sinclair 2007)
DHHS NIOSH 2007 Percentage of grant award or funding decisions
made available to applicants within 9 months of
application receipt or deadline date while
maintaining credible and efficient two-level peer-
review system (Sinclair 2007)
25
DHHS Extramural Construction By 2010, achieve average annual cost savings of
managing construction grants by expanding use of
electronic project-management tools that enhance
oversight and 20-year use monitoring
(Each FY) Achieve average annual cost of
managing construction grants (Duran 2007)
DHHS HIV/AIDS Research By 2010, use enhanced AIDS Research
Information System (ARIS) database to more
efficiently conduct portfolio analysis to invest in
priority AIDS research
(FY 2005) Improve existing ARIS by converting
its mainframe system into Web-based system
designed by OAR and IC representatives in
consultation with a contractor
(FY 2006, FY 2007, FY 2008) Track, monitor, and
budget for trans-NIH AIDS research, using
enhanced ARIS database, to more efficiently
conduct portfolio analysis of 100% of expiring
grants to determine reallocation of resources for
priority research (Duran 2007)
DHHS Research Training Program 2006 By 2012, ensure that 100% of trainee appointment
forms are processed electronically, to enhance
program management (OMB 2007)
NASA Human Systems Research and 2005 Time between solicitation and selection in NASA
Technology Research Announcements (OMB 2007)
NASA Solar System Exploration 2006 Percentage of budget for research projects
allocated through open peer-reviewed competition
(OMB 2007)
NASA Solar System Exploration 2006 Number of days within which NASA Research
Announcement research grants for program are
awarded, from proposal due date to selection, with
goal of 130 days (OMB 2007)
NASA Original Uniform Measures Complete all development projects within 110% of
cost and schedule baseline
Peer-review and competitively award at least 80%,
by budget, of research projects
Reduce time within which 80% of NRA research
grants are awarded, from proposal due date to
selection, by 5% per year, with goal of 130 days
Deliver at least 90% of scheduled operating hours
for all operations and research facilities (Pollitt
2007)
NASA 2007 Year-to-year reduction in Space Shuttle sustaining
engineering workforce for flight hardware and
software while maintaining safe flight
Reduction in ground operations cost (through
2012) of Constellation Systems based on
comparison with Space Shuttle Program
Number of financial processing steps and time to
perform year-end closing
Number of hours required for NASA personnel to
collect, combine, and reconcile data of contract-
management type for external agency reporting
purposes (Pollitt 2007)
26
NASA 2007 On-time availability and operation of Aeronautics
Test Program ground test facilities in support of
research, development, test, and engineering
milestones of NASA and DOD programs from
both schedule and cost perspectives
Operational cost per minute of Space Network
support of missions
Ratio of Launch Services Program cost per mission
to total spacecraft cost
Number of people reached via e-education
technologies per dollar invested (Pollitt 2007)
NOAA Climate Program 2004 Volume of data taken in annually and placed into
archive (terabytes) (EPA, unpublished material,
2006)
NOAA Ecosystem Research 2005 Cost per site characterization (OMB 2007)
NOAA Ecosystem Research 2005 Percentage of grants awarded on time (OMB 2007)
NSF Fundamental Science and 2005 Percentage of award decisions made available to
Engineering Research applicants within 6 months of proposal receipt or
deadline date while maintaining credible and
efficient competitive merit-review system as
evaluated by external experts (OMB 2007)
NSF Research on Biocomplexity in 2004 Percentage of award decisions made available to
the Environment applicants within 6 months of proposal receipt or
deadline date while maintaining credible and
efficient competitive merit-review system as
evaluated by external experts (OMB 2007)
NSF Construction and Operations of 2003 Percentage of construction acquisition and upgrade
Research Facilities projects with negative cost and schedule variances
of less than 10% of approved project plan (EPA,
unpublished material, 2006)
NSF Polar Research Tools, Facilities 2004 Percentage of construction cost and schedule
and Logistics variances of major projects as monitored by
earned-value management (OMB 2007)
NSF Support for Research Institutions 2004 Percentage of award decisions made available to
applicants within 6 months of proposal receipt or
deadline date while maintaining credible and
efficient competitive merit-review system as
evaluated by external experts (OMB 2007)
NSF Support for Small Research 2004 Percentage of award decisions made available to
Collaborations applicants within 6 months of proposal receipt or
deadline date while maintaining credible and
efficient competitive merit-review system as
evaluated by external experts (OMB 2007)
NSF Construction and Operations of 2003 Percentage of operational facilities that keep
Research Facilities scheduled operating time lost to less than 10%
(OMB 2007)
NSF Federally Funded Research and 2005 Percentage of operational facilities that keep
Development Centers scheduled operating time lost to less than 10%
(OMB 2007)
27
NSF Information Technology Qualitative assessment by external experts that
Research there have been significant research contributions
to software design and quality, scalable
information infrastructure, high-end computing,
workforce, and socioeconomic impacts of IT
(EPA, unpublished material, 2006)
NSF Polar Research Tools, Facilities Percentage of person-days planned for Antarctic
and Logistics research for which program is able to provide
necessary research support (EPA, unpublished
material, 2006)
NSF Polar Research Facilities and Research facilities: keep construction cost and
Support schedule variances of major polar facilities
projects as monitored by earned-value
management at 8% or less Research support:
provide necessary research support for Antarctic
researchers at least 90% of time (OMB 2007)
NSF Support for Individual External validation of "significant achievement" in
Researchers attracting and preparing U.S. students to be highly
qualified members of global S&E workforce (EPA,
unpublished material, 2006)
NSF Science and Engineering Centers 2006 Percentage of decisions on preproposals that are
Program merit-reviewed and available to Centers Program
applicants within 5 months of preproposal receipt
or deadline date (OMB 2007)
NSF Time to decision for proposals: for 70% of
proposals submitted to National Science
Foundation, inform applicants about funding
decisions within 6 months of proposal receipt or
deadline date or target date, whichever is later
(Tsuchitani 2007)
NSF Facilities cost, schedule, and operations: keep
negative cost and schedule variances at less than
10% of approved project plan for 90% of facilities;
keep loss of operating time due to unscheduled
downtime to less than 10% of total scheduled
operating time for 90% of operational facilities
(Tsuchitani 2007)
USDA USDA Research: Economic Percentage of construction acquisition and upgrade
Opportunities for Producers projects with negative cost variance of less than
10% of approved project plan (EPA, unpublished
material, 2006)
USDA Economic Opportunities for 2004 Cumulative dollars saved for grant review (OMB
Producers 2007)
USDA Economic Opportunities for 2004 Proposal review time in days (OMB 2007)
Producers
USDA Research on Protection and 2005 Additional research funds leveraged from external
Safety of Agricultural Food sources (OMB 2007)
Supply
USDA Economic Research Service 2005 Index of ERS product releases per staff year (OMB
2007)
28
USDA Grants for Economic 2006 Cumulative dollars saved for grant review: dollars
Opportunities and Quality of saved reflect average salary saved by calculating
Life for Rural America number of calendar days saved annually between
receipt of proposal and date funding awarded for
competitively reviewed proposals, then multiplied
by average daily salary for CSREES employees
(OMB 2007)
USDA In-House Research for Natural 2006 Relative increase in peer-reviewed publications
Resource Base and Environment (OMB 2007)
USDA In-House Research for Nutrition 2006 Relative increase in peer-reviewed publications
and Health (OMB 2007)
29
5 Conclusion
R&D organizations and programs face unique challenges in managing performance. An R&D
environment generally supports more open-ended creativity, longer-term visions, and more
exploratory work. In this type of environment, performance is generally harder to measure,
available data is often less timely, and more unknowns exist.
However, R&D organizations and programs can establish a valuable performance management
process. Organizations must look introspectively to identify measures of performance to help
achieve their goals and meet stakeholder objectives. Example metrics from other organizations
can provide a starting point for brainstorming. Those metrics that will best assess performance
and motivate effectiveness and efficiency will be specific to each organization. Peer review will
likely be a valuable process for offering performance feedback. Technical knowledge, flexibility
in allowable outcomes and timeframes, ongoing support, and true integration within an
organization's processes and culture are important attributes to performance management within
an R&D organization.
Based on the case studies examined in this report, organizations are increasingly using both
quantitative and qualitative measures to manage performance and improve sustainable value.
While some companies (e.g., Lucent, Hewlett-Packard) believe that R&D spending drives value
and growth, others (e.g., Xerox) operate so that R&D spending lags value and growth creation.
Yet other businesses (e.g., IBM) consider R&D spending uncorrelated to value and growth at
some level. Government organizations have more multi-dimensional goals than commercial
companies focused on profit. Therefore, a good performance management process is critical to
assessing and driving value in the government sector. Government R&D organizations are
evolving their own performance management processes based on goals and needs. Many (e.g.,
Navy S&T, DHS) view R&D as an essential means to achieving increased knowledge and
innovation to provide a competitive advantage over adversaries. For-profit companies rely more
heavily on financial metrics of performance, but are expanding to include other quantitative and
qualitative metrics. Government organizations choose a suite of performance metrics (e.g.,
programmatic, organizational, workforce, activity, outcome, impact, value) consistent with their
specific missions and goals.
Performance management that is properly implemented with management support and active
employee involvement is a powerful tool for the enterprise. Internally, it cultivates a systematic,
long-term view of the organization. It helps an enterprise stay focused on attributes of success
and failure to achieve the organization's goals and deliver meaningful results. Externally, it
communicates management efficiencies, transparency of goal alignment and resource targeting,
output effectiveness, and overall value of agency outcomes or progress toward those outcomes.
R&D organizations should be allowed flexibility to design and implement a performance
management process aligned with their mission, goals, and objectives that can be systematically
implemented with management support and active employee involvement to convey the true
value of performance to the enterprise.
30
Appendix A The Three Dimensional Value Proposition Approach
The three dimensional value proposition (3DVP) approach provides a process for measuring the
attributes of an IT operation organization, assessing the conditions under which it operates, and
examining the time in which the value proposition is defined. The purely technical
characteristics of the project (e.g., server uptime, network latency) and/or financial metrics (e.g.
return on investment, total cost of ownership) do not convey the true value of the IT investment
to the enterprise. 26
As illustrated in Figure 11, the 3DVP approach is based on the idea that the value of any system
is dependent on three main types of variables: internal variables (the attributes and
characteristics) of the system, external variables (conditions and external factors) impacting the
system, and the temporal effect represented in a varying time interval. It can be noted that the
value of a system is a function of a set of measureable attributes of a system under different
conditions at a specific point in time, and that these variables are only relevant to a certain group
of stake-holders. 27
26
Nassar, Ayman, "A System-Based Approach for Defining IT Operations Value Proposition," Management
Science and Engineering, October 2006.
27
Ibid.
31
Appendix B Acronym List
33
SBIR Small Business Innovation Research
S&T Science and Technology
S&TI Science and Technical Intelligence
STIPDG Summer Transportation Intern Program for Diverse Groups
TAB Technical Assessment Board
TBD To Be Determined
TCO Total Cost of Ownership
TFAP Technology Facilitation Action Plan
TIN Technology Innovation Network
TRL Technology Readiness Level
UFR Unfunded Requirements
USDA United States Department of Agriculture
VCM Value Creation Model
34
Bibliography
Atkins, Dr. Daniel L, "Performance Management and the Intelligence Community," Department
of the Air Force, Air Command and Staff College, Air University, Maxwell Air Force Base, AL,
April 2008.
Carter, Dr. Charlie, "Methodology for Evaluating Energy R&D," US Department of Energy,
Office of Science & Technology Policy, 23 April 1997.
"Guide to the Program Assessment Rating Tool (PART)," Office of Management and Budget,
January 2008.
Harman, Wayne, and Robin Staton, "Science and Technology Metrics and Other Thoughts,"
Naval Surface Warfare Center, Dahlgren, Virginia, July 2006.
Hartmann, George, "Planning Your Firm's R&D Investment," Technology Management, March
2006.
Kirchhoff, Bruce, Steven Walsh, Matt Merges, and Joseph Morabito, "A Value Creation Model
for Measuring and Managing the R&D Portfolio," Engineering Management Journal, March
2001.
Kostoff, Dr. Ronald N., "Science and Technology Metrics," Office of Naval Research.
Nassar, Ayman, "A System-Based Approach for Defining IT Operations Value Proposition,"
Management Science and Engineering, October 2006.
Oakley-Bogdewic, Dr., Lisa, Carolyn Kahn, and Kevin Buck, "Recommendations for the
Program Assessment Rating Tool (PART)," The MITRE Corporation, Stakeholder-Driven
Performance Management Mission Oriented Investigation and Experimentation (MOIE), April
2009.
Patterson, Marvin L., "From Experience: Linking Product Innovation to Business Growth,"
Journal of Product Innovation Management, 1998.
35
"Performance Measurement of Research and Development (R&D) Activities, Oak Ridge
Associated Universities, 2005.
Phillips, Lucy, "Performance Management at RBS Looks Beyond Financial Results," 22 April
2009.
"The Research Value Mapping Project," Office of Basic Energy Sciences, Department of
Energy, 19 February, 1999.
"Science and Technology for a Safer Nation," US Department of Homeland Security, March
2008.
Silberglitt, Richard, Lance Sherry, Carolyn Wong, Michael Tseng, Emile Ettedgui, Aaron Watts,
Geoffrey Stothard, "Portfolio Analysis and Management for Naval Research and Development,"
RAND Corporation, 2004.
Talbot, Colin, “Performance Management,” The Oxford Handbook of Public Management,” UK:
Oxford University Press, 2005.
Winston, Rebecca, "Risk Management for R&D Projects? Why Bother?" PM World Today,
October 2006.
36