Shyu White Paper July 15 2024
Shyu White Paper July 15 2024
Paul Solomon
Note: This revision adds a best practice from the GAO Schedule Assessment Guide and recommendations
from the Defense Business Board Business Transformation Advisory Subcommittee report, Creating a DoD
Digital Ecosystem. This revision also includes justification to eliminate the DFARS EVMS clause which is a
barrier to the digital ecosystem’s data needs and information flows regarding true schedule, technical, and
cost performance.
DoDD 5000.01, The Defense Acquisition System (DAS), includes policies to speed up delivery of products that
work as planned, e.g., products that meet the documented capability needs. However, several DoD
instructions and guides should be revised to better enable achievement of DAS objectives. Revisions will
benefit programs managers (PM) of programs with the following characteristics:
1. Use the embedded software path to develop software embedded in weapon systems.
2. Employ digital engineering (DE) metrics.
3. Employ model-based systems engineering (MBSE).
To speed up delivery of products that work, PMs need timely and accurate schedule status and situational
awareness of program execution for proactive resolution of issues impacting cost, schedule, and technical
achievement of program objectives. PMs also need situational awareness of the degree of product quality as
measured by functional completeness.
Per the DoD DE Strategy (DE Strat), expected benefits of DE include better informed decision-
making/greater insight through enhanced transparency and increased efficiency in acquisition practices.
This evolution will require engaging contracting and legal teams to streamline business and contracting
practices.
DoD will use DE methodologies, technologies, and practices across the life cycle of defense acquisition
programs… engineering, and management activities.
b. As specified in DoDI 5000.88, certain programs must include a DE implementation plan in the SE plan.
2.7. DOD COMPONENT HEADS WITH ACQUISITION AUTHORITY.
(2) Provide guidance and support for program managers (PMs) to develop, validate, and maintain:
(a) Credible and coherent authoritative sources of truth (ASOT) shared with stakeholders.
(b) Digital models that accurately reflect the architecture, attributes, and behaviors of the system they
represent.
However, the current set of instructions and guides focus on engineering, not program management, and are
insufficient to enable rapid decisions based on better-informed decision-making/insight of the base
measures of schedule and progress. To enhance transparency, the following documents should be revised to
address a PM’s information needs for authoritative DE metrics of schedule, progress, quality, technical debt
1
and technical performance:
1. DE Strat
2. DAS
3. DoD Instruction 5000.87 Operation of the Software Acquisition Pathway (5000.87)
4. DoD Instruction 5000.88 DoDI Engineering of Defense Systems (5000.88)
5. DoD Instruction 5000.89 DoDI Test and Evaluation (5000.89)
6. DoD Directive 5000.59 - DoD Modeling and Simulation (M&S) Management
7. DoD Systems Engineering Guidebook (SE Guidebook)
8. DoD SE Plan Outline version 4 (SEP)
9. DoD Integrated Master Plan (IMP) and Integration Master Schedule (IMS) Preparation and
Use Guide (IMP/IMS)
10. DoD Integrated Program Management Data and Analysis Report Implementation & Tailoring Guide
(IPMDAR Guide)
11. DOD MIL-HDBK-245E, DOD Handbook, Preparation of Statement of Work (SOW Handbook).
1. If the definitions of the technical baselines (functional, allocated, product), and if applicable Minimum
Viable Products (MVP), and Minimum Viable Capability Releases (MVCR), will be completed on schedule.
2. If the needed capabilities, features, and functions will be delivered on schedule.
3. If the software engineering processes mitigate cost and schedule risks by identifying and removing
software-related technical debt early in development (SE Guidebook).
4. If technical performance is being assessed at all levels: component, subsystem, integrated product,
and external interfaces.
5. If the intermediate goals for tracking technical performance measures (TPM) are achieved on
schedule.
6. If Modular Open Systems Approach (MOSA), defined interfaces between modules that are defined by
widely supported standards are achieved on schedule.
Mr. Andrew Hunter is Assistant Secretary of the Air Force for Acquisition, Technology and Logistics. In his
response to Senate Armed Services Committee (SASC) Advance Policy Questions (APQ) as nominee for that
post, on Oct. 5, 2021, he stated that, if confirmed:
I would also work closely with the Program Executive Officers to ensure all acquisition programs are on
track to meet cost, schedule, and performance criteria, and take appropriate actions where needed
when this is not the case.
I will perform active and close oversight of the B-21 program….to ensure the B-21 program cost,
schedule, and performance stays on track.
I will review the Presidential Aircraft Replacement program in detail…to ensure the program is,
and remains, on track to meet cost, schedule, and performance criteria.
I will work with the acquisition workforce leadership to continue emphasizing the pivot to DE and
2
modern software development by leveraging commercial practices and standards.
In his response, he also stated that “I believe that digital acquisition practices such as DE, open systems
architecture, and agile software development are best practices in these areas...If confirmed, I will
ensure the acquisition community is closely engaged with operators in pursuing technology and continues
to employ best practices as we develop capability to meet evolving threats.
On March 22, 2022, the Hon. William La Plante appeared before the SASC as nominee for Undersecretary of
Defense for Acquisition and Sustainment. In his response to APQs, he stated his positions and commitments
regarding EVM, iterative development approaches including MVCs, and DE. Excerpts from the APQ
statement follow.
EVM
The earned value management system (EVMS) is used to assess the cost, schedule, and technical
performance of major capability acquisitions for proactive course correction. However, the Section 809
Panel reported that EVM does not measure product quality and concluded, “EVM has been required on
most large software programs but has not prevented cost, schedule, or performance issues.” In 2009
DoD reported to the committee that “a program could perform ahead of schedule and under cost
according to EVM metrics but deliver a capability that is unusable by the customer” and stated the
program manager should ensure that the EVM process measures the quality and technical maturity of
technical work products instead of just the quantity of work performed.
51. If confirmed, what steps would you take, if any, to require contractors to report valid measures of
cost, schedule, and technical performance for all acquisition pathways?
If confirmed, I will work across the Department and with the industrial base— current and emerging—to
validate, improve, or establish appropriate metrics across the acquisition pathways. … I plan to
continue open communications to ensure transparency and allow individual programs to continually
improve and tailor approaches to best meet the warfighter need.
52. If confirmed, what steps would you take, if any, to require contractors that employ the DOD DE
Strategy to maintain valid information in the digital authoritative data source that is sufficient for
program managers to make informed and timely decisions to manage cost, schedule, performance, and
risk?
If confirmed, I would seek to engage with our industry partners and Service representatives to better
understand how they are currently employing DE and how we can work in partnership to better
collaborate within and outside of the Department… A combination of strong data, tool and modeling
standards and environments, training of our Acquisition Corps, and proper contract and data rights
guidance are foundational to enabling successful adoption of DE to feed the right cost, schedule,
performance and risk data to our acquisition decision makers.
40. What is your opinion on the merits of DOD incorporating iterative development
3
approaches centered on fielding minimum viable capabilities?
Best practices in software development focus on rapidly fielding a minimum viable capability to get into
the hands of users to accelerate learning, capture feedback, and use the insights to shape requirements,
design, and strategies. … Iterative development can reduce cycle times and be more responsive to
changing technologies, operations, and threats. If confirmed, I would seek to promote the DoD’s use of
this leading industry practice.
41. To what extent do you believe DOD has broadly implemented commercial best practice
agile development approaches adequately for software and hardware systems?
… I also understand DoD has taken important steps such as issuing the new Software Acquisition Pathway
which is purpose-built to implement best commercial agile approaches and enable modern software
practices for both applications and embedded software. DoD is still in the early stages of effectively
implementing agile and modern software approaches with progress in software intensive systems that
can be leveraged for application to more of our hardware systems. If confirmed, software acquisition will
be a high priority.
The DE metrics should also be sufficient to demonstrate that past and pending DoD commitments to
Congress, regarding cost and schedule reporting, will be met. Examples follow.
• Provision in NDAA for FY 2022 Sec. 1650 Review of EMD Contract for Ground-Based Strategic
Deterrent Program (GBSD)
Congress is concerned with the implementation of DE as a best practice. The NDAA for FY 2022 includes
a provision that specifically addresses the implementation of DE; Sec. 1650, Review of EMD Contract for
Ground-Based Strategic Deterrent Program (GBSD). That provision requires a review of DE with concern
about the AF’s ability to implement DE best practices and to leverage DE. Excerpts follow.
My recommendations for improving the cost, schedule, and program management of the EMD phase and
the effectiveness of DE, are covered in Tables 1 and 3 below.
• Ensure that Integrated Test and Evaluation is integrated with Modeling and Simulation to assess
attainment of technical performance parameters and to confirm performance against documented
capability needs.
4
• Ensure that programs using the embedded software path align test and integration with the testing
and delivery schedules of the overarching system in which the software is embedded, including the
testing and delivery schedules of MVPs and MVCRs.
1 SE and cost and schedule performance should be integrated and not stove-piped.
2 The PM should ensure that the cost and schedule performance process measures the quality and
technical maturity of technical work products instead of just the quantity of work performed.
3 Cost and schedule performance reporting can be an effective program management tool only if it is
integrated with technical performance, if the …processes are augmented with a rigorous SE process,
and if the SE products are costed and included in cost and schedule performance tracking.
4 If good TPMs are not used, programs could report (schedule performance) as 100 percent
complete even though behind schedule in validating requirements, completing the preliminary
design, meeting the weight targets, or delivering software.
• 2014 Report to Congress on Performance Assessments and Root Cause Analyses (PARCA)
Finally, the PARCA EVM Division will identify, document, and publish specific methods for relating
technical performance to earned value performance. The goal is to provide more accurate joint,
program office, and contractor situational awareness of the program execution. PARCA believes that
earned value metrics and technical metrics such as TPMs should be consistent with program progress.
Earned Value focuses on the completion of a set of tasks to mature the design. It should be consistent
with the set of metrics that indicate the actual design maturity.
In 2018, the Section 809 Report of the Advisory Panel on Streamlining and Codifying Acquisition
Regulations (Sec. 809 Report) reiterated issues in the DoD reports to Congress. The Panel reported that
“another substantial shortcoming of EVM is that it does not measure product quality. A program could
perform ahead of schedule and under cost according to EVM metrics but deliver a capability that is
unusable by the customer…Traditional measurement using EVM provides less value to a program than
an Agile process in which the end user continuously verifies that the product meets the requirement.”
• 2022 GAO Report: Congressional Need for Performance Metrics (Cost and Schedule)
In February 2022, GAO released GAO-22-104687 DEFENSE ACQUISITIONS Additional Actions Needed
to Implement Proposed Improvements to Congressional Reporting. Per the report, “DOD has yet to
decide what information to include in acquisition reports to Congress, including performance metrics
for each Adaptive Acquisition Framework pathway … for example, the extent to which a program is
5
meeting its baseline cost and schedule estimates.”
In March 2022, GAO released GAO-22-104513 LEADING PRACTICES Agency Acquisition Policies Could
Better Implement Key Product Development Principles. GAO found that DOD policies only partially
implement a key sub-principle for product development, used by leading commercial companies, to
“Use Iterative Design and Testing to Identify a Minimum Marketable Product.”
GAO reviewed policies for provisions requiring development of a MVP or initial capability to be
improved by subsequent or evolving releases. “GAO found that DOD Directive 5000.01 implies
iterative design followed by successive updates, but there is no reference to a minimum product prior
to developing successive updates. By comparison, the software policy requires program officials to
“use an iterative, human-centered design process to define the MVP recognizing that an MVP’s
definition may evolve as user needs become better understood.” The software policy is limited to
software efforts using the software pathway and does not include hardware acquisitions or programs
using other pathways.
In January 2022, DOT&E assessed Block 4 software development on the F-35 program and discussed
the MVP. DOT&E stated:
“Although the program designed C2D2 around commercial “agile software” development concepts,
it does not adhere to the published best practices that include clear articulation of the capabilities
required in the MVP, focused testing, comprehensive characterization of the product, and full
delivery of the specified operational capabilities. The program did not deliver programmed
capabilities to operational units, as defined in the Air Systems Playbook.”
• Report to Accompany the SASC NDAA for FY 2023, sec. 801, Middle Tier Authority (MTA),
with regard to the test plan.
Modifications to MTA. Sec. 801:
The committee is concerned that the desire for speed in these programs could lead to the omission of key
elements of good program management. Therefore, the committee believes that MTA programs and
the associated stakeholders would benefit from a … test plan.
• 2022 S E G u i d e b o o k :
Finding: Existing policies and guidance do not Support DOD oversight of non-software pathway
weapon programs using agile. Without the use of outcome-based metrics and continually assessing
6
the value of what was delivered against user needs, a program using Agile software development
might deliver capabilities and features that are not essential to the customer and that could
contribute to schedule and cost overruns.
1: Incorporate Agile principles into requirements policy and guidance for all programs using
Agile for software development. This should include a Capability Needs Statement and User
Agreement.
2: Incorporate oversight of Agile development of software into acquisition policy and guidance for
all programs using Agile. This should include use of metrics, including outcome-based metrics, and
continually assessing the value of capability delivered to support iterative software development.
• Provision in NDAA for FY 2021 SEC. 836. DIGITAL MODERNIZATION OF ANALYTICAL AND
DECISION-SUPPORT PROCESSES FOR MANAGING AND OVERSEEING DEPARTMENT OF DEFENSE
ACQUISITION PROGRAMS.
Excerpts:
• Iteratively develop and integrate advanced digital data management and analytics
capabilities, consistent with private sector best practices, that—
o integrate all aspects of the defense acquisition system, including …acquisition,
management,
o enable the use of such data to inform further development, acquisition, management and
oversight of such systems, including portfolio management; and
o include software capabilities to collect, transport, organize, manage, make available, and
analyze relevant data throughout the life cycle of defense acquisition programs, including any data
needed to support individual and portfolio management of acquisition programs.
• Supply data to DE models for use in the defense acquisition, sustainment, and portfolio
management processes;
• Move supporting processes and the data associated with such processes from analog to
digital format, including planning and reporting processes;
Excerpts follow:
• Programs should employ both automated (e.g., static code analysis scans) and manual (e.g.,
opportunities for developers to add technical debt items to the backlog and tag them as technical
debt when intentionally taking on debt or identify technical debt in design reviews) mechanisms for
identifying technical debt.
• Programs should track technical debt items on the backlog separate from other types of items,
7
such as vulnerabilities and defects.
• Programs should allocate appropriate effort during iteration capacity planning for resolving
technical debt items, and they must ensure that this effort is protected from the pressure to focus
on new capabilities.
• Program roadmaps should include the effort for managing technical debt to ensure that it is
planned and that effort is allocated to it over time.
Takeaway: Include technical debt in DoDI 5000.88, Engineering of Defense Systems and the Engineering
of Defense Systems Guidebook as shown in Table 3.
• 2024 GAO Report, GAO-24-105503 Navy Shipbuilding Increased Use of Leading Design Practices Could
Improve Timeliness of Deliveries, May 2024
How programs measured their achievement of design maturity varied but typically reflected
percentages of design drawings or design-specific contract deliverables expected to be submitted at key
milestones before construction. Navy shipbuilders noted that using this type of metric does not
necessarily provide a clear understanding of overall design maturity. For example, the metrics may
overstate design completeness by giving builders credit for submitting design-related documentation
without fully accounting for the quality or completeness of associated design. Drawings that appear
complete could include design placeholders that lack necessary vendor-furnished information (VFI) for
key equipment and, consequently, mask design uncertainties and remaining design work. Further, Navy
officials noted cases where builders submitted blank design products, which met the submittal deadline
to the Navy but did not contribute to advancing design maturity.
• 2024 Report of the Defense Business Board (DBB) Business Transformation Advisory Subcommittee,
Creating a DoD Digital Ecosystem, DBB FY24-03.
The Subcommittee was tasked by the Deputy Secretary of Defense to evaluate the need for lifecycle
digitalization and to provide recommendations on creating a digital ecosystem with industry partners.
Defense Digital Transformation
The immediate and rapid development of a Defense Digital Ecosystem must become a top national
security priority if the United States is to maintain its military advantage over the pacing threat from
adversaries, including the People’s Republic of China, who are aggressively transforming their defense
production processes. In this rapidly evolving threat environment, the establishment of a Defense Digital
Ecosystem across weapon system development, acquisition, sustainment, and operations is essential to
ensuring the agility and ability to deliver disruptive capability to the warfighter “at the speed of relevancy.”
• DoD must establish new best practices that can be rapidly replicated in a broader transformation.
... recognition that digital transformation will impact a wide array of functions and processes,
including but not limited to engineering, tech infrastructure, contracting, sustainment and logistics,
budget, legal, and personnel.
• Ensure sustainment and performance data are connected via digital threads. Progressive efforts
must include expertise from all phases of the Acquisition process to account for interrelated
processes, data needs, and information flows.
• Digitalization is not merely turning analog processes into digital (i.e., making paper drawings into
digital artifacts), rather it is the breaking down of organizational, process, and production silos using
an open digital ecosystem and access to a common set of data.
• Changing DoD’s prevailing risk-averse culture and inefficient business processes is essential for the
success of any enterprise-level digital initiative.
Recap of Reports
The Sec. 809 Report’s assessment indicates that DoD’s EVM commitments to Congress in 2009 and 2014
have not been met. PARCA’s goal of accurate joint, program office, and contractor situational awareness of
the program execution is relevant to development programs, including those with no EVM requirements, but
that goal is unmet. There is a need to integrate DE with program management. For successful
implementation of the DE Strat and to meet DAS goals, additional guidance is needed to ensure that the PM
measures schedule and progress towards meeting the requirements of the technical baseline.
Recommendations
9
Recommendations are provided herein that define the PM’s information needs and the DE metrics that meet
those needs. Authoritative Sources of Truth (ASOT) for selecting DE metrics and recommended DE
artifacts/work products that may be used as base measures of DE metrics are included in Appendices A and
B.
The five documents cited above can be improved to better define the information needs of PMs for effective
program technical planning and management, configuration and change management, and software
engineering.
The PM needs accurate schedule status and situational awareness of program execution for proactive
resolution of issues impacting cost, schedule, and technical achievement of program objectives. The
technical achievement criteria are defined in the technical baselines. The PM also needs situational
awareness of the degree of product quality as measured by functional completeness.
Finally, the exchange of schedule status information via model exchanges and automated transformations will
eliminate the manual entry of estimated schedule performance such as the percent of work complete used
with EVM. The estimated percent of work complete, such as drawings or code, may fail to be an indicator of
the true status of validating requirements, completing the preliminary design, meeting the weight targets,
or delivering software and may fail to properly account for rework.
Per GAO-24-105503 Navy Shipbuilding Increased Use of Leading Design Practices Could Improve Timeliness
of Deliveries, May 2024, several Navy shipbuilding programs set thresholds for the degree of design maturity
that reflected percentages of design drawings expected to be submitted at key milestones. However, Navy
shipbuilders noted that using this type of metric does not necessarily provide a clear understanding of overall
design maturity. For example, the metrics may overstate design completeness by giving builders credit for
submitting design-related documentation without fully accounting for the quality or completeness of
associated design. Drawings that appear complete could include design placeholders that lack necessary VFI
for key equipment and, consequently, mask design uncertainties and remaining design work.
Common DE Specifications and Standards for Model Exchanges and Automated Transformations
DoD recently established the new position of Chief Digital and Artificial Intelligence Officer (CDAO). The
CDAO should be responsible for addressing the DE Strategy statement that “DoD will need to encourage
commonality in terminology, develop a shared understanding of concepts, and ensure consistency and rigor
in implementing DE across engineering activities…by evaluating current policy, guidance,
specifications, and standards to determine what changes are necessary to implement DE.”
The evaluation should include providing a specifications and standards for exchanging data between the
engineering requirements management data base (such as DOORS), the ASOT, and the program cost and
schedule reports such the Integrated Program Management Data and Analysis Report (IMPDAR). The
IMPDAR’s components include the Contract Performance Dataset (CPD) which provides
10
performance/execution data from the contractor’s existing management systems and the schedule
(comprised of both the Native Schedule File and the Schedule Performance Dataset (SPD) which provides data
from the contractor’s Integrated Master Schedule.
The Practical Software and Systems Measurement (PSM) DE Measurement Framework Version 1.1, published
by the DoD Digital Engineering Working Group (DEWG), provides guidance to use Model-Based Systems
Engineering (MBSE) practice to:
1. Fully integrate system data and models with engineering, program management, and other domains
and disciplines.
2. Collect data directly from DE modeling tools and record results in team tracking tools, such as the
schedule.
The schedule and technical performance data collected from DE modeling tools is recorded in the schedule
without manual intervention, manipulation, or elimination, as compared with earned value, thus preserving
its truth and management value.
DoD Directive 5000.59 - DoD Modeling and Simulation Management should be revised to assign
responsibility to the CDAO for developing specifications and standards. Of course, budget should be
requested to develop the specifications and standards.
Action Plan
It is recommended that the documents cited above be revised, as specified in Table 3. It is also recommended
that the DEWG develop and publish metrics specifications for DE and MBSE that support the information
needs of PMs. The metrics specifications should be used as digital ASOTs for three PM responsibilities.
1. Develop the time phased schedule to complete the requirements definitions. It should reside in an
automatedly linked scheduling system.
2. Assess the schedule progress of defining and completing requirements. Schedule progress
should also reside in an automatedly linked scheduling system.
3. Use digital artifacts from the ASOT as base measures of DE metrics. These digital artifacts are
ASOT that SE work products are completed, such as:
• Requirement definitions including approved technical performance measures (TPM), verification
methods, and completion criteria in the functional and allocated baselines.
• Trade studies
• Completed products in the product baseline including the MVP and MVCR baselines, if
applicable
• Test artifacts (e.g., test cases, plans, deficiencies, and results)
With MBSE, the record of authority shifts away from the documents to the digital model. Digital modeling
provides an analytical tool, a coverage metric, to evaluate a current state of the model. In addition to
calculating statistics of how many requirements are covered by test cases (Verify relationship) or design
elements (Satisfy relationship), every metric records a time stamp. Periodically calculating the same metric
allows the user to monitor changes of a specific aspect of the model in time.
The EVMS DFARS clause should be revoked. It is an impediment to achieving DBB’s objectives such as:
11
• Use digital threads to account for interrelated processes, data needs, and information flows
(regarding measuring schedule, technical and cost performance based on ASoTs).
• Break down organizational, process and production silos using an open digital ecosystem and access
to a common set of data.
• Overcome bureaucratic inertia and risk-adverse culture…significant barriers to success (in holding
program managers and contractors accountable for program failures).
• Changing DoD’s prevailing inefficient business processes (for measuring cost, schedule, and
technical performance and for providing early warning of pending failures) for the success of any
enterprise-level digital initiative.
The pertinent DAS overarching policies and objectives are ASOTs for the purposes of the recommendations
herein. They are in Table 1.
DAS Excerpts
Section
1.2.a Deliver Performance at the Speed of Relevance.
The DAS will: (d) Conduct data driven analysis.
1.2.k Employ Performance Based-Acquisition Strategies
To maximize competition, innovation, and interoperability, acquisition managers will
consider and employ performance-based strategies for acquiring and sustaining
products and services. “Performance-based strategy” means a strategy that supports an
acquisition approach structured around the results to be achieved as opposed to the
manner by which the work is to be performed.
1.2.o Conduct Integrated Test and Evaluation (T&E)
(1) T&E will be integrated throughout the defense acquisition process. Test and
evaluation will be structured to provide essential information to decision makers, assess
attainment of technical performance parameters, and determine whether systems are
operationally effective, suitable, survivable, and safe for intended use.
(2) The conduct of T&E, integrated with M&S will:
(b) Assess technology maturity and interoperability.
(d) Confirm performance against documented capability needs and adversary
capabilities.
The recommended document modifications herein pertain to the following Information categories and
measurable concepts in PSM. See Table 2 and Appendix C.
12
The proposed metrics specifications and DE artifacts support the objectives of and are consistent with
documents that, in my opinion, are ASOT for DE. The documents follow.
• DoD Instruction (DoDI) 5000.80, Middle Tier of Acquisition
• DoD Instruction (DoDI) 5000.85, Major Capability Acquisition
• DoDI 5000.87, Software Acquisition
• DoDI 5000.88, Engineering of Defense Systems
• DoDI 5000.89, Test and Evaluation
• DoDI 5000.97 DIGITAL ENGINEERING (DE)
• DoD DE Strat
• DoD Software Modernization Strategy (SW Modernization)
• DoD OSD Best Practices for Using SE Standards (ISO/IEC/IEEE 15288, IEEE 15288.1, and IEEE 15288.2) on
Contracts for DOD Acquisition Programs (15288BP)
• SEI Blog Posts by Natalia Shevchenko Requirements in MBSE, Feb. 22,
2021Benefits and Challenges of MBSE, July 2021
• DoD SE Plan Outline version 4 (SEP)
• DoD Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs, 2023
• DOT&E
• DoD IMP/IMS
• Engineering of Defense Systems Guidebook
• GAO-20-590G GAO Agile Assessment Guide (GAO Agile)
• GAO Schedule Assessment Guide (GAO Schedule)
• Defense Business Board Business Transformation Advisory Subcommittee report, Creating a DoD Digital
Ecosystem
• NDIA Integrated Program Management Division, A Guide to Managing Programs Using Predictive
Measures, March 26, 2021 Rev. 3 (Predictive Measures).
• PSM DE measurement framework
• SE Guidebook
• International Counsil on SE (INCOSE) SE Leading Indicators Guide (SELI)
• Solomon, Paul. INCOSE International Symposium paper, “Using Earned Value to Track Requirement Progress” July
2006 (INCOSE Track)
• SERC SE Research Center Task Order WRT-1001: Digital Engineering Metrics, Technical Report SERC-
2020-TR-002 (SERC)
• Solomon, Paul. SEI Technical Note CMU/SEI-2002-TN-016, Oct. 2002 "Using CMMI® to Improve
EVM” (SEI-EVM)
Note: Despite its title, EVM is applicable to any project including projects that do not use EVM. SEI
focuses on the base measures of work unit progress.
• Solomon, Paul and Young, Ralph. Performance-Based Earned Value, IEEE Computer Society/John
Wiley and Sons, 2007. (PB-EV)
• 2018 DoD Defense Science (DSB) Board Report Design and Acquisition of Software for Defense Systems
13
(See Appendix F)
• 2019 NDIA SE Div. Input to DSB (See Appendix F)
• DoD Agile Metrics Guide Strategy Considerations and Sample Metrics for Agile Development Solutions Version
1.2, 11 November 2020 (Agile Metrics)
• PSM
Recommended revisions to DAS, DoDI 5000.80, DODI 5000.87, DODI 5000.88, DODI 5000.89, DE Strat, SEP,
and IPMDAR Guide are included Table 3.
14
DoDI 3.4 b. Technical Baseline Management product baseline,
5000.88 The PM will implement and describe in the SEP a technical baseline Add: including, if
management process as a mechanism to manage technical maturity, to needed, MVP and MVCR
include a mission, concept, functional, allocated, and product baseline. If baselines.
practicable, the PM will establish and manage the technical baseline as a
digital ASOT.
15
DoDI 3.4.c. Configuration and Change Management performance
5000.88 The LSE, under the direction of the PM, will implement a digital CM Insert: technical
approach and automated tools to establish, control, and curate product
attributes and technical baselines across the total system life-cycle. The performance
CM approach will: Insert: technical
(1) Identify, document, audit, and control schedule, cost, functional,
physical, and performance characteristics of the system design. metrics,
(2) Specifically, track any changes (e.g., a dynamic change log for in and Add:
out of scope changes, formal engineering change proposals) and provide including DE metrics for
an audit trail of program design decisions and design modifications. schedule progress and
(3) Provide for traceability of mission capability to system requirements to quality
performance and execution metrics.
DoDI 3.6 Specialty Engineering technical performance,
5000.88 3.6.a(2)(a)6 Insert:
Metrics identification, tracking, and reporting to address software schedule progress,
technical performance, development process, and quality.
DoDI 3.6 Specialty Engineering Insert: 9 technical debt
5000.88 3.6.a(2)(a)
DoDI 3.6.a(2)(b) The program may automate collection of metrics as much as metrics
5000.88 possible.
Insert:
, including DE metrics
for schedule progress
and quality,
DoDI 3.1.i results
5000.89 As part of the DE strategy… tools...must provide authoritative sources of Insert:
models, data, and test artifacts (e.g. test cases, plans, deficiencies, and , including DE metrics
results) for schedule progress
and quality,
DoDI 2.2.4 Software Acquisition In the IMS
IMP/IMS Although an IMS typically would not include Level of Effort (LOE) activities, Insert: “as IMP events”
the program should schedule MVP and post MVCR sprints in the IMS.
Programs should work closely with their software development team to Also, delete “Although
ensure the IMP structure matches the structure of Agile elements. For an IMS typically would
example, features or capabilities from an Agile perspective often correlate not include Level of
to the Criteria level of a project’s IMP. Effort (LOE) activities.” It
is irrelevant to
embedded software.
DE Strat 1.3 Exchange of information between technical disciplines or information
organizations should take place via model exchanges and automated Insert:
transformations.
16
, including DE metrics
for schedule progress
and quality,
DE Strat 2.3 Use the digital ASOT as the technical baseline performance
Insert:
Stakeholders should use the ASOT to make informed and timely decisions technical
to manage cost, schedule, performance, and risk. For example, contract
deliverables should be traced and validated from the ASOT. deliverables
Insert:
that report schedule
progress and product
quality (functional
completeness)
IPMDAR 1.2. IPMDAR consists of the following three components: 1.2. IPMDAR consists of
Guide … The IPMDAR requirement is comprised of three components: the the following four
Contract Performance Dataset (CPD), the Schedule (to include Native components:
Schedule and Schedule Performance Dataset (SPD), The IPMDAR
…The IPMDAR
requirement is
comprised of
four
components:
and the DE artifacts that
are created from the
standards, rules, tools, and
infrastructure within a DE
ecosystem, including
schedules.
IPMDAR 1.2.2 Schedule (Comprised of both the Native Schedule File and the Schedule Add
Guide Performance Dataset (SPD)). Provides data from the contractor’s Integrated F o r s o f t w a r e t h a t i s
Master Schedule (IMS). em bedded in weapo n
systems, the contractor’s
The Native Schedule submission is a direct export from the contractor’s IMS includes milestones
scheduling tool. The SPD is a collection of JSON encoded data tables a n d s c h e d u l e
capturing the detailed task and schedule metrics, task relationships, and performance from the DE
resource assignments tables. Since the CPD data report is now required at artifacts that are created
the CA or WP levels, the task definitions within the SPD must now be from the standards, rules,
correctly encoded against the CA or WP data included in the corresponding tools, and infrastructure
CPD submission. This critical improvement enhances the ability to support with in a DE ecos ys t e m.
integrated cost/schedule analysis.
17
IPMDAR 3.4. Applying the IPMDAR DID When EVMS DFARS Clause is not Applicable Add:
Guide or when the DFARS
The Government may apply the Schedule (comprised of both the Native 234.252-7002 EVM
Schedule File and/or the Schedule Performance Dataset (SPD)) deliverable of requirement is not on the
the IPMDAR DID when the DFARS 234.252-7002 EVM requirement is not on software that is
contract. The Schedule is applied to all development, major modification, and embedded in a weapon
low rate initial production efforts. systems contract.
The NDIA Predictive Measures includes predictive indicators that can be used to develop and implement
effective mitigation plans. Excerpts from the Sections, Requirements Completion Metrics and Technical
Performance Measures (TPM), follow.
19
2. The expected count of requirements analyzed from the system level to be
eventually allocated to the system elements (configuration items).
• Requirements Planned - the time-phased profile count of total requirements fully articulated
given resource capability and capacity. This value might come from Control Account Plans
for completion of specifications.
• Requirements Completed – the count of completed requirements as determined from work
package level status reports or system requirements data base.
NDIA TPM
TPM involves predicting the future values of a key technical performance parameter of the higher level
end product under development based on current assessments of products lower in the system
structure. A good TPM has the element of traceability of the technical requirements to WBS to TPMs to
EVM Control Accounts. In the Control Account, a description of the TPM and its allowed range of values
for the Period of Performance of that Control Account should be defined.
The Systems Engineering Management Plan (SEMP) and the resulting SE architectural documents are used
to further define the TPMs and to set threshold values.
Digital Artifacts
Typical artifacts that should be the base measures of schedule performance are outputs from the
measurement and verification processes in OSD Best Practices for Using SE Standards (ISO (International
Standards Organization/IEC (International Electrotechnical Commission)/IEEE (Institute of Electrical and
Electronics Engineers) 15288, IEEE 15288.1, and IEEE 15288.2) on Contracts for DOD Acquisition Programs
(15288BP), GAO Agile, PB-EV, and CMMI® for Development, Version 1.3 (CMMI-DEV, V1.3).
These outputs are ASOTs for PMs. When DE is employed, the digital versions of these artifacts should be
automatically transferred from the engineering to the program management organizations.
Per SE Guidebook, “software development activities should employ automation across all aspects of the
software factory and project management components to eliminate tedious, manual steps to the
maximum degree practicable, enabling higher velocity, consistency, and overall better-quality software
components.
Typical DE artifacts are included in Appendices A and B. The primary source of the artifacts in PB-EV is the
technical note, SEI-EVM, published in 2002. In 2010, SEI published information regarding Agile methods in
CMMI-DEV, V1.3. Excerpts from CMMI-DEV, V1.3, including the processes, Requirements Development,
Configuration Management, and Quantitative Project Management, are in Appendix E.
20
Appendix A ASOT for Selecting DE Metrics and Typical DE Artifacts
21
SELI 1. Requirements Validation Trends
2. Requirements Verification Trends
3. Technical Measurement Trends
INCOSE Requirements management status:
Tracking • Defined
• Validated
• Verification method determined
• Approved
• Allocated
• Traced to verification document (test procedure)
• Designed
• Implemented
• Tested
• Verified
With MBSE, the record of authority shifts away from the documents to the
digital model.
SW Modern- 3 Unifying Principles
ization Resilient software must be defined first by execution stability, quality, and
dependable cyber-survivability. These attributes can be achieved at speed by
aggressively adopting modern software development practices that effectively
integrate performance and security throughout the software development
lifecycle.
More Than Code - Software modernization is more than just code development.
It includes the many policies, processes, and standards that take a concept from
idea to reality. Considerations such as contracting and intellectual property
rights, as well as transition from development to fielding, are often overlooked
and underappreciated. These policies, processes, and standards must not hinder,
but empower the vision of this strategy.
22
SEP Introduction:
• The SEP should include a digital ecosystem implementation plan that
addresses the DE Strat goals and defines six key digital engineering
ecosystem attributes … Applied elements of these attributes
(requirements, models, digital artifacts, …) will be evident in the
planning of the digital ecosystem implementation that results in the
(ASoT) for the program
• The SEP will describe a data management approach consistent with the
DoD DE Strat. The approach should support maximizing the technical
coherency of data as it is shared across engineering disciplines …
Additional approaches to data management should at a minimum
describe:
o Digital artifact generation for reporting and distribution purposes
SEP 2.1 Requirements Development
Program should trace all requirements from the highest level (JCIDS or
equivalent requirements sources) to the lowest level (e.g., component
specification or user story). This traceability should be captured and
maintained in digital requirements management tools or within model(s).
The system Requirements Traceability Matrix (RTM) should be a model
output that can be embedded in or attached to the SEP, or the SEP should
contain a tool reference location. …The matrix should include the
verification method for each of the identified requirements and an indication
whether each requirement is expected to change over the life of the
program.
SEP 2.3 Specialty Engineering (SpEng)
23
2. MAJOR CONCEPTS
PSM DE
Because DE processes help to define the capabilities of the eventual system, DE
measurement
measures can serve as useful leading indicators for other product related
framework
measures.
CYCLE TIME
The elapsed time from when development work is started until the time
development work has been completed and is ready for deployment. This
time includes activities such as planning, requirements analysis, design,
implementation, and testing.
24
IMP/IMS 2.4 Digital Engineering Guidance
Project schedules are digital models and should be integrated with other
digital models of the project to support the project’s DE effort.
SE 2.2.4 Software Engineering
Guidebook
Properly planned software engineering processes can mitigate cost and
schedule risks by allowing DoD programs to identify and remove software-
related technical debt early in development. This early action can increase
acquisition efficiency and lead to higher success rates during operational
testing and during operations and sustainment.
SE Schedule Management
Guidebook Include metrics to assess both schedule health,….associated completeness of the
Work Breakdown Structure and the risk register. A healthy, complete and risk-
enabled schedule forms the technical basis for the EVMS. Strong schedule metrics
are paramount for accurate EVMS data.
Software Quality
Metrics should address software technical performance and quality (e.g., defects,
rework) evaluating the software’s ability to meet user needs
SE Role in Contracting
To adopt commercial best practices and advances, Program Management Offices
(PMOs) should use the DoDI 5000.87 for software acquisition
Incentive fees and penalties such as award fee may be tied to program performance
…evaluated during technical reviews,
PB-EV Maintain bi-directional traceability of product and product component
requirements among the project plans, work packages, planning
packages, and work products. Requirements traceability is a necessary
activity of mapping customer needs to the system requirements and
tracking how the system requirements are met throughout the development
process—in the design, to system component development, through testing
and system documentation, including for validation, verification, as well as
to the project plans, and work products. CMMI® requires bi-directional
traceability, that is, that evidence of an association between a requirement
and its source requirement, its implementation, and its verification is
established from the source requirement to its lower-level requirements,
and from the lower-level requirements back to their source. A requirements
traceability matrix is used to track the requirements.
25
DoDI 5000.87 (4) …define the MVP recognizing that an MVP’s definition may evolve as user needs
become better understood. Insights from MVPs help shape scope, requirements,
and design.
(11) Each program will develop and track a set of metrics to assess and manage the
performance, (schedule) progress, speed, cybersecurity, and quality of the software
development, its development teams, and ability to meet users’ needs. Metrics
collection will leverage automated tools to the maximum extent practicable. The
program will continue to update its cost estimates and cost and software data
reporting from the planning phase throughout the execution phase.
Agile Metrics 5.1.1 Story Points
5.1.7 Release Burnup Charts
… measure the amount of work completed for a given release based on the total
amount of work planned for the release. Usually, story points are used as the unit
of measure to show planned and completed work.
Additional Context
Conceptually, release burnup could be measured using requirements or user
stories as the unit of measure as well. From the user perspective, understanding
how many requirements are completed and how many remain might be a better
way of communicating progress than story points. Additionally, like burndown
charts, burnup charts can be applied to other scopes of work beyond releases (e.g.,
sprint burnup and product burnup).
Variations
• The number of requirements completed provides insight to users on
requirements completed and requirements remaining.
• The number of user stories completed is similar in concept to the metric showing
the number of requirements completed.
26
Appendix B PB-EV Typical SE/DE work products/artifacts
27
Measurement and Specifications of base and derived measures
Analysis
Decision Analysis and Results of evaluating alternate solutions
Resolution
28
Appendix C PSM DE measurement framework Artifacts
29
Appendix D
Excerpts from DOD INSTRUCTION 5000.97 DIGITAL ENGINEERING, December 21, 2023
Glossary:
DE: An integrated digital approach that uses authoritative sources of systems' data and models as a continuum across
disciplines to support lifecycle activities from concept through disposal.
DE Ecosystem: The interconnected infrastructure, environment, and methodology (process, methods, and tools) used to store,
access, analyze, and visualize evolving systems' data and models to address the needs of the stakeholders.
1.2. POLICY.
a. The DoD will conduct a comprehensive engineering program for defense systems, pursuant to DoD Instruction (DoDI)
5000.88. In support of that effort, the DoD will use DE methodologies, technologies, and practices across the life cycle of
defense acquisition programs,… engineering, and management activities.
b. DoDI 5000.88: certain programs must include a DE implementation plan in the SE plan.
31
Appendix E Excerpts from CMMI-DEV, V1.3
Requirements Development
Configuration Management
32
Quantitative Project Management
33
Appendix F page 1 of 2
Excerpts from 2019 NDIA SE Div. Input to 2018 DoD Defense Science (DSB) Board Report Design and
Acquisition of Software for Defense and from DSB Report
DSB Excerpts:
Background
NDIA Excerpts:
NDIA, in collaboration with the International Council on SE (INCOSE) and PSM has volunteered to provide input to
USD(A&S) and USD(R&E) representing the “industry perspective” on implementation of the DSB recommendations.
While the DSB report focuses primarily on SOFTWARE design and acquisition using continuous and iterative methods,
NDIA believes that the scope must be expanded to focus on SYSTEM design and acquisition using continuous and
iterative methods.
Steering at lower levels is integrated with roadmap updates and MVP/Next Viable Product (NVP) planning.
• Contracts defined by MVP: Contracting approach includes mechanisms for flexibly defining and approving MVP/NVP
capabilities.
34
Appendix F page 2 of 2
NDIA Excerpts continued:
35
Appendix G, page 1 of 2
2006 INCOSE International Symposium paper, “Using Earned Value to Track Requirement Progress,” by
Paul Solomon, July 2006
Copyright © 2006 by Paul Solomon. Published and used by INCOSE with permission.
Note: A PDF of this paper may be downloaded from www.pb-ev.com, at the White Papers” tab.
Excerpts:
It is necessary to track the status of each requirement as it moves through engineering life cycle activities.
Measures that reflect the status of the requirements are essential to monitor program status and serve as
a scorecard to indicate that requirements are being implemented on schedule. This paper provides
guidance to use the tools of requirements traceability to plan and measure the progress of the
requirements management activities. The requirements traceability matrix (RTM) can be used as a
scheduling source and as a set of base measures of Earned Value (EV). Finally, the importance and value of
comparing the schedule variances of the requirements management and tracing activities with the
variances of other project activities is discussed.
Progress.
It is important to quantify the progression of requirements from concept to formulation to design to test.
Peter Baxter discusses assessing these requirements to ensure that your product contains all required
functionality. Baxter’s advice addresses software requirements but is also applicable to the system
requirements: It is advisable to measure the number of requirements that each software process generates
or accepts. Measure the number of system or top-level software requirements (i.e. features or
capabilities), as well as the decomposition of system requirements into more detailed requirements. In
order to track differences between developed and planned requirements, it is necessary to also measure
the status of each requirement as it moves through life cycle activities. A typical requirement status could
be: defined, approved, allocated, designed, implemented, tested, and verified. A measure that shows the
status of all requirements is essential in monitoring program status and acts as a scorecard to illustrate
that requirements are being implemented. Early in the program schedule, ensure that requirements
become defined, approved, and allocated as the system architecture is finalized. Near the end of the
program schedule, you should see requirements move from implemented status, to tested, then to verified
status (Baxter 2002). Measuring the status of each requirement as it moves through life cycle activities is
an essential control tool for effective project management.
Recommended Requirements Statuses
To recap, a recommended set of requirements management statuses is:
• Defined
• Validated
• Verification method determined
• Approved
• Allocated
• Traced to verification document (test procedure)
• Designed
• Implemented
• Tested
• Verified
36
Appendix G, page 2 of 2
When determining which project activities and work products should be discretely scheduled and tracked,
PMs regard the RTM as a tool, not as a work product. They propose that populating the RTM with data is
a support activity to the real work products of engineering development (designs, test articles, test results
etc.). They also argue that the actual completion of many of activities listed above, as well as the associated
documents, is the responsibility of other engineers, not the requirements management engineers. They
then point to those who are actually doing the designing or testing or making related decisions.
Consequently, the requirements engineers conclude that, if the allocated requirements have not been
implemented into the design on schedule, or the test procedure does not yet include all necessary test
cases, or the verification of requirements is behind schedule, it’s not their fault. Therefore, they propose,
their activities should be measured as LOE. It is recommended that, regardless of accountability, the
progress of requirements, as they progress through the engineering life cycle, should be scheduled and
measured against a plan. Of course, discrete earned value techniques should be used for management
control. Even though the budget for the requirements engineers may be relatively small, as compared with
the budgets for all other engineers, the earned value taken in control accounts or work packages for
requirements management activities can be the most important indicator of project schedule
performance. The schedule status of the set of requirements reveals more about the health of the project
than any other schedule performance indicator in the Performance Measurement Baseline (PMB).
Conclusions
If the requirements management and traceability activities are behind schedule, it is an early warning that
the rest of the project is or will be in trouble. We recommend that a PM look at the progress and schedule
variance of these activities early in any review. The requirements management and traceability activities
should be discretely planned and measured. If these activities are realistically planned, they provide a valid
basis for Outcome-based metrics (published as “Performance-based EV”) and give the PM insight into
progress of the total program.
37
Appendix H PSM Excerpts
Many of the measurable benefits of DE are associated with the use of both data and validated digital models as a
“source of truth” across life cycle activities.
Page 3
Thus, DE has three interrelated concerns: the transformation of engineering activities to fully digital infrastructure,
artifacts, and processes; the use of authoritative sources of data and models to improve the efficiency and
productivity of engineering practice; and the use of MBSE practice to fully integrate system data and models with
engineering, program management, and other domains and disciplines.
Page 9
DE measures can serve as useful leading indicators for other product related measures. DE can produce additional
products in support of delivered data, hardware, and software products such as digital twins or other model- or
simulation-based executable systems.
Page 54
In a DE environment products are model-driven, providing additional opportunities to cost-effectively incorporate
changes to digital models that are directly traceable to the implemented and tested work products, some of which
can be automatically generated.
59
Model-based work products such as requirements, architecture, design, use cases and other views or modeling
artifacts can be automatically generated and published directly from modeling tools, at significant savings in effort
relative to traditional documentation-centric approaches. Model-driven automation based on an Authoritative
Source of Truth (ASoT) can lead to process efficiencies, labor reductions, shorter cycle times, less rework, and
earlier verification and validation of solutions.
38
Appendix H PSM Excerpts
39