0% found this document useful (0 votes)
34 views39 pages

Shyu White Paper July 15 2024

Uploaded by

pbevps
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views39 pages

Shyu White Paper July 15 2024

Uploaded by

pbevps
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Integrating the Embedded Software Path, Model-Based Systems Engineering, MOSA, and

Digital Engineering with Program Management July 15, 2024

Paul Solomon

Note: This revision adds a best practice from the GAO Schedule Assessment Guide and recommendations
from the Defense Business Board Business Transformation Advisory Subcommittee report, Creating a DoD
Digital Ecosystem. This revision also includes justification to eliminate the DFARS EVMS clause which is a
barrier to the digital ecosystem’s data needs and information flows regarding true schedule, technical, and
cost performance.

DoDD 5000.01, The Defense Acquisition System (DAS), includes policies to speed up delivery of products that
work as planned, e.g., products that meet the documented capability needs. However, several DoD
instructions and guides should be revised to better enable achievement of DAS objectives. Revisions will
benefit programs managers (PM) of programs with the following characteristics:
1. Use the embedded software path to develop software embedded in weapon systems.
2. Employ digital engineering (DE) metrics.
3. Employ model-based systems engineering (MBSE).

To speed up delivery of products that work, PMs need timely and accurate schedule status and situational
awareness of program execution for proactive resolution of issues impacting cost, schedule, and technical
achievement of program objectives. PMs also need situational awareness of the degree of product quality as
measured by functional completeness.

Per the DoD DE Strategy (DE Strat), expected benefits of DE include better informed decision-
making/greater insight through enhanced transparency and increased efficiency in acquisition practices.
This evolution will require engaging contracting and legal teams to streamline business and contracting
practices.

Per DODI 5000.97 DIGITAL ENGINEERING (DE), December 21, 2023:

DoD will use DE methodologies, technologies, and practices across the life cycle of defense acquisition
programs… engineering, and management activities.
b. As specified in DoDI 5000.88, certain programs must include a DE implementation plan in the SE plan.
2.7. DOD COMPONENT HEADS WITH ACQUISITION AUTHORITY.
(2) Provide guidance and support for program managers (PMs) to develop, validate, and maintain:
(a) Credible and coherent authoritative sources of truth (ASOT) shared with stakeholders.
(b) Digital models that accurately reflect the architecture, attributes, and behaviors of the system they
represent.

Pertinent excerpts from DODI 5000.97 are in Appendix D.

Information Needs of Program Managers

However, the current set of instructions and guides focus on engineering, not program management, and are
insufficient to enable rapid decisions based on better-informed decision-making/insight of the base
measures of schedule and progress. To enhance transparency, the following documents should be revised to
address a PM’s information needs for authoritative DE metrics of schedule, progress, quality, technical debt
1
and technical performance:

1. DE Strat
2. DAS
3. DoD Instruction 5000.87 Operation of the Software Acquisition Pathway (5000.87)
4. DoD Instruction 5000.88 DoDI Engineering of Defense Systems (5000.88)
5. DoD Instruction 5000.89 DoDI Test and Evaluation (5000.89)
6. DoD Directive 5000.59 - DoD Modeling and Simulation (M&S) Management
7. DoD Systems Engineering Guidebook (SE Guidebook)
8. DoD SE Plan Outline version 4 (SEP)
9. DoD Integrated Master Plan (IMP) and Integration Master Schedule (IMS) Preparation and
Use Guide (IMP/IMS)
10. DoD Integrated Program Management Data and Analysis Report Implementation & Tailoring Guide
(IPMDAR Guide)
11. DOD MIL-HDBK-245E, DOD Handbook, Preparation of Statement of Work (SOW Handbook).

The metrics are needed to inform the PM:

1. If the definitions of the technical baselines (functional, allocated, product), and if applicable Minimum
Viable Products (MVP), and Minimum Viable Capability Releases (MVCR), will be completed on schedule.
2. If the needed capabilities, features, and functions will be delivered on schedule.
3. If the software engineering processes mitigate cost and schedule risks by identifying and removing
software-related technical debt early in development (SE Guidebook).
4. If technical performance is being assessed at all levels: component, subsystem, integrated product,
and external interfaces.
5. If the intermediate goals for tracking technical performance measures (TPM) are achieved on
schedule.
6. If Modular Open Systems Approach (MOSA), defined interfaces between modules that are defined by
widely supported standards are achieved on schedule.

Information Needs of Asst. Sec. of the AF (AT&L)

Mr. Andrew Hunter is Assistant Secretary of the Air Force for Acquisition, Technology and Logistics. In his
response to Senate Armed Services Committee (SASC) Advance Policy Questions (APQ) as nominee for that
post, on Oct. 5, 2021, he stated that, if confirmed:

I would also work closely with the Program Executive Officers to ensure all acquisition programs are on
track to meet cost, schedule, and performance criteria, and take appropriate actions where needed
when this is not the case.

I will perform active and close oversight of the B-21 program….to ensure the B-21 program cost,
schedule, and performance stays on track.

I will review the Presidential Aircraft Replacement program in detail…to ensure the program is,
and remains, on track to meet cost, schedule, and performance criteria.

I will work with the acquisition workforce leadership to continue emphasizing the pivot to DE and

2
modern software development by leveraging commercial practices and standards.

In his response, he also stated that “I believe that digital acquisition practices such as DE, open systems
architecture, and agile software development are best practices in these areas...If confirmed, I will
ensure the acquisition community is closely engaged with operators in pursuing technology and continues
to employ best practices as we develop capability to meet evolving threats.

Information Needs of USD(A&S)

On March 22, 2022, the Hon. William La Plante appeared before the SASC as nominee for Undersecretary of
Defense for Acquisition and Sustainment. In his response to APQs, he stated his positions and commitments
regarding EVM, iterative development approaches including MVCs, and DE. Excerpts from the APQ
statement follow.

EVM

The earned value management system (EVMS) is used to assess the cost, schedule, and technical
performance of major capability acquisitions for proactive course correction. However, the Section 809
Panel reported that EVM does not measure product quality and concluded, “EVM has been required on
most large software programs but has not prevented cost, schedule, or performance issues.” In 2009
DoD reported to the committee that “a program could perform ahead of schedule and under cost
according to EVM metrics but deliver a capability that is unusable by the customer” and stated the
program manager should ensure that the EVM process measures the quality and technical maturity of
technical work products instead of just the quantity of work performed.

51. If confirmed, what steps would you take, if any, to require contractors to report valid measures of
cost, schedule, and technical performance for all acquisition pathways?

If confirmed, I will work across the Department and with the industrial base— current and emerging—to
validate, improve, or establish appropriate metrics across the acquisition pathways. … I plan to
continue open communications to ensure transparency and allow individual programs to continually
improve and tailor approaches to best meet the warfighter need.

52. If confirmed, what steps would you take, if any, to require contractors that employ the DOD DE
Strategy to maintain valid information in the digital authoritative data source that is sufficient for
program managers to make informed and timely decisions to manage cost, schedule, performance, and
risk?

If confirmed, I would seek to engage with our industry partners and Service representatives to better
understand how they are currently employing DE and how we can work in partnership to better
collaborate within and outside of the Department… A combination of strong data, tool and modeling
standards and environments, training of our Acquisition Corps, and proper contract and data rights
guidance are foundational to enabling successful adoption of DE to feed the right cost, schedule,
performance and risk data to our acquisition decision makers.

Iterative Development Approaches

40. What is your opinion on the merits of DOD incorporating iterative development

3
approaches centered on fielding minimum viable capabilities?

Best practices in software development focus on rapidly fielding a minimum viable capability to get into
the hands of users to accelerate learning, capture feedback, and use the insights to shape requirements,
design, and strategies. … Iterative development can reduce cycle times and be more responsive to
changing technologies, operations, and threats. If confirmed, I would seek to promote the DoD’s use of
this leading industry practice.

41. To what extent do you believe DOD has broadly implemented commercial best practice
agile development approaches adequately for software and hardware systems?

… I also understand DoD has taken important steps such as issuing the new Software Acquisition Pathway
which is purpose-built to implement best commercial agile approaches and enable modern software
practices for both applications and embedded software. DoD is still in the early stages of effectively
implementing agile and modern software approaches with progress in software intensive systems that
can be leveraged for application to more of our hardware systems. If confirmed, software acquisition will
be a high priority.

Information Also Needed for Congressional Oversight

The DE metrics should also be sufficient to demonstrate that past and pending DoD commitments to
Congress, regarding cost and schedule reporting, will be met. Examples follow.

• Provision in NDAA for FY 2022 Sec. 1650 Review of EMD Contract for Ground-Based Strategic
Deterrent Program (GBSD)

Congress is concerned with the implementation of DE as a best practice. The NDAA for FY 2022 includes
a provision that specifically addresses the implementation of DE; Sec. 1650, Review of EMD Contract for
Ground-Based Strategic Deterrent Program (GBSD). That provision requires a review of DE with concern
about the AF’s ability to implement DE best practices and to leverage DE. Excerpts follow.

Excerpts of NDAA provision:

The Sec. of the AF shall conduct a review…include the following:


1. An analysis of the ability of the AF to implement industry best practices regarding DE during
the EMD phase
2. An assessment of the opportunities offered by the adoption by the AF of DE processes and of
the challenges the AF faces in implementing such industry best practices.
3. A review of the ability of the AF to leverage DE during such EMD phase.
4. Recommendations to improve the cost, schedule, and program management of the EMD
phase.

My recommendations for improving the cost, schedule, and program management of the EMD phase and
the effectiveness of DE, are covered in Tables 1 and 3 below.

• Ensure that Integrated Test and Evaluation is integrated with Modeling and Simulation to assess
attainment of technical performance parameters and to confirm performance against documented
capability needs.
4
• Ensure that programs using the embedded software path align test and integration with the testing
and delivery schedules of the overarching system in which the software is embedded, including the
testing and delivery schedules of MVPs and MVCRs.

• 2009 DoD Report to Congress Required by WSARA


DoD has unfinished acquisition reform tasks to satisfy its commitments in a 2009 report to Congress, DoD
EVM: Performance, Oversight & Governance Report. The report was required by WSARA applies to EVM
but is relevant to major acquisitions for which reporting of cost and schedule performance is required
even if there is no requirement to comply with EIA-748. For easier reading, “EVM” was replaced by “cost
and schedule performance” in the following excerpts from the report.

1 SE and cost and schedule performance should be integrated and not stove-piped.

2 The PM should ensure that the cost and schedule performance process measures the quality and
technical maturity of technical work products instead of just the quantity of work performed.

3 Cost and schedule performance reporting can be an effective program management tool only if it is
integrated with technical performance, if the …processes are augmented with a rigorous SE process,
and if the SE products are costed and included in cost and schedule performance tracking.

4 If good TPMs are not used, programs could report (schedule performance) as 100 percent
complete even though behind schedule in validating requirements, completing the preliminary
design, meeting the weight targets, or delivering software.

• 2014 Report to Congress on Performance Assessments and Root Cause Analyses (PARCA)

Finally, the PARCA EVM Division will identify, document, and publish specific methods for relating
technical performance to earned value performance. The goal is to provide more accurate joint,
program office, and contractor situational awareness of the program execution. PARCA believes that
earned value metrics and technical metrics such as TPMs should be consistent with program progress.
Earned Value focuses on the completion of a set of tasks to mature the design. It should be consistent
with the set of metrics that indicate the actual design maturity.

• 2018 Section 809 Report

In 2018, the Section 809 Report of the Advisory Panel on Streamlining and Codifying Acquisition
Regulations (Sec. 809 Report) reiterated issues in the DoD reports to Congress. The Panel reported that
“another substantial shortcoming of EVM is that it does not measure product quality. A program could
perform ahead of schedule and under cost according to EVM metrics but deliver a capability that is
unusable by the customer…Traditional measurement using EVM provides less value to a program than
an Agile process in which the end user continuously verifies that the product meets the requirement.”

• 2022 GAO Report: Congressional Need for Performance Metrics (Cost and Schedule)

In February 2022, GAO released GAO-22-104687 DEFENSE ACQUISITIONS Additional Actions Needed
to Implement Proposed Improvements to Congressional Reporting. Per the report, “DOD has yet to
decide what information to include in acquisition reports to Congress, including performance metrics
for each Adaptive Acquisition Framework pathway … for example, the extent to which a program is
5
meeting its baseline cost and schedule estimates.”

• 2022 GAO Report: Leading Practices

In March 2022, GAO released GAO-22-104513 LEADING PRACTICES Agency Acquisition Policies Could
Better Implement Key Product Development Principles. GAO found that DOD policies only partially
implement a key sub-principle for product development, used by leading commercial companies, to
“Use Iterative Design and Testing to Identify a Minimum Marketable Product.”

GAO reviewed policies for provisions requiring development of a MVP or initial capability to be
improved by subsequent or evolving releases. “GAO found that DOD Directive 5000.01 implies
iterative design followed by successive updates, but there is no reference to a minimum product prior
to developing successive updates. By comparison, the software policy requires program officials to
“use an iterative, human-centered design process to define the MVP recognizing that an MVP’s
definition may evolve as user needs become better understood.” The software policy is limited to
software efforts using the software pathway and does not include hardware acquisitions or programs
using other pathways.

• 2022 DOT&E Report: DOT&E FY 2021 Annual Report, MVP (DOT&E)

In January 2022, DOT&E assessed Block 4 software development on the F-35 program and discussed
the MVP. DOT&E stated:

“Although the program designed C2D2 around commercial “agile software” development concepts,
it does not adhere to the published best practices that include clear articulation of the capabilities
required in the MVP, focused testing, comprehensive characterization of the product, and full
delivery of the specified operational capabilities. The program did not deliver programmed
capabilities to operational units, as defined in the Air Systems Playbook.”

• Report to Accompany the SASC NDAA for FY 2023, sec. 801, Middle Tier Authority (MTA),
with regard to the test plan.
Modifications to MTA. Sec. 801:
The committee is concerned that the desire for speed in these programs could lead to the omission of key
elements of good program management. Therefore, the committee believes that MTA programs and
the associated stakeholders would benefit from a … test plan.

• 2022 S E G u i d e b o o k :

2.2.4 Software Engineering


To adopt commercial best practices and advances, Program Management Offices should use the DoDI 5000.87 for
software acquisition.

• 2023 G A O Report: DEFENSE SOFTWARE ACQUISITIONS Changes to


Requirements, Oversight, and Tools Needed for Weapon Programs, GAO-23-105867, July 2023

Finding: Existing policies and guidance do not Support DOD oversight of non-software pathway
weapon programs using agile. Without the use of outcome-based metrics and continually assessing
6
the value of what was delivered against user needs, a program using Agile software development
might deliver capabilities and features that are not essential to the customer and that could
contribute to schedule and cost overruns.

Recommendations to Sec. Def:

1: Incorporate Agile principles into requirements policy and guidance for all programs using
Agile for software development. This should include a Capability Needs Statement and User
Agreement.

2: Incorporate oversight of Agile development of software into acquisition policy and guidance for
all programs using Agile. This should include use of metrics, including outcome-based metrics, and
continually assessing the value of capability delivered to support iterative software development.

3. Establish an overarching plan—which identifies associated resources—to enable the adoption of


modern engineering tools, across all programs. This should include (1) mission engineering, (2) SE,
and (3) software engineering.

• Provision in NDAA for FY 2021 SEC. 836. DIGITAL MODERNIZATION OF ANALYTICAL AND
DECISION-SUPPORT PROCESSES FOR MANAGING AND OVERSEEING DEPARTMENT OF DEFENSE
ACQUISITION PROGRAMS.
Excerpts:
• Iteratively develop and integrate advanced digital data management and analytics
capabilities, consistent with private sector best practices, that—
o integrate all aspects of the defense acquisition system, including …acquisition,
management,
o enable the use of such data to inform further development, acquisition, management and
oversight of such systems, including portfolio management; and
o include software capabilities to collect, transport, organize, manage, make available, and
analyze relevant data throughout the life cycle of defense acquisition programs, including any data
needed to support individual and portfolio management of acquisition programs.
• Supply data to DE models for use in the defense acquisition, sustainment, and portfolio
management processes;
• Move supporting processes and the data associated with such processes from analog to
digital format, including planning and reporting processes;

• CMU/Software Engineering Institute (SEI) SEI-2023-TR-003 | SOFTWARE ENGINEERING


INSTITUTE | CARNEGIE MELLON UNIVERSITY, Report to the Congressional Defense
Committees on National Defense Authorization Act (NDAA) for Fiscal Year 2022 Section 835
Independent Study on Technical Debt in Software-Intensive Systems, November 2023

Excerpts follow:
• Programs should employ both automated (e.g., static code analysis scans) and manual (e.g.,
opportunities for developers to add technical debt items to the backlog and tag them as technical
debt when intentionally taking on debt or identify technical debt in design reviews) mechanisms for
identifying technical debt.
• Programs should track technical debt items on the backlog separate from other types of items,
7
such as vulnerabilities and defects.
• Programs should allocate appropriate effort during iteration capacity planning for resolving
technical debt items, and they must ensure that this effort is protected from the pressure to focus
on new capabilities.
• Program roadmaps should include the effort for managing technical debt to ensure that it is
planned and that effort is allocated to it over time.

Takeaway: Include technical debt in DoDI 5000.88, Engineering of Defense Systems and the Engineering
of Defense Systems Guidebook as shown in Table 3.

• 2024 The DoD PPBE Implementation Plan


The Plan includes “Operationalize understanding of best practices within private sector.” Guidance to
adopt commercial best practices and advances for software acquisition is in the DoD SE Guidebook.
• 2018 DoD Defense Science (DSB) Board Report Design and Acquisition of Software for Defense
Systems (See Appendix F)
• 2019 NDIA SE Div. Input to DSB (See Appendix F)
• 2006 INCOSE International Symposium paper, “Using Earned Value to Track Requirement Progress”
July 2006 (INCOSE Track) (See Appendix G)
• 2024 GAO Report NAVY FRIGATE Unstable Design Has Stalled Construction and Compromised Delivery
Schedules GAO-24-106546, May 2024
“While the Navy tracks design progress, its process to calculate design stability hinges largely on the
quantity—rather than the quality—of completed design documents. The focus on quantity obscures
functional design progress and how much design work remains.

• 2024 GAO Report, GAO-24-105503 Navy Shipbuilding Increased Use of Leading Design Practices Could
Improve Timeliness of Deliveries, May 2024
How programs measured their achievement of design maturity varied but typically reflected
percentages of design drawings or design-specific contract deliverables expected to be submitted at key
milestones before construction. Navy shipbuilders noted that using this type of metric does not
necessarily provide a clear understanding of overall design maturity. For example, the metrics may
overstate design completeness by giving builders credit for submitting design-related documentation
without fully accounting for the quality or completeness of associated design. Drawings that appear
complete could include design placeholders that lack necessary vendor-furnished information (VFI) for
key equipment and, consequently, mask design uncertainties and remaining design work. Further, Navy
officials noted cases where builders submitted blank design products, which met the submittal deadline
to the Navy but did not contribute to advancing design maturity.

• 2024 R E P O R T OF THE COMMITTEE ON ARMED SERVICES HOUSE OF REPRESENTATIVES ON H.R.


8070, REPORT 118–529, May 31, 2024
DoD Technical Debt
The committee recognizes that technical debt is a known challenge for the agile acquisition of both
software intensive systems and networking hardware infrastructure. … The committee recognizes that
addressing technical debt in software is only part of the equation, and technical debt in hardware must
also be addressed to be able to effectively use software and new applications like artificial intelligence.
Therefore, the committee encourages the Chief Information Officer of the Department of Defense, the
8
Director of the Defense Information Systems Agency, and the Chief Information Officer of each military
service to prioritize the reduction of technical debt in software-intensive systems and hardware systems
upon which software-intensive systems operate.

• 2024 Report of the Defense Business Board (DBB) Business Transformation Advisory Subcommittee,
Creating a DoD Digital Ecosystem, DBB FY24-03.
The Subcommittee was tasked by the Deputy Secretary of Defense to evaluate the need for lifecycle
digitalization and to provide recommendations on creating a digital ecosystem with industry partners.
Defense Digital Transformation
The immediate and rapid development of a Defense Digital Ecosystem must become a top national
security priority if the United States is to maintain its military advantage over the pacing threat from
adversaries, including the People’s Republic of China, who are aggressively transforming their defense
production processes. In this rapidly evolving threat environment, the establishment of a Defense Digital
Ecosystem across weapon system development, acquisition, sustainment, and operations is essential to
ensuring the agility and ability to deliver disruptive capability to the warfighter “at the speed of relevancy.”

• DoD must establish new best practices that can be rapidly replicated in a broader transformation.
... recognition that digital transformation will impact a wide array of functions and processes,
including but not limited to engineering, tech infrastructure, contracting, sustainment and logistics,
budget, legal, and personnel.

• Ensure sustainment and performance data are connected via digital threads. Progressive efforts
must include expertise from all phases of the Acquisition process to account for interrelated
processes, data needs, and information flows.

• Digitalization is not merely turning analog processes into digital (i.e., making paper drawings into
digital artifacts), rather it is the breaking down of organizational, process, and production silos using
an open digital ecosystem and access to a common set of data.

• A combination of longstanding bureaucratic inertia; a culture known to be highly risk-averse;


workforce gaps; and resource availability present significant barriers to success

• Changing DoD’s prevailing risk-averse culture and inefficient business processes is essential for the
success of any enterprise-level digital initiative.

Recap of Reports

The Sec. 809 Report’s assessment indicates that DoD’s EVM commitments to Congress in 2009 and 2014
have not been met. PARCA’s goal of accurate joint, program office, and contractor situational awareness of
the program execution is relevant to development programs, including those with no EVM requirements, but
that goal is unmet. There is a need to integrate DE with program management. For successful
implementation of the DE Strat and to meet DAS goals, additional guidance is needed to ensure that the PM
measures schedule and progress towards meeting the requirements of the technical baseline.

Recommendations

9
Recommendations are provided herein that define the PM’s information needs and the DE metrics that meet
those needs. Authoritative Sources of Truth (ASOT) for selecting DE metrics and recommended DE
artifacts/work products that may be used as base measures of DE metrics are included in Appendices A and
B.

The pertinent overarching DAS policies and objectives are:


1. Deliver Performance at the Speed of Relevance using data driven analysis.
2. Employ Performance Based-Acquisition Strategies that are structured around the results to be
achieved as opposed to the manner by which the work is to be performed.
3. Conduct Integrated Test and Evaluation (T&E), integrated with (M and S), to assess attainment
of technical performance parameters and to confirm performance against documented
capability needs.

The five documents cited above can be improved to better define the information needs of PMs for effective
program technical planning and management, configuration and change management, and software
engineering.

The PM needs accurate schedule status and situational awareness of program execution for proactive
resolution of issues impacting cost, schedule, and technical achievement of program objectives. The
technical achievement criteria are defined in the technical baselines. The PM also needs situational
awareness of the degree of product quality as measured by functional completeness.

Finally, the exchange of schedule status information via model exchanges and automated transformations will
eliminate the manual entry of estimated schedule performance such as the percent of work complete used
with EVM. The estimated percent of work complete, such as drawings or code, may fail to be an indicator of
the true status of validating requirements, completing the preliminary design, meeting the weight targets,
or delivering software and may fail to properly account for rework.

Per GAO-24-105503 Navy Shipbuilding Increased Use of Leading Design Practices Could Improve Timeliness
of Deliveries, May 2024, several Navy shipbuilding programs set thresholds for the degree of design maturity
that reflected percentages of design drawings expected to be submitted at key milestones. However, Navy
shipbuilders noted that using this type of metric does not necessarily provide a clear understanding of overall
design maturity. For example, the metrics may overstate design completeness by giving builders credit for
submitting design-related documentation without fully accounting for the quality or completeness of
associated design. Drawings that appear complete could include design placeholders that lack necessary VFI
for key equipment and, consequently, mask design uncertainties and remaining design work.

Common DE Specifications and Standards for Model Exchanges and Automated Transformations

DoD recently established the new position of Chief Digital and Artificial Intelligence Officer (CDAO). The
CDAO should be responsible for addressing the DE Strategy statement that “DoD will need to encourage
commonality in terminology, develop a shared understanding of concepts, and ensure consistency and rigor
in implementing DE across engineering activities…by evaluating current policy, guidance,
specifications, and standards to determine what changes are necessary to implement DE.”

The evaluation should include providing a specifications and standards for exchanging data between the
engineering requirements management data base (such as DOORS), the ASOT, and the program cost and
schedule reports such the Integrated Program Management Data and Analysis Report (IMPDAR). The
IMPDAR’s components include the Contract Performance Dataset (CPD) which provides
10
performance/execution data from the contractor’s existing management systems and the schedule
(comprised of both the Native Schedule File and the Schedule Performance Dataset (SPD) which provides data
from the contractor’s Integrated Master Schedule.

The Practical Software and Systems Measurement (PSM) DE Measurement Framework Version 1.1, published
by the DoD Digital Engineering Working Group (DEWG), provides guidance to use Model-Based Systems
Engineering (MBSE) practice to:
1. Fully integrate system data and models with engineering, program management, and other domains
and disciplines.
2. Collect data directly from DE modeling tools and record results in team tracking tools, such as the
schedule.

Pertinent excerpts from PSM are in Appendix H.

The schedule and technical performance data collected from DE modeling tools is recorded in the schedule
without manual intervention, manipulation, or elimination, as compared with earned value, thus preserving
its truth and management value.

DoD Directive 5000.59 - DoD Modeling and Simulation Management should be revised to assign
responsibility to the CDAO for developing specifications and standards. Of course, budget should be
requested to develop the specifications and standards.

Action Plan

It is recommended that the documents cited above be revised, as specified in Table 3. It is also recommended
that the DEWG develop and publish metrics specifications for DE and MBSE that support the information
needs of PMs. The metrics specifications should be used as digital ASOTs for three PM responsibilities.

1. Develop the time phased schedule to complete the requirements definitions. It should reside in an
automatedly linked scheduling system.
2. Assess the schedule progress of defining and completing requirements. Schedule progress
should also reside in an automatedly linked scheduling system.
3. Use digital artifacts from the ASOT as base measures of DE metrics. These digital artifacts are
ASOT that SE work products are completed, such as:
• Requirement definitions including approved technical performance measures (TPM), verification
methods, and completion criteria in the functional and allocated baselines.
• Trade studies
• Completed products in the product baseline including the MVP and MVCR baselines, if
applicable
• Test artifacts (e.g., test cases, plans, deficiencies, and results)

With MBSE, the record of authority shifts away from the documents to the digital model. Digital modeling
provides an analytical tool, a coverage metric, to evaluate a current state of the model. In addition to
calculating statistics of how many requirements are covered by test cases (Verify relationship) or design
elements (Satisfy relationship), every metric records a time stamp. Periodically calculating the same metric
allows the user to monitor changes of a specific aspect of the model in time.

The EVMS DFARS clause should be revoked. It is an impediment to achieving DBB’s objectives such as:

11
• Use digital threads to account for interrelated processes, data needs, and information flows
(regarding measuring schedule, technical and cost performance based on ASoTs).
• Break down organizational, process and production silos using an open digital ecosystem and access
to a common set of data.
• Overcome bureaucratic inertia and risk-adverse culture…significant barriers to success (in holding
program managers and contractors accountable for program failures).
• Changing DoD’s prevailing inefficient business processes (for measuring cost, schedule, and
technical performance and for providing early warning of pending failures) for the success of any
enterprise-level digital initiative.

The pertinent DAS overarching policies and objectives are ASOTs for the purposes of the recommendations
herein. They are in Table 1.

Table 1 ASOT for DE Metrics Specifications

DAS Excerpts
Section
1.2.a Deliver Performance at the Speed of Relevance.
The DAS will: (d) Conduct data driven analysis.
1.2.k Employ Performance Based-Acquisition Strategies
To maximize competition, innovation, and interoperability, acquisition managers will
consider and employ performance-based strategies for acquiring and sustaining
products and services. “Performance-based strategy” means a strategy that supports an
acquisition approach structured around the results to be achieved as opposed to the
manner by which the work is to be performed.
1.2.o Conduct Integrated Test and Evaluation (T&E)
(1) T&E will be integrated throughout the defense acquisition process. Test and
evaluation will be structured to provide essential information to decision makers, assess
attainment of technical performance parameters, and determine whether systems are
operationally effective, suitable, survivable, and safe for intended use.
(2) The conduct of T&E, integrated with M&S will:
(b) Assess technology maturity and interoperability.
(d) Confirm performance against documented capability needs and adversary
capabilities.

The recommended document modifications herein pertain to the following Information categories and
measurable concepts in PSM. See Table 2 and Appendix C.

Table 2 PSM Information Categories and Measurable Concepts


Information Measurable Concept
Category
Schedule and Work Unit Progress, Deployment Lead Time (a)
Progress (a) Deployment Lead Time is a measure of how rapidly authorized requests for
system capabilities and work products can be engineered, developed, and
delivered for use in their intended operational environment.
Product Quality Functional Completeness (Traceability)

12
The proposed metrics specifications and DE artifacts support the objectives of and are consistent with
documents that, in my opinion, are ASOT for DE. The documents follow.
• DoD Instruction (DoDI) 5000.80, Middle Tier of Acquisition
• DoD Instruction (DoDI) 5000.85, Major Capability Acquisition
• DoDI 5000.87, Software Acquisition
• DoDI 5000.88, Engineering of Defense Systems
• DoDI 5000.89, Test and Evaluation
• DoDI 5000.97 DIGITAL ENGINEERING (DE)
• DoD DE Strat
• DoD Software Modernization Strategy (SW Modernization)
• DoD OSD Best Practices for Using SE Standards (ISO/IEC/IEEE 15288, IEEE 15288.1, and IEEE 15288.2) on
Contracts for DOD Acquisition Programs (15288BP)
• SEI Blog Posts by Natalia Shevchenko Requirements in MBSE, Feb. 22,
2021Benefits and Challenges of MBSE, July 2021
• DoD SE Plan Outline version 4 (SEP)
• DoD Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs, 2023
• DOT&E
• DoD IMP/IMS
• Engineering of Defense Systems Guidebook
• GAO-20-590G GAO Agile Assessment Guide (GAO Agile)
• GAO Schedule Assessment Guide (GAO Schedule)
• Defense Business Board Business Transformation Advisory Subcommittee report, Creating a DoD Digital
Ecosystem
• NDIA Integrated Program Management Division, A Guide to Managing Programs Using Predictive
Measures, March 26, 2021 Rev. 3 (Predictive Measures).
• PSM DE measurement framework
• SE Guidebook
• International Counsil on SE (INCOSE) SE Leading Indicators Guide (SELI)
• Solomon, Paul. INCOSE International Symposium paper, “Using Earned Value to Track Requirement Progress” July
2006 (INCOSE Track)
• SERC SE Research Center Task Order WRT-1001: Digital Engineering Metrics, Technical Report SERC-
2020-TR-002 (SERC)
• Solomon, Paul. SEI Technical Note CMU/SEI-2002-TN-016, Oct. 2002 "Using CMMI® to Improve
EVM” (SEI-EVM)
Note: Despite its title, EVM is applicable to any project including projects that do not use EVM. SEI
focuses on the base measures of work unit progress.
• Solomon, Paul and Young, Ralph. Performance-Based Earned Value, IEEE Computer Society/John
Wiley and Sons, 2007. (PB-EV)
• 2018 DoD Defense Science (DSB) Board Report Design and Acquisition of Software for Defense Systems
13
(See Appendix F)
• 2019 NDIA SE Div. Input to DSB (See Appendix F)
• DoD Agile Metrics Guide Strategy Considerations and Sample Metrics for Agile Development Solutions Version
1.2, 11 November 2020 (Agile Metrics)
• PSM

Recommended revisions to DAS, DoDI 5000.80, DODI 5000.87, DODI 5000.88, DODI 5000.89, DE Strat, SEP,
and IPMDAR Guide are included Table 3.

Table 3 Recommended Revisions to Authoritative Sources of Truth


for Embedded Software and DE Metrics Specifications
Doc. Excerpts Revision
DAS g. Employ a Disciplined Approach. performance
DoDD (2) Program goals for cost, schedule, and performance parameters (or Insert:
5000.01 alternative quantitative management controls) will describe the program technical
over its life cycle. Approved program baseline parameters will serve as objectives
control objectives. including, the product
baseline and, if
appropriate, the MVP
and MVCR baselines.
DoDI f. CAEs will ensure that MTA program names and budget reporting clearly Department
5000.80 and discretely indicate the scope of the effort being conducted under the Add:
MTA pathway, especially when the MTA program is a subprogram of a Scope includes
larger program or is a program spiral, increment, or block upgrade. functional, allocated,
USD(A&S) will maintain the authoritative list of MTA programs for the and product baseline.
Department. (See DoDI 5000.88)
DoDI 3.2 f. Test Strategy. embedded
5000.87 (1) The test strategy defines the streamlined processes by which Insert: including the
capabilities, features, user stories, use cases, etc., will be tested and testing and delivery
evaluated to satisfy developmental test and evaluation criteria and to

demonstrate operational effectiveness, suitability, interoperability, and schedules of MVPs and


survivability, including cyber survivability for operational test and MVCRs.
evaluation. The strategy will:
(f) Programs using the embedded software path will align test and
integration with the testing and delivery schedules of the overarching
system in which the software is embedded, including aligning resources
and criteria for transitioning from development to test and operational
environments.
DoDI 3b(11) Each program will develop and track a set of metrics to assess and performance
5000.87 manage the performance, progress, speed, cybersecurity, and quality of Insert: technical
the software development, its development teams, and ability to meet collection
users’ needs. Metrics collection will leverage automated tools to the Add: , including
maximum extent practicable. The program will continue to update its cost collection of DE metrics
estimates and cost and software data reporting from the planning phase of schedule progress
throughout the execution phase. towards the MVP and
MVCR.

14
DoDI 3.4 b. Technical Baseline Management product baseline,
5000.88 The PM will implement and describe in the SEP a technical baseline Add: including, if
management process as a mechanism to manage technical maturity, to needed, MVP and MVCR
include a mission, concept, functional, allocated, and product baseline. If baselines.
practicable, the PM will establish and manage the technical baseline as a
digital ASOT.

DoDI 3.4. PROGRAM TECHNICAL PLANNING AND MANAGEMENT. Add:


5000.88 a. SEP (u) DE metrics of
(3) For MDAPs, ACAT II, and ACAT III programs, the SEP will contain these schedule progress will
elements, unless waived by the SEP approval authority: be ASOT for tracking
and reporting metrics
for technical
performance, schedule
progress, and quality.
DoDI 3.4. PROGRAM TECHNICAL PLANNING AND MANAGEMENT. traceability;
5000.88 a. SEP Including automated
(3) For MDAPs, ACAT II, and ACAT III programs, the SEP will contain these traceability to
elements, unless waived by the SEP approval authority: completion criteria in
(b) The engineering management approach to include technical baseline the schedule,
management; requirements traceability; CM; risk, issue, and opportunity
management; and technical trades and evaluation criteria.
DoDI 3.4. PROGRAM TECHNICAL PLANNING AND MANAGEMENT. progress,
5000.88 a. SEP
(3) For MDAPs, ACAT II, and ACAT III programs, the SEP will contain these Should be:
elements, unless waived by the SEP approval authority: schedule progress,
(c) The software development approach to include architecture design
considerations; software unique risks; software obsolescence; inclusion of

software in technical reviews; identification, tracking, and reporting of


metrics for software technical performance, process, progress, and
quality; software system safety and security considerations; and software
development resources.
DoDI 3.4. PROGRAM TECHNICAL PLANNING AND MANAGEMENT. Interfaces and schedule
5000.88 a. SEP dependencies.
(3) For MDAPs, ACAT II, and ACAT III programs, the SEP will contain these Delete: “and”
elements, unless waived by the SEP approval authority: Add:
(r) The MOSA and program interdependencies with other programs and , schedule
components, to include standardized interfaces and schedule dependencies, and
dependencies. collection of DE metrics
of schedule progress
towards developing and
verifying the MOSA
interdependencies and
standardized interfaces.

15
DoDI 3.4.c. Configuration and Change Management performance
5000.88 The LSE, under the direction of the PM, will implement a digital CM Insert: technical
approach and automated tools to establish, control, and curate product
attributes and technical baselines across the total system life-cycle. The performance
CM approach will: Insert: technical
(1) Identify, document, audit, and control schedule, cost, functional,
physical, and performance characteristics of the system design. metrics,
(2) Specifically, track any changes (e.g., a dynamic change log for in and Add:
out of scope changes, formal engineering change proposals) and provide including DE metrics for
an audit trail of program design decisions and design modifications. schedule progress and
(3) Provide for traceability of mission capability to system requirements to quality
performance and execution metrics.
DoDI 3.6 Specialty Engineering technical performance,
5000.88 3.6.a(2)(a)6 Insert:
Metrics identification, tracking, and reporting to address software schedule progress,
technical performance, development process, and quality.
DoDI 3.6 Specialty Engineering Insert: 9 technical debt
5000.88 3.6.a(2)(a)

DoDI 3.6.a(2)(b) The program may automate collection of metrics as much as metrics
5000.88 possible.
Insert:
, including DE metrics
for schedule progress
and quality,
DoDI 3.1.i results
5000.89 As part of the DE strategy… tools...must provide authoritative sources of Insert:
models, data, and test artifacts (e.g. test cases, plans, deficiencies, and , including DE metrics
results) for schedule progress
and quality,
DoDI 2.2.4 Software Acquisition In the IMS
IMP/IMS Although an IMS typically would not include Level of Effort (LOE) activities, Insert: “as IMP events”
the program should schedule MVP and post MVCR sprints in the IMS.
Programs should work closely with their software development team to Also, delete “Although
ensure the IMP structure matches the structure of Agile elements. For an IMS typically would
example, features or capabilities from an Agile perspective often correlate not include Level of
to the Criteria level of a project’s IMP. Effort (LOE) activities.” It
is irrelevant to
embedded software.
DE Strat 1.3 Exchange of information between technical disciplines or information
organizations should take place via model exchanges and automated Insert:
transformations.

16
, including DE metrics
for schedule progress
and quality,
DE Strat 2.3 Use the digital ASOT as the technical baseline performance
Insert:
Stakeholders should use the ASOT to make informed and timely decisions technical
to manage cost, schedule, performance, and risk. For example, contract
deliverables should be traced and validated from the ASOT. deliverables
Insert:
that report schedule
progress and product
quality (functional
completeness)
IPMDAR 1.2. IPMDAR consists of the following three components: 1.2. IPMDAR consists of
Guide … The IPMDAR requirement is comprised of three components: the the following four
Contract Performance Dataset (CPD), the Schedule (to include Native components:
Schedule and Schedule Performance Dataset (SPD), The IPMDAR
…The IPMDAR
requirement is
comprised of
four
components:
and the DE artifacts that
are created from the
standards, rules, tools, and
infrastructure within a DE
ecosystem, including
schedules.
IPMDAR 1.2.2 Schedule (Comprised of both the Native Schedule File and the Schedule Add
Guide Performance Dataset (SPD)). Provides data from the contractor’s Integrated F o r s o f t w a r e t h a t i s
Master Schedule (IMS). em bedded in weapo n
systems, the contractor’s
The Native Schedule submission is a direct export from the contractor’s IMS includes milestones
scheduling tool. The SPD is a collection of JSON encoded data tables a n d s c h e d u l e
capturing the detailed task and schedule metrics, task relationships, and performance from the DE
resource assignments tables. Since the CPD data report is now required at artifacts that are created
the CA or WP levels, the task definitions within the SPD must now be from the standards, rules,
correctly encoded against the CA or WP data included in the corresponding tools, and infrastructure
CPD submission. This critical improvement enhances the ability to support with in a DE ecos ys t e m.
integrated cost/schedule analysis.

IPMDAR 1.3 IPMDAR Outline. Add:


Guide 1.3.2 Data reported shall reflect the output of the contractor's Earned Value and the DE artifacts that
Management System (EVMS) are created from the
standards, rules, tools, and
infrastructure within a DE
ecosystem.

17
IPMDAR 3.4. Applying the IPMDAR DID When EVMS DFARS Clause is not Applicable Add:
Guide or when the DFARS
The Government may apply the Schedule (comprised of both the Native 234.252-7002 EVM
Schedule File and/or the Schedule Performance Dataset (SPD)) deliverable of requirement is not on the
the IPMDAR DID when the DFARS 234.252-7002 EVM requirement is not on software that is
contract. The Schedule is applied to all development, major modification, and embedded in a weapon
low rate initial production efforts. systems contract.

SEP 3.2.2 TPMs categories,


A set of TPMs covering a broad range of core categories, rationale for Insert (from Risk):
tracking, intermediate goals, and the plan to achieve them with as-of at all levels including
dates. component,
subsystem, integrated
product, external
interfaces.
SEP 3.2.2 TPMs performance
(2) empirically forecast the impact on program cost, schedule, and Insert:
performance technical
SEP 3.2.2 Expectation Measures
Program should use measures Insert: technical
SEP Verified
3.2.9 Config. and Change Management
Add: The product
Technical Baseline Artifacts – baseline includes the
…At a minimum, describe the artifacts of the concept, functional, sequential set of
allocated, and product baselines and when each technical baseline MVP/MVCR
has been or will be established and verified. If practicable, the PM baselines as
will establish and manage the technical baseline as a digital appropriate.
authoritative source of truth. (See SE Guidebook (forthcoming)
Configuration Management Process, for additional guidance) forthcoming
delete
SE Guide- Add:
2.5 Another area to which incentives are tied…work
book Reduction of technical
products debt in software-
intensive systems and
hardware systems upon
which software-intensive
systems operate.
SE Guide- 2.5 Another area to which incentives are tied is the EVMS. The PM should Replace “the EVMS. The
book ensure that the EVMS, tied to any incentive, measures the quality and PM should ensure that
technical maturity of technical work products instead of just the quantity the EVMS, tied to any
of work. incentive, measures the
quality and technical
maturity of technical
work products instead of
just the quantity of
work”
with
“a set of metrics to assess
18
and manage technical
performance, schedule
progress, speed,
cybersecurity, and quality
of the development, its
development teams, and
ability to meet users’
needs. Metrics collection
will leverage automated
tools to the maximum
extent practicable. Those
metrics will be used to
update cost estimates and
cost and software data
reporting from the
planning phase
throughout the execution
phase. Metrics should
address software
technical performance
and quality (e.g., defects,
rework) evaluating the
software’s ability to meet
user needs.”
(Source: DoDI 5000.87).

SOW APPENDIX A WORK WORDS/PRODUCT WORDS Add: Also called Product


Hand- Product Scope (the features and functions that characterize a product, Baseline
book service, or result)

NDIA Predictive Measures

The NDIA Predictive Measures includes predictive indicators that can be used to develop and implement
effective mitigation plans. Excerpts from the Sections, Requirements Completion Metrics and Technical
Performance Measures (TPM), follow.

NDIA Requirements Completion Metrics

Predictive Nature: Unfavorable differences in requirements completion metrics indicate a threat


to timely delivery of a capable system that satisfy stakeholders’ needs. The metric indicates
progress in eliciting and documenting all the requirements necessary for a final, completed
systems design.
The base measures are:

• Total Requirements consisting of:


1. The physical count of system level requirements statements at the transition from
the systems requirements phase to preliminary design.

19
2. The expected count of requirements analyzed from the system level to be
eventually allocated to the system elements (configuration items).

• Requirements Planned - the time-phased profile count of total requirements fully articulated
given resource capability and capacity. This value might come from Control Account Plans
for completion of specifications.
• Requirements Completed – the count of completed requirements as determined from work
package level status reports or system requirements data base.

The basic algorithms are:

NDIA TPM

TPM involves predicting the future values of a key technical performance parameter of the higher level
end product under development based on current assessments of products lower in the system
structure. A good TPM has the element of traceability of the technical requirements to WBS to TPMs to
EVM Control Accounts. In the Control Account, a description of the TPM and its allowed range of values
for the Period of Performance of that Control Account should be defined.

The Systems Engineering Management Plan (SEMP) and the resulting SE architectural documents are used
to further define the TPMs and to set threshold values.

Digital Artifacts

Typical artifacts that should be the base measures of schedule performance are outputs from the
measurement and verification processes in OSD Best Practices for Using SE Standards (ISO (International
Standards Organization/IEC (International Electrotechnical Commission)/IEEE (Institute of Electrical and
Electronics Engineers) 15288, IEEE 15288.1, and IEEE 15288.2) on Contracts for DOD Acquisition Programs
(15288BP), GAO Agile, PB-EV, and CMMI® for Development, Version 1.3 (CMMI-DEV, V1.3).

These outputs are ASOTs for PMs. When DE is employed, the digital versions of these artifacts should be
automatically transferred from the engineering to the program management organizations.

Per SE Guidebook, “software development activities should employ automation across all aspects of the
software factory and project management components to eliminate tedious, manual steps to the
maximum degree practicable, enabling higher velocity, consistency, and overall better-quality software
components.

Typical DE artifacts are included in Appendices A and B. The primary source of the artifacts in PB-EV is the
technical note, SEI-EVM, published in 2002. In 2010, SEI published information regarding Agile methods in
CMMI-DEV, V1.3. Excerpts from CMMI-DEV, V1.3, including the processes, Requirements Development,
Configuration Management, and Quantitative Project Management, are in Appendix E.

20
Appendix A ASOT for Selecting DE Metrics and Typical DE Artifacts

ASOT for Selecting DE Metrics and Typical DE Artifacts


Doc. Excerpts
5000.89 As part of the DE strategy...tools...must provide authoritative sources of models,
data, and test artifacts (e.g. test cases, plans, deficiencies, and results)
15288BP 6.3.5.4 Requirements Traceability Mapping
1) Includes full bi-directional traceability between the requirements source and
the system requirements down to their lowest level.
15288BP 6.3.7.4 Measurement process outputs
c) Measurement data with the following attributes:
1) Provides data on established TPMs for use in project assessment and control
to support the assessment of the system technical performance, and for an
assessment of risk in achieving the measures of effectiveness or measures of
performance and associated operational requirements.
NOTE—TPMs are a subset of measures that evaluate technical progress (i.e.,
product maturity) and support evidence-based decisions at key decision points
such as technical reviews or milestone decisions.
2) Provides technical project measurement data for use in project assessment
and control to support the assessment of technical progress toward fulfilling
system requirements.
15288BP 6.4.9.4 Verification process outputs
a) Planned system verification with the following attributes:
1) Quantitatively verifies that each system product …meets all of its
requirements and design constraints in accordance with the verification
method for each requirement or constraint in the allocated baseline.
b) Verification results with the following attributes:
1) Verify required performance of all critical characteristics by demonstration or
test.
2) Verify risks identified in the Risk Management process are mitigated to levels
acceptable for continued development of the system as planned.
d) Acceptance verification data with the following attributes:
1) Verifies that each delivered hardware product, each constituent product of a
delivered hardware product, and each system product that is used to
manufacture, verify, integrate, or deploy end products that are to be
delivered meets each of its requirements …in the maintained, allocated, or
product baselines in accordance with the applicable verification method or
verification requirements.
GAO Agile Data from Agile artifacts enables contract oversight
Programs should also collect actual data associated with the program’s releases,
features, and capabilities to enable contract oversight and hold contractors
accountable for producing quality deliverables.
GAO Schedule Best Practice 1: Capturing All Activities
Is the IMS maintained in scheduling software and linked to external, detailed
project schedules?

21
SELI 1. Requirements Validation Trends
2. Requirements Verification Trends
3. Technical Measurement Trends
INCOSE Requirements management status:
Tracking • Defined
• Validated
• Verification method determined
• Approved
• Allocated
• Traced to verification document (test procedure)
• Designed
• Implemented
• Tested
• Verified

EVM The purpose of Requirements Management is to manage the requirements of


the project’s products and product components and to identify inconsistencies
between those requirements and the project’s plans and work products.
• The project plans, activities, and work products are reviewed for consistency
with the product requirements and the changes made to them.
SEI Digital modeling provides us with another analytical tool--a coverage metric,
which allows us to evaluate a current state of the model. In addition to
calculating statistics of how many requirements are covered by test cases
(Verify relationship) or design elements (Satisfy relationship), every metric
records a time stamp. Periodically calculating the same metric allows the user to
monitor changes of a specific aspect of the model in time.

With MBSE, the record of authority shifts away from the documents to the
digital model.
SW Modern- 3 Unifying Principles
ization Resilient software must be defined first by execution stability, quality, and
dependable cyber-survivability. These attributes can be achieved at speed by
aggressively adopting modern software development practices that effectively
integrate performance and security throughout the software development
lifecycle.

More Than Code - Software modernization is more than just code development.
It includes the many policies, processes, and standards that take a concept from
idea to reality. Considerations such as contracting and intellectual property
rights, as well as transition from development to fielding, are often overlooked
and underappreciated. These policies, processes, and standards must not hinder,
but empower the vision of this strategy.

22
SEP Introduction:
• The SEP should include a digital ecosystem implementation plan that
addresses the DE Strat goals and defines six key digital engineering
ecosystem attributes … Applied elements of these attributes
(requirements, models, digital artifacts, …) will be evident in the
planning of the digital ecosystem implementation that results in the
(ASoT) for the program
• The SEP will describe a data management approach consistent with the
DoD DE Strat. The approach should support maximizing the technical
coherency of data as it is shared across engineering disciplines …
Additional approaches to data management should at a minimum
describe:
o Digital artifact generation for reporting and distribution purposes
SEP 2.1 Requirements Development

Program should maximize traceability and the use of models as an integral


part of the mission, concept, and technical baseline to trace measures of
effectiveness, measures of performance, and all requirements throughout
the life cycle from JCIDS (or equivalent requirements authoritative

source(s)) into a verification matrix, equivalent artifact, or tool that provides


contiguous requirements traceability digitally.

Program should trace all requirements from the highest level (JCIDS or
equivalent requirements sources) to the lowest level (e.g., component
specification or user story). This traceability should be captured and
maintained in digital requirements management tools or within model(s).
The system Requirements Traceability Matrix (RTM) should be a model
output that can be embedded in or attached to the SEP, or the SEP should
contain a tool reference location. …The matrix should include the
verification method for each of the identified requirements and an indication
whether each requirement is expected to change over the life of the
program.
SEP 2.3 Specialty Engineering (SpEng)

As part of the program’s digital engineering approach, describe how


models, simulations, the digital ecosystem, and digital artifacts will be
used as part of an integrated approach to supporting SpEng activities and
deliverables.
SEP 3.2.2 TPMs

Technical Assessment Process … should include … a set of TPMs


covering a broad range of core categories, rationale for tracking,
intermediate goals, and the plan to achieve them with as-of dates (Table
3.2-2). (a)This table was erroneously numbered “3.2-2.” It should be
“3.2.1.”

23
2. MAJOR CONCEPTS
PSM DE
Because DE processes help to define the capabilities of the eventual system, DE
measurement
measures can serve as useful leading indicators for other product related
framework
measures.

8.7 DEPLOYMENT LEAD TIME


Deployment Lead Time is a measure of how rapidly authorized requests for
system capabilities and work products can be engineered, developed, and
delivered for use in their intended operational environment.

CYCLE TIME
The elapsed time from when development work is started until the time
development work has been completed and is ready for deployment. This
time includes activities such as planning, requirements analysis, design,
implementation, and testing.

Base Measures 1: Completed Date: timestamp when authorized work


completes development (design, implementation, integration, testing) and is
authorized for deployment.
RIO
2.4.1
Ensure risk mitigation plans are reflected in the IMP, IMS, TPMs, and the EVM
baseline.
3.2.1 Risk Identification Methodologies

Assess technical performance at all levels: component, subsystem, integrated


product, external interfaces.
3.4.5 develop a risk burn-down plan for all high and moderate risks and for
selected low risks.
A.4.2 Typical Contractor Responsibilities
• Synthesize and correlate new and ongoing risk elements in the IMS, risk
mitigation plans, estimates at completion, technical status documentation, and
program updates and reviews.
5.5.1.2 Develop Strategies
Establish effective metrics to monitor and manage the program. Planned
metrics should consider recommendations for agile metrics per the DoD
Agile Metrics Guide
5.5.2.5 Iterate
• Review and update the risk register and backlog before each iteration.
Reprioritize with the user based on feedback from previous iteration(s) and
track accumulation of technical debt.
DOT&E
…commercial “agile software” development … published best practices ,,,
include clear articulation of the capabilities required in the MVP, focused
testing, comprehensive characterization of the product, and full delivery of
the specified operational capabilities.

24
IMP/IMS 2.4 Digital Engineering Guidance
Project schedules are digital models and should be integrated with other
digital models of the project to support the project’s DE effort.
SE 2.2.4 Software Engineering
Guidebook
Properly planned software engineering processes can mitigate cost and
schedule risks by allowing DoD programs to identify and remove software-
related technical debt early in development. This early action can increase
acquisition efficiency and lead to higher success rates during operational
testing and during operations and sustainment.
SE Schedule Management
Guidebook Include metrics to assess both schedule health,….associated completeness of the
Work Breakdown Structure and the risk register. A healthy, complete and risk-
enabled schedule forms the technical basis for the EVMS. Strong schedule metrics
are paramount for accurate EVMS data.

Software Quality
Metrics should address software technical performance and quality (e.g., defects,
rework) evaluating the software’s ability to meet user needs

SE Role in Contracting
To adopt commercial best practices and advances, Program Management Offices
(PMOs) should use the DoDI 5000.87 for software acquisition

Incentive fees and penalties such as award fee may be tied to program performance
…evaluated during technical reviews,
PB-EV Maintain bi-directional traceability of product and product component
requirements among the project plans, work packages, planning
packages, and work products. Requirements traceability is a necessary
activity of mapping customer needs to the system requirements and
tracking how the system requirements are met throughout the development
process—in the design, to system component development, through testing
and system documentation, including for validation, verification, as well as
to the project plans, and work products. CMMI® requires bi-directional
traceability, that is, that evidence of an association between a requirement
and its source requirement, its implementation, and its verification is
established from the source requirement to its lower-level requirements,
and from the lower-level requirements back to their source. A requirements
traceability matrix is used to track the requirements.

25
DoDI 5000.87 (4) …define the MVP recognizing that an MVP’s definition may evolve as user needs
become better understood. Insights from MVPs help shape scope, requirements,
and design.

(11) Each program will develop and track a set of metrics to assess and manage the
performance, (schedule) progress, speed, cybersecurity, and quality of the software
development, its development teams, and ability to meet users’ needs. Metrics
collection will leverage automated tools to the maximum extent practicable. The
program will continue to update its cost estimates and cost and software data
reporting from the planning phase throughout the execution phase.
Agile Metrics 5.1.1 Story Points
5.1.7 Release Burnup Charts
… measure the amount of work completed for a given release based on the total
amount of work planned for the release. Usually, story points are used as the unit
of measure to show planned and completed work.

Additional Context
Conceptually, release burnup could be measured using requirements or user
stories as the unit of measure as well. From the user perspective, understanding
how many requirements are completed and how many remain might be a better
way of communicating progress than story points. Additionally, like burndown
charts, burnup charts can be applied to other scopes of work beyond releases (e.g.,
sprint burnup and product burnup).

Variations
• The number of requirements completed provides insight to users on
requirements completed and requirements remaining.
• The number of user stories completed is similar in concept to the metric showing
the number of requirements completed.

5.2 Agile Quality Metrics


5.2.1 Recidivism
Recidivism describes stories that are returned to the team for various reasons.

5.3 Agile Capability Delivery Metrics


Agile capability delivery metrics measure delivery progress over time in alignment
to desired outcomes (measured by value).
5.3.1 Delivered Features (or Delivered Capabilities)
The count of delivered features measures the business-defined features accepted
and delivered.

26
Appendix B PB-EV Typical SE/DE work products/artifacts

PB-EV Table E-1: Typical SE/DE Work Products/Artifacts in CMMI


CMMI Process Area Typical Work Products/Artifacts
Requirements Customer requirements
Development Derived requirements
Product requirements
Product-component requirements
Interface requirements
Functional architectures
Activity diagrams and use cases
Object-oriented analyses with services identified
Technical performance measures
Records of analysis methods and results

PB-EV Table E-1: Typical SE/DE Work Products/Artifacts in CMMI


CMMI Process Area Typical Work Products/Artifacts
Results of requirements validation
Technical Product component operational concepts, scenarios, and
Solution environments
Use cases
Documented relationships between requirements and product
components
Product architectures
Product-component designs
Technical data packages
Allocated requirements
Product component descriptions
Key product characteristics
Required physical characteristics and constraints
Interface requirements
Material requirements
Verification criteria used to ensure requirements have been achieved
Conditions of use (environments) and operating/usage scenarios,
modes, and states for operations, support, training, and
verifications throughout the life cycle
Interface design specifications
Interface control documents
Implemented design
Product support documentation (training materials, users manual,
maintenance manual, online help.)
Requirements Requirements traceability matrix
Management
Validation Validation results
Verification Exit and entry criteria for work products
Verification results

27
Measurement and Specifications of base and derived measures
Analysis
Decision Analysis and Results of evaluating alternate solutions
Resolution

PB-EV Table F-1 Trade Study Plan: Typical Work Products/Artifacts

Activity Trade Study Work Product/Artifacts


1. Generate trade study plan Trade study plan (based on time stamps of
planned completion dates)
2. Establish objectives Trade objectives
3. Establish evaluation criteria Evaluation criteria
4. Define baseline candidates Candidate definition:
Include performance characteristics
and / or models, engineering drawings,
schematics, flow diagrams, equations etc.

5. Establish candidate evaluation methods: Evaluation methods


Approaches include preliminary design,
analysis /evaluations, prototyping, simulation,
analytical modeling, lessons learned, analysis
6. Establish interpretation guidelines Interpretation guidelines
7. Trade study stakeholder review Stakeholder review report
8. Evaluate candidates Results of performing evaluation
9. Prioritize according to best fit Trade study recommendations
10. Establish refinement criteria (if necessary): Refinement criteria and methods
Accommodate new information

28
Appendix C PSM DE measurement framework Artifacts

Appendix C PSM DE measurement framework Artifacts


Artifact Description Source
Source Statement that identifies what 8.1 ARCHITECTURE COMPLETENESS AND VOLATILITY
Functional results a product … shall Function:
Requirement produce; a function that a A task, action, or activity that must be accomplished to
system or system component achieve a desired outcome. A function may originate from
shall perform. source functional requirements, use cases, or functional
decomposition.
Source The base model elements 8.2 MODEL TRACEABILITY
Element defined per DE model from The usefulness and quality of a digital model depends on the
which other model elements completeness and integrity of the relationships among model
shall be derived from or elements. Traceability between elements, such as
allocated to, e.g., a stakeholder requirements allocation and flow down to architectural,
needs. design, and implementation components, assures that the
system solution is complete and consistent. Gaps in bi-
directional traceability between the artifacts of two models or
might indicate where further analysis or refinement are
needed.
The traceability concepts and indicators in this specification
are representative examples of more general traceability
mappings and reports across the development life cycle, such
as:
• Traceability between stakeholder needs, system
requirements, and allocated or derived requirements at each
level of the system hierarchy
• Traceability and flow down of requirements to the logical or
physical solution domain (e.g., design, implementation,
integration, verification, validation)
• Allocation and traceability of performance measures or
parameters, such as Measures of Effectiveness (MOEs) or Key
Performance Parameters (KPPs)
• Traceability of system interfaces.
Copyright Notice: General Use: Permission to reproduce, use this document or parts thereof, and to prepare
derivative works from this document is granted, with attribution to the participating organizations and the original
author(s), provided this copyright notice is included with all reproductions and derivative works.

29
Appendix D

Excerpts from DOD INSTRUCTION 5000.97 DIGITAL ENGINEERING, December 21, 2023
Glossary:
DE: An integrated digital approach that uses authoritative sources of systems' data and models as a continuum across
disciplines to support lifecycle activities from concept through disposal.
DE Ecosystem: The interconnected infrastructure, environment, and methodology (process, methods, and tools) used to store,
access, analyze, and visualize evolving systems' data and models to address the needs of the stakeholders.

1.2. POLICY.
a. The DoD will conduct a comprehensive engineering program for defense systems, pursuant to DoD Instruction (DoDI)
5000.88. In support of that effort, the DoD will use DE methodologies, technologies, and practices across the life cycle of
defense acquisition programs,… engineering, and management activities.
b. DoDI 5000.88: certain programs must include a DE implementation plan in the SE plan.

2.7. DOD COMPONENT HEADS WITH ACQUISITION AUTHORITY.


(2) Provide guidance and support for program managers to develop, validate, and maintain:
(a) Credible and coherent authoritative sources of truth (ASOT) shared with stakeholders.
(b) Digital models that accurately reflect the architecture, attributes, and behaviors of the system they represent.
3.1 DE
c. Uses computer systems for the development, verification, validation, use, curation, configuration management, and
maintenance of technically accurate digital models in support of system life-cycle activities. These models capture system
representations and, together with their underlying data, provide an authoritative source of truth (ASOT).
d. Moves the primary means of communicating system information from documents to digital models and their
underlying data.
3.2 DE CAPABILITY.
b. DE Capability Elements.
(3) Digital Threads.
(b) The digital thread allows different audiences with different perspectives to extract data from and adjust usage of models to
carry out different activities, including, but not limited to:
1. Requirements analysis.
2. Architecture development.
3. Design evaluation and optimization.
4. System, subsystem, and component definition and integration.
5. Cost estimating.
6. Training aids and devices development.
7. Developmental and operational tests.
(4) Digital Artifacts.
Digital artifacts are the digital products and views that can be dynamically generated directly from digital models. These
artifacts are created from the standards, rules, tools, and infrastructure within a DE ecosystem. Some common examples of
digital artifacts include, but are not limited to:
(a) Design specifications.
(b) Technical drawings (e.g., authorization boundaries, data flows).
(c) Design documents.
(d) Interface management documents.
(e) Analytical results.
(f) Bills of material.
(g) Software source code.
(h) Work breakdown structure.
(i) Production or machining instructions.
(j) Test planning and cases.
(k) Schedules.
3.4. IMPLEMENTATION OF DIGITAL ENGINEERING.
30
b. The PM will identify and require digital models, artifacts, and data sets as deliverables in the contract through
contract data requirements lists and data item descriptions.
3.5. PROCEDURES FOR MAINTAINING DIGITAL MODELS AND AUTHORITATIVE DATA SOURCES.
a. Digital Models.
(1) Programs will identify and maintain model-centric baselines, approaches, and applications in a digital form that
integrates the technical data and associated digital artifacts that stakeholders generate throughout the system life
cycle.
b. Authoritative Data.
Programs should develop and implement plans to establish current, consistent, enduring, and authoritative sources of
truth for digital models and data.

31
Appendix E Excerpts from CMMI-DEV, V1.3

Requirements Development

Configuration Management

32
Quantitative Project Management

33
Appendix F page 1 of 2
Excerpts from 2019 NDIA SE Div. Input to 2018 DoD Defense Science (DSB) Board Report Design and
Acquisition of Software for Defense and from DSB Report

DSB Excerpts:
Background

NDIA Excerpts:
NDIA, in collaboration with the International Council on SE (INCOSE) and PSM has volunteered to provide input to
USD(A&S) and USD(R&E) representing the “industry perspective” on implementation of the DSB recommendations.

While the DSB report focuses primarily on SOFTWARE design and acquisition using continuous and iterative methods,
NDIA believes that the scope must be expanded to focus on SYSTEM design and acquisition using continuous and
iterative methods.

Steering at lower levels is integrated with roadmap updates and MVP/Next Viable Product (NVP) planning.
• Contracts defined by MVP: Contracting approach includes mechanisms for flexibly defining and approving MVP/NVP
capabilities.

DSB #1: Software Factory Picture of Success (end-state):


Soft link all of the tools in the value stream to deliver software. Review that all of the tools are soft linked.
• Requirements Tools
• Product Backlog
• Master Schedule
• Models
• Repository
• Test Tools
• Deployment Tools that demonstrates end-to-end traceability

34
Appendix F page 2 of 2
NDIA Excerpts continued:

35
Appendix G, page 1 of 2
2006 INCOSE International Symposium paper, “Using Earned Value to Track Requirement Progress,” by
Paul Solomon, July 2006
Copyright © 2006 by Paul Solomon. Published and used by INCOSE with permission.
Note: A PDF of this paper may be downloaded from www.pb-ev.com, at the White Papers” tab.
Excerpts:
It is necessary to track the status of each requirement as it moves through engineering life cycle activities.
Measures that reflect the status of the requirements are essential to monitor program status and serve as
a scorecard to indicate that requirements are being implemented on schedule. This paper provides
guidance to use the tools of requirements traceability to plan and measure the progress of the
requirements management activities. The requirements traceability matrix (RTM) can be used as a
scheduling source and as a set of base measures of Earned Value (EV). Finally, the importance and value of
comparing the schedule variances of the requirements management and tracing activities with the
variances of other project activities is discussed.
Progress.
It is important to quantify the progression of requirements from concept to formulation to design to test.
Peter Baxter discusses assessing these requirements to ensure that your product contains all required
functionality. Baxter’s advice addresses software requirements but is also applicable to the system
requirements: It is advisable to measure the number of requirements that each software process generates
or accepts. Measure the number of system or top-level software requirements (i.e. features or
capabilities), as well as the decomposition of system requirements into more detailed requirements. In
order to track differences between developed and planned requirements, it is necessary to also measure
the status of each requirement as it moves through life cycle activities. A typical requirement status could
be: defined, approved, allocated, designed, implemented, tested, and verified. A measure that shows the
status of all requirements is essential in monitoring program status and acts as a scorecard to illustrate
that requirements are being implemented. Early in the program schedule, ensure that requirements
become defined, approved, and allocated as the system architecture is finalized. Near the end of the
program schedule, you should see requirements move from implemented status, to tested, then to verified
status (Baxter 2002). Measuring the status of each requirement as it moves through life cycle activities is
an essential control tool for effective project management.
Recommended Requirements Statuses
To recap, a recommended set of requirements management statuses is:
• Defined
• Validated
• Verification method determined
• Approved
• Allocated
• Traced to verification document (test procedure)
• Designed
• Implemented
• Tested
• Verified

36
Appendix G, page 2 of 2
When determining which project activities and work products should be discretely scheduled and tracked,
PMs regard the RTM as a tool, not as a work product. They propose that populating the RTM with data is
a support activity to the real work products of engineering development (designs, test articles, test results
etc.). They also argue that the actual completion of many of activities listed above, as well as the associated
documents, is the responsibility of other engineers, not the requirements management engineers. They
then point to those who are actually doing the designing or testing or making related decisions.
Consequently, the requirements engineers conclude that, if the allocated requirements have not been
implemented into the design on schedule, or the test procedure does not yet include all necessary test
cases, or the verification of requirements is behind schedule, it’s not their fault. Therefore, they propose,
their activities should be measured as LOE. It is recommended that, regardless of accountability, the
progress of requirements, as they progress through the engineering life cycle, should be scheduled and
measured against a plan. Of course, discrete earned value techniques should be used for management
control. Even though the budget for the requirements engineers may be relatively small, as compared with
the budgets for all other engineers, the earned value taken in control accounts or work packages for
requirements management activities can be the most important indicator of project schedule
performance. The schedule status of the set of requirements reveals more about the health of the project
than any other schedule performance indicator in the Performance Measurement Baseline (PMB).
Conclusions
If the requirements management and traceability activities are behind schedule, it is an early warning that
the rest of the project is or will be in trouble. We recommend that a PM look at the progress and schedule
variance of these activities early in any review. The requirements management and traceability activities
should be discretely planned and measured. If these activities are realistically planned, they provide a valid
basis for Outcome-based metrics (published as “Performance-based EV”) and give the PM insight into
progress of the total program.

37
Appendix H PSM Excerpts

Many of the measurable benefits of DE are associated with the use of both data and validated digital models as a
“source of truth” across life cycle activities.

Page 3
Thus, DE has three interrelated concerns: the transformation of engineering activities to fully digital infrastructure,
artifacts, and processes; the use of authoritative sources of data and models to improve the efficiency and
productivity of engineering practice; and the use of MBSE practice to fully integrate system data and models with
engineering, program management, and other domains and disciplines.

Page 9
DE measures can serve as useful leading indicators for other product related measures. DE can produce additional
products in support of delivered data, hardware, and software products such as digital twins or other model- or
simulation-based executable systems.

Page 54
In a DE environment products are model-driven, providing additional opportunities to cost-effectively incorporate
changes to digital models that are directly traceable to the implemented and tested work products, some of which
can be automatically generated.
59
Model-based work products such as requirements, architecture, design, use cases and other views or modeling
artifacts can be automatically generated and published directly from modeling tools, at significant savings in effort
relative to traditional documentation-centric approaches. Model-driven automation based on an Authoritative
Source of Truth (ASoT) can lead to process efficiencies, labor reductions, shorter cycle times, less rework, and
earlier verification and validation of solutions.

38
Appendix H PSM Excerpts

39

You might also like