System Design
System Design
3 SYSTEM DESIGN
Purpose
List of Processes
Figure 3-1
System Requirements Analysis System Design System Construction
Define Refine
Determine System
Technical
Business Standards
Architecture
Requirements
Technical
Business Architecture
Requirements
Build, Test
Define
Define and
System
Process Model Validate
Standards System (BTV)
Standards
Process Model
List of Roles
List of Deliverables
Figure 3-2
Purpose
Description
and that the team has access to the data repository that will be
used throughout design efforts. A key activity will be the defi-
nition of the mechanisms and processes to be followed for cre-
ating and maintaining all System Design related materials, sim-
ilar to the repository that was utilized for all System
Requirements work products and deliverables.
During this phase, the Project Team’s focus moves from busi-
ness and functional areas to technical issues. As a result, there
is often less involvement on the part of those project partici-
pants more closely aligned with the organization’s functional
and operational needs (the Stakeholders, Customer, Consumer,
and Project Sponsor). These parties may begin to feel isolated
or removed from the project due to their reduced involvement
and they may not be immediately aware of much of the progress
of the Project Team. While they are likely to play an active role
in the discussions surrounding test planning and data conver-
sion, they usually have limited involvement in the identification
of the technical architecture and standards and the development
of Technical Specifications. This situation poses a challenge for
the Project Manager, since these individuals will ultimately be
profoundly affected by all of these activities. The Project
Manager must maintain effective communications with these
individuals throughout this phase to ensure that they understand
the implications of the technical decisions being made.
Business area experts, such as Customer Representatives, typically have much less
involvement in System Design than in the System Requirements Analysis phase. Those
areas of System Design in which they are most often involved include:
■ Reviewing iterations of the prototype and user interface design.
■ Defining detailed business-related algorithms that were not specified during System
Requirements Analysis.
■ Approving plans for converting from old system(s) to the new one.
■ Validating user security schemes and authorizations.
Periodic design reviews conducted at key points during System Design often provide a way to
batch user comments so that they are most useful to the software designers.
78 Section III:3 System Design
NYS Project Management Guidebook
Purpose
Description
Obviously, the Technical Lead/Architect is crucial throughout this process. Keys to the
Technical Lead’s success are familiarity and background with multiple technologies, and
the ability to assess pros and cons of these technologies as they apply to the system at hand.
As Project Manager, you need to ensure that the Technical Lead has access to additional expert-
ise and resources, if needed.
Deliverable
Agency
Project Name
Project Sponsor
Project Manager
Document Date
Prepared By
Enter the name of the Agency for which the system is being developed.
Enter the Project Name, and the names of the Project Manager and the Project Sponsor.
Enter the Date as of which this document is current.
Enter the names of the Project Team members by whom the document was Prepared.
82 Section III:3 System Design
NYS Project Management Guidebook
Technical Architecture
TABLE OF CONTENTS
The goal of this Technical Architecture is to define the technologies, products, and tech-
niques necessary to develop and support the system, and to ensure that the system com-
ponents are compatible and comply with the enterprise-wide standards and direction
defined by the Agency.
The Document Scope narrative also provides an overview of the efforts conducted to under-
stand the existing technical environment and IT strategic direction and to determine how the
system’s proposed technical architecture fits into them.
Section III:3 System Design 83
NYS Project Management Guidebook
Technical Architecture
The System Architecture Context Diagram provides the “big picture” view of the system’s
architecture, and puts it in context with the rest of the Performing Organization’s systems
portfolio, illustrating how the system’s hardware and software platforms fit into the existing
environment.
Technical Architecture
For each System Architecture Component, the narrative describes specific Component
Functions, requirements and other Technical Considerations that were used in the decision-
making process, as well as any specific Products selected to implement this component. The
Selection Rationale identifies any other products that may have been considered, and pro-
vides rationale for the decision. Architecture Risks identifies any potential risks associated
with the architecture element.
The System Construction Environment section details the various environments necessary
to enable system construction and testing.
4.2 QA Environment
Purpose
Description
Deliverable
Agency
Project Name
Project Sponsor
Project Manager
Document Date
Prepared By
Enter the name of the Agency for which the system is being developed.
Enter the Project Name, and the names of the Project Manager and the Project Sponsor.
Enter the Date as of which this document is current.
Enter the names of the Project Team members by whom the document was Prepared.
88 Section III:3 System Design
NYS Project Management Guidebook
System Standards
TABLE OF CONTENTS
All deviations from the <Agency> standards are annotated and explained.
This document addresses standards for the following areas:
● Graphical User Interface
● Reporting
● Application Navigation
● Error Prevention and Correction
● Programming
● Documentation
● Naming Conventions
● Database Access and Views
● Data Creation and Updating
● Stored Procedures
The Document Scope narrative also provides an overview of the efforts conducted to under-
stand the existing standards in the organization, and to research those areas for which no
appropriate standards exist.
Section III:3 System Design 89
NYS Project Management Guidebook
System Standards
2.2 Reporting
2.5 Programming
System Standards
2.6 Documentation
Miscellaneous standards address any other Technical Development areas that are not cov-
ered in sections above.
System Standards
3.2 QA Environment
Release Management Standards detail how source code, compiled applications, and data
will be migrated among the various environments (Development, QA, and Acceptance).
For each kind of testing performed (Unit, Integration/System and Acceptance), define
Testing Standards and suggested approaches to setting up test cases and conducting tests,
and identify and describe Testing Tools that should be utilized in that testing cycle.
92 Section III:3 System Design
NYS Project Management Guidebook
Purpose
Description
Data views are an effective way to manage the presentation of data to the user as well
as to accommodate many of the security needs of the system. Sometimes, data views
are overlooked until late in the project, often defined and created during the construction or
testing phases in response to security or performance issues. This is a clear case of “you can
pay me now or pay me more later”, with the costs associated with implementing these views
late in the project often exceeding what they would have been had data views been a focus
in the early design efforts.
Deliverable
Purpose
Description
Another benefit to prototyping is that by actively involving the Customers in the design
of the system, a sense of ownership and buy-in is created that might not otherwise be
possible, or that certainly could be more difficult to achieve if the system were designed with-
out their input.
In addition, there are advantages to engaging the Application Developers early in System
Design. While these developers will make their primary contribution to the project during
System Construction, involvement during System Design will enhance their overall under-
standing of the system, the business objectives, and the rationale behind many of the design
decisions, all of which will contribute towards a stronger final product.
96 Section III:3 System Design
NYS Project Management Guidebook
Deliverable
Purpose
Description
• System Performance
Create Physical
Operational
Requirements • Data Archival Database ✓Transaction processing and reporting architecture.
• Audit and Controls ✓Performance optimization strategies.
Impacts • System Administration ✓System management and administration capabilities.
Operations and • SQA ✓Audit, archival, and quality assurance processes.
Support Prototype System
• Business Continuity
Components
Trasnsitional • Data Conversion ✓Data extract, cleansing, import, and validation utilities.
Requirements • Release Validation
Produce Technical ✓System test strategies, plans, and utilities.
• Documentation ✓Training and documentation strategies, outlines, curriculum,
Impacts • Training Specifications
Implementation and prototypes.
• Deployment ✓Application deployment strategies and utilities.
99
100 Section III:3 System Design
NYS Project Management Guidebook
Data Conversion
Much of the data required by the system will be entered through
normal day-to-day business operations, whether through man-
ual input or through automated mechanisms. Additional data
may be required, however, before the system can effectively
begin operation, such as:
◆ Historical data, typically found on existing legacy sys-
tems, that may need to be migrated to the new system to
provide a basis for future calculations, historical or trend
reports, etc.
◆ Reference data, also known as lookup data, which can be
used to populate common tables upon which other data is
dependent (e.g., system codes that might be referenced
across multiple tables). This information may or may not
reside on existing systems, depending upon how the
eventual design of the new system maps to the legacy
environments.
◆ New data, essential to the initial operation of the system
being built, that may not be available on any legacy systems.
Whether or not the data the new system requires exists on lega-
cy systems (or in spreadsheets, or on scraps of paper, etc.), the
Project Manager must ensure that the Project Schedule
includes the steps needed to obtain all required data in a for-
mat compatible with the new environment. This often necessi-
tates the development of conversion and migration software
modules, to support and ensure successful completion of the
data conversion. Research may also be needed to determine
whether data is valid, and cooperation between multiple organ-
izations may be required as attempts are made to identify and
resolve conflicting data.
Because data is often only as good as the source from which it originated, you need to
ensure that you involve your Customers in evaluating and validating the information
that may eventually be loaded into your system. Often there are historical implications or
nuances embedded in the information that may not immediately be evident to someone unfa-
miliar with the data. The data itself may also imply business rules that may not have been cap-
tured during System Requirements Analysis. Historical data often contains “dummy” or other-
wise invalid data to flag the existence of an exception situation. Without planning for direct and
active involvement of your Customers during the data conversion process, the risk of missing
or mishandling critical system data is greatly increased.
into the new system. If there is a need to run the new and lega-
cy systems in parallel for some period of time to allow for val-
idation of the new system, there may be additional data con-
version implications that must be addressed. All identified
impacts should be captured in the Project Implementation and
Transition Plan and the Organizational Change Management
Plan, both defined in the Project Planning phase of the Project
Management Lifecycle.
Testing
Test plans created in the Produce Technical Specifications
process define the overall strategy for validating the function-
ality of the system being developed, as well as the individual
test cases that will be performed in the execution of this strat-
egy. Additionally, the environments in which these tests will be
executed must be defined in detail.
Unit
Integration
System
Acceptance
Project Timeline
Often one of the most difficult aspects of testing an application is defining and creat-
ing the appropriate set of test data needed to validate system functionality. This is espe-
cially true in environments that require special processing of data at the end of specific time
periods (monthly, quarterly, annually, etc.), or need to manage data across multiple fiscal years.
Preparation of this data can be very time consuming, and it is in System Design that the scope
and responsibilities for data preparation must be clearly defined.
Also, while the creation of representative or “dummy” test data may be acceptable for tests per-
formed internally by the Application Developers, real or meaningful data should employed in
any testing that involves Customer Representatives.
Deployment Planning
By this point, the Project Manager and Project Team have
determined what needs to be built, how to build it, and how the
Performing Organization is going to use it. The one remaining
piece of the puzzle is to identify how the system being created
is going to be made available for use once testing has been
completed.
Deliverable
Agency
Project Name
Project Sponsor
Project Manager
Document Date
Prepared By
Enter the name of the Agency for which the system is being developed.
Enter the Project Name, and the names of the Project Manager and the Project Sponsor.
Enter the Date as of which this document is current.
Enter the names of the Project Team members by whom the document was Prepared.
110 Section III:3 System Design
NYS Project Management Guidebook
Technical Specifications
TABLE OF CONTENTS
The goal of this Technical Specifications document is to define the system and its develop-
ment and testing strategies in enough detail to enable the Application Developers to con-
struct and test the system with minimal need for further explanation.
This document places the system in its context from Technical Architecture, Customer
Interface, and System Development perspectives, provides detailed Module Specifications
for all its components, details Unit, Integration and System Plans, and outlines Deployment
and Transition plans.
Section III:3 System Design 111
NYS Project Management Guidebook
Technical Specifications
2.0 SYSTEM ARCHITECTURE
System Architecture section provides the “big picture” view of the system from technical,
functional and development perspectives, and puts it in context with the rest of the organiza-
tion’s systems portfolio. This section repeats some of the information from Section 2 (Overall
Technical Architecture) of the Technical Architecture document, and from Section 2 (General
Functional Specifications) of the Functional Specification document, with their contents refined
as a result of the prototyping and other System Design efforts.
The Refined System Context Diagram shows how this system integrates into the Agency’s
application portfolio. All external dependencies and influences should be noted, as well as all
data sources and outputs.
The Refined System Architecture Context Diagram shows how the system’s hardware/soft-
ware platform fits into the existing environment.
The Refined System Architecture Model represents the various architecture components
that comprise the System, and shows their interrelationships. This model presents the view of
the system from the technical architecture perspective, as opposed to the Consumer-driven
perspective of the System Interface Diagram.
The Refined Business Flow Diagram shows how the Customer and Consumer business
units will interface with the system.
The Refined System Interface Diagram shows the application structure (menu structure and
navigation of the online application) and batch/utility structure (organization and flow of report-
ing and other interfaces), which is a refinement of the System Interface Diagram from the
Functional Specification document. The System Interface Diagram presents the view of the
system from the Consumer perspective.
112 Section III:3 System Design
NYS Project Management Guidebook
Technical Specifications
The System Development Diagram shows the system from the development perspective,
according to how the components of the system will be coded, tested, and integrated.
Beginning with the application structure established in the Functional Specification, a modified
list of sub-systems will be identified based on development efficiencies and testing and
deployment considerations. Each sub-system consists of a number of Modules, which will
be assigned to individual Application Developers for coding and unit-testing; each sub-system
constitutes a work packet that will be assigned to a group of Application Developers for
construction and integration testing.
3.1 Sub-System A
Depending upon how the system has been decomposed into sub-systems, these Sub-System
sections contain specifications for all Modules comprising the sub-systems. Sub-systems may
be defined by functional area (e.g., security, reports, etc.), or by business focus (e.g., accounts
receivable, payroll, etc.)
Technical Specifications
3.1.1.5 Business Requirement(s)
Business Requirements provides a tie-back to the Business Requirements Document.
3.1.1.6 Inputs
Inputs details all data sources, Consumer input, etc. that will provide data to the module.
3.1.1.7 Interfaces
Interfaces details how the Consumers will interact with the module’s interface components,
and how those components will behave in all circumstances.
3.1.1.10 Outputs
Outputs details all data stores, displays, etc. created or modified as a result of the module’s
execution.
Unit Test Plan details how the module will be tested, once developed.
114 Section III:3 System Design
NYS Project Management Guidebook
Technical Specifications
Unit Test Case Number allows quick reference to test case; should be based on module
identification.
Unit Test Case Name provides a brief description of the condition/scenario being tested.
Purpose of Test Case identifies those functions that the test is intended to validate.
Unit Test Data identifies data values (or conditions) that need to be set in order to conduct the
test case.
Navigation provides a sequence of activities that need to be performed to set up and execute
the test.
Expected Results provides a comprehensive description of how the module is expected to re-
act to the test case, and/or what data values (or conditions) are expected as a result of the test.
Comments provides additional considerations for the test (expected Fail conditions, etc.)
Unit Test Results allows the tester to record the results of the unit test.
Tester enters his/her Name, and Date and Time of the test.
Tester certifies the test as Passed or Failed, and provides a Justification for that certification.
In the event of a failure, and depending upon how defects are being captured and tracked, this
justification may be a description of the problem encountered, or may simply contain a refer-
ence to the defect log, where a detailed description of the error would be maintained.
Technical Specifications
Sub-system modules are organized into Integration Packets to facilitate integration testing.
The same module (or a series of modules) can be included in different, smaller or larger,
Integration Packets depending on the aspects of the system being integrated and tested.
Technical Specifications
Integration Test Case Number allows quick reference to test case; should be based on mod-
ule identification.
Integration Test Case Name/Purpose provide a brief description of the scenario being tested.
Module List identifies system modules included in the Packet
Integration Test Data identifies data values (or conditions) that need to be set in order to con-
duct the test case.
Navigation provides a sequence of activities that need to be performed to set up and execute
the test.
Expected Results provides a comprehensive description of how the Packet is expected to
react to the test case, and/or what data values (or conditions) are expected as a result of the
test.
Comments provides additional considerations for the test (expected Fail conditions, etc.)
Integration Test Results allows the tester to record the results of the test.
Tester enters his/her Name, and Date and Time of the test, certifies the test as Passed or
Failed, and provides a Justification for that certification. As with Unit testing, this justification
may contain descriptive text, or may refer to an entry in the project’s defect log.
Verifier verifies that the Integration Test was conducted as described, and produced reported
results.
Sub-systems and system modules are organized into System Test Packets to facilitate system
testing. The same packet, or the system as whole, may be tested numerous times to verify dif-
ferent aspects of its operation.
Technical Specifications
System Test Results:
Tester:
Name:
Date: Time:
Results:
Passed: ______ Failed: ______
Justification:
Verifier:
Name:
Date: Time:
System Test Case Number allows quick reference to test case; should be based on module
identification.
System Test Case Name/Purpose provide a brief description of the scenario being tested.
Module List identifies system modules included in the Packet
System Test Data identifies data values (or conditions) that need to be set in order to conduct
the test case.
Navigation provides a sequence of activities that need to be performed to set up and execute
the test.
Expected Results provides a comprehensive description of how the Packet is expected to
react to the test case, and/or what data values (or conditions) are expected as a result of the
test.
Comments provides additional considerations for the test (expected Fail conditions, etc.)
System Test Results allows the tester to record the results of the test.
Tester enters his/her Name, and Date and Time of the test, certifies the test as Passed or
Failed, and provides a Justification for that certification. In the event of a failure, and depend-
ing upon how defects are being captured and tracked, this justification may be a description of
the problem encountered, or may simply contain a reference to the defect log, where a detailed
description of the error would be maintained.
Verifier verifies that the System Test was conducted as described, and produced reported
results.
Modules, groups of modules and sub-systems are organized into Acceptance Test Packets to
facilitate Customer Representative testing of the system.
Technical Specifications
Acceptance Test Data Preparation:
Data Preparer:
Data Sources and Values:
Acceptance Case Description:
Business Rules, Requirements and Conditions being tested:
Navigation directions:
Expected Results:
Narrative
Comments:
Additional Testing Consideration
Acceptance Test Results:
Tester:
Name:
Date: Time:
Results:
Passed: ______ Failed: ______
Justification:
Defect Resolution:
Application Developer:
Resolved Date:
Re-Tester:
Name:
Date: Time:
Results:
Passed: ______ Failed: ______
Justification:
Approval:
Name:
Date: Time:
Acceptance Test Case Number allows quick reference to test case; should be based on mod-
ule identification.
Acceptance Test Case Name/Purpose provide a brief description of the condition/scenario
being tested.
Module List identifies system modules included in the Packet
Acceptance Test Data Preparation describes how the Data Preparer will prime Data Sources
with Values that will provide realistic and understandable test scenarios for Customer
Representatives.
Navigation Directions provide a guide for the Customer Representative testing the Packet on
a proper sequence of activities to set up and execute the test.
Expected Results provides a comprehensive description of how the Packet is expected to
react to the test case, and/or what data values (or conditions) are expected as a result of the
test.
Section III:3 System Design 119
NYS Project Management Guidebook
Technical Specifications
Comments provides additional considerations for the test (expected Fail conditions, etc.)
Acceptance Test Results allows the tester(s) to record the results of the test.
Tester
In case of a Defect, the Packet is passed to an Application Developer for Resolution; the
Date of resolution is recorded, and the Packet is passed back for further Acceptance Testing.
Re-Tested enters his/her Name, and Date and Time of the test, certifies the test as Passed or
Failed, and provides a Justification for that certification. In the event of a failure, and
depending upon how defects are being captured and tracked, this justification may be a
description of the problem encountered, or may simply contain a reference to the defect log,
where a detailed description of the error would be maintained.
A Customer Decision-Maker (or Representative) approves the test results by entering his/her
Name, Date and Time of the Approval.
Consumer Training and Deployment deals with training and preparing Consumers for system
deployment.
Data Preparation deals with plans for data conversion, data cleansing, and data migration in
preparation for system deployment.
Software Migration outlines an approach for migrating developed software to Production, and
making it available to Consumers.
Production Start-up considers all other (outside data preparation and software migration)
activities necessary to prepare and start up the System in Production.
Production Verification deals with all the tasks that need to be performed to make sure the
version of the System migrated to Production is functioning properly.
Performing Organization Training and Transition outlines plans for training and turning over
system support responsibilities to the Performing Organization.
Measurements of Success
Figure 3-8
Prepare for System Do all team members have experience with (or
Design training on) the tools that will be used in this phase?
Is the team comfortable with the process defined for
managing the deliverable repository?
Define Technical Has the proposed architecture been reviewed by an
Architecture independent third-party subject matter expert?
Do your Customers understand the potential impact
that the proposed architecture may have on their
operations, and agree that the defined architecture
supports both their immediate and long-term needs?
Define System Have the technical and configuration management
Standards standards been reviewed and approved by an
agency’s SQA Administrator or equivalent?
Have standards been defined and accepted that
address the strategy for managing future releases
of the system?
Create Physical Were the Performing Organization’s data
Database administration policies and standards considered in
creating the database?
Was the database created using scripts from an
automated tool to ensure consistency, repeatability,
and maintainability of future builds of the database?
Has an independent third-party subject matter expert
reviewed the physical database design?
Prototype System Has the Customer been involved in defining which
Components aspects of the system would be prototyped and
reviewed?
Section III:3 System Design 121
NYS Project Management Guidebook
The one thing that pulls it all together – the key to putting the
“dream” in your dream home – is that clear, shared vision. Your
dream home’s worst enemies? Indecisiveness and impatience.
These factors combined can have the same effect on your budg-
et as fast food has on your cholesterol. In other words … it’s
gonna go up, and it’s gonna go up fast.
But how does this relate to building your new dream system?
And that’s when the walls start to crumble. It seems that while
you convinced yourself that you had devised a clever testing
strategy, you were so focused on making sure that you could
validate the system functionality that you may have overlooked
one little detail … the hardware needed to support all of the
testing environments. And now that you’re neck deep in people
looking for results, you’ve got to explain why testing will be
delayed and why you’ve got to now find a way to obtain (trans-
lation – beg, borrow, steal, BUY?) the hardware to support inte-
gration testing. And user acceptance testing. And QA testing.
Suffice it to say, there goes the parking spot.
Of course, this could all have been avoided by looking at the full
testing picture. This includes not only defining how you plan to
confirm that the system performs to expectations, but also that
you’ve got the hardware, data, and all other resources needed
to execute the tests. The time to think about this is now, dur-
ing System Design.
Section III:3 System Design 125
NYS Project Management Guidebook
How can the Project Team determine that all security con-
cerns have been addressed?
And as for those flashy promises… as the poet said, you’ll have
miles to go before you sleep when you have promises to keep.
Well, there is that old story about the Tower of Babel that sort
of makes the point about why a common language (“standards”
in the system development parlance) is preferable.
Ah yes, the old school of “we’ll cross that bridge when we come
to it.” As Project Managers, we would rather go by the “meas-
ure twice, cut once” theory. As per the project management
128 Section III:3 System Design
NYS Project Management Guidebook