0% found this document useful (0 votes)
172 views31 pages

Perf Test Plan Results Template

This document provides a template for a Performance Test Plan and Results report. It includes sections for executive summary, performance requirements and planning, execution and analysis, and appendices. The executive summary gives an overview of the project background and scope. Performance requirements, load model, test approach, environment, tools, and metrics are defined in the planning section. Test runs, results, and analysis are reported in the execution section. Appendices include sign-off sheets, changes, acronyms, and referenced documents. The template is to be completed by vendors testing applications in the Enterprise Testing Center.

Uploaded by

apeksha bhandari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
172 views31 pages

Perf Test Plan Results Template

This document provides a template for a Performance Test Plan and Results report. It includes sections for executive summary, performance requirements and planning, execution and analysis, and appendices. The executive summary gives an overview of the project background and scope. Performance requirements, load model, test approach, environment, tools, and metrics are defined in the planning section. Test runs, results, and analysis are reported in the execution section. Appendices include sign-off sheets, changes, acronyms, and referenced documents. The template is to be completed by vendors testing applications in the Enterprise Testing Center.

Uploaded by

apeksha bhandari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 31

For instructions on using this template, please see Notes to Author/Template Instructions on page 29.

Notes on accessibility: This template has been tested and is best accessible with JAWS 11.0 or higher.
For questions about using this template, please contact CMS IT Governance. To request changes to the
template, please submit an XLC Process Change Request (CR).

Centers for Medicare & Medicaid Services


CMS eXpedited Life Cycle (XLC)

<Project Name / Acronym>


Performance Test Plan and Results Template
Version X.X
MM/DD/YYYY

Document Number: <document’s configuration item control number>


CMS XLC Table of Contents

Contract Number: <current contract number of company maintaining document>

Performance Test Plan and Results Template ii


Error! Use the Home tab to apply Version Number to the text that you want to appear here.
MM/DD/YYYY<Pub Date>MM/DD/YYYY

CMS eXpedited Life Cycle (XLC)


CMS XLC Table of Contents

Table of Contents
1. Executive Summary...................................................................................................1
1.1 Overview: Project Background and Scope........................................................1
2. Performance Requirements and Planning...............................................................2
2.1 Performance Requirements..............................................................................2
2.1.1 Load Model..........................................................................................2
2.2 Performance Test Approach.............................................................................3
2.2.1 Assumptions, Constraints, and Risks..................................................4
2.2.2 Milestones............................................................................................5
2.2.3 Test Organization.................................................................................6
2.2.4 Performance Test Script Steps............................................................7
2.2.5 Performance Test Data Planning.........................................................7
2.2.6 Performance Test Scenario Runs........................................................8
2.2.7 Performance Test Environment.........................................................11
2.3 Performance Test Monitoring Tools and Metrics............................................13
3. Execution and Analysis...........................................................................................16
3.1 Performance Test Results and Analysis.........................................................16
3.1.1 Test Run 1 – Baseline Two Hour – Peak Hour Load Test................16
3.1.2 Test Run 2.........................................................................................17
3.1.3 Test Run 3.........................................................................................17
3.1.4 Test Run 4.........................................................................................17
3.1.5 Test Run 5.........................................................................................18
3.1.6 Test Run 6.........................................................................................18
Appendix A: Test Sign-off Sheet............................................................................2
Appendix B: Record of Changes............................................................................3
Appendix C: Acronyms............................................................................................4
Appendix D: Glossary..............................................................................................5
Appendix E: Referenced Documents.....................................................................6
Appendix F: Notes to the Author / Template Instructions...................................7
Appendix G: XLC Template Revision History.......................................................8
Appendix H: Additional Appendices......................................................................9

Performance Test Plan and Results Template Version X.X iii <Project Name / Acronym>
CMS XLC Table of Contents

List of Figures
No table of figures entries found.

List of Tables
Table 1: Load Model..........................................................................................................3
Table 2: Change Requests (CRs)......................................................................................3
Table 3: Assumptions........................................................................................................4
Table 4: Constraints...........................................................................................................4
Table 5: Risks....................................................................................................................5
Table 6: Schedule of Milestones........................................................................................5
Table 7: Test Organization................................................................................................6
Table 8: Performance Test (Script 1 Steps)......................................................................7
Table 9: Performance Test (Script 2 Steps)......................................................................7
Table 10: Performance Test Scenarios.............................................................................8
Table 11: Performance Test Scenario Runtime Settings..................................................9
Table 12: Performance Test Environment Hardware......................................................11
Figure 1: Logical Test Environment.................................................................................12
Figure 2: Physical Test Environment...............................................................................13
Table 13: Tools................................................................................................................13
Table 14: Application Server Tier....................................................................................14
Table 15: Database Server Tier.......................................................................................14
Table 16: Business Process/Transactions Goals............................................................16
Table 17: Business Transactions Statistics Summary....................................................16
Table 18: Record of Changes............................................................................................3
Table 19: Acronyms...........................................................................................................4
Table 20: Glossary.............................................................................................................5

Performance Test Plan and Results Template Version X.X iv <Project Name / Acronym>
CMS XLC Table of Contents

Table 21: Referenced Documents.....................................................................................6


Table 22: XLC Template Revision History.........................................................................8

Performance Test Plan and Results Template Version X.X v <Project Name / Acronym>
CMS XLC Table of Contents

1. Executive Summary
The Enterprise Testing Center (ETC) supports the commercial off-the-shelf (COTS)
product HP Performance Center (PC) to conduct performance tests and analyze results
from executed tests. This Test Plan needs to be completed by vendors seeking to test
their applications in the ETC. Performance tests help to determine a system’s and
application’s limitations, as well as the maximum number of active users utilizing the
application throughout servers.

The Performance Test Plan and Results is a combined document designed to more
closely integrate performance test planning and reporting. This document prescribes the
performance requirements, load model, performance test approach, assumptions,
issues, risks, constraints, milestone/schedule, test organization, performance test script
steps, performance test data planning, performance test scenario runs, performance
test environment, performance test monitoring tools and metrics, performance test
results, and analysis steps for Performance Testing (PT).

The objective of this document is to summarize the tasks that have been performed by
each system and/or department supporting the Performance Test (PT) phase.

1.1 Overview: Project Background and Scope


Instructions: Provide an objective of the Application/Project and what it is intended to do
in support of the Government Agency.

Briefly describe the changes or new modernization that this Application/Project will
include. Examples of changes or new modernization include new software code,
enhanced capabilities, or hardware component changes.

Performance Test Plan and Results Template Version X.X vi <Project Name / Acronym>
CMS XLC Table of Contents

2. Performance Requirements and Planning

2.1 Performance Requirements


List the performance requirements in the Load Model format located in Section 2.1.1.
Write any other details, factors, or performance attributes related to the Performance
Requirements. The baseline performance attributes are Business Process, Service
Level Agreement (SLA)/Response Time, and Transactions per hour.

The Performance Requirements should define the expected performance standard for
the Application and Infrastructure that supports this project. The Performance
Requirements should specifically define the transaction rate per Business Process and
the expected SLAs/response times per Business Process.

The Business Processes of interest within the Application should be defined based on
their level of importance to the application execution.

2.1.1 Load Model


Complete the Load Model detailed in the table below. Add as many load models as
needed to support the Performance Test. Add columns to the Load Model if there are
more performance attribute information to share.

The Load Model is used as a repository for the Performance Requirements and the
workloads that drive Demand Management. Demand Management is the measurement
of the demand for workloads that process on infrastructure. The Load Model includes
the following performance attributes: Business Processes, SLA/Response Times, and
Transactions per Hour. The Load Model allows for mapping all of the performance
attributes together to make specific and measurable Performance Requirements. The
Load Model and Performance Requirements complement each other as the Load Model
is a repository for the Performance Requirements.

The Load Model can be adjusted, dependent on what the key business processes are
for the application. Several Load Models can be placed in this section to explain
different workloads. For example, there could be a Production Load Model located here
to indicate this project’s application along with other regression workloads that also will
be sharing the infrastructure in Production. There could be a Load Model for this
application’s workload only. There could be a Load Model for what will be executed in a
Performance Test environment which is 50% of Production.

The base Load Model table is presented below:

Performance Test Plan and Results Template Version X.X vii <Project Name / Acronym>
CMS XLC Table of Contents

Table 1: Load Model


Business Transactions SLA/response times Transactions per hour

Business Process 1 4 seconds 1500

Business Process 2 4 seconds 1900

Business Process 3 1 second 1400

Business Process 4 3 seconds 1350

Business Process 5 xxxx xxxx

2.2 Performance Test Approach


Write the high level approach for performance-testing the Application and Infrastructure.

Document the Performance Test Approach by covering five key areas:

1. Review the System Design Document (SDD) if available, or collect project


background information on the application and infrastructure.

2. Leverage a Production Load Model to determine what regression load should


also be considered for execution during the performance tests, if available.

3. Determine what load test types should be executed; examples include Peak hour
Load Tests, Stress Tests, and 4–24-hour Endurance Tests.

4. What key performance metric areas will be important to monitor or define the
pass criteria? The Performance Requirements are strongly recommended to
drive the pass/fail criteria, but previous results comparisons can also be
considered in the pass/fail decision process.

5. Based on the Application capabilities or Infrastructure changes, what scope will


be performance tested? Are there any Change Requests (CR) s related to this
activity?

Table 2: Change Requests (CRs)


Task ID Description Project Affected
CR10000 TBD XXXXXX
CR10001 TBD XXXXXX
CR10002 TBD XXXXXX

Performance Test Plan and Results Template Version X.X viii <Project Name / Acronym>
CMS XLC Table of Contents

2.2.1 Assumptions, Constraints, and Risks

2.2.1.1 Assumptions
Assumptions should be documented concerning the available release software, test
environment, dependencies, tools, and test schedule associated with the performance
test. Examples are shown below.

Table 3: Assumptions
No. Assumption
1 Release x.x Code will be fully functional, having passed Functional and Automated
Pass I and II before Performance Testing begins.
2 All Performance Center Tool software has been installed and configured.
3 The fully deployed, installed, and configured Web tier, middleware tier, and
database servers must be operational in order for performance testing shake-out to
begin.
4 xxxxxxx

2.2.1.2 Constraints
Constraints should be documented concerning the available release software, test
environment, dependencies, tools, test schedule, and other items pertaining to the
performance test. Examples are shown below.

Table 4: Constraints
No. Constraint Impact
1 The Performance Test environment The scaling factor of Performance Test to
has 50% of the servers that Production Production is 50%. All Production Load
has. Models that are executed in Performance
Test should be run at 50% of the full
Production load Model to represent a 100%
Load Test in the AJ Test environment.
2 The Performance Test environment The data in Production has not been purged
does not have some of the older data since 2009; searches in Production
that Production has, which limits some intermingle with older data than
of the data scenarios that can be Performance Test can. This could limit the
simulated. capability of reproducing some Production
issues.
3 The Performance Test team does not The impact of network response times will
have a commercial tool or an approved not be measurable as we determine what
Wire Shark-like tool that allows for areas within the Architecture are
measuring network response times responsible for transaction response time

Performance Test Plan and Results Template Version X.X ix <Project Name / Acronym>
CMS XLC Table of Contents

using packet captures. cost. This constraint will leave network


response time cost-related questions
unanswered.
4 xxxx xxxx

2.2.1.3 Risks
Risks should be documented concerning the test schedule, release software,
dependencies, tools, test approach test environment and other items pertaining to the
performance test. Examples are shown below.

Table 5: Risks
No. Risk Impact Action/Mitigation Assigned To
1 If functional errors from HIGH The team will start Project Manager
validation testing occur Performance Test
and prevent the creation execution once
of performance test environment
scripts or performance certification, test
test execution, script validation, and
execution of data staging efforts
performance test project are completed.
tasks will be delayed
until functional errors
can be addressed.
2 If a performance-tuning HIGH It is recommended Project
effort is conducted in that any tests that Manager,
the middle of the were executed before Performance
performance test the performance Engineering
execution schedule and tuning changes
as a result configuration should be re-
or code changes are executed after the
made to the performance-tuning
environment, any tests changes.
executed prior to the
performance-tuning
changes should be re-
executed.
3 Xx Xx Xx xx

2.2.2 Milestones
Key milestones are listed in the table below. Each of the milestones represents a group
of tasks on which completion of Performance Testing is dependent. If any of the
milestones are listed as “At Risk”, the milestones that follow it will most likely be delayed
as well.

Performance Test Plan and Results Template Version X.X x <Project Name / Acronym>
CMS XLC Table of Contents

Table 6: Schedule of Milestones


ID % Done At Risk Task Due Date Interface
1 0-100 Yes or Preliminary Project Plan xx/xx/xxxx Project Management
No submitted
2 0-100 Yes or Final Project Plan submitted xx/xx/xxxx Project Management
No
3 0-100 Yes or Performance Requirements xx/xx/xxxx Requirements
No and Production Load Model Management and
reviewed and verified Performance
Engineer
4 0-100 Yes or Environment Planning xx/xx/xxxx Environment Team
No and Project
Management
5 0-100 Yes or Test Plan xx/xx/xxxx Performance
No Engineer
6 0-100 Yes or Script Development and Data xx/xx/xxxx Performance
No Planning Engineer and
Vendor Project
Team
7 0-100 Yes or Environment Certification and xx/xx/xxxx Project Management
No Test Script Validation and Environment
Team
8 0-100 Data Staging and Setup xx/xx/xxxx Performance
Engineer and
Vendor Project
Team
9 0-100 Performance Monitoring xx/xx/xxxx Environment Team
Configuration and Performance
Engineer
10 0-100 Test Execution and Analysis xx/xx/xxxx Performance
Engineer, Monitoring
Tool administrators,
and Development

2.2.3 Test Organization


Document the test organization and any other departments that will be supporting the
Performance Test Phase.

Table 7: Test Organization


Name Functional Role Responsibilities
Name Project Manager Facilitating and coordinating all schedules related
to SDLC phases and infrastructure

Performance Test Plan and Results Template Version X.X xi <Project Name / Acronym>
CMS XLC Table of Contents

Name Functional Role Responsibilities


Name Performance Manages schedules and activities related to
Engineering Lead Performance Testing projects
Name Performance Prepares for performance test execution,
Engineer executes performance tests, analyzes
performance tests, and tracks problem reports
Name Performance Prepares for performance test execution,
Engineer executes performance tests, analyzes
performance tests, and tracks problem reports.
Name Monitoring Support Monitors performance tests using Performance
monitors
Name Application Support Supports performance test execution as
configuration or application issues are found
Name Performance Test Supports and maintains the Performance Test
Environment Support environment

2.2.4 Performance Test Script Steps


In this section, the performance test scripts that need to be developed are detailed by
user action step as shown in the tables below. For each key Business Process within
the Application under Test, a Performance Test script needs to be developed.

Performance Test scripts will be built using Performance Center’s Virtual User
Generator (VuGen) component. The following performance test script steps map back
to the Load Model defined earlier in Section 2.1.1. The performance test scripts are
designed to simulate the Business Processes/Transactions in the Load Model.

Develop performance test scripts that simulate all of the actions in the Business
Processes/Transactions documented in the Load Model.

Table 8: Performance Test (Script 1 Steps)


Application Name: Pecos (for example)
Name
Business Process Name: Submit Health information (for example)
1 Login
2 Click link
3 Enter personal ID
4 Fill out .PDF form and submit
5 xxxxx
6 xxxxx

Performance Test Plan and Results Template Version X.X xii <Project Name / Acronym>
CMS XLC Table of Contents

Table 9: Performance Test (Script 2 Steps)


Application Name: Pecos (for example)
Name
Business Process Name: Search for Health information (for example)
1 Login
2 Click Link
3 Search
4 Search
5 xxxxx
6 xxxxx

2.2.5 Performance Test Data Planning


Provide a summary of the test data that will be needed for the Performance Test phase.
Provide the details of what data will be needed to support the execution of the
performance test scripts for several iterations per the Load Model. There could be
needs for dynamic login data and variable data in order to not allow for iterations to be
static. The performance test execution iterations should be varied, and the data drives
this process.

There could be a need to research the problematic data flows in Production if the
application under test is already deployed in Production. Determine if there is a need for
performance test data to be pre-loaded in the Performance Test Environment database.

2.2.5.1 Data Preparation


Define the procedure that will be used to prepare the test data for Performance Test.
Define the procedures needed to create the test data. Some of the procedures include,
but are not limited to:

 Testing the data and database to ensure that they are ready for the current test
stage and align with test requirements.

 Identifying/defining any tools needed to create, modify, and manipulate the test
data.

 Developing a strategy for obtaining and refreshing test data for every cycle or
pass conducted in performance testing. During the performance test, the volume
of the test data is large relative to other test stages.

Performance Test Plan and Results Template Version X.X xiii <Project Name / Acronym>
CMS XLC Table of Contents

2.2.6 Performance Test Scenario Runs


The table below provides an example of a short summary of each of the Performance
Test scenario runs.

Instructions: In the short paragraphs below the table, describe the short objective of
each of the Performance Test scenarios to be executed. Examples are included.

Table 10: Performance Test Scenarios


Test Run Date Execution
Shakeout To Be Determined Shake-out performance test scripts and monitors
test (TBD)
Run 1 TBD Baseline – 2 Hour Load Test (Peak Hour Load)
Run 2 TBD Repeat Baseline – 2 Hour Load Test (Peak Hour
Load)
Run 3 TBD 1 Hour Stress Test (150–300% of Peak Hour Load)
Run 4 TBD Repeat 1-Hour Stress Test (150–300% of Peak
Hour Load)
Run 5 TBD 4–24-Hour Endurance Test (Average Hour or
Varied Load)
Run 6 TBD Repeat 4–24-Hour Endurance Test (Average Hour
or Varied Load)

Table 11: Performance Test Scenario Runtime Settings


Test Run Pacing between Iterations Think Times
Run 1 6 seconds 10 seconds
Run 2 6 seconds 10 seconds
Run 3 3 seconds 10 seconds
Run 4 3 seconds 10 seconds
Run 5 12 seconds 10 seconds
Run 6 12 seconds 10 seconds

2.2.6.1 Shake-out Test


The Shake-out test is designed to ensure that the performance test scripts are working
in the Performance Test Environment. The Shake-out test is also used for making sure
the Performance Monitors that are configured for metrics collection are operating as
expected. The Shake-out test can also be used to run a 1–5-user test to determine how
long it takes for transaction steps to complete. This method is valuable for the runtime
settings pacing of the test.

Performance Test Plan and Results Template Version X.X xiv <Project Name / Acronym>
CMS XLC Table of Contents

2.2.6.2 Test Run One, Baseline 2 Hour - Peak Hour Load Test (REPEAT in
Run 2)
Test Run One is a 2-hour peak hour baseline test that will exercise the top workload
transactions in the Application under Test, which are documented above in Table 2-1,
Load Model and Table 2-8, Performance Test Script Steps. Peak hour transaction
processing will be under examination to determine if the system can maintain response
times under the highest anticipated load. Test Run One is designed to collect
performance metrics on transaction throughput, response times, and system resource
utilization, in comparison to Performance requirements.

Test Run Two is a repeat of Test Run One.

2.2.6.3 Test Run Three, 1 Hour Stress Test (150-300% of Peak Hour Load)
(REPEAT in Run 4)
Test Run Three is a 1-hour stress test that will exercise the top workload transactions in
the Application under Test, which are documented above in Table 2-1, Load Model and
Table 2-8, Performance Test Script Steps. Stressing the system to view how the
Application and Infrastructure scales if this Application’s workload grows in the near
future will be examined to determine if response times can be maintained. Test Run
Three is designed to collect performance metrics on transaction throughput, response
times, and system resource utilization, in comparison to Performance requirements.

Test Run Four is a repeat of Test Run Three.

2.2.6.4 Test Run Five, 4-24 Hour Endurance Test (Average Hour or Varied
Load) (REPEAT in Run 6)
Test Run Five is a 2-hour peak hour baseline test that will exercise the top workload
transactions in the Application under Test, which are documented above in table 2-1,
Load Model and Table 2-8, Performance Test Script Steps. This Endurance test will
determine if the system resources are recycled for re-use while processing transactions
over long periods. Proper re-cycling of memory, CPU, and other system utilization
resources is healthy for performance. Test Run Five is designed to collect performance
metrics on transaction throughput, response times, and system resource utilization, in
comparison to Performance requirements.

Test Run Six is a repeat of Test Run Five.

Performance Test Plan and Results Template Version X.X xv <Project Name / Acronym>
CMS XLC Table of Contents

2.2.7 Performance Test Environment


Instructions: Document the Performance Test Environment components based on the
categories listed in the table below.

Performance testing will be performed in the XXXX Test environment. The table below
lists an example of the components within the Performance Test environment that
XXXX Application will use.

Table 12: Performance Test Environment Hardware


Server Total
Server Environmen Hardware Memory CPU
vendor OS Drive
Name t Tier Version (GB) count
model Space
CM5687 Web Service Dell M620 Linux 32 GB 8 cores 512 GB
CM5688 Web Service Dell M620 Linux 32 GB 8 cores 512 GB

CM5689 Middleware Dell M620 Linux 32 GB 8 cores 512 GB

CM5690 Middleware Dell M620 Linux 32 GB 8 cores 512 GB

CM5683 Middleware Dell M820 Linux 32 GB 16 cores 1 TB

CM5685 Database Dell M820 Linux 32 GB 16 cores 1 TB

xxx xxx xxx xxx xxx xxx xxx xxx

xxx xxx xxx xxx xxx xxx xxx xxx

As listed below, describe what the Scaling factor between the Production environment
that will support the Application under Test, and the Performance Test environment that
will support the Application under Test.

The Scaling factors are as follows:

1. Is the hardware in the same vendor family?

2. Does the hardware have the same number of CPUs (processors)?

3. Does the hardware have the same version?

4. Is the OS the same platform (UNIX, Linux, or Windows)?

5. Are the configuration files in Test a match to Production or pulled from


Production?

6. Is data populated in the Database to the same level as in Production? If not, what
is the ratio?

Performance Test Plan and Results Template Version X.X xvi <Project Name / Acronym>
CMS XLC Table of Contents

2.2.7.1 Server Software


Instructions: Provide information on the software used on the Performance Test
environment components.

Example:

The software used in the performance test environment is:

 WebSphere Application Server (WAS) software.


 DB/2 database

2.2.7.2 Logical Test Environment


Instructions: Provide a Logical Test Environment diagram to illustrate where
transactions will process.

Figure 1: Logical Test Environment

Performance Test Plan and Results Template Version X.X xvii <Project Name / Acronym>
CMS XLC Table of Contents

2.2.7.3 Physical Test Environment


Instructions: Provide a Physical Test Environment diagram to illustrate what
components will be used in the Performance Test Environment.

Figure 2: Physical Test Environment

2.3 Performance Test Monitoring Tools and Metrics


Instructions: List and provide a description of the tools to be used in supporting
monitoring the performance tests.

The tools listed below will be used to support Performance Testing efforts. They will be
used to capture performance metrics as needed, located in the table below.

An example is given below.

Table 13: Tools


Tool Purpose
HP Performance To build performance test scripts
Center To execute performance test scenarios
To collect transaction response times
CA Wily To measure physical server level utilization statistics and performance
metrics

Performance Test Plan and Results Template Version X.X xviii <Project Name / Acronym>
CMS XLC Table of Contents

Tool Purpose
HP Deep To measure physical server level utilization statistics and performance
Diagnostics metrics
To measure utilization statistics at the code level (classes and
methods)
Site Scope To measure physical server level utilization statistics and performance
metrics

The two tables below describe examples of the various performance metrics that can be
captured during the Performance Test stage to view resource usage trends.

Table 14: Application Server Tier


Metrics Value Measured
CPU utilization CPU utilization
Physical Memory Percentage used Physical Memory Percentage used
Memory Memory utilization
Java Virtual Machine (JVM) Total memories in the JVM runtime
Runtime/Total Memory
JVM Runtime/Free Memory Free memories in the JVM runtime
Used memories in the JVM runtime
JDBC Connections/Concurrent Number of threads that are currently waiting for
Waiters connections
JDBC DB Connections/Percent used Average percent of pool that is in use
JDBC DB Connections/Percent Average percent of the time that all connections are in
maxed use
Thread Creates Total number of thread creates
Thread Destroys Total number of threads destroyed
Thread Pool/Active Threads Number of concurrently active threads
Thread Pool/Pool Size Average number of threads in pool
Thread Pool/Percent Maxed Average percent of the time that all threads are in use
Heap size Amount of heap allocated.
Memory Memory utilization
Processes in run queue (Procs r), User Time (cpu us),
System time(cpu sy), Idle time (cpu id), Context
Switching (cs), Interrupts
Disk I/O Disk I/O utilization
Read/Write per sec (r/s, w/s), Percentage busy (%b),
Service Time (svc_t)
Network Collisions (Collis), Output Packets (Opkts), Input
errors (Ierrs), Input Packets (Ipkts)
Queue Depth Measurement of queue depths during the test
execution

Performance Test Plan and Results Template Version X.X xix <Project Name / Acronym>
CMS XLC Table of Contents

Table 15: Database Server Tier


Metrics Value Measured
CPU utilization CPU utilization
Physical Memory Percentage Physical Memory Percentage used
used
Memory Memory utilization
Processes in run queue (Procs r), User Time (cpu us),
System time(cpu sy), Idle time (cpu id), Context
Switching (cs), Interrupts
Disk I/O Disk I/O utilization
Read/Write per sec (r/s, w/s), Percentage busy (%b),
Service Time (svc_t)
Network Collisions (Collis), Output Packets (Opkts), Input errors
(Ierrs), Input Packets (Ipkts)

Performance Test Plan and Results Template Version X.X xx <Project Name / Acronym>
CMS XLC Table of Contents

3. Execution and Analysis


Instructions: In this section, compile the Performance Test run log details and the
Performance Test results. Also provide an overall analysis of the Application and
Infrastructure being examined.

An example is provided below.

3.1 Performance Test Results and Analysis


3.1.1 Test Run 1 – Baseline Two Hour – Peak Hour Load Test
Performance Requirement
Each Business Process should achieve a transaction per hour goal while processing
within the allocated SLA response times.

Table 16: Business Process/Transactions Goals


SLA/ Transactions
Business Transactions
Response Times per Hour
Business Process 1 4 seconds 1500
Business Process 2 4 seconds 1900
Business Process 3 1 second 1400
Business Process 4 3 seconds 1350
Business Process 5 xxxx xxxx
Observations
This load test was executed with 35 virtual users processing the load model. The server
processing time for all transactions (except Business Process 1) were within the 4-
second response time. Business Process 4 had a 90th percentile measurement of 9.28
seconds during this test run. Numerous exceptions were noticed during this test run
related to obtaining the same record in the queue by two users.

Pass/Fail Analysis Summary


Business Process 4 failed during Performance Test execution. It has been determined
that a code change is needed to improve Business Process 4. A decision will be made if
the application can be deployed with this issue or if more time will be allocated for the fix
to be developed and tested.

Table 17: Business Transactions Statistics Summary


Transactions
Business SLA/response SLA/response Transactions
per hour
Transactions times Requirement times Actual per hour Actual
Requirement
Business Process 4 seconds 3.6 secs 1500 1500

Performance Test Plan and Results Template Version X.X xxi <Project Name / Acronym>
CMS XLC Table of Contents

Transactions
Business SLA/response SLA/response Transactions
per hour
Transactions times Requirement times Actual per hour Actual
Requirement
1
Business Process
2 4 seconds 2.4 secs 1900 1900
Business Process
3 1 second 0.6 secs 1400 1400
Business Process
4 3 seconds 9.28 secs 1350 1214
Business Process
5 5 seconds 3.9 secs xxxx xxxx

3.1.2 Test Run 2

Performance Requirement

Observations

Pass/Fail Analysis Summary

3.1.3 Test Run 3

Performance Requirement

Observations

Pass/Fail Analysis Summary

Performance Test Plan and Results Template Version X.X xxii <Project Name / Acronym>
CMS XLC Table of Contents

3.1.4 Test Run 4

Performance Requirement

Observations

Pass/Fail Analysis Summary

3.1.5 Test Run 5

Performance Requirement

Observations

Pass/Fail Analysis Summary

3.1.6 Test Run 6

Performance Requirement

Observations

Pass/Fail Analysis Summary

Performance Test Plan and Results Template Version X.X xxiii <Project Name / Acronym>
CMS XLC Table of Contents

Appendix A: Test Sign-off Sheet

Test Sign-Off Sheet


This sign-off sheet indicates that the signing reviewers agree with the final test deliverables in
relation to release <XX.XX> of <Project>. The signatures attest that all deliverables have
been completed and all required test activities have been successfully accomplished and
verified, thereby acknowledging the completion of the test stage.

Additional comments:

Concur:

Government Task Lead (GTL) Signature Date


Concur:

Project/Release Manager Signature Date


Concur:

Project/Release Manager Signature Date


Concur:

Performance Test Lead Signature Date


Approve:

Performance Engineer Signature Date

Performance Test Plan and Results Template Version X.X xxiv <Project Name / Acronym>
CMS XLC Table of Contents

Appendix B: Record of Changes


Instructions: Provide information on how the development and distribution of the
Performance Test Plan and Results Template will be controlled and tracked. Use the
table below to provide the version number, the date of the version, the author/owner of
the version, and a brief description of the reason for creating the revised version.

Table 18: Record of Changes


Version
Numbe Date Author/Owner Description of Change
r

Performance Test Plan and Results Template Version X.X xxv <Project Name / Acronym>
CMS XLC Table of Contents

Appendix C: Acronyms
Instructions: Provide a list of acronyms and associated literal translations used within
the document. List the acronyms in alphabetical order using a tabular format as
depicted below.

Table 19: Acronyms


Acronym Literal Translation

Performance Test Plan and Results Template Version X.X xxvi <Project Name / Acronym>
CMS XLC Table of Contents

Appendix D: Glossary
Instructions: Provide clear and concise definitions for terms used in this document that
may be unfamiliar to readers of the document. Terms are to be listed in alphabetical
order.

Table 20: Glossary


Term Definition

Performance Test Plan and Results Template Version X.X xxvii <Project Name / Acronym>
CMS XLC Table of Contents

Appendix E: Referenced Documents


Instructions: Summarize the relationship of this document to other relevant documents.
Provide identifying information for all documents used to arrive at and/or referenced
within this document (e.g., related and/or companion documents, prerequisite
documents, relevant technical documentation, etc.).

Table 21: Referenced Documents


Document Name Document Location and/or URL Issuance Date

Performance Test Plan and Results Template Version X.X xxviii <Project Name / Acronym>
CMS XLC Table of Contents

Appendix F: Notes to the Author / Template Instructions


This document is a template for creating a Performance Test Plan and Results
Template for a given investment or project. The final document should be delivered in
an electronically searchable format. The Performance Test Plan and Results Template
should stand on its own with all elements explained and acronyms spelled out for
reader/reviewers, including reviewers outside CMS who may not be familiar with CMS
projects and investments.
This template includes instructions, boilerplate text, and fields. The developer should
note that:
 Each section provides instructions or describes the intent, assumptions, and
context for content included in that section. Instructional text appears in blue
italicized font throughout this template.

 Instructional text in each section should be replaced with information specific to


the particular investment.

 Some text and tables are provided as boilerplate examples of wording and
formats that may be used or modified as appropriate.

When using this template, follow these steps:


1. Table captions and descriptions are to be placed centered, above the table.

2. Modify any boilerplate text, as appropriate, to your specific investment.

3. Do not delete any headings. If the heading is not applicable to the


investment, enter “Not Applicable” under the heading.

4. All documents must be compliant with Section 508 requirements.

5. Figure captions and descriptions are to be placed centered, below the figure.
All figures must have an associated tag providing appropriate alternative text
for Section 508 compliance.

6. Delete this “Notes to the Author / Template Instructions” page and all
instructions to the author before finalizing the initial draft of the document.

Performance Test Plan and Results Template Version X.X xxix <Project Name / Acronym>
CMS XLC Table of Contents

Appendix G: XLC Template Revision History


The following table records information regarding changes made to the XLC template
over time. This table is for use by the XLC Steering Committee only. To provide
information about the controlling and tracking of this artifact, please refer to the Record
of Changes section of this document.

Table 22: XLC Template Revision History


Version
Numbe Date Author/Owner Description of Change
r
1.0 11/7/2014 Enterprise Test Center Baseline document. See CR 14-006.
1.1 02/02/201 Surya Potu, Updated CMS logo.
5 CMS/OEI/DPPIG

Performance Test Plan and Results Template Version X.X xxx <Project Name / Acronym>
CMS XLC Table of Contents

Appendix H: Additional Appendices


Instructions: Utilize additional appendices to facilitate ease of use and maintenance of
the document.

Performance Test Plan and Results Template Version X.X xxxi <Project Name / Acronym>

You might also like