Perf Test Plan Results Template
Perf Test Plan Results Template
Notes on accessibility: This template has been tested and is best accessible with JAWS 11.0 or higher.
For questions about using this template, please contact CMS IT Governance. To request changes to the
template, please submit an XLC Process Change Request (CR).
Table of Contents
1. Executive Summary...................................................................................................1
1.1 Overview: Project Background and Scope........................................................1
2. Performance Requirements and Planning...............................................................2
2.1 Performance Requirements..............................................................................2
2.1.1 Load Model..........................................................................................2
2.2 Performance Test Approach.............................................................................3
2.2.1 Assumptions, Constraints, and Risks..................................................4
2.2.2 Milestones............................................................................................5
2.2.3 Test Organization.................................................................................6
2.2.4 Performance Test Script Steps............................................................7
2.2.5 Performance Test Data Planning.........................................................7
2.2.6 Performance Test Scenario Runs........................................................8
2.2.7 Performance Test Environment.........................................................11
2.3 Performance Test Monitoring Tools and Metrics............................................13
3. Execution and Analysis...........................................................................................16
3.1 Performance Test Results and Analysis.........................................................16
3.1.1 Test Run 1 – Baseline Two Hour – Peak Hour Load Test................16
3.1.2 Test Run 2.........................................................................................17
3.1.3 Test Run 3.........................................................................................17
3.1.4 Test Run 4.........................................................................................17
3.1.5 Test Run 5.........................................................................................18
3.1.6 Test Run 6.........................................................................................18
Appendix A: Test Sign-off Sheet............................................................................2
Appendix B: Record of Changes............................................................................3
Appendix C: Acronyms............................................................................................4
Appendix D: Glossary..............................................................................................5
Appendix E: Referenced Documents.....................................................................6
Appendix F: Notes to the Author / Template Instructions...................................7
Appendix G: XLC Template Revision History.......................................................8
Appendix H: Additional Appendices......................................................................9
Performance Test Plan and Results Template Version X.X iii <Project Name / Acronym>
CMS XLC Table of Contents
List of Figures
No table of figures entries found.
List of Tables
Table 1: Load Model..........................................................................................................3
Table 2: Change Requests (CRs)......................................................................................3
Table 3: Assumptions........................................................................................................4
Table 4: Constraints...........................................................................................................4
Table 5: Risks....................................................................................................................5
Table 6: Schedule of Milestones........................................................................................5
Table 7: Test Organization................................................................................................6
Table 8: Performance Test (Script 1 Steps)......................................................................7
Table 9: Performance Test (Script 2 Steps)......................................................................7
Table 10: Performance Test Scenarios.............................................................................8
Table 11: Performance Test Scenario Runtime Settings..................................................9
Table 12: Performance Test Environment Hardware......................................................11
Figure 1: Logical Test Environment.................................................................................12
Figure 2: Physical Test Environment...............................................................................13
Table 13: Tools................................................................................................................13
Table 14: Application Server Tier....................................................................................14
Table 15: Database Server Tier.......................................................................................14
Table 16: Business Process/Transactions Goals............................................................16
Table 17: Business Transactions Statistics Summary....................................................16
Table 18: Record of Changes............................................................................................3
Table 19: Acronyms...........................................................................................................4
Table 20: Glossary.............................................................................................................5
Performance Test Plan and Results Template Version X.X iv <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X v <Project Name / Acronym>
CMS XLC Table of Contents
1. Executive Summary
The Enterprise Testing Center (ETC) supports the commercial off-the-shelf (COTS)
product HP Performance Center (PC) to conduct performance tests and analyze results
from executed tests. This Test Plan needs to be completed by vendors seeking to test
their applications in the ETC. Performance tests help to determine a system’s and
application’s limitations, as well as the maximum number of active users utilizing the
application throughout servers.
The Performance Test Plan and Results is a combined document designed to more
closely integrate performance test planning and reporting. This document prescribes the
performance requirements, load model, performance test approach, assumptions,
issues, risks, constraints, milestone/schedule, test organization, performance test script
steps, performance test data planning, performance test scenario runs, performance
test environment, performance test monitoring tools and metrics, performance test
results, and analysis steps for Performance Testing (PT).
The objective of this document is to summarize the tasks that have been performed by
each system and/or department supporting the Performance Test (PT) phase.
Briefly describe the changes or new modernization that this Application/Project will
include. Examples of changes or new modernization include new software code,
enhanced capabilities, or hardware component changes.
Performance Test Plan and Results Template Version X.X vi <Project Name / Acronym>
CMS XLC Table of Contents
The Performance Requirements should define the expected performance standard for
the Application and Infrastructure that supports this project. The Performance
Requirements should specifically define the transaction rate per Business Process and
the expected SLAs/response times per Business Process.
The Business Processes of interest within the Application should be defined based on
their level of importance to the application execution.
The Load Model is used as a repository for the Performance Requirements and the
workloads that drive Demand Management. Demand Management is the measurement
of the demand for workloads that process on infrastructure. The Load Model includes
the following performance attributes: Business Processes, SLA/Response Times, and
Transactions per Hour. The Load Model allows for mapping all of the performance
attributes together to make specific and measurable Performance Requirements. The
Load Model and Performance Requirements complement each other as the Load Model
is a repository for the Performance Requirements.
The Load Model can be adjusted, dependent on what the key business processes are
for the application. Several Load Models can be placed in this section to explain
different workloads. For example, there could be a Production Load Model located here
to indicate this project’s application along with other regression workloads that also will
be sharing the infrastructure in Production. There could be a Load Model for this
application’s workload only. There could be a Load Model for what will be executed in a
Performance Test environment which is 50% of Production.
Performance Test Plan and Results Template Version X.X vii <Project Name / Acronym>
CMS XLC Table of Contents
3. Determine what load test types should be executed; examples include Peak hour
Load Tests, Stress Tests, and 4–24-hour Endurance Tests.
4. What key performance metric areas will be important to monitor or define the
pass criteria? The Performance Requirements are strongly recommended to
drive the pass/fail criteria, but previous results comparisons can also be
considered in the pass/fail decision process.
Performance Test Plan and Results Template Version X.X viii <Project Name / Acronym>
CMS XLC Table of Contents
2.2.1.1 Assumptions
Assumptions should be documented concerning the available release software, test
environment, dependencies, tools, and test schedule associated with the performance
test. Examples are shown below.
Table 3: Assumptions
No. Assumption
1 Release x.x Code will be fully functional, having passed Functional and Automated
Pass I and II before Performance Testing begins.
2 All Performance Center Tool software has been installed and configured.
3 The fully deployed, installed, and configured Web tier, middleware tier, and
database servers must be operational in order for performance testing shake-out to
begin.
4 xxxxxxx
2.2.1.2 Constraints
Constraints should be documented concerning the available release software, test
environment, dependencies, tools, test schedule, and other items pertaining to the
performance test. Examples are shown below.
Table 4: Constraints
No. Constraint Impact
1 The Performance Test environment The scaling factor of Performance Test to
has 50% of the servers that Production Production is 50%. All Production Load
has. Models that are executed in Performance
Test should be run at 50% of the full
Production load Model to represent a 100%
Load Test in the AJ Test environment.
2 The Performance Test environment The data in Production has not been purged
does not have some of the older data since 2009; searches in Production
that Production has, which limits some intermingle with older data than
of the data scenarios that can be Performance Test can. This could limit the
simulated. capability of reproducing some Production
issues.
3 The Performance Test team does not The impact of network response times will
have a commercial tool or an approved not be measurable as we determine what
Wire Shark-like tool that allows for areas within the Architecture are
measuring network response times responsible for transaction response time
Performance Test Plan and Results Template Version X.X ix <Project Name / Acronym>
CMS XLC Table of Contents
2.2.1.3 Risks
Risks should be documented concerning the test schedule, release software,
dependencies, tools, test approach test environment and other items pertaining to the
performance test. Examples are shown below.
Table 5: Risks
No. Risk Impact Action/Mitigation Assigned To
1 If functional errors from HIGH The team will start Project Manager
validation testing occur Performance Test
and prevent the creation execution once
of performance test environment
scripts or performance certification, test
test execution, script validation, and
execution of data staging efforts
performance test project are completed.
tasks will be delayed
until functional errors
can be addressed.
2 If a performance-tuning HIGH It is recommended Project
effort is conducted in that any tests that Manager,
the middle of the were executed before Performance
performance test the performance Engineering
execution schedule and tuning changes
as a result configuration should be re-
or code changes are executed after the
made to the performance-tuning
environment, any tests changes.
executed prior to the
performance-tuning
changes should be re-
executed.
3 Xx Xx Xx xx
2.2.2 Milestones
Key milestones are listed in the table below. Each of the milestones represents a group
of tasks on which completion of Performance Testing is dependent. If any of the
milestones are listed as “At Risk”, the milestones that follow it will most likely be delayed
as well.
Performance Test Plan and Results Template Version X.X x <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xi <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test scripts will be built using Performance Center’s Virtual User
Generator (VuGen) component. The following performance test script steps map back
to the Load Model defined earlier in Section 2.1.1. The performance test scripts are
designed to simulate the Business Processes/Transactions in the Load Model.
Develop performance test scripts that simulate all of the actions in the Business
Processes/Transactions documented in the Load Model.
Performance Test Plan and Results Template Version X.X xii <Project Name / Acronym>
CMS XLC Table of Contents
There could be a need to research the problematic data flows in Production if the
application under test is already deployed in Production. Determine if there is a need for
performance test data to be pre-loaded in the Performance Test Environment database.
Testing the data and database to ensure that they are ready for the current test
stage and align with test requirements.
Identifying/defining any tools needed to create, modify, and manipulate the test
data.
Developing a strategy for obtaining and refreshing test data for every cycle or
pass conducted in performance testing. During the performance test, the volume
of the test data is large relative to other test stages.
Performance Test Plan and Results Template Version X.X xiii <Project Name / Acronym>
CMS XLC Table of Contents
Instructions: In the short paragraphs below the table, describe the short objective of
each of the Performance Test scenarios to be executed. Examples are included.
Performance Test Plan and Results Template Version X.X xiv <Project Name / Acronym>
CMS XLC Table of Contents
2.2.6.2 Test Run One, Baseline 2 Hour - Peak Hour Load Test (REPEAT in
Run 2)
Test Run One is a 2-hour peak hour baseline test that will exercise the top workload
transactions in the Application under Test, which are documented above in Table 2-1,
Load Model and Table 2-8, Performance Test Script Steps. Peak hour transaction
processing will be under examination to determine if the system can maintain response
times under the highest anticipated load. Test Run One is designed to collect
performance metrics on transaction throughput, response times, and system resource
utilization, in comparison to Performance requirements.
2.2.6.3 Test Run Three, 1 Hour Stress Test (150-300% of Peak Hour Load)
(REPEAT in Run 4)
Test Run Three is a 1-hour stress test that will exercise the top workload transactions in
the Application under Test, which are documented above in Table 2-1, Load Model and
Table 2-8, Performance Test Script Steps. Stressing the system to view how the
Application and Infrastructure scales if this Application’s workload grows in the near
future will be examined to determine if response times can be maintained. Test Run
Three is designed to collect performance metrics on transaction throughput, response
times, and system resource utilization, in comparison to Performance requirements.
2.2.6.4 Test Run Five, 4-24 Hour Endurance Test (Average Hour or Varied
Load) (REPEAT in Run 6)
Test Run Five is a 2-hour peak hour baseline test that will exercise the top workload
transactions in the Application under Test, which are documented above in table 2-1,
Load Model and Table 2-8, Performance Test Script Steps. This Endurance test will
determine if the system resources are recycled for re-use while processing transactions
over long periods. Proper re-cycling of memory, CPU, and other system utilization
resources is healthy for performance. Test Run Five is designed to collect performance
metrics on transaction throughput, response times, and system resource utilization, in
comparison to Performance requirements.
Performance Test Plan and Results Template Version X.X xv <Project Name / Acronym>
CMS XLC Table of Contents
Performance testing will be performed in the XXXX Test environment. The table below
lists an example of the components within the Performance Test environment that
XXXX Application will use.
As listed below, describe what the Scaling factor between the Production environment
that will support the Application under Test, and the Performance Test environment that
will support the Application under Test.
6. Is data populated in the Database to the same level as in Production? If not, what
is the ratio?
Performance Test Plan and Results Template Version X.X xvi <Project Name / Acronym>
CMS XLC Table of Contents
Example:
Performance Test Plan and Results Template Version X.X xvii <Project Name / Acronym>
CMS XLC Table of Contents
The tools listed below will be used to support Performance Testing efforts. They will be
used to capture performance metrics as needed, located in the table below.
Performance Test Plan and Results Template Version X.X xviii <Project Name / Acronym>
CMS XLC Table of Contents
Tool Purpose
HP Deep To measure physical server level utilization statistics and performance
Diagnostics metrics
To measure utilization statistics at the code level (classes and
methods)
Site Scope To measure physical server level utilization statistics and performance
metrics
The two tables below describe examples of the various performance metrics that can be
captured during the Performance Test stage to view resource usage trends.
Performance Test Plan and Results Template Version X.X xix <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xx <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xxi <Project Name / Acronym>
CMS XLC Table of Contents
Transactions
Business SLA/response SLA/response Transactions
per hour
Transactions times Requirement times Actual per hour Actual
Requirement
1
Business Process
2 4 seconds 2.4 secs 1900 1900
Business Process
3 1 second 0.6 secs 1400 1400
Business Process
4 3 seconds 9.28 secs 1350 1214
Business Process
5 5 seconds 3.9 secs xxxx xxxx
Performance Requirement
Observations
Performance Requirement
Observations
Performance Test Plan and Results Template Version X.X xxii <Project Name / Acronym>
CMS XLC Table of Contents
Performance Requirement
Observations
Performance Requirement
Observations
Performance Requirement
Observations
Performance Test Plan and Results Template Version X.X xxiii <Project Name / Acronym>
CMS XLC Table of Contents
Additional comments:
Concur:
Performance Test Plan and Results Template Version X.X xxiv <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xxv <Project Name / Acronym>
CMS XLC Table of Contents
Appendix C: Acronyms
Instructions: Provide a list of acronyms and associated literal translations used within
the document. List the acronyms in alphabetical order using a tabular format as
depicted below.
Performance Test Plan and Results Template Version X.X xxvi <Project Name / Acronym>
CMS XLC Table of Contents
Appendix D: Glossary
Instructions: Provide clear and concise definitions for terms used in this document that
may be unfamiliar to readers of the document. Terms are to be listed in alphabetical
order.
Performance Test Plan and Results Template Version X.X xxvii <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xxviii <Project Name / Acronym>
CMS XLC Table of Contents
Some text and tables are provided as boilerplate examples of wording and
formats that may be used or modified as appropriate.
5. Figure captions and descriptions are to be placed centered, below the figure.
All figures must have an associated tag providing appropriate alternative text
for Section 508 compliance.
6. Delete this “Notes to the Author / Template Instructions” page and all
instructions to the author before finalizing the initial draft of the document.
Performance Test Plan and Results Template Version X.X xxix <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xxx <Project Name / Acronym>
CMS XLC Table of Contents
Performance Test Plan and Results Template Version X.X xxxi <Project Name / Acronym>