Test Plan Template 30
Test Plan Template 30
Test Plan Template 30
FOR
[PROGRAM NAME]
[DATE]
___________________________________________________________________
Program Manager Date:
___________________________________________________________________
Test Director Date:
TABLE OF CONTENTS
Page
1. Scope...........................................................................................................................................................5
2. Test Strategy..............................................................................................................................................6
2.1. Capabilities Risk-Based Testing Strategy........................................................................................6
2.2. Integrated Testing............................................................................................................................6
2.3. Regression Testing...........................................................................................................................7
3. Test Control.............................................................................................................................................12
5. Glossary....................................................................................................................................................21
1. Scope
This paragraph establishes the scope of the Developmental Test and Evaluation (DT&E)
activities of the program/family of systems. The Integrated Test Plan (ITP) integrates
developer and government test approaches and activities into a single overarching plan
covering all DT&E events. The ITP is not required for each release of capability
(sustainment release). An Integrated Test Description (ITD) is generated for each release
and contains release specific details for conducting each test segment.
2. Test Strategy
This paragraph outlines the general strategy selected for testing and the high-level
approach for each segment of DT&E to be performed. Test segments are subject to
tailoring based on the program’s test strategy and approval by the Integrated Test Team
(ITT). The test strategy also includes capabilities risk-based testing, integrated testing,
and regression testing.
SAMPLE: Due to complexity, size, time constraints and available resources for systems,
risk-based testing strategy focuses on identifying the most important or costly anomalies
as early as possible. The ITT assesses test risks based on factors such as: requirement
priorities, code complexity, frequency of use, user priorities, etc. This assessment
influences test planning, execution and reporting strategies, and serves as the basis for
determining an optimal balance between test coverage and assessed risks.
SAMPLE: Integrated testing is a concept that helps structure T&E to more effectively
support the requirements and acquisition process by integrating testing stakeholders,
techniques, and procedures. Considering risk-based test strategy, integrated testing
assists in managing the correct amount of testing and determining the DT&E phase
targeted for requirement validation while managing risk.
The ROT includes regression testing which validates that existing capabilities and
functionality are not diminished or damaged by changes or enhancements introduced to a
system. Regression testing also includes “break-fix” testing that verifies corrections
implemented function to meet specified requirements.
The selection of appropriate assessment procedures and the rigor, intensity, and
scope of the assessment depend on three factors:
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
9
SWTM046 Integrated Test Plan Template 30 October 2015
The CSE may include application of Security Technical Implementation Guides (STIGs),
Security Readiness Review (SRR) Scans and Security Control Validation. This
paragraph defines how CSE is conducted and who has responsibility.
The SOE also includes regression testing which validates that existing capabilities and
functionality are not diminished or damaged by changes or enhancements introduced to a
system. Regression testing also includes “break-fix” testing that verifies corrections
implemented function to meet specified requirements.
The selection of appropriate assessment procedures and the rigor, intensity, and
scope of the assessment depend on three factors:
The CSE may include application of STIGs, SRR Scans and Security Control Validation.
This paragraph defines how CSE is conducted and who has responsibility.
Limited Deployment begins when the Functional Sponsor and the Milestone Decision
Authority (MDA) approves fielding the capability into an operational environment for:
(1) System Acceptance Test (SAT), conducted during the QT&E Test Phase as part of
Limited Deployment, obtains confirmation that a system meets requirements. The end users
or subject matter experts provide such confirmation after they conduct a period of trial or
acceptance test; and (2) Initial Operational Test and Evaluation (IOT&E) of the
implementation and use of a major release at one or more selected operational sites. It
provides the opportunity to observe the initial implementation and use of the system
under actual operating conditions prior to the Full Deployment Decision (FDD). This
paragraph defines Limited Deployment, how this decision is determined and who has
responsibility, if applicable.
3. Test Control
test procedures. Although templates and forms exist for testing artifacts, the use of
automated test tools to collect and display data semantically equivalent to the data
identified in the forms and templates is recommended.
<Sample> The following exit criteria is satisfied before each test segment may end and
movement to the next segment may begin:
All test objectives (for each segment) pass or the Program Management Office
(PMO)/Financial Management Office (FMO) accept the associated risk to
proceed.
The triage team or DRB adjudicates all PRs identified.
<Sample> The following exit criteria is satisfied before each test segment may end and
movement to the next segment may begin:
All test objectives (for each segment) pass or the PMO/FMO accept the
associated risk to proceed.
The triage team or DRB adjudicates all PRs identified.
The LDTO ensures a PR is created for any issues found during QT&E. Upon identifying
a problem during testing, the tester will:
Document the issue on a PR
Submit the PR and supporting documents to the Watch Item (WIT)/DRB for
resolution
Ensure the PR is added to the configuration-controlled tracking tool
The LDTO Test Manager and Program Manager/Project Lead determine whether testing
continues on other test cases while a PR is being resolved, or whether all testing is halted.
To resume QT&E testing, the PMO notifies the LDTO that:
All required modifications identified in the PR have been addressed
When applicable, all changes have been retested and moved to the appropriate test
environment
The PR has been updated in the appropriate configuration-controlled tracking tool
method to observe identified conditions which do not fully satisfy deficiency report
submission criteria.” Reference T.O. 00-35D-54, chapter 2.
NOTE: For the purpose of this plan, a WIT is also known as a PR.
NOTE: Developers who do not use HP® Quality Center will ensure the following
subparagraphs are addressed.
Deficienc DR has been created in Remedy database or Only valid with Closed Status.
y Report other deficiency tracking system. Reference
Created must be made to the DR Tracking Number.
Withdraw PR withdrawn at the originator’s request, or Only valid with Closed Status.
n as suggested by PR reviewing agency.
Duplicate PR describes a deficiency or enhancement Only valid with Closed Status.
already addressed by an earlier PR.
Reference must be made to the PR ID of that
earlier PR.
Change PR is determined to be a System Change Only valid with Closed Status.
Request Request (SCR) and closed. Information
contained in the PR will be added as an SCR
in the local configuration management
database.
Table 7 - PR State Codes, Definitions, and Constraints
5. Glossary
This paragraph identifies acronyms and their expanded definitions used throughout the
document (including attachments).
Test
Test Objective Table
Segment
ICV Validate each component operates as designed.
CIT Validate that all components integrate into the application as designed.
Validate all desired capabilities function properly and meet specific functional and technical
ROT
requirements.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
ROT
associated risk before moving forward into the QT&E test phase.
Validate, through Performance Evaluation Test (PET) techniques such as bandwidth analysis, load
PET testing, and stress testing, this release operates effectively and meets any documented performance-
related Key Performance Parameters (KPPs).
Evaluate, through Cybersecurity (CSE), how well this release controls information-related risks
(including assessment of network security policies, identification and authentication, access
CSE
controls, auditing, and the confidentiality, integrity, and availability of data and the delivery
systems).
Validate, through Security Readiness Review (SRR) Scans and remediation activities, all Category
CSE
(CAT) I and CAT II vulnerabilities are properly addressed.
Validate, through System Integration Testing (SIT), implementation procedures are complete,
SIT
accurate, and can be used to properly execute system installation.
Validate, through Data Conversion tests, evaluation, and inspection, conversion methods or routines
DM
execute properly and data converted correctly.
SOE Validate the system functions properly and meets specified functional and technical requirements.
Validate, through Regression Testing, that existing capabilities and or functionality are not
SOE
diminished or damaged by installation.
Validate this release operates properly with the current Standard Desktop Configurations (SDCs)
SOE
version.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
SOE
associated risk before moving into the intended production environment.
ROT /
Validate system interoperability through exhaustive execution of system interfaces.
SOE
Evaluate, through UET conducted by end users, how effectively this release meets the usability,
UET
functionality, and mission requirements of the user community.
Validate the Requirements, Problem Reports (PRs), Deficiency Reports (DRs), Engineering Change
UET Proposals (ECPs), Change Requests (CRs), and or capabilities have been corrected or implemented
properly.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
UET
associated risk before moving into the intended production environment.
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
24
SWTM046 Integrated Test Plan Template 30 October 2015