Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Test Plan Template 30

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 25

SWTM046 Integrated Test Plan Template 30 October 2015

INTEGRATED TEST PLAN

FOR

[PROGRAM NAME]

[DATE]

___________________________________________________________________
Program Manager Date:

___________________________________________________________________
Test Director Date:

IMPORTANT NOTE: Include additional signatories as-required. The Chief, Test


Branch (CTB) signature is required for all acquisition category (ACAT) programs or as
deemed necessary.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
1
SWTM046 Integrated Test Plan Template 30 October 2015

Record of Reviews and Changes


Date Date
Change ID Comment OPR
Reviewed Approved

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
2
SWTM046 Integrated Test Plan Template 30 October 2015

TABLE OF CONTENTS
Page

1. Scope...........................................................................................................................................................5

1.1. System Overview-------------------------------------------------------------------------------------------------5

1.2. System Threat Assessment---------------------------------------------------------------------------------------5

1.3. Document Security-----------------------------------------------------------------------------------------------5

1.4. Reference Documents--------------------------------------------------------------------------------------------5

1.5. System Interfaces-------------------------------------------------------------------------------------------------5

2. Test Strategy..............................................................................................................................................6
2.1. Capabilities Risk-Based Testing Strategy........................................................................................6
2.2. Integrated Testing............................................................................................................................6
2.3. Regression Testing...........................................................................................................................7

2.4. Developmental Test and Evaluation (DT&E)----------------------------------------------------------------7


2.4.1. Participating Organizations.........................................................................................................7
2.4.2. Test Artifacts.................................................................................................................................7
2.4.3. Component Validation and Integration (CV&I)...........................................................................8
2.4.3.1. Individual Component Validation (ICV)....................................................................................8
2.4.3.2. Component Integration Test (CIT).............................................................................................8
2.4.3.3. Data Management (DM)............................................................................................................8
2.4.3.4. Requirements Operability Test (ROT).......................................................................................9
2.4.3.5. Performance Evaluation Test (PET)..........................................................................................9
2.4.3.6. Information Assurance Evaluation (IAE)..................................................................................9
2.4.3.7. System Integration Test (SIT).....................................................................................................9
2.4.3.8. User Evaluation Test (UET)......................................................................................................9
2.4.4. Test Readiness Review I (TRR I).................................................................................................10
2.4.5. Qualification Test and Evaluation (QT&E)................................................................................10
2.4.5.1. System Integration Test (SIT)...................................................................................................10
2.4.5.2. Data Management (DM)..........................................................................................................11
2.4.5.3. System Operability Evaluation (SOE)......................................................................................11
2.4.5.4. Performance Evaluation Test (PET)........................................................................................11
2.4.5.5. Information Assurance Evaluation (IAE)................................................................................11
2.4.5.6. User Evaluation Test (UET)....................................................................................................11
2.4.5.8. System Acceptance Test (SAT).................................................................................................12

3. Test Control.............................................................................................................................................12

3.1. CV&I Test Control----------------------------------------------------------------------------------------------12


3.1.1. CV&I Test Environment Management........................................................................................12
3.1.2. CV&I Test Execution..................................................................................................................13
3.1.2.1. CV&I Test Execution Procedures............................................................................................13
3.1.2.2. CV&I Test Execution Log........................................................................................................13
3.1.2.3. CV&I Validation of Test Environment.....................................................................................13
3.1.2.4. CV&I Problem Report (PR) Management...............................................................................13
3.1.2.5. CV&I Exit Criteria...................................................................................................................13
3.1.2.6. CV&I Suspension and Resumption Criteria............................................................................14
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
3
SWTM046 Integrated Test Plan Template 30 October 2015

3.1.3. CV&I Test Constraints and Limitations......................................................................................14


3.1.4. CV&I Test Reporting..................................................................................................................14

3.2. QT&E Test Control---------------------------------------------------------------------------------------------14


3.2.1. QT&E Test Environment Management.......................................................................................14
3.2.1.1. QT&E Test Environment and Artifact Control........................................................................14
3.2.1.2. QT&E Automated Test Tools...................................................................................................14
3.2.2. QT&E Test Execution.................................................................................................................15
3.2.2.1. QT&E Test Execution Procedures...........................................................................................15
3.2.2.2. QT&E Test Execution Log.......................................................................................................15
3.2.3. QT&E Test Constraints and Limitations....................................................................................16
3.2.4. QT&E Test Reporting.................................................................................................................16

4. Test Quality Assurance (QA).................................................................................................................16

4.1. Test Artifact Quality Assurance-------------------------------------------------------------------------------16

4.2. Recycle Procedures---------------------------------------------------------------------------------------------16

4.3. Watch Item (WIT)/Deficiency Review Board (DRB) Procedures-----------------------------------------17

4.4. Hewlett-Packard (HP®) Quality Center Standard Configurations--------------------------------------17


4.4.1. Problem Report (PR) Severity Codes and Definitions................................................................17
4.4.2. Problem Report (PR) Management Life Cycle...........................................................................17
4.4.3. Problem Report (PR) Status Codes and Definitions...................................................................18
4.4.4. Problem Report (PR) State Codes and Definitions.....................................................................19

5. Glossary....................................................................................................................................................21

Appendix A - Sample Test Objectives Table.............................................................................................22

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
4
SWTM046 Integrated Test Plan Template 30 October 2015

1. Scope
This paragraph establishes the scope of the Developmental Test and Evaluation (DT&E)
activities of the program/family of systems. The Integrated Test Plan (ITP) integrates
developer and government test approaches and activities into a single overarching plan
covering all DT&E events. The ITP is not required for each release of capability
(sustainment release). An Integrated Test Description (ITD) is generated for each release
and contains release specific details for conducting each test segment.

1.1. System Overview


This paragraph contains a description of the system and the software to which this test
plan applies, including, as applicable, identification number, title and abbreviation, and
major version number. If there is a document that contains the system overview or
system concept of operations, this paragraph references that document.

1.2. System Threat Assessment


This paragraph identifies the potential system and operational threats.

SAMPLE: Potential system threats include: electronic eavesdropping, unauthorized


disclosure of information, alteration of applications programs, unauthorized access/use,
theft of assets, employee sabotage, enemy overrun/civil disorder, vandalism, malicious
attack, shared or compromised passwords, spoofing, and terrorism/war. Potential
operational threats include: natural disaster, fire, hurricane, snow/ice storms, theft of
assets, environmental contamination, malfunction/failure of software, alteration of
applications programs, operator/user error, misuse of computer resources, interface
integrity, water or liquid damage, lightning, tornado, severe wind storms, bomb threat,
telecommunications failure, unintentional programmer error, alteration/failure of
hardware, data entry error, and power instability.

1.3. Document Security


This paragraph describes the classification of the plan and security or privacy
considerations associated with the use of this document.

1.4. Reference Documents


This paragraph lists any other documents that are needed or used to develop the test plan.

1.5. System Interfaces


This paragraph lists all interfaces (inbound/outbound) for the system that exchange data.
For each interface, provide the system or interface long name, the interface short name,
inbound/outbound designation, the protocol and the test method.

Sample Interface Table:

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
5
SWTM046 Integrated Test Plan Template 30 October 2015

Interface Short Inbound / Protocol Test Method


System / Interface Long Name Outbound
Name

1099 Tax Reporting Program 1099-O-001 O FTP Stub Test

Automated Fund Management AFMS-I-002 I SFTP End-To-End


System
Accounting Pre-Validation APVM-I-001 I MQ Message Stub Test
Module
Table 1 – Interface Table

2. Test Strategy
This paragraph outlines the general strategy selected for testing and the high-level
approach for each segment of DT&E to be performed. Test segments are subject to
tailoring based on the program’s test strategy and approval by the Integrated Test Team
(ITT). The test strategy also includes capabilities risk-based testing, integrated testing,
and regression testing.

2.1. Capabilities Risk-Based Testing Strategy


This paragraph describes, in detail, the risk-based testing strategy for the program. Refer
to the Capabilities Risk-Based Testing Assessment Guide, provided by the Lead
Developmental Test and Evaluation Organization (LDTO).

SAMPLE: Due to complexity, size, time constraints and available resources for systems,
risk-based testing strategy focuses on identifying the most important or costly anomalies
as early as possible. The ITT assesses test risks based on factors such as: requirement
priorities, code complexity, frequency of use, user priorities, etc. This assessment
influences test planning, execution and reporting strategies, and serves as the basis for
determining an optimal balance between test coverage and assessed risks.

2.2. Integrated Testing


This paragraph defines the process of employing integrated testing techniques. The
primary goals of integrated testing are:
 Minimize the transitions between contractor and government testing
 Minimize redundant testing efforts
 Ensure the right amount of testing is completed while managing schedule impacts
 Identify problems and engage developers to correct problems early on in the
process

SAMPLE: Integrated testing is a concept that helps structure T&E to more effectively
support the requirements and acquisition process by integrating testing stakeholders,
techniques, and procedures. Considering risk-based test strategy, integrated testing
assists in managing the correct amount of testing and determining the DT&E phase
targeted for requirement validation while managing risk.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
6
SWTM046 Integrated Test Plan Template 30 October 2015

2.3. Regression Testing


This paragraph describes how regression testing is conducted throughout DT&E.
Regression testing validates that existing capabilities and functionality are not diminished
or damaged by changes or enhancements introduced to a system. Regression testing also
includes “break-fix” testing which verifies that the corrections implemented meet
specified requirements.

2.4. Developmental Test and Evaluation (DT&E)


This paragraph defines DT&E phases.

SAMPLE: DT&E is conducted to evaluate design, quality, performance, functionality,


security, interoperability, supportability, suitability, usability, and maturity of a system or
capability in an operationally representative environment. DT&E includes contractor and
government testing and is conducted over the life cycle of a system. DT&E supports the
acquisition of new systems or enhanced capabilities of existing systems before fielding
decisions and supports the sustainment of systems or capabilities to keep them current or
extend their service life. [Program Name / Family of Systems Name] employs an
integrated version of DT&E that is composed of two test phases: Component Validation
and Integration (CV&I) and Qualification Test and Evaluation (QT&E). CV&I is
performed by the developing agency or system integrator, managed by the Program Test
Manager (PTM), and observed by the government Lead Developmental Test and
Evaluation Organization (LDTO), while QT&E is performed by the LDTO and is
managed by the LDTO’s Test Director (TD), Test Manager (TM), and Test Evaluators
(TEs). Paragraph 2.4.1 and table 2 specify the organizations participating in the CV&I
and QT&E phases of test.

2.4.1. Participating Organizations


This paragraph identifies the organizations that participate in the testing and their roles
and responsibilities. In particular, it identifies the organizations responsible for
conducting, observing, and preparing for the tests (creating test scripts, identifying and
training testers, etc.), and preparing test results (preparing reports, preparing briefings,
tracking problem report disposition, etc.). Refer to table 2 and the program’s ITT Charter
for a list of participating organizations.

Sample Participating Organization Table:


Activity Site Name Role/Responsibility Organization
ICV Testing CIE Gunter, AFB Developer/Tester Developer/SI

Table 2 – Participating Organization Table

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
7
SWTM046 Integrated Test Plan Template 30 October 2015

2.4.2. Test Artifacts


This paragraph describes any project-level conventions for naming and storing test
scripts, test reports, special data or databases created just for testing, and special software
created for testing, such as test drivers or stubs or special software used for executing or
managing the tests.

2.4.3. Component Validation and Integration (CV&I)


This paragraph defines the CV&I phase.

SAMPLE: CV&I is performed by the developing agency or system integrator, managed


by the Program Test Manager (PTM), and observed by the government LDTO. The
purpose of CV&I is for the developing agency or system integrator to demonstrate that
each individual component and the assembled components are developed in accordance
with the approved design and function properly to meet specified requirements. CV&I
may be conducted in iterations as components are completed and integrated. Table 3
identifies test objectives for each applicable test segment. See appendix A for a list of
suggested test objectives.

Sample of Overall CV&I Table:


Test
Test Objective
Segment
SIT To validate that all components integrate into the affected applications without issue.

DM To validate that all components produce intended results.


Table 3 – Overall CV&I Table

2.4.3.1. Individual Component Validation (ICV)


The Individual Component Validation (ICV) validates each individual component is
developed in accordance with approved designs. This paragraph defines how ICV is
conducted and who has responsibility.

2.4.3.2. Component Integration Test (CIT)


The Component Integration Test (CIT), conducted during the CV&I Test Phase, validates
completed components can be integrated into a complete system in accordance with
specified requirements and approved designs. This paragraph defines how CIT is
conducted and who has responsibility.

2.4.3.3. Data Management (DM)


The Data Management (DM), performed during both the CV&I and QT&E Test Phases,
is the process to Extract, Transform, and Load (ETL) data from one system for use in
another, usually for the purpose of application interoperability or system modernization.
DM may consist of Data Migration, Data Conversion, and or Data Validation. This
paragraph defines how DM is conducted and who has responsibility. This segment is
optional (include rationale if tailored).

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
8
SWTM046 Integrated Test Plan Template 30 October 2015

2.4.3.4. Requirements Operability Test (ROT)


The Requirements Operability Test (ROT), conducted during the CV&I Test Phase,
validates that the integrated system functions properly and meets specified requirements
and approved designs. This paragraph defines how ROT is conducted and who has
responsibility.

The ROT includes regression testing which validates that existing capabilities and
functionality are not diminished or damaged by changes or enhancements introduced to a
system. Regression testing also includes “break-fix” testing that verifies corrections
implemented function to meet specified requirements.

2.4.3.5. Performance Evaluation Test (PET)


The Performance Evaluation Test (PET) evaluates the performance of the integrated
system by employing techniques which may include bandwidth analysis, load testing, and
stress testing to ensure the system performs in accordance with specified requirements
and approved designs. This paragraph defines how PET is conducted and who has
responsibility. This segment is optional (include rationale if tailored).

2.4.3.6. Cybersecurity Evaluation (CSE)


The Cybersecurity Evaluation (CSE), conducted during both the CV&I and QT&E Test
Phases, evaluates information-related risks to a system. The CSE is the assessment of:
 Develop, review, and approve a plan to assess the security controls.
o Ensure security control assessment activities are coordinated with
the following: interoperability and supportability certification
efforts; and, T&E events.
o Ensure the coordination of activities is documented in the security
assessment plan and the program T&E documentation, to
maximize effectiveness, reuse, and efficiency.
 Assess the security controls in accordance with the security assessment
plan and DoD assessment procedures.
o Record security control compliance;
o Assign vulnerability severity values for security controls;
o Determine risk levels for security controls; and,
o Assess and characterize aggregate levels of risk to the system.
 Document issues, findings and recommendations from assessments.
 Conduct remediation actions on non-compliant security controls.
o Assist development personnel with POA&M documentation for
non-compliant controls that cannot be remediated during the
assessment.

The selection of appropriate assessment procedures and the rigor, intensity, and
scope of the assessment depend on three factors:
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
9
SWTM046 Integrated Test Plan Template 30 October 2015

 The security categorization of the information system;


 The assurance requirements that the organization intends to meet in
determining the overall effectiveness of the security controls; and,
 The security controls from NIST SP 800-53 as identified in the approved
security plans.

The information produced during control assessments can be used by an


organization to:

 Identify potential problems or shortfalls in the program’s implementation


of the Risk Management Framework;
 Identify security -related weaknesses and deficiencies in the information
system and in the environment in which the system operates;
 Prioritize risk mitigation decisions and associated risk mitigation
activities;
 Confirm that identified security -related weaknesses and deficiencies in
the information system and in the environment of operation have been
addressed;
 Support monitoring activities and information security situational
awareness;
 Facilitate security authorization decisions and ongoing authorization
decisions; and
 Inform budgetary decisions and the capital investment process.

The CSE may include application of Security Technical Implementation Guides (STIGs),
Security Readiness Review (SRR) Scans and Security Control Validation. This
paragraph defines how CSE is conducted and who has responsibility.

2.4.3.7. System Integration Test (SIT)


The System Integration Test (SIT), conducted during both the CV&I and QT&E phases,
validates the integration of a system into an operationally-representative environment
(installation, removal, and backup and recovery procedures). This paragraph defines how
SIT is conducted and who has responsibility.

2.4.3.8. User Evaluation Test (UET)


The User Evaluation Test (UET) is typically ad-hoc testing conducted by end users of the
system. The UET is conducted to offer an early look at the maturity of the system and to
evaluate how well the system meets mission requirements. This paragraph defines how
UET is conducted and who has responsibility. This segment is optional (include rationale
if tailored).

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
10
SWTM046 Integrated Test Plan Template 30 October 2015

2.4.4. Test Readiness Review I (TRR I)


The Test Readiness Review I (TRR I) is the formal review, conducted by the PM,
signifying the CV&I portion of DT&E is complete and recommends the system move
into the QT&E portion of DT&E testing. The results of the TRR I demonstrate that each
individual component and the assembled components are developed or configured in
accordance with the approved design and function properly to meet specified
requirements. This paragraph defines how the TRR I is conducted and who has
responsibility.

2.4.5. Qualification Test and Evaluation (QT&E)


This paragraph defines the Qualification Test and Evaluation (QT&E) phase.

SAMPLE: QT&E is managed and performed by the appointed government LDTO. It is


performed in a government-provided and managed in an operationally-representative
environment. The purpose of QT&E is to validate the product integrates into its intended
environment, meets both functional and nonfunctional requirements, meets performance
standards, and validates the information assurance controls employed by the system meet
DoD standards and polices. QT&E can also be supported by end-users (with oversight
by the LDTO/Participating Test Organization (PTO)) to provide end-user input related to
system maturity and its ability to meet operational mission requirements. Table 4
identifies test objectives for each applicable test segment. See appendix A for a list of
suggested test objectives.

Sample of Overall QT&E Table:


Test
Test Objective
Segment
QT&E Environment 1
To validate that all components integrate into the affected applications without
SIT
issue.
QT&E Environment 2
To validate that all components integrate into the affected applications without
SIT
issue.
DM To validate that all components produce intended results.
Table 4 –Overall QT&E Table

2.4.5.1. System Integration Test (SIT)


The System Integration Test (SIT), conducted during both the CV&I and QT&E phases,
validates the integration of a system into an operationally-representative environment
(installation, removal, and backup and recovery procedures). This paragraph defines how
SIT is conducted and who has responsibility.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
11
SWTM046 Integrated Test Plan Template 30 October 2015

2.4.5.2. Data Management (DM)


The Data Management (DM), performed during both the CV&I and QT&E Test Phases,
is the process to Extract, Transform, and Load (ETL) data from one system for use in
another, usually for the purpose of application interoperability or system modernization.
DM may consist of Data Migration, Data Conversion, and or Data Validation. This
paragraph defines how DM is conducted and who has responsibility. This segment is
optional (include rationale if tailored).

2.4.5.3. System Operability Evaluation (SOE)


The System Operability Evaluation (SOE), conducted during the QT&E Test Phase and
managed by the LDTO, is a traceable scenario and or script driven end-to-end
qualification testing of a system that validates the integrated system operates in
accordance with specified requirements and approved designs. This paragraph defines
how SOE is conducted and who has responsibility.

The SOE also includes regression testing which validates that existing capabilities and
functionality are not diminished or damaged by changes or enhancements introduced to a
system. Regression testing also includes “break-fix” testing that verifies corrections
implemented function to meet specified requirements.

2.4.5.4. Performance Evaluation Test (PET)


The Performance Evaluation Test (PET) evaluates the performance of the integrated
system by employing techniques such as bandwidth analysis, load testing, and stress
testing. During QT&E, the PET may be tailored based on the results of similar tests
conducted during CV&I. This paragraph defines how PET is conducted and who has
responsibility. This segment is optional (include rationale if tailored).

2.4.5.5. Cybersecurity Evaluation (CSE)


The Cybersecurity Evaluation (CSE), conducted during both the CV&I and QT&E Test
Phases, evaluates information-related risks to a system. It includes the assessment of:
 Develop, review, and approve a plan to assess the security controls.
o Ensure security control assessment activities are coordinated with
the following: interoperability and supportability certification
efforts; and, T&E events.
o Ensure the coordination of activities is documented in the security
assessment plan and the program T&E documentation, to
maximize effectiveness, reuse, and efficiency.
 Assess the security controls in accordance with the security assessment
plan and DoD assessment procedures.
o Record security control compliance;
o Assign vulnerability severity values for security controls;
o Determine risk levels for security controls; and,
o Assess and characterize aggregate levels of risk to the system.
 Document issues, findings and recommendations from assessments.
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
12
SWTM046 Integrated Test Plan Template 30 October 2015

 Conduct remediation actions on non-compliant security controls.


o Assist development personnel with POA&M documentation for
non-compliant controls that cannot be remediated during the
assessment.

The selection of appropriate assessment procedures and the rigor, intensity, and
scope of the assessment depend on three factors:

 The security categorization of the information system;


 The assurance requirements that the organization intends to meet in
determining the overall effectiveness of the security controls; and,
 The security controls from NIST SP 800-53 as identified in the approved
security plans.

The information produced during control assessments can be used by an


organization to:

 Identify potential problems or shortfalls in the program’s implementation


of the Risk Management Framework;
 Identify security -related weaknesses and deficiencies in the information
system and in the environment in which the system operates;
 Prioritize risk mitigation decisions and associated risk mitigation
activities;
 Confirm that identified security -related weaknesses and deficiencies in
the information system and in the environment of operation have been
addressed;
 Support monitoring activities and information security situational
awareness;
 Facilitate security authorization decisions and ongoing authorization
decisions; and
 Inform budgetary decisions and the capital investment process.

The CSE may include application of STIGs, SRR Scans and Security Control Validation.
This paragraph defines how CSE is conducted and who has responsibility.

2.4.5.6. User Evaluation Test (UET)


The User Evaluation Test (UET) is typically ad-hoc testing conducted by end users of the
system. The UET is conducted to offer an early look at the maturity of the system and to
evaluate how well the system meets mission requirements. This paragraph defines how
UET is conducted and who has responsibility. This segment is optional (include rationale
if tailored).

2.4.5.7. Limited Deployment

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
13
SWTM046 Integrated Test Plan Template 30 October 2015

Limited Deployment begins when the Functional Sponsor and the Milestone Decision
Authority (MDA) approves fielding the capability into an operational environment for:
(1) System Acceptance Test (SAT), conducted during the QT&E Test Phase as part of
Limited Deployment, obtains confirmation that a system meets requirements. The end users
or subject matter experts provide such confirmation after they conduct a period of trial or
acceptance test; and (2) Initial Operational Test and Evaluation (IOT&E) of the
implementation and use of a major release at one or more selected operational sites. It
provides the opportunity to observe the initial implementation and use of the system
under actual operating conditions prior to the Full Deployment Decision (FDD). This
paragraph defines Limited Deployment, how this decision is determined and who has
responsibility, if applicable.

2.4.5.8. System Acceptance Test (SAT)


The System Acceptance Test (SAT), conducted during the QT&E Test Phase as part of
Limited Deployment, obtains confirmation that a system meets mutually agreed-upon
requirements. The end users or subject matter experts provide confirmation after they
conduct a trial period or acceptance testing. This paragraph defines how SAT is
conducted and who has responsibility. This segment is optional (include rationale if
tailored).

3. Test Control

3.1. CV&I Test Control


This paragraph generally includes a series of subparagraphs to address activities with the
test phase and associated environment. The following is a list of areas to address:

3.1.1. CV&I Test Environment Management


This paragraph specifies who maintains control of the test environments during the entire
test period. It also addresses the processes and ownership over mid-project changes to
the physical hardware, communications and connectivity, database management,
hardware configurations, and software versions (builds) across test instances.

3.1.1.1. CV&I Test Environment and Artifact Control


All test environments and artifacts used in a test such as components, current product
baseline, test plans, databases and test scripts are under configuration control before the
test starts. All test artifacts generated by a test such as the test log, test results, problem
reports and test reports are under configuration control when they are generated. If some
artifacts are contained in testing tools, this paragraph will also address the configuration
control of those artifacts. Describe any changes or specializations to the configuration
management plan that apply to the test artifacts. Reference the applicable Configuration
Management Plan.

3.1.1.2. CV&I Automated Test Tools


Each test tool utilized is separately addressed and approved. Identify who manages,
controls, and is responsible for test data content, management, and maintenance. If using
an approved test management tool, describe how the test tool will automate parts of the
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
14
SWTM046 Integrated Test Plan Template 30 October 2015

test procedures. Although templates and forms exist for testing artifacts, the use of
automated test tools to collect and display data semantically equivalent to the data
identified in the forms and templates is recommended.

3.1.2. CV&I Test Execution


This paragraph generally includes a series of subparagraphs to address project-level
activities with CV&I test execution. The following is a list of areas to address:

3.1.2.1. CV&I Test Execution Procedures


This paragraph outlines the general test standards and processes that apply regardless of
test type. For example, it includes the processes for identifying expected results, logging
and evaluating actual results, and retesting to validate resolution of discrepancies or other
testing errors.

3.1.2.2. CV&I Test Execution Log


The test log is a chronological record of the events that take place during the test process
for a given release. This paragraph specifies how test logs are established and maintained.

3.1.2.3. CV&I Validation of Test Environment


This paragraph defines the process for validating the test environment. For example, the
test environment is validated at the start of testing for the individual types of test, and is
revalidated anytime the environment is corrupted or compromised. Exceptions
encountered during the validation or revalidation processes are explained to and approved
by the ITT prior to proceeding.

3.1.2.4. CV&I Problem Report (PR) Management


This paragraph describes the processes to follow for PR management. For example, an
automated test management tool may be used to record, track, and report the problem
reports. The conditions leading to documentation of a PR are duplicated or re-created to
the best extent possible before the PR is generated. Evaluators recording PRs document
the necessary details to provide the development team, ITT, and Deficiency Review
Board (DRB) with adequate information to make qualified and accurate decisions.

3.1.2.5. CV&I Exit Criteria


This paragraph identifies the exit criteria that must be satisfied before testing may end
and movement to the next segment may begin. Include test segment pass/fail criteria.

<Sample> The following exit criteria is satisfied before each test segment may end and
movement to the next segment may begin:
 All test objectives (for each segment) pass or the Program Management Office
(PMO)/Financial Management Office (FMO) accept the associated risk to
proceed.
 The triage team or DRB adjudicates all PRs identified.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
15
SWTM046 Integrated Test Plan Template 30 October 2015

3.1.2.6. CV&I Suspension and Resumption Criteria


This paragraph defines the conditions under which some or all of the tests are suspended
pending correction of problems encountered and specifies the conditions that must be met
in order to resume execution of testing. It also includes the developer’s procedures,
which are similar to the criteria described in paragraph 3.2.2.6.

3.1.3. CV&I Test Constraints and Limitations


This paragraph identifies factors that constrain or limit the ability to thoroughly test the
system, or affect the timing of testing.

3.1.4. CV&I Test Reporting


This paragraph describes how the results of CV&I test segments are documented and
reported in the ITR.

3.2. QT&E Test Control


This paragraph generally includes a series of subparagraphs to address activities with the
test phase and associated environment. The following is a list of areas to address:

3.2.1. QT&E Test Environment Management


This paragraph specifies who maintains control of the test environments during the entire
test period. It also addresses the processes and ownership over mid-project changes to
the physical hardware, communications and connectivity, database management, and
software versions across test instances.

3.2.1.1. QT&E Test Environment and Artifact Control


All test environments and artifacts used in a test such as components, current product
baseline, test plans, databases and test scripts are under configuration control before the
test starts. All test artifacts generated by a test such as the test log, test results, problem
reports and test reports are under configuration control when they are generated. If some
artifacts are contained in testing tools, this paragraph also addresses the configuration
control of those artifacts. Describe any changes or specializations to the configuration
management plan that apply to the test artifacts. Reference the applicable Configuration
Management Plan.

3.2.1.2. QT&E Automated Test Tools


Each test tool is separately addressed and approved. Identify who manages, controls, and
is responsible for test data content, management, and maintenance. If using an approved
test management tool, describe how the test tool will automate parts of the test
procedures. Although templates and forms exist for testing artifacts, the use of
automated test tools to collect and display data semantically equivalent to the data
identified in the forms and templates is recommended.

3.2.2. QT&E Test Execution


This paragraph generally includes a series of subparagraphs to address project-level
activities with QT&E test execution. The following is a list of areas to address:

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
16
SWTM046 Integrated Test Plan Template 30 October 2015

3.2.2.1. QT&E Test Execution Procedures


This paragraph outlines the general test standards and processes that apply regardless of
test type. For example, it includes the processes for identifying expected results, logging
and evaluating actual results, and retesting to validate resolution of discrepancies or other
testing errors.

3.2.2.2. QT&E Test Execution Log


The test log is a chronological record of the events that take place during the test process
for a given release. This paragraph describes how test logs are established and
maintained.

3.2.2.3. QT&E Validation of Test Environment


This paragraph addresses the requirements in order to validate the test environment. For
example, the test environment is validated at the start of testing for the individual types of
test, and is revalidated anytime the environment is corrupted or compromised.
Exceptions encountered during the validation or revalidation processes are explained to
and approved by the LDTO prior to proceeding.

3.2.2.4. QT&E Problem Report (PR) Management


This paragraph describes the processes to follow for PR management. For example, an
automated test management tool may be used to record, track, report, and manage the
problem reports. The conditions leading to documentation of a PR is duplicated or re-
created to the best extent possible before the PR is generated. Evaluators recording PRs
document the necessary details to provide the development team, ITT, and DRB with
adequate information to make qualified and accurate decisions.

3.2.2.5. QT&E Exit Criteria


This paragraph identifies the exit criteria that must be satisfied before testing may end
and movement to the next segment may begin. Include test segment pass/fail criteria.

<Sample> The following exit criteria is satisfied before each test segment may end and
movement to the next segment may begin:
 All test objectives (for each segment) pass or the PMO/FMO accept the
associated risk to proceed.
 The triage team or DRB adjudicates all PRs identified.

3.2.2.6. QT&E Suspension and Resumption Criteria


This paragraph describes the conditions under which some or all of the tests are
suspended pending correction of problems encountered. This paragraph also defines the
conditions that must be met in order to resume execution of testing.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
17
SWTM046 Integrated Test Plan Template 30 October 2015

The LDTO ensures a PR is created for any issues found during QT&E. Upon identifying
a problem during testing, the tester will:
 Document the issue on a PR
 Submit the PR and supporting documents to the Watch Item (WIT)/DRB for
resolution
 Ensure the PR is added to the configuration-controlled tracking tool

The LDTO Test Manager and Program Manager/Project Lead determine whether testing
continues on other test cases while a PR is being resolved, or whether all testing is halted.
To resume QT&E testing, the PMO notifies the LDTO that:
 All required modifications identified in the PR have been addressed
 When applicable, all changes have been retested and moved to the appropriate test
environment
 The PR has been updated in the appropriate configuration-controlled tracking tool

3.2.3. QT&E Test Constraints and Limitations


This paragraph identifies factors that constrain or limit the ability to thoroughly test the
system or affect the timing of testing.

3.2.4. QT&E Test Reporting


This paragraph describes how the results of QT&E test segments are documented and
reported in the ITR.

4. Test Quality Assurance (QA)


This paragraph outlines any project-level quality assurance procedures used in testing
such as test artifact reviews, reviews of the test approaches for different tests, and the
standard configuration of automated test tools. This paragraph also describes any quality
assurance procedures used in resolving problem reports such as re-creating the error,
applying the fix, and retesting to ensure the fix is effective.

4.1. Test Artifact Quality Assurance


This paragraph specifies the process and responsibilities for ensuring all test articles and
documentation undergo complete peer reviews and the acceptance/approval process.

4.2. Recycle Procedures


This paragraph documents the recycle process for the system. A recycle is any condition
which causes the original release (software) to be modified after TRR I. It is policy that
all recycles, regardless of size or complexity, follow the same disciplined test
methodology. No changes are introduced into the government QT&E test environment
without going through the CV&I phase.

4.3. Watch Item (WIT)/Deficiency Review Board (DRB) Procedures


This paragraph briefly describes the process used by the Watch Item/Deficiency Review
Board (WIT/DRB) to adjudicate problem reports identified and documented throughout
the test life cycle. “Watch items are unique to test and evaluation and are used as a
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
18
SWTM046 Integrated Test Plan Template 30 October 2015

method to observe identified conditions which do not fully satisfy deficiency report
submission criteria.” Reference T.O. 00-35D-54, chapter 2.

NOTE: For the purpose of this plan, a WIT is also known as a PR.

4.4. Hewlett-Packard (HP®) Quality Center Standard Configurations


HP® Quality Center is the preferred LDTO tool used to manage DT&E test phases and
segments. The following paragraphs detail the standard configurations implemented.

NOTE: Developers who do not use HP® Quality Center will ensure the following
subparagraphs are addressed.

4.4.1. Problem Report (PR) Severity Codes and Definitions


The PR Severity Codes and Definitions configured for use in HP® Quality Center are as
follows:

Problem Report Severity Codes and Definitions


1 - This severity denotes a problem that prevents accomplishment of essential capability or
jeopardizes safety, or other requirements designated as “Critical.” No further testing can be
accomplished until the problem is adequately resolved. (Critical)
2 - This severity denotes a problem that adversely affects the accomplishment of an essential
capability or adversely affects costs, technical or scheduled risks to the project or to the life cycle
support of the system and no work around solution is known. The next test phase/segment cannot
be started until the problem is adequately resolved. (Major)
3 - This severity denotes a problem that adversely affects the accomplishment of an essential
capability or adversely affects costs, technical or scheduled risks to the project or to the life cycle
support of the system and a work around solution is known. Testing may continue while the
problem is resolved. (Minor)
4 - This severity denotes a problem that results in operator inconvenience or annoyance, but does
not affect a required operational or mission-essential capability or results in inconvenience or
annoyance for development or maintenance personnel, but does not prevent the accomplishment of
the responsibilities of those personnel. (Average)
5 - This severity denotes any other condition. (Generally, enhancements will fall in this severity.)
(Other)
Table 5 - PR Severity Codes and Definitions

4.4.2. Problem Report (PR) Management Life Cycle


The PR Management Life Cycle process depicted in figure 1 is used for documenting and
managing problem reports noted during DT&E test phases and segments.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
19
SWTM046 Integrated Test Plan Template 30 October 2015

Figure 1 - PR Life Cycle

4.4.3. Problem Report (PR) Status Codes and Definitions


The PR Status Codes and Definitions configured for use in HP® Quality Center are as
follows:

Status Definition Constraints


New The PR has been newly authored and has yet to Default status when submitted.
be reviewed by the Deficiency Review Board State must be Awaiting
(DRB) or other designated evaluation authority. Evaluation.
Open The PR is accepted by the evaluation authority State must be Evaluation, In
and item is opened for analysis and resolution. Progress, Referred, Tabled,
Watch Item, Recycled, or Action
Item.
Fixed Fix actions are complete. The disposition of a State must be QA Validation.
PR will be changed to "QA Validation" to be re-
tested/verified by the appropriate test agency.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
20
SWTM046 Integrated Test Plan Template 30 October 2015

Status Definition Constraints


Closed - The PR fix has been verified and validated by State must be Rejected, Remedy
the appropriate QA test agency (that is, DR Created, Withdrawn,
problem resolved and verified). Duplicate Validated, or Change
- The PR has been determined to be an Request.
approved system enhancement to be tracked
as a System Change Request.
- The PR has been determined to be invalid
and no other action is required.
- The PR has been determined to be a standing
problem that must be tracked in another DR
tracking system.

Though "Closed," the PR remains in the HP®


Quality Center database marked with its
Disposition.
Reopen PR has been previously in Closed Status, but State must be Evaluation, In
has recurred in other test actions and must be Progress, Referred, Tabled,
reworked. Watch Item, Recycled, or Action
Item.
Table 6 - PR Status Codes, Definitions, and Constraints

4.4.4. Problem Report (PR) State Codes and Definitions


The PR State Codes and Definitions configured for use in HP® Quality Center are as
follows:

PR State Definition Constraints


Awaiting Awaiting analysis by the government or Default State when submitted.
Evaluatio contractor. Valid with New Status.
n
Evaluatio Resolution is being analyzed by the Valid with Open or Reopen
n government or contractor. Status.
In Resolution is being implemented by the Only valid with Open or Reopen
Progress government or contractor. Status.
Referred PR referred to another organization for Only valid with Open or Reopen
resolution, but still tracked until closed. Status.
Tabled Decision delayed until some later time, Only valid with Open or Reopen
usually until the PR is reviewed and Status.
determination made by the appropriate
agency.
Action PR is related to a problem, but the resolution Only valid with Open or Reopen
Item is not related to the system functionality, and Status.
requires resolution by an organization outside
of the test or acquisition community.

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
21
SWTM046 Integrated Test Plan Template 30 October 2015

PR State Definition Constraints


Recycled PR has been retested and has not been Valid with Open or Reopen
satisfactorily corrected. The PR is being Status.
returned to the development team for action.
Awaiting Awaiting test by a designated test agency. Only valid with Fixed Status.
QA
Validation
QA PR resolution is being tested and verified by a Only valid with Fixed Status.
Validation designated test agency.
Validated PR has been retested and corrective action has Only valid with Closed Status.
corrected the problem identified.
Rejected PR is assessed to be invalid, and is not a Only valid with Closed Status.
problem.

Deficienc DR has been created in Remedy database or Only valid with Closed Status.
y Report other deficiency tracking system. Reference
Created must be made to the DR Tracking Number.
Withdraw PR withdrawn at the originator’s request, or Only valid with Closed Status.
n as suggested by PR reviewing agency.
Duplicate PR describes a deficiency or enhancement Only valid with Closed Status.
already addressed by an earlier PR.
Reference must be made to the PR ID of that
earlier PR.
Change PR is determined to be a System Change Only valid with Closed Status.
Request Request (SCR) and closed. Information
contained in the PR will be added as an SCR
in the local configuration management
database.
Table 7 - PR State Codes, Definitions, and Constraints

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
22
SWTM046 Integrated Test Plan Template 30 October 2015

5. Glossary
This paragraph identifies acronyms and their expanded definitions used throughout the
document (including attachments).

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
23
SWTM046 Integrated Test Plan Template 30 October 2015

Appendix A - Sample Test Objectives Table


Table 8 is a list of sample test objectives which may be used within the CV&I and or
QT&E phases.

Test
Test Objective Table
Segment
ICV Validate each component operates as designed.

CIT Validate that all components integrate into the application as designed.
Validate all desired capabilities function properly and meet specific functional and technical
ROT
requirements.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
ROT
associated risk before moving forward into the QT&E test phase.
Validate, through Performance Evaluation Test (PET) techniques such as bandwidth analysis, load
PET testing, and stress testing, this release operates effectively and meets any documented performance-
related Key Performance Parameters (KPPs).
Evaluate, through Cybersecurity (CSE), how well this release controls information-related risks
(including assessment of network security policies, identification and authentication, access
CSE
controls, auditing, and the confidentiality, integrity, and availability of data and the delivery
systems).
Validate, through Security Readiness Review (SRR) Scans and remediation activities, all Category
CSE
(CAT) I and CAT II vulnerabilities are properly addressed.
Validate, through System Integration Testing (SIT), implementation procedures are complete,
SIT
accurate, and can be used to properly execute system installation.
Validate, through Data Conversion tests, evaluation, and inspection, conversion methods or routines
DM
execute properly and data converted correctly.
SOE Validate the system functions properly and meets specified functional and technical requirements.
Validate, through Regression Testing, that existing capabilities and or functionality are not
SOE
diminished or damaged by installation.
Validate this release operates properly with the current Standard Desktop Configurations (SDCs)
SOE
version.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
SOE
associated risk before moving into the intended production environment.
ROT /
Validate system interoperability through exhaustive execution of system interfaces.
SOE
Evaluate, through UET conducted by end users, how effectively this release meets the usability,
UET
functionality, and mission requirements of the user community.
Validate the Requirements, Problem Reports (PRs), Deficiency Reports (DRs), Engineering Change
UET Proposals (ECPs), Change Requests (CRs), and or capabilities have been corrected or implemented
properly.
Validate all Severity 1 or 2 PRs have been closed or the PMO and stakeholders have accepted the
UET
associated risk before moving into the intended production environment.
<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL
INFORMATION (STINFO) MARKINGS HERE>>
24
SWTM046 Integrated Test Plan Template 30 October 2015

Table 8 –Test Objectives

<<INSERT APPROPRIATE DISTRIBUTION AND/OR SCIENTIFIC AND TECHNICAL


INFORMATION (STINFO) MARKINGS HERE>>
25

You might also like