Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

(Project Name) Test Plan: (Document Version Number) Project Team: (Date)

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 3

[Project Name] Test Plan

[Document Version Number]


[Date]

Project Team:
[Name] [Role]
[Name] [Role]
[Name] [Role]
[Name] [Role]
[Name] [Role]

Document Author(s):
[Name]
Project Sponsor:
[Name]
I. Introduction
This serves as the plan for testing all software artifacts as well as the reporting of test results.

II. Test Plan


Use the template below to specify the black box test cases you will run on your code. Every requirement must have
a minimum of one test case. Considering equivalence class partitioning, boundary value analysis, and
diabolical test cases, it is likely that each requirement should have several test cases.
Test ID

Description

Expected Results

Actual Results

Where:
Test ID is a unique identifier for the test case. The unique identifier should relate back to the
particular requirement the test case is verifying. For example, if your naming scheme for
requirements is numbers, test cases for requirement 3 could have test IDs 3.1, 3.2, etc. Acceptance
test cases must end the Test ID with a *.
Description should clearly document the steps that need to be done in order to run the test case.
Write the description specifically, such that any team member can run the test case, even if the
author of the test case is not present.
Expected results is a statement of what should happen when the test case is run.
Actual results are an indication of whether the test case is currently passing or failing when it is
run. The actual results could be recorded simply as Pass or Fail. However, it is also helpful
to describe what happened in cases where a test case fails.
Ultimately, your customer should agree to the test case. When test cases are written so specifically, often
requirements understanding is enhanced.

III. Testing Deliverables


Specify the planned testing deliverables which may include:
Test Design Specification
Test Case Specification
Test Procedure Specification
Test Log
Test Incident Report
Test Summary Report
Test Input and Output Data

IV. Environmental Requirements


Specify the environmental needs for conducting tests:
Hardware, communications and system software, other supplies, etc.
Level of security

Testing tools

V. Staffing
Specify testing responsibilities, staffing and training needs.

VI. Schedule
Specify testing schedule.

VII. Risks and Contingencies


Specify any potential risks and plans for mitigating, addressing and/or resolving those risks.

VIII. Approvals
List any approvals / signatures required to sign off on test results.

IX. Document Revision History:


Version
Name(s)
Date
Change Description

File version number.


Name of individual(s) responsible for the change.
Date of change.
Description of the changes made to the file.

You might also like