3.4.1 The project shall establish and maintain:
a. Software Test Plan(s).
b. Software Test Procedure(s).
c. Software Test Report(s).
The requirements for the content of a Software Test Plan, Software Test Procedure, and Software Test Report are defined in Chapter 5 [of NPR 7150.2, NASA Software Engineering Requirements, Sections 5.1.3, 5.2.6, and 5.3.2, respectively].
1.2 Applicability Across Classes
Class D and Not Safety Critical and class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
Having plans and procedures in place increases the likelihood that all necessary and required tasks are performed and performed consistently. Development of plans and procedures provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project. Test reports ensure that results of verification activities are documented and stored in the configuration management system for use in acceptance reviews or readiness reviews.
Ensuring the test plans, procedures, and reports follow templates ensures consistency of documents across projects, ensures proper planning occurs, ensure proper activity and results are captured, and prevents repeating problems of the past.
Projects create test plans, procedures and reports following the content requirements in SWE-104, SWE-114, and SWE-118. This Handbook provides guidance for each of these NPR 7150.2 requirements.
Once these documents are created, they need to be maintained to reflect current project status, progress, and plans, which will change over the life of the project. When requirements change (SWE-071), test plans, procedures and the resulting test reports may also need to be updated or revised to reflect the changes. Changes to test plans and procedures may result from:
- Inspections/peer reviews of documentation.
- Inspections/peer reviews of code.
- Design changes.
- Code maturation and changes (e.g., code changes to correct bugs or problems found during testing, interfaces revised during development).
- Availability of relevant test tools that were not originally part of the test plan (e.g., tools freed up from another project, funding becomes available to purchase new tools).
- Updated software hazards and mitigations (e.g., new hazards identified, hazards eliminated, mitigations are added or revised).
- Execution of the tests (e.g., issues found in test procedures).
- Test report/results analysis (e.g., incomplete, insufficient requirements coverage).
- Changes in test objectives or scope.
- Changes to schedule, milestone, or budget changes.
- Changes in test resource numbers or availability (e.g., personnel, tools, facilities).
- Changes to software classification or safety criticality (e.g., a research project not intended for flight becomes destined for use on the International Space Station (ISS)).
- Process improvements relevant to test activities.
- Changes in the project that affects the software testing effort.
Just as the initial test plans, procedures, and reports require review and approval before use, the project team ensures that updates are also reviewed and approved following project procedures.
Maintaining accurate and current test plans, procedures, and reports continues into the operation and maintenance phases of a project.
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to test plans, procedures, and reports, including templates and examples.
Additional guidance related to test plans, procedures, and reports may be found in the following related requirements in this Handbook:
4. Small Projects
No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.
6. Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to insufficiencies in software test plans:
- Aquarius Reflector Over-Test Incident (Procedures should be complete.) Lesson Number 2419. Lessons Learned No. 1 states: "The Aquarius Reflector test procedure lacked complete instructions for configuring the controller software prior to the test." Lesson Learned No. 4 states: "The roles and responsibilities of the various personnel involved in the Aquarius acoustic test operations were not clearly documented. This could lead to confusion during test operations."
- Planning and Conduct of Hazardous Tests Require Extra Precautions (2000-2001) (Special measures needed for potentially hazardous tests.) Lesson Number 0991. "When planning tests that are potentially hazardous to personnel, flight hardware or facilities (e.g., high/low temperatures or pressure, stored energy, deployables), special measures should be taken to ensure that:
- "Test procedures are especially well written, well organized, and easy to understand by both engineering and quality assurance personnel.
- "Known test anomalies that history has shown to be inherent to the test equipment or conditions (including their likely causes, effects, and remedies) are documented and included in pre-test training.
- "Readouts of safety-critical test control data are provided in an easily understood form (e.g., audible, visible or graphic format).
- "Test readiness reviews are held, and test procedures require confirmation that GSE test equipment and sensors have been properly maintained.
- "Quality assurance personnel are present and involved throughout the test to ensure procedures are properly followed, including prescribed responses to pre-identified potential anomalies."
- Test plans should reflect proper configurations: "Testing of the software changes was inadequate at the Unit, Integrated and Formal test level. In reviewing test plans...neither had test steps where BAT06 and BAT04 were running concurrently in a launch configuration scenario. Thus no test runs were done with the ... program that would reflect the new fully loaded console configuration. Had the launch configuration scenarios been included in integrated and acceptance testing, this might have revealed the code timing problems."
- Ensure Test Monitoring Software Imposes Limits to Prevent Overtest (2003) (Include test monitoring software safety steps.) Lesson Number 1529: Recommendation No. 2 states: "Prior to test, under the test principle of 'First, Do No Harm' to flight equipment, assure that test monitoring and control software is programmed or a limiting hardware device is inserted to prevent over-test under all conditions..."