Comment:
Migration of unmigrated content due to installation of a new plugin
Include Page
2B-Page Warning
2B-Page Warning
Tabsetup
1. The Requirement
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
Div
id
tabs-1
1. Requirements
4.5.2 The project manager shall establish and maintain: a. Software test plan(s). b. Software test procedure(s). c. Software test report(s).
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Applicable b
a
1
b
1
csc
1
c
1
d
1
dsc
1
e
0
f
1
g
1
h
0
Div
id
tabs-2
2. Rationale
Having plans and procedures in place ensures that all necessary and required tasks are performed and performed consistently. Development of plans and procedures provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project. Test reports ensure that results of verification activities are documented and stored in the configuration management system for use in acceptance reviews or readiness reviews.
Panel
Ensuring the test plans, procedures, and reports follow templates ensures consistency of documents across projects, ensures proper planning occurs, ensure proper activity and results are captured, and prevents repeating problems of the past.
Div
id
tabs-3
3. Guidance
Projects create test plans, procedures and reports following the content recommendations in topic 7.18 - Documentation Guidance.
Objectives of software test procedures are to perform software testing in accordance with the following guidelines:
Software testing is performed to demonstrate to the project that the software requirements have been met, including all interface requirements.
If a software item is developed in multiple builds, its software testing will not be completed until the final build for the software item, or possibly until later builds involving items with which the software item is required to interface. Software testing in each build is interpreted to mean planning and performing test of the current build of each software item to ensure that the software item requirements to be implemented in that build have been met.
Swerefn
refnum
478
Independence in software item testing
For Class A, B and Safety critical class C software the person(s) responsible for software testing of a given software item should not be the persons who performed detailed design, implementation or unit testing of the software item. This does not preclude persons who performed detailed design, implementation or unit testing of the software item from contributing to the process, for example by contributing test cases that rely on knowledge of the software items internal implementation.
Swerefn
refnum
478
Software Test Procedure Development Guidelines
The project should “establish test cases (in terms of inputs, expected results, and evaluation criteria), test procedures, and test data for testing the software.”
Swerefn
refnum
478
The test cases, test procedures, should cover the software requirements and design, including, as a minimum, the correct execution of all interfaces (including between software units), statements and branches; all error and exception handling; all software unit interfaces including limits and boundary conditions; end-to-end functional capabilities, performance testing, operational input and output data rates and timing and accuracy requirements, stress testing, worst case scenario(s), fault detection, isolation and recovery handling, resource utilization, hazard mitigations, start-up, termination, and restart (when applicable); and all algorithms. Legacy reuse software should be tested for all modified reuse software, for all reuse software units where the track record indicates potential problems and all critical reuse software components even if the reuse software component has not been modified.
Swerefn
refnum
478
All software testing should be in accordance with the defined test cases and procedures.
“Based on the results of the software testing, the developer [should] make all necessary revisions to the software, perform all necessary retesting, update the SDFs and other software products as needed... Regression testing ... [should] be performed after any modification to previously test software.”
Swerefn
refnum
478
Testing on the target computer system
Software testing should be performed using the target hardware. The target hardware used for software qualification testing should be as close as possible to the operational target hardware and should be in a configuration as close as possible to the operational configuration.
Swerefn
refnum
478
(see SWE-073) Typically, a high-fidelity simulation has the exact processor, processor performance, timing, memory size, and interfaces as the target system.
Software Assurance Witnessing
The software test procedure developer should “dry run the software item test cases and procedures to ensure that they are complete and accurate and that the software is ready for witnessed testing. The developer should record the results of this activity in the appropriated SDFs and should update the software test cases and procedures as appropriate.”
Swerefn
refnum
478
Formal and acceptance software testing are witnessed by software assurance personnel to verify satisfactory completion and outcome. Software assurance is required to witness or review/audit results of software testing and demonstration.
Software Test Report Guidance
The software tester is required to analyze the results of the software testing and record the test and analysis results in the appropriate test report.
Software Test Documentation Maintenance
Once these documents are created, they need to be maintained to reflect current project status, progress, and plans, which will change over the life of the project. When requirements change (SWE-071), test plans, procedures and the resulting test reports may also need to be updated or revised to reflect the changes. Changes to test plans and procedures may result from:
Inspections/peer reviews of documentation.
Inspections/peer reviews of code.
Design changes.
Code maturation and changes (e.g., code changes to correct bugs or problems found during testing, interfaces revised during development).
Availability of relevant test tools that were not originally part of the test plan (e.g., tools freed up from another project, funding becomes available to purchase new tools).
Updated software hazards and mitigations (e.g., new hazards identified, hazards eliminated, mitigations are added or revised).
Execution of the tests (e.g., issues found in test procedures).
Test report/results analysis (e.g., incomplete, insufficient requirements coverage).
Changes in test objectives or scope.
Changes to schedule, milestone, or budget changes.
Changes in test resource numbers or availability (e.g., personnel, tools, facilities).
Changes to software classification or safety criticality (e.g., a research project not intended for flight becomes destined for use on the ISS (International Space Station)).
Process improvements relevant to test activities.
Changes in the project that affects the software testing effort.
Just as the initial test plans, procedures, and reports require review and approval before use, the project team ensures that updates are also reviewed and approved following project procedures.
Maintaining accurate and current test plans, procedures, and reports continues into the operation and maintenance phases of a project.
Note
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to test plans, procedures, and reports, including templates and examples.
NASA-specific test documentation information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to test plans, procedures, and reports may be found in the following related requirement in this Handbook:
No additional guidance is available for small projects.
Div
id
tabs-5
5. Resources
refstable
toolstable
Div
id
tabs-6
6. Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to insufficiencies in software test plans:
Aquarius Reflector Over-Test Incident (Procedures should be complete.) Lesson Number 2419. Lessons Learned No. 1 states: "The Aquarius Reflector test procedure lacked complete instructions for configuring the controller software prior to the test." Lesson Learned No. 4 states: "The roles and responsibilities of the various personnel involved in the Aquarius acoustic test operations were not clearly documented. This could lead to confusion during test operations."
Swerefn
refnum
573
Planning and Conduct of Hazardous Tests Require Extra Precautions (2000-2001) (Special measures needed for potentially hazardous tests.) Lesson Number 0991."When planning tests that are potentially hazardous to personnel, flight hardware or facilities (e.g., high/low temperatures or pressure, stored energy, deployables), special measures should be taken to ensure that:
"Test procedures are especially well written, well organized, and easy to understand by both engineering and quality assurance personnel.
"Known test anomalies that history has shown to be inherent to the test equipment or conditions (including their likely causes, effects, and remedies) are documented and included in pre-test training.
"Readouts of safety-critical test control data are provided in an easily understood form (e.g., audible, visible or graphic format).
"Test readiness reviews are held, and test procedures require confirmation that GSE test equipment and sensors have been properly maintained.
"Quality assurance personnel are present and involved throughout the test to ensure procedures are properly followed, including prescribed responses to pre-identified potential anomalies."
Swerefn
refnum
579
Test plans should reflect proper configurations: "Testing of the software changes was inadequate at the Unit, Integrated and Formal test level. In reviewing test plans...neither had test steps where BAT06 and BAT04 were running concurrently in a launch configuration scenario. Thus no test runs were done with the ... program that would reflect the new fully loaded console configuration. Had the launch configuration scenarios been included in integrated and acceptance testing, this might have revealed the code timing problems."
Swerefn
refnum
581
Ensure Test Monitoring Software Imposes Limits to Prevent Overtest (2003) (Include test monitoring software safety steps.) Lesson Number 1529: Recommendation No. 2 states: "Prior to test, under the test principle of 'First, Do No Harm' to flight equipment, assure that test monitoring and control software is programmed or a limiting hardware device is inserted to prevent over-test under all conditions..."