See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
4.5.7 The project manager shall update the software test and verification plan(s) and procedure(s) to be consistent with software requirements.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Software test plans and test procedures are the main tools used to ensure proper implementation of the requirements and are developed based on those requirements. Therefore, if the requirements change, the test plans and procedures must also change to ensure that the test activity is accurate, complete, and consistent with the requirements.
Software test plans and test procedures are a key element of ensuring that the requirements that specify a product are completely and accurately implemented; in other words, that the delivered product is the right product.
3. Guidance
3.1 Test Plans and Procedures
The team typically develops test plans and procedures as soon as the relevant phase in the life cycle has been completed. Once the initial documents have been created (see topic 7.18 - Documentation Guidance, 5.14 - Test - Software Test Procedures), it is important to keep them up-to-date as requirements change throughout the project life cycle. Continuous updates help avoid delays in testing caused by waiting until testing begins to update the test plans and procedures to accurately reflect the requirements.
Test documentation that may require updating includes:
- System test plan and procedures.
- Acceptance test plan and procedures.
- Unit test plans and procedures.
- Regression test plans and procedures.
- End-to-end test plans and procedures.
- Integration test plans and procedures.
- Test cases.
- Test scripts.
- Test data.
- Test schedule.
- Traceability matrix.
- Test effort estimates.
See also SWE-065 - Test Plan, Procedures, Reports,
3.2 Testing Traceability Matrix
Using a traceability matrix that identifies the test plans, procedures, scripts, test cases and even test data associated with each requirement can be helpful when determining the effect of requirements change on the testing documentation and plans.
It may be helpful to include as a checklist item in the relevant life cycle reviews confirmation that test documentation has been updated to reflect any requirements changes made at that point in the project life cycle. Additionally, if checklists for the test plans and procedures are used, consider including a checklist item to confirm that the plans and procedures are consistent with the software requirements; it may be helpful to repeat those checklists when making revisions to the test plans and procedures. See also SWE-052 - Bidirectional Traceability.
3.3 Requirements Changes Effects on Testing
It may also be helpful if the project can establish a mechanism by which the test plan developers are notified of changes to the requirements once those changes are approved. Consider:
- Providing copies of approvals for change requests that affect requirements to the Software Lead Engineer.
- Providing copies of Change Control Board (CCB) minutes so test plan developers can check for approved requirements changes.
- Including a test team representative as part of the CCB or other authorized groups responsible for approving changes to requirements.
3.4 Configuration Management and Testing
When updating test plans and procedures, use configuration management principles (see SWE-080 - Track and Evaluate Changes).
Consult Center Process Asset Libraries (PALs) for center-specific guidance and resources related to keeping test plans and procedures current as requirements change.
3.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
3.6 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
When developing the test procedures, be sure to add links or notes to other procedures that trace back to the same requirement. That way when a requirement changes, it will be easier to find all the places where the test documentation might need changes. Another alternative is to link the information through a traceability matrix that is maintained by the project.
5. Resources
5.1 References
- (SWEREF-072) Checklist for the Contents of Software Critical Design Review (CDR), 580-CK-008-02, Software Engineering Division, NASA Goddard Space Flight Center (GSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-505) Public Lessons Learned Entry: 345.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
- Mars Observer Attitude Control Fault Protection (Failure caused in part by incomplete testing.) Lesson Number 0345 505: "From the analyses performed after the Mars Observer [MO] mission failure, it became apparent that the MO fault protection suffered from a lack of top-down system engineering design approach. Most fault protection was in the category of low-level redundancy management. It was also determined that the MO fault protection software was never tested on the flight spacecraft before launch. Design fault protection to detect and respond to excessive attitude control errors, use RCS Thrusters to control excessive attitude control errors, and always test fault protection software on the flight spacecraft before launch."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Analyze that software test plans and software test procedures cover the software requirements and provide adequate verification of hazard controls, specifically the off-nominal scenarios.
7.2 Software Assurance Products
- The results of the analysis show software requirement and hazard control coverage by software testing, including corrective actions. Evidence may include artifacts analyzed, approach, results, and any peer reviews.
Objective Evidence
- Software test plan(s).
- Software test procedure(s).
- Traceability between the software requirements and the software test procedures.
- Test coverage metric data.
7.3 Metrics
- # of detailed software requirements tested to date vs. total # of detailed software requirements
- # of Non-Conformances identified when the approved, updated requirements are not reflected in test procedures
- # of Non-Conformances identified while confirming hazard controls are verified through test plans/procedures/cases
- # of software requirements with completed test procedures/cases over time
- # of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
- # of safety-related non-conformances identified by life cycle phase over time
- # of safety-related requirement issues (Open, Closed) over time
- # of Software Requirements without associated test cases
- # of Software Requirements being met via satisfactory testing vs. total # of Software Requirements
- # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)
- # of software work product Non-Conformances identified by life cycle phase over time
- # of safety-related Non-Conformances
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
Software assurance will analyze the software test plans and test procedures to make sure that all the requirements are being tested, particularly all those which are safety-related. There should be good coverage of all operational scenarios, as well as the off-nominal scenarios and boundary conditions. All safety features, controls, mitigations, and other safety-related requirements should be covered in the test plans. Test planning should cover unit testing, component testing, integration testing, acceptance testing, stress testing, interface testing, and regression testing. See also Topic 8.01 - Off Nominal Testing.
For guidance on what should be in a test plan and a test procedure, go to 7.18 - Documentation Guidance and look at the minimum recommended content for test plans and test procedures. It is important to analyze test documentation and update it as the requirements change throughout the life cycle. When requirements change, documentation that may need updating includes:
- System test plan and procedures.
- Acceptance test plan and procedures.
- Unit test plans and procedures.
- Regression test plans and procedures.
- End-to-end test plans and procedures.
- Integration test plans and procedures.
- Test cases.
- Test scripts.
- Test data.
- Test schedule.
- Traceability matrix.
- Test effort estimates.
As testing progress and changes are identified in the software, software assurance will need to check the trace matrices to see what else is affected by the change (e.g., did it cause requirements change?). If a requirement changes, test documentation updates may be necessary.
When testing is done to verify that the defect/change has been implemented correctly, a carefully chosen set of regression tests should also be run to verify that requirements previously tested still work and safety-related code still worked as intended. See the software assurance guidance in SWE-191 - Software Regression Testing for selecting regression test sets.
See also SWE-194 - Delivery Requirements Verification,
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: