Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
01. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
67. Software Assurance

1. Requirements


4.5.7 The project manager shall update the software test plan(s) and the software test procedure(s) to be consistent with software requirements.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

titleClick here to view the history of this requirement: SWE-071 History

Include Page
SITE:SWE-071 History
SITE:SWE-071 History

1.3 Applicability Across Classes

Applicable c


2. Rationale

Software test plans and test procedures are the main tools used to ensure proper implementation of the requirements and are developed based on those requirements. Therefore, if the requirements change, the test plans and procedures must also change to ensure that the test activity is accurate, complete, and consistent with the requirements.


Software test plans and test procedures are a key element of ensuring that the requirements that specify a product are completely and accurately implemented; in other words, that the delivered product is the right product. 


3. Guidance

The team typically develops test plans and procedures as soon as the relevant phase in the life cycle has been completed. Once the initial documents have been created (see topic 7.18 - Documentation Guidance), it is important to keep them up-to-date as requirements change throughout the project life cycle. Continuous updates help avoid delays in testing caused by waiting until testing begins to update the test plans and procedures to accurately reflect the requirements.

Test documentation that may require updating includes:

  • System test plan and procedures.
  • Acceptance test plan and procedures.
  • Unit test plans and procedures.
  • Regression test plans and procedures.
  • End-to-end test plans and procedures.
  • Integration test plans and procedures.
  • Test cases.
  • Test scripts.
  • Test data.
  • Test schedule.
  • Traceability matrix.
  • Test effort estimates.

Using a traceability matrix that identifies the test plans, procedures, scripts, test cases and even test data associated with each requirement can be helpful when determining the effect of requirements change on the testing documentation and plans.

It may be helpful to include as a checklist item in the relevant life cycle reviews confirmation that test documentation has been updated to reflect any requirements changes made at that point in the project life cycle. Additionally, if checklists for the test plans and procedures are used, consider including a checklist item to confirm that the plans and procedures are consistent with the software requirements; it may be helpful to repeat those checklists when making revisions to the test plans and procedures.

It may also be helpful if the project can establish a mechanism by which the test plan developers are notified of changes to the requirements once those changes are approved. Consider:

  • Providing copies of approvals for change requests that affect requirements to the Software Lead Engineer.
  • Providing copies of Change Control Board (CCB) minutes so test plan developers can check for approved requirements changes.
  • Including a test team representative as part of the CCB or other authorized groups responsible for approving changes to requirements.

When updating test plans and procedures, use configuration management principles (see SWE-080).


Consult Center Process Asset Libraries (PALs) for center-specific guidance and resources related to keeping test plans and procedures current as requirements change.

Additional guidance related to the test plans and procedures may be found in the following related requirements in this Handbook:


4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.


5. Resources

5.1 References

Show If
titleVisible to editors only

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted

SWEREFs called out in the text: 505

SWEREFs NOT called out in text but listed as germane: 072

5.2 Tools

Include Page
Tools Table Statement
Tools Table Statement


6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Mars Observer Attitude Control Fault Protection (Failure caused in part by incomplete testing.) Lesson Number 0345
    : "From the analyses performed after the Mars Observer [MO] mission failure, it became apparent that the MO fault protection suffered from a lack of top-down system engineering design approach. Most fault protection was in the category of low-level redundancy management. It was also determined that the MO fault protection software was never tested on the flight spacecraft before launch. Design fault protection to detect and respond to excessive attitude control errors, use RCS Thrusters to control excessive attitude control errors, and always test fault protection software on the flight spacecraft before launch."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.


7. Software Assurance

Excerpt Include
SWE-071 - Update Test Plans and Procedures
SWE-071 - Update Test Plans and Procedures

7.1 Tasking for Software Assurance

  1. Analyze that software test plans and software test procedures cover the software requirements and provide adequate verification of hazard controls, specifically the off-nominal scenarios.

7.2 Software Assurance Products

  • The results of the analysis show software requirement and hazard control coverage by software testing, including corrective actions. Evidence may include artifacts analyzed, approach, results, and any peer reviews.

    titleObjective Evidence
    • Software test plan(s).
    • Software test procedure(s).
    • Traceability between the software requirements and the software test procedures.
    • Test coverage metric data.
    titleDefinition of objective evidence

    Include Page
    SITE:Definition of Objective Evidence
    SITE:Definition of Objective Evidence

7.3 Metrics

  • # of detailed software requirements tested to date vs. total # of detailed software requirements
  • # of Non-Conformances identified when the approved, updated requirements are not reflected in test procedures
  • # of Non-Conformances identified while confirming hazard controls are verified through test plans/procedures/cases
  • # of software requirements with completed test procedures/cases over time
  • #  of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
  • # of safety-related non-conformances identified by life-cycle phase over time
  • # of safety-related requirement issues (Open, Closed) over time
  • # of Software Requirements without associated test cases
  • # of Software Requirements being met via satisfactory testing vs. total # of Software Requirements
  • # of software requirements with completed test procedures over time
  • # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)
  • # of software work product Non-Conformances identified by life-cycle phase over time
  • # of safety-related Non-Conformances

7.4 Guidance

Software assurance will analyze the software test plans and test procedures to make sure that all the requirements are being tested, particularly all those which are safety-related. There should be good coverage of all operational scenarios, as well as the off-nominal scenarios and boundary conditions. All safety features, controls, mitigations, and other safety-related requirements should be covered in the test plans. Test planning should cover unit testing, component testing, integration testing, acceptance testing, stress testing, interface testing, and regression testing. 

For guidance on what should be in a test plan and a test procedure, go to 7.18 - Documentation Guidance and look at the minimum recommended content for test plans and test procedures. It is important to analyze test documentation and update it as the requirements change throughout the life cycle. When requirements change, documentation that may need updating includes:

  • System test plan and procedures.
  • Acceptance test plan and procedures.
  • Unit test plans and procedures.
  • Regression test plans and procedures.
  • End-to-end test plans and procedures.
  • Integration test plans and procedures.
  • Test cases.
  • Test scripts.
  • Test data.
  • Test schedule.
  • Traceability matrix.
  • Test effort estimates.

As testing progress and changes are identified in the software, software assurance will need to check the trace matrices to see what else is affected by the change (e.g., did it cause requirements change?). If a requirement changes, test documentation updates may be necessary. 

When testing is done to verify that the defect/change has been implemented correctly, a carefully chosen set of regression tests should also be run to verify that requirements previously tested still work and safety-related code still worked as intended. See the software assurance guidance in SWE-191 - Software Regression Testing for selecting regression test sets.