bannerc
SWE-071 - Update Test Plans and Procedures

1. Requirements

4.5.7 The project manager shall update the software test plan(s) and the software test procedure(s) to be consistent with software requirements.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-071 - Last used in rev NPR 7150.2D

RevSWE Statement
A

3.4.7 The project shall update Software Test Plan(s) and Software Test Procedure(s) to be consistent with software requirements.

Difference between A and B

No change

B

4.5.8 The project manager  shall update software test plan(s) and software test procedure(s) to be consistent with software requirements. 

Difference between B and C

No change

C

4.5.7 The project manager shall update the software test plan(s) and the software test procedure(s) to be consistent with software requirements.

Difference between C and DEditorial clarification (added verification plans).
D

4.5.7 The project manager shall update the software test and verification plan(s) and procedure(s) to be consistent with software requirements.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Software test plans and test procedures are the main tools used to ensure proper implementation of the requirements and are developed based on those requirements. Therefore, if the requirements change, the test plans and procedures must also change to ensure that the test activity is accurate, complete, and consistent with the requirements.

Software test plans and test procedures are a key element of ensuring that the requirements that specify a product are completely and accurately implemented; in other words, that the delivered product is the right product. 

3. Guidance

The team typically develops test plans and procedures as soon as the relevant phase in the life cycle has been completed. Once the initial documents have been created (see topic 7.18 - Documentation Guidance), it is important to keep them up-to-date as requirements change throughout the project life cycle. Continuous updates help avoid delays in testing caused by waiting until testing begins to update the test plans and procedures to accurately reflect the requirements.

Test documentation that may require updating includes:

  • System test plan and procedures.
  • Acceptance test plan and procedures.
  • Unit test plans and procedures.
  • Regression test plans and procedures.
  • End-to-end test plans and procedures.
  • Integration test plans and procedures.
  • Test cases.
  • Test scripts.
  • Test data.
  • Test schedule.
  • Traceability matrix.
  • Test effort estimates.

Using a traceability matrix that identifies the test plans, procedures, scripts, test cases and even test data associated with each requirement can be helpful when determining the effect of requirements change on the testing documentation and plans.

It may be helpful to include as a checklist item in the relevant life cycle reviews confirmation that test documentation has been updated to reflect any requirements changes made at that point in the project life cycle. Additionally, if checklists for the test plans and procedures are used, consider including a checklist item to confirm that the plans and procedures are consistent with the software requirements; it may be helpful to repeat those checklists when making revisions to the test plans and procedures.

It may also be helpful if the project can establish a mechanism by which the test plan developers are notified of changes to the requirements once those changes are approved. Consider:

  • Providing copies of approvals for change requests that affect requirements to the Software Lead Engineer.
  • Providing copies of Change Control Board (CCB) minutes so test plan developers can check for approved requirements changes.
  • Including a test team representative as part of the CCB or other authorized groups responsible for approving changes to requirements.

When updating test plans and procedures, use configuration management principles (see SWE-080).

Consult Center Process Asset Libraries (PALs) for center-specific guidance and resources related to keeping test plans and procedures current as requirements change.

Additional guidance related to the test plans and procedures may be found in the following related requirements in this Handbook:

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

5. Resources

5.1 References

  • (SWEREF-072) Checklist for the Contents of Software Critical Design Review (CDR), 580-CK-008-02, Software Engineering Division, NASA Goddard Space Flight Center (GSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
  • (SWEREF-505) Public Lessons Learned Entry: 345.

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Mars Observer Attitude Control Fault Protection (Failure caused in part by incomplete testing.) Lesson Number 0345 505: "From the analyses performed after the Mars Observer [MO] mission failure, it became apparent that the MO fault protection suffered from a lack of top-down system engineering design approach. Most fault protection was in the category of low-level redundancy management. It was also determined that the MO fault protection software was never tested on the flight spacecraft before launch. Design fault protection to detect and respond to excessive attitude control errors, use RCS Thrusters to control excessive attitude control errors, and always test fault protection software on the flight spacecraft before launch."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-071 - Update Test Plans and Procedures
4.5.7 The project manager shall update the software test plan(s) and the software test procedure(s) to be consistent with software requirements.

7.1 Tasking for Software Assurance

  1. Analyze that software test plans and software test procedures cover the software requirements and provide adequate verification of hazard controls, specifically the off-nominal scenarios.

7.2 Software Assurance Products

  • The results of the analysis show software requirement and hazard control coverage by software testing, including corrective actions. Evidence may include artifacts analyzed, approach, results, and any peer reviews.


    Objective Evidence

    • Software test plan(s).
    • Software test procedure(s).
    • Traceability between the software requirements and the software test procedures.
    • Test coverage metric data.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of detailed software requirements tested to date vs. total # of detailed software requirements
  • # of Non-Conformances identified when the approved, updated requirements are not reflected in test procedures
  • # of Non-Conformances identified while confirming hazard controls are verified through test plans/procedures/cases
  • # of software requirements with completed test procedures/cases over time
  • #  of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
  • # of safety-related non-conformances identified by life-cycle phase over time
  • # of safety-related requirement issues (Open, Closed) over time
  • # of Software Requirements without associated test cases
  • # of Software Requirements being met via satisfactory testing vs. total # of Software Requirements
  • # of software requirements with completed test procedures over time
  • # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)
  • # of software work product Non-Conformances identified by life-cycle phase over time
  • # of safety-related Non-Conformances

7.4 Guidance

Software assurance will analyze the software test plans and test procedures to make sure that all the requirements are being tested, particularly all those which are safety-related. There should be good coverage of all operational scenarios, as well as the off-nominal scenarios and boundary conditions. All safety features, controls, mitigations, and other safety-related requirements should be covered in the test plans. Test planning should cover unit testing, component testing, integration testing, acceptance testing, stress testing, interface testing, and regression testing. 

For guidance on what should be in a test plan and a test procedure, go to 7.18 - Documentation Guidance and look at the minimum recommended content for test plans and test procedures. It is important to analyze test documentation and update it as the requirements change throughout the life cycle. When requirements change, documentation that may need updating includes:

  • System test plan and procedures.
  • Acceptance test plan and procedures.
  • Unit test plans and procedures.
  • Regression test plans and procedures.
  • End-to-end test plans and procedures.
  • Integration test plans and procedures.
  • Test cases.
  • Test scripts.
  • Test data.
  • Test schedule.
  • Traceability matrix.
  • Test effort estimates.

As testing progress and changes are identified in the software, software assurance will need to check the trace matrices to see what else is affected by the change (e.g., did it cause requirements change?). If a requirement changes, test documentation updates may be necessary. 

When testing is done to verify that the defect/change has been implemented correctly, a carefully chosen set of regression tests should also be run to verify that requirements previously tested still work and safety-related code still worked as intended. See the software assurance guidance in SWE-191 - Software Regression Testing for selecting regression test sets.



  • No labels