bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 25 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-065 - Test Plan, Procedures, Reports
Unknown macro: {div3}

1. Requirements

3.4.1 The project shall establish and maintain:
        a.    Software Test Plan(s).
        b.    Software Test Procedure(s).
        c.    Software Test Report(s).

1.1 Notes">1.1 Notes

The requirements for the content of a Software Test Plan, Software Test Procedure, and Software Test Report are defined in Chapter 5.

1.2 Applicability Across Classes

Class D and Not Safety Critical and class G are labeled with "P (Center)".  This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

    P(C)

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Having plans and procedures in place increases the likelihood that all necessary and required tasks are performed and performed consistently. Development of plans and procedures provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project. Test reports ensure that results of verification activities are documented and stored in the configuration management system for use in acceptance reviews or readiness reviews.

Ensuring the test plans, procedures and reports follow templates ensures consistency of documents across projects, ensures proper planning occurs, ensure proper activity and results are captured, and prevents repeating problems of the past.

Unknown macro: {div3}

3. Guidance

Projects create test plans, procedures and reports following the content requirements in [SWE-104], [SWE-114], and [SWE-118].  This handbook provides guidance for each of these NPR 7150.2A requirements.

Once these documents are created, they should be maintained to reflect current project status, progress, and plans, which will change over the life of the project.  When requirements change ([SWE-071]), test plans, procedures and the resulting test reports may also need to be updated or revised to reflect the changes.  Changes to test plans and procedures may result from:

  • Inspections / peer reviews of documentation
  • Inspections / peer reviews of code
  • Design changes
  • Code maturation and changes (e.g., code changes to correct bugs or problems found during testing, interfaces revised during development)
  • Availability of relevant test tools that were not originally part of the test plan (e.g., tools freed up from another project, funding becomes available to purchase new tools)
  • Updated software hazards and mitigations (e.g., new hazards identified, hazards eliminated, mitigations are added or revised)
  • Execution of the tests (e.g., issues found in test procedures)
  • Test report / results analysis (e.g., incomplete, insufficient requirements coverage)
  • Changes in test objectives or scope
  • Changes to schedule, milestone, or budget changes
  • Changes in test resource numbers or availability (e.g., personnel, tools, facilities)
  • Changes to software classification or safety criticality (e.g., a research project not intended for flight becomes destined for use on the ISS)
  • Process improvements relevant to test activities
  • Changes in the project that affects the software testing effort

Just as the initial test plans, procedures, and reports require review and approval before use, the project team should ensure that updates are also reviewed and approved following project procedures.

Maintaining accurate and current test plans, procedures, and reports continues into the operation and maintenance phases of a project.

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to test plans, procedures and reports, including templates and examples. 

Additional guidance related to test plans, procedures, and reports may be found in the following related requirements in this handbook:

[SWE-071]

Update Plans & Procedures

[SWE-104]

Software Test Plan

[SWE-114]

Software Test Procedures

[SWE-118]

Software Test Report


Unknown macro: {div3}

4. Small Projects

There is currently no guidance specific to small projects for this requirement.

Unknown macro: {div3}

5. Resources

  1. NASA Technical Standard, "NASA Software Safety Guidebook", NASA-GB-8719.13, 2004.
  2. IEEE Computer Society, "IEEE Standard for Software Verification and Validation", Chapter 7, IEEE STD 1012-2004, 2004. A user account is required to access IEEE standards via this NASA Technical Standards System link.
  3. IEEE Computer Society, "IEEE Guide for Software Verification and Validation Plans", IEEE STD 1059-1993, 1993.  A user account is required to access IEEE standards via this NASA Technical Standards System link.

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

">

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Unknown macro: {div3}

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to on insufficiencies in software test plans:

  • Procedures should be complete. Lesson Number 2419: "The Aquarius Reflector test procedure lacked complete instructions for configuring the controller software prior to the test...The roles and responsibilities of the various personnel involved in the Aquarius acoustic test operations were not clearly documented.  This could lead to confusion during test operations." (http://www.nasa.gov/offices/oce/llis/imported_content/lesson_2419.html)
  • Special measures needed for potentially hazardous tests. Lesson Number 0991: "When planning tests that are potentially hazardous to personnel, flight hardware or facilities (e.g., high/low temperatures or pressure, stored energy, deployables), special measures should be taken to ensure that:  Test procedures are especially well written, well organized, and easy to understand by both engineering and quality assurance personnel.
    1. Known test anomalies that history has shown to be inherent to the test equipment or conditions (including their likely causes, effects, and remedies) are documented and included in pre-test training. 
    2. Readouts of safety-critical test control data are provided in an easily understood form (e.g., audible, visible or graphic format).
    3. Test readiness reviews are held, and test procedures require confirmation that GSE test equipment and sensors have been properly maintained.
    4. Quality assurance personnel are present and involved throughout the test to ensure procedures are properly followed, including prescribed responses to pre-identified potential anomalies."
      (https://nen.nasa.gov/llis_content/0991.html)
  • Test plans should reflect proper configurations: "Testing of the software changes was inadequate at the Unit, Integrated and Formal test level.  In reviewing test plans...neither had test steps where BAT06 and BAT04 were running concurrently in a launch configuration scenario.  Thus no test runs were done with the ... program that would reflect the new fully loaded console configuration.  Had the launch configuration scenarios been included in integrated and acceptance testing, this might have revealed the code timing problems." (https://nen.nasa.gov/llis_lib/pdf/1035716main_PR%20LCA%204168.pdf)
  • Include test monitoring software safety steps. Lesson Number 1529: "Prior to test, under the test principle of 'First, Do No Harm' to flight equipment, assure that test monitoring and control software is programmed or a limiting hardware device is inserted to prevent over-test under all conditions..." (http://www.nasa.gov/offices/oce/llis/1529.html)
  • No labels