bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

SWE-114 - Software Test Procedures

1. Requirements

5.2.6.1 The Software Test Procedures shall contain: (SWE-114)

a. Test preparations, including hardware and software.
b. Test descriptions, including:

    (1) Test identifier.
    (2) System or CSCI (Computer Software Configuration Item) requirements addressed by the test case.
    (3) Prerequisite conditions.
    (4) Test input.
    (5) Instructions for conducting procedure.
    (6) Expected test results, including assumptions and constraints.
    (7) Criteria for evaluating results.

c. Requirements traceability.
d. Identification of test configuration.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Note that all classes must perform testing.  This requirement applies to the documentation of the test methodologies used for that testing.  It is that documentation that varies by class.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

   

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

When testing software, it is important to capture the setup, steps, data, test cases, etc., used to verify requirements, functionality, safety, and other aspects of the software.  Test procedures capture that information and more for purposes including but not limited to:

  • Verification of defined software functionality.
  • Verification that all requirements were tested.
  • Verification of test procedure validity, applicability, adequacy, completeness, and accuracy before use.
  • Stakeholder understanding and agreement of test methods.
  • Repeatability of tests and use of tests in regression testing.

3. Guidance

The Software Test Procedures describe the test preparations, test configuration, test cases, and test methods to be used to perform qualification testing of a CSCI or a software system or subsystem.  The test procedures also describe the expected test results and include bidirectional traceability to the requirements or a reference to the document containing that trace.

The following documents are useful when developing test procedures:

  • Software Requirements Specification(SRS) (SWE-109).
  • Software Data Dictionary (SWE-110).
  • Software Design Description (SWE-111).
  • Interface Design Description (SWE-112).
  • SW Change Requests_Problem Reports (SWE-113).
  • Software Architecture (SWE-057).

When writing test procedures, remember to:

  • Include non-functional requirements, including safety, security, performance, etc.
  • Ensure all requirements are covered by the full set of test procedures.
  • Maintain the bidirectional test-to-requirements trace when modifying test procedures. (See SWE-072.)
  • Include all elements required by NPR 7150.2. (See full SWE-114 text above.)
  • Include test preparations for both software and hardware:
    • Noting in the test procedure any dependencies in the order the test procedures must be run.
    • Noting or setting the state of the system to that required to run the test procedure.
    • Noting or setting the status of data values required to run the test procedure.
  • Include tests to:
    • Confirm the software does what it is supposed to do.
    • Confirm the software does not do what it should not do.
    • Confirm the software behaves in an expected manner under adverse or off-nominal conditions.
    • Cover the range of allowable inputs, boundary conditions, false or invalid inputs, load tests, stress tests, interrupt execution and processing, etc.
    • Include performance testing.

When writing test procedures, be sure to use these helpful processes and practices:

  • Include very clear, understandable, detailed, step-by-step explanations of how to run each test case.
  • Use templates and examples from your NASA Center, company, or the NASA Process Asset Library (PAL).
  • Include references to any test scripts or other automated procedures, as appropriate.
  • Include references to any documents describing the test configuration, if configuration is not captured in the test procedure.
  • Include place to document expected results, not just actual results.
  • Include a signature block at appropriate points in the procedure so that Software Assurance can sign off on a formal test when it is completed.
  • Include provisions to add redlines to the test procedures when they are executed so that configuration management steps are not required to make a minor change to a procedure. The redlines become the official record and can be initialed by Software Assurance to show their concurrence on the changes.

If reusing test procedures, be sure to:

  • Check that those procedures adhere to the content and helpful practice guidance above.
  • Revise those test procedures to align with testing planned for the current project.

Pitfalls and issues:

  • Do not guess at how the software works. If the requirements are not clear enough to write the test procedures, ask questions of the appropriate project team members.
  • Do not assume the tester understands the intricacies of the software design. The test procedures must be easy to follow.

Best practices:

  • Identify in each test procedure all requirements being verified by that test procedure.
  • Establish bidirectional trace early, and maintain it throughout the test life cycle.
  • Sequentially number the steps in the test procedure.
  • Tailor the level of detail in each procedure step to allow:
    • Clear specification of expected results.
    • Meaningful comparison of expected results to actual results.
  • Include cleanup steps to leave the system in a known state at the end of the test procedure.
  • Use multiple test cases for each requirement, basing the number of test cases on the criticality of the requirement.
  • Design test cases to address several related requirements. 140
  • Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together. 140
  • Include a setup procedure or test case to place (or restore) the system in a known state, and call that test case repeatedly rather than repeating those setup steps in each test procedure.
  • Peer review test procedures checking for, at a minimum:
    • Completeness of test procedure content. (See full SWE-114 text above.)
    • Understandable and clear steps.
    • Requirements coverage.
    • Test procedure validity, applicability, completeness, adequacy, and accuracy.

Additional guidance related to software test procedures may be found in the following related requirements in this Handbook:

SWE-065

Test Plan, Procedures, Reports

SWE-066

Perform Testing

SWE-071

Update Test Plans and Procedures

SWE-072

Bidirectional Traceability Between Software Test Procedures and Software Requirements

4. Small Projects

Test procedures are needed regardless of project size.  However, in situations involving small projects, the following relaxations in rigor (but still in compliance with the requirement) may be appropriate:

  • Combine the Test Plan and test procedures into one document.
    • The Test Plan is developed before the test procedures, but each test procedure could be added later to the Test Plan as a chapter or appendix.
  • Use redlines during review and approval process versus update in configuration management to save time. 
  • If using a model-based development system, use the model to generate test cases automatically when possible and appropriate to do so. 
    • Model-generated test cases typically follow a standard approach (range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.).

5. Resources

  • (SWEREF-074) At www.information-management-architect.com. Retrieved May 17, 2011, from http://www.information-management-architect.com/software-testing-procedures.html.
  • (SWEREF-111) Test Procedure Checklist, NASA Marshall Space Flight Center (MSFC), 2008.
  • (SWEREF-140) Borysowich, Craig, 2006. Observations from a Tech Architect: Enterprise Implementation Issus & Solutions,
  • (SWEREF-325) Software Change Request (SCR), NASA Glenn Research Center.
  • (SWEREF-339) Software Test Procedures (STPr) Template, GRC-SW-TPLT-STPr-2006, Software Engineering Process Group, NASA Glenn Research Center, 2006. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-451) SED Test Description Guideline, 580-GL-063-02, Software Engineering Dvision, NASA Goddard Space Flight Center (GSFC), 2012. Replaces SWEREF-080
  • (SWEREF-529) Public Lessons Learned Entry: 938.

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following, emphasizing the importance of and potential issues related to software test procedures:

Probable Scenario for Mars Polar Lander Mission Loss (1998) (Importance of including known hardware characteristics). Lesson Number 0938: "1. Project test policy and procedures should specify actions to be taken when a failure occurs during test. When tests are aborted, or known to have had flawed procedures, they must be rerun after the test deficiencies are corrected. When test article hardware or software is changed, the test should be rerun unless there is a clear rationale for omitting the rerun. 2. All known hardware operational characteristics, including transients and spurious signals, must be reflected in the software requirements documents and verified by test." 529