Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin



1. Requirements The Software Test Procedures shall contain: [SWE-114]

a. Test preparations, including hardware and software.
b. Test descriptions, including:


c. Requirements traceability.
d. Identification of test configuration.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Note that all classes must perform testing.  This requirement applies to the documentation of the test methodologies used for that testing.  It is that documentation that varies by class.



2. Rationale

When testing software, it is important to capture the setup, steps, data, test cases, etc., used to verify requirements, functionality, safety, and other aspects of the software.  Test procedures capture that information and more for purposes including but not limited to:

  • Verification of defined software functionality.
  • Verification that all requirements were tested.
  • Verification of test procedure validity, applicability, adequacy, completeness, and accuracy before use.
  • Stakeholder understanding and agreement of test methods.
  • Repeatability of tests and use of tests in regression testing.



3. Guidance


The following documents are useful when developing test procedures:

  • Software Requirements Specification(SRS) (SWE-109).
  • Software Data Dictionary (SWE-110).
  • Software Design Description (SWE-111).
  • Interface Design Description (SWE-112).
  • SW Change Requests_Problem Reports (SWE-113).
  • Software Architecture (SWE-057).

When writing test procedures, remember to:

  • Include non-functional requirements, including safety, security, performance, etc.
  • Ensure all requirements are covered by the full set of test procedures.
  • Maintain the bidirectional test-to-requirements trace when modifying test procedures. (See SWE-072.)
  • Include all elements required by NPR 7150.2. (See full SWE-114 text above.)
  • Include test preparations for both software and hardware:
    • Noting in the test procedure any dependencies in the order the test procedures must be run.
    • Noting or setting the state of the system to that required to run the test procedure.
    • Noting or setting the status of data values required to run the test procedure.
  • Include tests to:
    • Confirm the software does what it is supposed to do.
    • Confirm the software does not do what it should not do.
    • Confirm the software behaves in an expected manner under adverse or off-nominal conditions.
    • Cover the range of allowable inputs, boundary conditions, false or invalid inputs, load tests, stress tests, interrupt execution and processing, etc.
    • Include performance testing.

When writing test procedures, be sure to use these helpful processes and practices:

  • Include very clear, understandable, detailed, step-by-step explanations of how to run each test case.
  • Use templates and examples from your NASA Center, company, or the NASA Process Asset Library (PAL).
  • Include references to any test scripts or other automated procedures, as appropriate.
  • Include references to any documents describing the test configuration, if configuration is not captured in the test procedure.
  • Include place to document expected results, not just actual results.
  • Include a signature block at appropriate points in the procedure so that Software Assurance can sign off on a formal test when it is completed.
  • Include provisions to add redlines to the test procedures when they are executed so that configuration management steps are not required to make a minor change to a procedure. The redlines become the official record and can be initialed by Software Assurance to show their concurrence on the changes.

If reusing test procedures, be sure to:

  • Check that those procedures adhere to the content and helpful practice guidance above.
  • Revise those test procedures to align with testing planned for the current project.

Pitfalls and issues:

  • Do not guess at how the software works. If the requirements are not clear enough to write the test procedures, ask questions of the appropriate project team members.
  • Do not assume the tester understands the intricacies of the software design. The test procedures must be easy to follow.

Best practices:

  • Identify in each test procedure all requirements being verified by that test procedure.
  • Establish bidirectional trace early, and maintain it throughout the test life cycle.
  • Sequentially number the steps in the test procedure.
  • Tailor the level of detail in each procedure step to allow:
    • Clear specification of expected results.
    • Meaningful comparison of expected results to actual results.


  • Completeness of test procedure content. (See full SWE-114 text above.)
  • Understandable and clear steps.
  • Requirements coverage.
  • Test procedure validity, applicability, completeness, adequacy, and accuracy.

Additional guidance related to software test procedures may be found in the following related requirements in this Handbook:


Test Plan, Procedures, Reports


Perform Testing


Update Test Plans and Procedures


Bidirectional Traceability Between Software Test Procedures and Software Requirements


4. Small Projects

Test procedures are needed regardless of project size.  However, in situations involving small projects, the following relaxations in rigor (but still in compliance with the requirement) may be appropriate:

  • Combine the Test Plan and test procedures into one document.
    • The Test Plan is developed before the test procedures, but each test procedure could be added later to the Test Plan as a chapter or appendix.
  • Use redlines during review and approval process versus update in configuration management to save time. 
  • If using a model-based development system, use the model to generate test cases automatically when possible and appropriate to do so. 
    • Model-generated test cases typically follow a standard approach (range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.).



5. Resources





6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following, emphasizing the importance of and potential issues related to software test procedures: