bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-114 - Software Test Procedures
Unknown macro: {div3}

1. Requirements

5.2.6.1 The Software Test Procedures shall contain: [SWE-114]

a. Test preparations, including hardware and software.
b. Test descriptions, including:

    (1) Test identifier.
    (2) System or CSCI requirements addressed by the test case.
    (3) Prerequisite conditions.
    (4) Test input.
    (5) Instructions for conducting procedure.
    (6) Expected test results, including assumptions and constraints.
    (7) Criteria for evaluating results.

c. Requirements traceability.
d. Identification of test configuration.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Implementation Notes from Appendix D

NPR 7150.2 does not include any notes for this requirement.

1.3 Applicability Across Classes

Note that all classes must perform testing.  This requirement applies to the documentation of the test methodologies used for that testing.  It is that documentation which varies by class.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

   

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

When testing software, it is important to capture the setup, steps, data, test cases, etc. used to verify requirements, functionality, safety, and other aspects of the software.  Test procedures capture that information and more for purposes including, but not limited to:

  • Verification of test procedure validity, applicability, adequacy, and accuracy prior to use
  • Stakeholder understanding and agreement of test methods
  • Repeatability of tests and use of tests in regression testing
  • Verification that all requirements were tested
  • Verification of software functionality
Unknown macro: {div3}

3. Guidance

The Software Test Procedures describe the test preparations, test cases, test methods, and result verification to be used to perform qualification testing of a CSCI or a software system or subsystem.

The following documents are useful when developing test procedures:

  • Software Requirements Specification (SRS)
  • Software Data Dictionary
  • Software Design Description
  • Interface Design Description
  • Software Change Requests/Problem Reports

When writing test procedures, don't forget to:

  • Include non-functional requirements, including safety, security, performance, etc.
  • Ensure all requirements are covered by the full set of test procedures
  • Maintain the test-to-requirements trace when modifying test procedures
  • Include all elements required by NPR 7150.2 (see full SWE-114 text above)
  • Include test preparations for both software and hardware
    • Noting in the test procedure any dependencies in the order the test procedures must be run
    • Noting or setting the state of the system to that required to run the test procedure
    • Noting or setting the status of data values required to run the test procedure
  • Includes tests to:
    • Confirm the software does what it is supposed to do
    • Confirm the software does not do what it should not do
    • Confirm the software behaves in an expected manner under adverse or off-nominal conditions
    • Cover the range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.

When writing test procedures, be sure to use these helpful processes & practices:

  • Include very clear, understandable, detailed, step-by-step explanations of how to run each test case
  • Use templates and examples from your NASA Center, Company, or the NASA Process Asset Library (PAL)
  • Include references to any test scripts or other automated procedures, as appropriate
  • Include a signature block at appropriate points in the procedure so that Software Assurance can sign off on a formal test when it is completed
  • Include provisions to add redlines to the test procedures when they are executed so that configuration management steps are not required to make a minor change to a procedure.  The redlines become the official record and can be initialed by Software Assurance to show their concurrence on the changes.

Pitfalls & Issues:

  • Don't guess at how the software works, if the requirements aren't clear enough to write the test procedures, ask questions of the appropriate project team members.
  • Don't assume the tester understands the intricacies of the design of the software.  The test procedures must be easy to follow.

Best practices:

  • Identify in each test procedure all requirements being verified by that test procedure
  • Sequentially number the steps in the test procedure
  • Tailor the level of detail in each procedure step to allow:
    • Clear specification of expected results
    • Meaningful comparison of expected results to actual results
  • Include cleanup steps to leave the system in a known state at the end of the test procedure
  • Use multiple test cases for each requirement, basing the number of test cases on the criticality of the requirement1
  • Minimize the number of test cases required by designing test cases to address several related requirements)5
  • Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.5
  •  Include a setup procedure or test case to place (or restore) the system in a known state and call that test case repeatedly rather than repeating those setup steps in each test procedure
  • Peer review test procedures checking for, at a minimum:
    • Completeness of test procedure content (see full SWE-114 text above)
    • Understandable and clear steps
    • Requirements coverage
    • Test procedure validity, applicability, adequacy, and accuracy

Additional guidance related to software test procedures may be found in the following related requirement in this handbook:

SWE-065:

Test plans, procedures, reports

SWE-066

Perform testing

SWE-071

Update plans & procedures

SWE-072

Bidirectional Traceability (test procedures to software requirements)

Unknown macro: {div3}

4. Small Projects

Test procedures are needed regardless of project size.  However, in situations involving small projects the following relaxations in rigor (but still in compliance with the requirement) may be appropriate:

  • Combine the Test Plan and test procedures into one document.
    • Each procedure could be a chapter or appendix. 
  • Use of redlines versus updating in configuration management to save time. 
  • If using a model based development system, use the model to generate test cases automatically when possible and appropriate to do so. 
    • Model-generated test cases typically follow a standard approach (range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.).
  • Streamline the signature process by having Software Assurance sign off on a set of steps, or a full procedure, versus each individual step in a test procedure.
Unknown macro: {div3}

5. Resources

  1. Goddard Space Flight Center, "ISD Test Description Guideline", 580-GL-063-01, 2006.
  2. Marshall Space Flight Center, "Test Procedure Checklist", 2008.
  3. David Bowman's Information Management Checklist, Software Testing Procedures, 2009. Accessed May 17, 2011.
  4. Software Engineering Process Group, Glenn Research Center, "Software Test Procedures (STPr) Template" GRC-SW-TPLT-STPr, 2006.
  5. Borysowich, Craig, "Observations from a Tech Architect: Enterprise Implementation Issus & Solutions, Sample Software Testing Standards and Procedures", 2006.

5.1 Tools

Tools relative to this SWE may be found in the table above. If no tools are listed, none have been currently identified for this SWE. You may wish to reference table XYZ in this handbook for an evolving list of these and other tools in use at NASA.  Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool.  Check with your center to see what tools are available to facilitate compliance with this requirement.

Unknown macro: {div3}

6. Lessons Learned

The following documented lesson from the NASA Lessons Learned database emphasizes the importance of and potential issues related to software test procedures:

Importance of including known hardware characteristics: "1. Project test policy and procedures should specify actions to be taken when a failure occurs during test. When tests are aborted, or known to have had flawed procedures, they must be rerun after the test deficiencies are corrected. When test article hardware or software is changed, the test should be rerun unless there is a clear rationale for omitting the rerun. 2. All known hardware operational characteristics, including transients and spurious signals, must be reflected in the software requirements documents and verified by test." (http://www.nasa.gov/offices/oce/llis/0938.html)

  • No labels