Invalid license: Your evaluation license of Refined expired.
bannerd

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Excerpt Include
2D-Topic Page Message
2D-Topic Page Message
nopaneltrue

Show If
spacePermissionedit
Panel
borderColorred
titleVisible to editors only
Expand

Content updates needed on this page: 

  1. To be renamed to "5.14 - Test - Software Test Procedures"
  2. Fix References report
  3. Added link to 8.01
Tabsetup
01. Minimum Recommended Content
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned

Return to 7.18 - Documentation Guidance

Div
idtabs-1

1. Minimum Recommended Content

Excerpt

Minimum recommended content for the Software Test Procedures Plan. 

 The Test Procedure Document contains all the detailed information needed to run the test cases identified for the level of testing:

  • Detailed description of the planned test environment including
    • Identification of hardware, flight components used for test, software needed to run test (operating system, simulators used, hardware test beds, etc.)
  • Planned schedule
  • Detailed description of each test case defined for the level of testing being performed, It should include for each test:
    • Test identifier
    • Test grouping (required verification, hazard control verification, etc.)
    • Purpose of the test (proof requirement is satisfied, software works properly at boundary limits, handles fault, etc.) 
    • Identification of software version being tested
    • Description of any prerequisite condition
    • Description of steps performed to run test
    • Input and output parameters 
    • Expected results, including assumptions and constraints, criteria for evaluating results
    • Pass/Fail Criteria for test
    • Any other information needed to rerun test
  • Recommended: Traceability Matrix, listing set of test cases for level of testing traced to requirement satisfied or aspect of software verified (for example: identification of erroneous inputs)          
Div
idtabs-2

2. Rationale

When testing software, it is important to capture the setup, steps, data, test cases, etc., used to verify requirements, functionality, safety, and other aspects of the software.  Test procedures capture that information and more for purposes including but not limited to:

  • Verification of defined software functionality.
  • Verification that all requirements were tested.
  • Verification of test procedure validity, applicability, adequacy, completeness, and accuracy before use.
  • Stakeholder understanding and agreement of test methods.
  • Repeatability of tests and use of tests in regression testing.
Div
idtabs-3

3. Guidance

The Software Test Procedures describe the test preparations, test configuration, test cases, and test methods to be used to perform qualification testing of a computer software configuration item (CSCI) or a software system or subsystem. The test procedures also describe the expected test results and include bidirectional traceability to the requirements or a reference to the document containing that trace. See also SWE-065 - Test Plan, Procedures, Reports, SWE-191 - Software Regression Testing, Topic 7.06 - Software Test Estimation and Testing Levels, 7.08 - Maturity of Life Cycle Products at Milestone Reviews,

3.1 Related Documents

The following documents are useful when developing test procedures:

3.2 Writing Test Procedures

When writing test procedures, remember to:

  • Include non-functional requirements, including safety, security, performance, etc.
  • Ensure all requirements are covered by the full set of test procedures.
  • Maintain the bidirectional test-to-requirements trace when modifying test procedures.
  • Include test preparations for both software and hardware:
  • Noting in the test procedure any dependencies in the order the test procedures must be run.
  • Noting or setting the state of the system to that required to run the test procedure.
  • Noting or setting the status of data values required to run the test procedure.
  • Include tests to:
  • Confirm the software does what it is supposed to do.
  • Confirm the software does not do what it should not do.
  • Confirm the software behaves in an expected manner under adverse or off-nominal conditions.
  • Confirm the software can handle faults and failures through mitigation or return to a known safe condition.
  • Cover the range of allowable inputs, boundary conditions, false or invalid inputs, load tests, stress tests, interrupt execution and processing, etc.
  • Include performance testing.

When writing test procedures, be sure to use these helpful processes and practices:

  • Include very clear, understandable, detailed, step-by-step explanations of how to run each test case.
  • Use templates and examples from your NASA Center, company, or the NASA Process Asset Library (PAL).
  • Include references to any test scripts or other automated procedures, as appropriate.
  • Include references to any documents describing the test configuration, if configuration is not captured in the test procedure.
  • Include place to document expected results, not just actual results.
  • Include a signature block at appropriate points in the procedure so that Software Assurance can sign off on a formal test when it is completed.
  • Include provisions to add redlines to the test procedures when they are executed so that configuration management steps are not required to make a minor change to a procedure. The redlines become the official record and can be initialed by Software Assurance to show their concurrence on the changes.

3.3 Reusing Test Procedures

If reusing test procedures, be sure to:

  • Check that those procedures adhere to the content and helpful practice guidance above.
  • Revise those test procedures to align with testing planned for the current project.

See also SWE-071 - Update Test Plans and Procedures

3.4 Pitfalls and Issues

Here are some pitfalls and issues when writing test procedures:

  • Do not guess at how the software works. If the requirements are not clear enough to write the test procedures, ask questions of the appropriate project team members.
  • Do not assume the tester understands the intricacies of the software design. The test procedures must be easy to follow.

3.5 Best Practices

Some other best practices to consider:

  • Identify in each test procedure all requirements being verified by that test procedure.
  • Establish bidirectional trace early, and maintain it throughout the test life cycle.
  • Sequentially number the steps in the test procedure.
  • Tailor the level of detail in each procedure step to allow:
  • Clear specification of expected results.
  • Meaningful comparison of expected results to actual results.
  • Include cleanup steps to leave the system in a known state at the end of the test procedure.
  • Use multiple test cases for each requirement, basing the number of test cases on the criticality of the requirement.
  • Design test cases to address several related requirements.
    Swerefn
    refnum140
  • Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.
    Swerefn
    refnum140
  • Include a setup procedure or test case to place (or restore) the system in a known state, and call that test case repeatedly rather than repeating those setup steps in each test procedure.
  • Peer review test procedures checking for, at a minimum:
  • Completeness of test procedure content.
  • Understandable and clear steps.
  • Requirements coverage.
  • Test procedure validity, applicability, completeness, adequacy, and accuracy.

3.6 Additional Guidance

Links to Additional Guidance materials for this subject have been compiled in the Relevant Links table. Click here to see the

Tablink2
tab5
linktextAdditional Guidance
 in the Resources tab.

Div
idtabs-4

4. Small Projects

Test procedures are needed regardless of project size. However, in situations involving small projects, the following relaxations in rigor (but still in compliance with the recommended content) may be appropriate:

  • Combine the Test Plan and test procedures into one document.
    • The Test Plan is developed before the test procedures, but each test procedure could be added later to the Test Plan as a chapter or appendix.
  • Use redlines during review and approval process versus update in configuration management to save time.
  • If using a model-based development system, use the model to generate test cases automatically when possible and appropriate to do so.
    • Model-generated test cases typically follow a standard approach (range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.).
Div
idtabs-5

5. Resources

5.1 References

refstable-topic


Show If
groupconfluence-users
Panel
titleColorred
titleInstructions for Editors
Expand

Enter necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted


SWEREFs called out in text: 140, 529

SWEREFs NOT called out in text but listed as germane: 074, 110, 215

Related Links Pages

Children Display

Refstable Topic

Include Page
SITE:MC Test
SITE:MC Test



5.2 Tools

Include Page
Tools Table Statement
Tools Table Statement

5.3 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

Related Links
Include Page
5.14 - Related SWEs
5.14 - Related SWEs


Include Page
5.14 - Related SM
5.14 - Related SM

5.4 Center Process Asset Libraries

Excerpt Include
SITE:SPAN
SITE:SPAN
nopaneltrue

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

Include Page
SITE:SPAN verification and validation
SITE:SPAN verification and validation



Show If
labelactivity

5.5 Related Activities

This Topic is related to the following Life Cycle Activities:

Related Links

Include Page
5.14 - Related Activities
5.14 - Related Activities

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

  • Probable Scenario for Mars Polar Lander Mission Loss (1998) (Importance of including known hardware characteristics). Lesson Number 0938
    Swerefn
    refnum529
    :
    "1. Project test policy and procedures should specify actions to be taken when a failure occurs during test. When tests are aborted, or known to have had flawed procedures, they must be rerun after the test deficiencies are corrected. When test article hardware or software is changed, the test should be rerun unless there is a clear rationale for omitting the rerun. 2. All known hardware operational characteristics, including transients and spurious signals, must be reflected in the software requirements documents and verified by test." 

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.