Minimum recommended content for the Software Test Procedures Plan.
Test preparations, including hardware and software.
Test descriptions, including:
(1) Test identifier.
(2) System or CSCI requirements addressed by the test case.
(3) Prerequisite conditions.
(4) Test input.
(5) Instructions for conducting procedure.
(6) Expected test results, including assumptions and constraints.
(7) Criteria for evaluating results.
Requirements traceability.
Identification of test configuration.
Div
id
tabs-2
2. Rationale
When testing software, it is important to capture the setup, steps, data, test cases, etc., used to verify requirements, functionality, safety, and other aspects of the software. Test procedures capture that information and more for purposes including but not limited to:
Verification of defined software functionality.
Verification that all requirements were tested.
Verification of test procedure validity, applicability, adequacy, completeness, and accuracy before use.
Stakeholder understanding and agreement of test methods.
Repeatability of tests and use of tests in regression testing.
Div
id
tabs-3
3. Guidance
The Software Test Procedures describe the test preparations, test configuration, test cases, and test methods to be used to perform qualification testing of a computer software configuration item (CSCI) or a software system or subsystem. The test procedures also describe the expected test results and include bidirectional traceability to the requirements or a reference to the document containing that trace.
The following documents are useful when developing test procedures:
Include non-functional requirements, including safety, security, performance, etc.
Ensure all requirements are covered by the full set of test procedures.
Maintain the bidirectional test-to-requirements trace when modifying test procedures.
Include test preparations for both software and hardware:
Noting in the test procedure any dependencies in the order the test procedures must be run.
Noting or setting the state of the system to that required to run the test procedure.
Noting or setting the status of data values required to run the test procedure.
Include tests to:
Confirm the software does what it is supposed to do.
Confirm the software does not do what it should not do.
Confirm the software behaves in an expected manner under adverse or off-nominal conditions.
Cover the range of allowable inputs, boundary conditions, false or invalid inputs, load tests, stress tests, interrupt execution and processing, etc.
Include performance testing.
When writing test procedures, be sure to use these helpful processes and practices:
Include very clear, understandable, detailed, step-by-step explanations of how to run each test case.
Use templates and examples from your NASA Center, company, or the NASA Process Asset Library (PAL).
Include references to any test scripts or other automated procedures, as appropriate.
Include references to any documents describing the test configuration, if configuration is not captured in the test procedure.
Include place to document expected results, not just actual results.
Include a signature block at appropriate points in the procedure so that Software Assurance can sign off on a formal test when it is completed.
Include provisions to add redlines to the test procedures when they are executed so that configuration management steps are not required to make a minor change to a procedure. The redlines become the official record and can be initialed by Software Assurance to show their concurrence on the changes.
If reusing test procedures, be sure to:
Check that those procedures adhere to the content and helpful practice guidance above.
Revise those test procedures to align with testing planned for the current project.
Pitfalls and issues:
Do not guess at how the software works. If the requirements are not clear enough to write the test procedures, ask questions of the appropriate project team members.
Do not assume the tester understands the intricacies of the software design. The test procedures must be easy to follow.
Best practices:
Identify in each test procedure all requirements being verified by that test procedure.
Establish bidirectional trace early, and maintain it throughout the test life cycle.
Sequentially number the steps in the test procedure.
Tailor the level of detail in each procedure step to allow:
Clear specification of expected results.
Meaningful comparison of expected results to actual results.
Include cleanup steps to leave the system in a known state at the end of the test procedure.
Use multiple test cases for each requirement, basing the number of test cases on the criticality of the requirement.
Design test cases to address several related requirements.
Swerefn
refnum
140
Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.
Swerefn
refnum
140
Include a setup procedure or test case to place (or restore) the system in a known state, and call that test case repeatedly rather than repeating those setup steps in each test procedure.
Peer review test procedures checking for, at a minimum:
Completeness of test procedure content.
Understandable and clear steps.
Requirements coverage.
Test procedure validity, applicability, completeness, adequacy, and accuracy.
Additional guidance related to software test procedures may be found in the following related requirements in this Handbook:
Test procedures are needed regardless of project size. However, in situations involving small projects, the following relaxations in rigor (but still in compliance with the recommended content) may be appropriate:
Combine the Test Plan and test procedures into one document.
The Test Plan is developed before the test procedures, but each test procedure could be added later to the Test Plan as a chapter or appendix.
Use redlines during review and approval process versus update in configuration management to save time.
If using a model-based development system, use the model to generate test cases automatically when possible and appropriate to do so.
Model-generated test cases typically follow a standard approach (range of allowable inputs, boundary conditions, false or invalid inputs, stress tests, etc.).
Div
id
tabs-5
5. Resources
5.1 References
refstable-topic
Show If
group
confluence-users
Panel
titleColor
red
title
Instructions for Editors
Expand
Enter necessary modifications to be made in the table below:
SWEREFs to be added
SWEREFS to be deleted
SWEREFs called out in text: 140, 529
SWEREFs NOT called out in text but listed as germane: 074, 110, 215
Related Links Pages
Children Display
Refstable Topic
Include Page
MC Test
MC Test
5.2 Tools
Include Page
Tools Table Statement
Tools Table Statement
Div
id
tabs-6
6. Lessons Learned
6.1 NASA Lessons Learned
Probable Scenario for Mars Polar Lander Mission Loss (1998)(Importance of including known hardware characteristics). Lesson Number 0938
Swerefn
refnum
529
: "1. Project test policy and procedures should specify actions to be taken when a failure occurs during test. When tests are aborted, or known to have had flawed procedures, they must be rerun after the test deficiencies are corrected. When test article hardware or software is changed, the test should be rerun unless there is a clear rationale for omitting the rerun. 2. All known hardware operational characteristics, including transients and spurious signals, must be reflected in the software requirements documents and verified by test."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.