bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned


Div
idtabs-1

1. Requirements

5.1.3.1 The Software Test Plan shall include: [SWE-104]

a.    Test levels (separate test effort that has its own documentation and resources, e.g., component, integration, and system testing).
b.    Test types:
       (1)  Unit testing.
       (2)  Software integration testing.
       (3)  Systems integration testing.
       (4)  End-to-end testing.
       (5)  Acceptance testing.
       (6)  Regression testing.
c.    Test classes (designated grouping of test cases).
d.    General test conditions.
e.    Test progression.
f.     Data recording, reduction, and analysis.
g.    Test coverage (breadth and depth) or other methods for ensuring sufficiency of testing.
h.    Planned tests, including items and their identifiers.
i.      Test schedules.
j.      Requirements traceability (or verification matrix).
k.    Qualification testing environment, site, personnel, and participating organizations.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C-E, Safety Critical, are labeled with "P (Center) + SO."  "P (Center)" means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement. "SO" means that the requirement applies only for safety critical portions of the software.


applicable
f1
gp
h0
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnscp
dnscp
dsc*
ensc0



Div
idtabs-2

2. Rationale

Planning the software testing activity allows thorough deliberation of tasks, methods, environments, and related criteria before they are implemented. Planning also allows the project team to improve a current project, based on lessons learned from previous projects, including using more appropriate or efficient techniques and ensuring the performance of steps previously missed or not included.

As with any task, having a plan in place ensures that all necessary and required tasks are performed. Development of that plan provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project.


Panel

Ensuring the Software Test Plan follows a template and includes specified information ensures consistency of test plans across projects, ensures proper planning occurs, and prevents repeating problems of the past.



Div
idtabs-3

3. Guidance

The Software Test Plan may be standalone or part of the Software Management Plan. 


Note

Follow Center policies and procedures when determining which approach to use for a particular project.


Floatbox

NPR 7150.2, section 5.1.3, states: "The Software Test Plan describes the plans for software component level testing, software integration testing, software qualification testing, and system qualification testing of software systems. The plan describes the software test environment, development, and test activities. The plan provides an overview of software testing, test schedules, and test management procedures."


The Software Test Plan may be tailored by software classification.  Goddard Space Flight Center's (GSFC's) 580-STD-077-01, Requirements for Minimum Contents of Software Documents,

sweref
090
090
provides one suggestion for tailoring a Software Test Plan based on the required contents and the classification of the software being tested.

As shown in the SWE-104 requirement text above, the Software Test Plan is required to contain specific information.  Below are suggestions for the type of information to consider for the named required contents. Note that if this information is added to the Software Test Plan in phases, IEEE Std 1059-1993, IEEE Guide for Software Verification and Validation Plans,

sweref
211
211
notes that: "The requirements phase produces the ...acceptance and system test plans. The design phase produces the ... component and integration test plans."

Test levels

Specific types of testing may have their own plans, resources, tools, schedules, and other information particular to the testing being performed. Where plans exist to document this information, they need not be duplicated in the Software Test Plan, but they need to be identified and referenced to provide an overall view of the project's testing efforts. This section of the plan also describes the levels at which testing will be performed: configuration item level, system level, etc.

Unit testing


Panel

The intent of unit testing is to confirm that a unit performs the capability assigned to it, correctly interfaces with other units and data, and represents a faithful implementation of the unit design.

sweref
452
452
 


In accordance with IEEE Std 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology,

sweref
222
222
, a unit is defined as:

(1) A separately testable element specified in the design of a computer software component. 
(2) A logically separable part of a computer program.
(3) A software component that is not subdivided into other components.

The Software Test Plan includes an overview of the unit testing process to be used for this project.  The section(s) of the Software Test Plan focused on unit testing addresses the following:

  • Use of test cases.
  • Type of testing to be used: path testing, analytical testing, user interface testing.
  • Exercising functions in the unit.
  • Testing all paths in the unit. (This is a goal, although it may not always be possible.)
  • Testing boundary conditions and error situations.
  • Assessment of timing, sizing, accuracy.
  • Testing of safety features.
    • NASA-GB-8719.13, NASA Software Safety Guidebook,
      sweref
      276
      276
      recommends formal unit test plans and procedures for safety-critical units.
    • Unit tests for non-critical units may be informally documented, e.g., in laboratory notebooks.
  • Tools to be used.
  • Testing assignment: Who will carry out the testing? (Typically, the developer.)
  • Testing iteration: testing until success criteria are achieved (define objective criteria).
  • Documenting and correcting issues.
    • Typically, during unit testing, issues are corrected and the unit retested.
    • Typically, issues related to requirements or design follow the relevant process for correcting those issues.

Software integration testing


Panel

According to NASA-GB-8719.13,

sweref
276
276
"Integration testing deals with how the software components will be incorporated into the final software system and what will be tested at each integration step."


The Software Test Plan includes an overview of the software integration testing process to be used for this project. The section(s) of the Software Test Plan focused on software integration testing addresses the following:

  • Testing environment.
  • Integration sequence.
  • Testing of safety features, including confirmation that non-critical components cannot influence critical components.
  • Interactions among the units.
  • Assessment of timing, sizing, accuracy.
  • Performance at boundaries and under stress conditions.
  • Use of automated tools, when available, for analysis of the results.
  • Repetition of integration testing until success criteria are achieved (define objective criteria).
  • Use of independent personnel to perform testing.
  • Identification of test harnesses or drivers required and whether they exist or need to be developed.

Systems integration testing

The Software Test Plan includes an overview of the testing process used to test software integrated into the larger system. The section(s) of the software test plan focused on systems integration testing addresses the following:

  • Stress, load, disaster, stability testing.
  • Boundary testing (data, interfaces).
  • Functional testing in a complete system environment.
  • Testing of safety features.
  • Pass/fail criteria for each test.
  • Use of independent personnel to perform testing.

End-to-end testing

The Software Test Plan includes an overview of the process used for end-to-end testing.  The section(s) of the Software Test Plan focused on end-to-end testing addresses the following:

  • Stress, load, disaster, stability testing.
  • Functional testing.
  • Testing of safety features.
  • Testing all paths in the system. (This is a goal, although it may not always be possible.)
  • Exercising all branches in the system.
  • Execution of each statement at least once.
  • Testing of boundary conditions for all inputs, as well as nominal and invalid input values.
  • Objective pass/fail criteria for each test.
  • Any special procedures, constraints, dependencies for implementing and running tests.
  • Use of independent personnel to perform testing.

Acceptance testing

Acceptance testing needs to be conducted after the appropriate readiness review has been successfully completed. This type of testing is the customer acceptance test, and the Software Test Plan includes an overview of the process used. The section(s) of the Software Test Plan focused on acceptance testing includes the following:

  • Plans to assure the customer that the system is safe.
  • Plans to confirm that software correctly implements system and software requirements in an operational environment.
  • Use of independent personnel to perform testing.
  • Identification of criteria for stopping testing, e.g., fraction of requirements covered, number of errors remaining, reliability goals.
  • Provisions for witnessing of tests.

Regression testing

Regression tests are used to ensure that changes made to software have had no unintended side effects. The section(s) of the Software Test Plan focused on regression testing addresses the following:

  • Exercising the maximum number of critical functions.
  • Selection of the subset of the total number of tests (from the original test suites) to be used as the regression test set, e.g., key 10 percent.
  • Use of independent personnel to perform testing.
  • Use of automated test tools.

Test classes (designated grouping of test cases)

Test classes describe the grouping of tests that will be performed throughout the various types of testing. When developing the Software Test Plan, consider:

  • Timing tests.
  • Erroneous input tests.
  • Maximum capacity tests.
  • Other factors.

General test conditions

General test conditions are conditions that apply to all of the planned tests or to a specific group of tests. When documenting general test conditions, consider statements and conditions, such as these taken from Langley Research Center's NPR 7150.2 Class A Required Testing Documents With Embedded Guidance:

sweref
083
083

  • "Each test should include nominal, maximum, and minimum values."
  • "Each test of type X should use live data."
  • "Execution size and time should be measured for each software item."
  • Extent of testing to be performed, e.g., percent of some defined total, and the associated rationale.

Test progression

Test progression addresses the sequence or ordering of tests. The Software Test Plan describes dependencies among tests that require that tests be performed in a particular order. 

Data recording, reduction, and analysis

This section of the Software Test Pan describes processes and procedures for capturing and evaluating/analyzing results and issues found during all types of testing.  Consider "manual, automatic, and semi-automatic techniques for recording test results, manipulating the raw results into a form suitable for evaluation, and retaining the results of data reduction and analysis."

sweref
401
401
For specific suggestions for carrying out these activities, see SWE-068 and SWE-069 in this Handbook.

NASA-GB-8719.13

sweref
276
276
recommends test reports be created for unit tests of safety-critical items.

Test coverage (breadth and depth) or other methods for ensuring sufficiency of testing

If not addressed elsewhere in the Software Test Plan, a description of the methods to be used for ensuring sufficient test coverage are provided. Methods could include:

  • Reviews/inspections.
  • Automated tools.

Planned tests, including items and their identifiers

If not already included in sections of the plan focused on specific types of testing (unit, integration, etc.), all planned tests, test cases, data sets, etc., that will be used for the project need to be identified in the Software Test Plan, along with the software items they will be used to test. Each item needs to have its own unique identifier to ensure proper execution and tracking of the planned tests. Consider the following information as information to capture for each test:

  • Objective.
  • Test level.
  • Test type.
  • Test class.
  • Requirements addressed.
  • Software items(s) tested.
  • Type of data to be recorded.
  • Assumptions, constraints, limitations (timing, interfaces, personnel, etc.).
  • Safety, security, privacy considerations.
    sweref
    083
    083

Test schedules

NASA-GB-8719.13

sweref
276
276
notes that: "Scheduling testing phases is always an art, and depends on the expected quality of the software product... Previous history, either of the development team or similar projects, can help determine how long testing will take. Some methods (such as error seeding and Halstead's defect metric) exist for estimating defect density (number of defects per unit of code) when historical information is not available."

Information to consider for test schedules includes:

  • List or chart showing time frames for testing at all test sites.
  • Schedule of test activities for each test site, including on-site test setup, on-site testing, data collection, retesting, etc.
  • Milestones relevant to the development life cycle.

Requirements traceability (or verification matrix)

One of the key purposes of testing is to confirm that the requirements for the project have been met. The Software Test Plan includes a traceability matrix (see SWE-072) that maps tests to software requirements. If the matrix already exists, the test plan includes a reference to the matrix.

Qualification testing environment, site, personnel, and participating organizations

This section or topic in the Software Test Plan lists items such as these taken from NPR 7150.2A Class A Required Testing Documents With Embedded Guidance:

sweref
083
083

  • Sites where testing will occur, identify by name..
  • Software and version necessary to perform the planned testing activities at each site, for example:
    • Compilers, operating systems, communications software.
    • Test drivers, test data generators, test control software.
    • Input files, databases, path analyzers.
    • Other.
  • Hardware and firmware, including versions, that will be used in the software test environment at each site.
  • Manuals, media, licenses, instructions, etc., required to setup and perform the planned tests.
  • Items to be supplied by the site and those items that will be delivered to the test site.
  • Organizations participating in the tests at each site and their roles and responsibilities.
  • Number, type, skill level of personnel required to carry out testing at each site.
  • Training and/or orientation required for testing personnel at each site.
  • Tests to be performed at each site.

In addition to the information required above, test plans address the following information for all types of testing:

  • Resources (personnel, tools, equipment, facilities, etc.).
  • Risks that require contingency planning.
  • What is to be tested and what is not to be tested.
  • Test completion criteria.

If not identified elsewhere, the Software Test Plan identifies the metrics to be collected for each type of testing.  Suggested metrics include:

  • Number of units tested.
  • Hours spent.
  • Number of defects found.
  • Average defects found per line of code.
  • Measures of test coverage, software reliability and maintainability.
  • Other.

Consider the following best practices for Software Test Plans:

  • Begin development of the Software Test Plan(s) early.
    • As soon as the relevant stage has been completed:
      • Helps identify confusing or unclear requirements.
      • Helps identify un-testable design features before implementation.
    • Allows for acquisition/allocation of test resources.
  • Involve the right people in the plan development (quality engineers, software engineers, systems engineers, etc.).

Use the right Sources of Information (first column in table below) as appropriate for the project and for each type of testing, such as:



*Unit Test

*SW Integration Test

*Systems Integration Test

*End-to-End Test

*Acceptance Test

*Regression Test

Software Requirements Specification (SRS)

X


X

X

X

X

Software Design Description (SDD)

X

X





Design traceability

X

X





Interface documents

X

X

X

X

X

X

Draft user documentation

X






Code coverage analyzer specifications

X






Criticality analysis


X





Draft operating documents


X

X

X

 


Draft maintenance documents


X





Final operating documents





X


Final user documentation





X


Concept documents



X

X



Requirements traceability



X

X

X

X

Expected customer usage patterns and conditions



X

X

X

X


*May be a separate test plan referenced in the overall Software Test Plan or part of the overall Software Test Plan.

  • Have the Software Test Plan reviewed/inspected before use (SWE-087).
    • Include Software Assurance and Software Safety personnel to verify safety-critical coverage.
  • Have changes to the Software Test Plan evaluated for their effect on system safety.
  • Keep the Software Test Plan maintained (up to date) and under configuration control.
  • Identify early and focus testing on the components most likely to have issues (high risk, complex, many interfaces, demanding timing constraints, etc.).
    • May require some level of analysis to determine optimal test coverage. (See NASA-GB-8719.13
      sweref
      276
      276
      for a list of automated test coverage analysis tools.)
  • Plan to use independent testing, e.g., fellow programmers, separate test group, separate test organization, NASA Independent Verification & Validation, where possible and cost-effective as new perspectives can turn up issues that authors might not see.
  • Include coverage of user documentation, e.g., training materials, procedures.


Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to Software Test Plans and their contents.


Additional guidance related to software testing may be found in the following related requirements in this Handbook:


SWE-015

Cost Estimation

SWE-065

Test Plans, Procedures, Reports

SWE-066

Perform Testing

SWE-067

Verify Implementation

SWE-068

Evaluate Test Results

SWE-069

Document Defects and Track

SWE-114

Software Test Procedures

SWE-118

Software Test Report




Div
idtabs-4

4. Small Projects

Software Test Plans are necessary for all software projects, but for projects with small budgets or small teams starting with an existing test plan from a project of a similar type and size could help reduce the time and effort required to produce a test plan for a new project. Working with someone experienced in writing test plans, perhaps from another project and on a short-term basis, could help the project team prepare the document in a timely fashion without overburdening team resources. Where applicable, the test plan could reference other project documents rather than reproduce their contents, avoiding duplication of effort and reducing maintenance activities.

Since the Software Test Plan may be standalone or part of the Software Management Plan, incorporating the test plan into a larger project document may be useful for document tracking, review, etc. 


Note

Follow Center policies and procedures when determining which approach to use for a particular project. 


The Software Test Plan may be tailored by software classification. 580-STD-077-01

sweref
090
090
provides one suggestion for tailoring a Software Test Plan based on the required contents and the classification of the software being tested. This tailoring could reduce the size of the Software Test Plan and, therefore, the time and effort to produce and maintain it.


Div
idtabs-5

5. Resources


refstable


Toolstable


Div
idtabs-6

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to software test planning:

  • MPL Uplink Loss Timer Software/Test Errors (1998) (Plan for testing of high-risk sequences). Lesson Number 0939: "Recognize that the transition to another mission phase (e.g. from Entry, Descent, and Landing (EDL) to the landed phase) is a high risk sequence. Devote extra effort to planning and performing tests of these transitions."
    sweref
    530
    530
  • MPL Uplink Loss Timer Software/Test Errors (1998) (Plan to test against full range of parameters). Lesson Number 0939: "Unit and integration testing should, at a minimum, test against the full operational range of parameters. When changes are made to database parameters that affect logic decisions, the logic should be re-tested."
    sweref
    530
    530
  • Deep Space 2 Telecom Hardware-Software Interaction (1999) (Plan to test as you fly). Lesson Number 1197: "To fully validate performance, test integrated software and hardware over the flight operational temperature range."
    sweref
    545
    545
  • International Space Station (ISS) Program/Computer Hardware-Software/Software (Plan realistic but flexible schedules). Lesson Number 1062: "NASA should realistically reevaluate the achievable ... software development and test schedule and be willing to delay ... deployment if necessary rather than potentially sacrificing safety."
    sweref
    536
    536
  • Thrusters Fired on Launch Pad (1975) (Plan for safe exercise of command sequences). Lesson Number 0403: "When command sequences are stored on the spacecraft and intended to be exercised only in the event of abnormal spacecraft activity, the consequences should be considered of their being issued during the system test or the pre-launch phases."
    sweref
    507
    507