bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D

Return to 7.18 - Documentation Guidance

STR - Software Test Report

1. Minimum Recommended Content


a. Overview of the test results:

  1. Overall evaluation of the software as shown by the test results.
  2. Remaining deficiencies, limitations, or constraints detected by testing (e.g., including description of the impact on software and system performance, the impact a correction would have on software and system design, and recommendations for correcting the deficiency, limitation, or constraint).
  3. Impact of test environment.

b. Detailed test results:

  1. Project-unique identifier of a test and test procedure(s).
  2. Summary of test results (e.g., including requirements verified).
  3. Problems encountered.
  4. Deviations from test cases/procedures.

c. Test log:

  1. Date(s), time(s), and location(s) of tests performed.
  2. Test environment, hardware, and software configurations used for each test.
  3. Date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable.             

d. Rationale for decisions.


2. Rationale

When testing software, it is important to capture the outcome of tests used to verify requirements, functionality, safety, and other aspects of the software. It is also important to capture any decisions based on the outcome of those tests. Test reports capture that information and more for purposes including but not limited to:

  • Documenting what was done and how the results match or differ from the expected results.
  • Identification and isolation of the source of any error found in the software.
  • Verification that testing was completed as planned.
  • Verification that safety-critical elements were properly tested.
  • Verification that all identified hazards have been eliminated or controlled to an acceptable level of risk.
  • Reporting safety-critical findings that should be used to update hazard reports.
  • Generating data for evaluating the quality of tested products and the effectiveness of testing processes.

3. Guidance

As noted above, software test reports document an assessment of the software based on the test results, the detailed test results, a log of the testing activities, and the rationale for any decisions made during the testing process.

The software test reports may be tailored by software classification. Goddard Space Flight Center's (GSFC)'s 580-STD-077-01, Requirements for Minimum Contents of Software Documents, provides one suggestion for tailoring software test reports based on the required contents and the classification of the software being tested.

Test reports are to be written following each type of testing activity, such as a pass of unit or integration tests. Note that NASA-GB-8719.13, NASA Software Safety Guidebook 276, recommends formal unit test reports to be written for safety-critical unit tests, while other unit tests' reports may be as simple as notes in a log book. Also note, however, that project planning will include a determination of the degree of formality for the various types of testing, including unit and integration testing, and that formality needs to be documented in the project's software development plan (SDP).

Depending on a project's defined procedures, test reports can be of multiple types:

Preliminary Test Reports

Detailed Test Reports

Prepared at the end of each test session.

Prepared within a week of test execution.

Provide rapid assessment of how software is working.

Describe problems but do not identify their sources in the code.

Provide early indications of any major problems.

Prepared by test team by analyzing results obtained during test sessions, using hand calculations and detailed comparisons with expected results.

Prepared by test team on basis of a "quick-look" evaluation of the executions.



To identify and properly configuration manage the test environment used for testing, include the following information in the test report:

  • Version numbers of all software elements used to support testing, such as simulators and monitoring tools.
  • Inventory numbers and calibration dates for tools such as logic analyzers, oscilloscopes, multimeters.
  • Version numbers for all FPGA (Field Programmable Gate Array) components present on the testbed.
  • Serial numbers of the testbed elements.

Test reports are to provide evidence of the thoroughness of the testing, including:

  • Differences in the test environment and the operational environment and any effects those differences had on the test results.
  • Any test anomalies and the disposition of any related corrective actions or problem reports.
  • Details of the test results (see requirement text), including test case identifications, test version, completion status, etc., along with the associated item tested as required by the project's software test plan.
  • Location of original test results (output from tests, screen shots, error messages, etc., as captured during the actual testing activity).

Issues that may arise related to software test reports include:

  • If acquired software does not include complete (meet the required content) test reports, it is prudent for the acquirer to conduct additional testing upon delivery of the software from the provider.
  • Test reports may not be available for acquired off-the-shelf (OTS) products. SWE-027 requires the project to ensure OTS is verified and validated to the same level of confidence as a developed software component. Lack of existing OTS test reports, does not relieve the project from subsequent tests and test reports to determine fitness for use.

Useful processes or recommended practices for software test reports include:

  • Review and signoff by Software Assurance and Software Safety for software safety verifications; may include NASA Independent Verification and Validation (IV&V) if those services are being used for the project.
  • Maintenance under configuration management.
  • If software is acquired, the provider needs to provide software test reports containing the required information.
  • Use of templates or checklists to ensure accurate capture and recording of detailed information such as the test logs and deviations from planned test cases/test procedures.
  • Use of tables to summarize test results, including requirements tested, pass/fail results, identification of tests performed, etc.
  • Referencing of problem reports/change requests when documenting details of remaining deficiencies, limitations, constraints, etc., rather than duplication of the information contained in those reports/requests.

Additional guidance related to software testing may be found in the following requirements in this Handbook:

SWE-027

Use of Commercial, Government, and Legacy Software

SWE-065

Test Plans, Procedures, Reports

SWE-066

Perform Testing

SWE-068

Evaluate Test Results

SWE-069

Document Defects and Track

4. Small Projects

For projects with small budgets or small team size, the following approaches may be helpful in reducing the time and cost of preparing software test reports:

  • Automate collection of detailed report data and/or test logs for the test report, particularly if the automated tools already exist at the Center.
  • Use existing templates rather than create new ones.
  • Use less formal reporting for unit tests, if Center procedures allow.
  • Tailor the contents of the software test reports, where allowed and appropriate.
  • Reference existing content rather than duplicate it in the test report. 

 5. Resources

5.1 Tools

Tools relative to this Topic may be found in the table below. You may wish to reference the Tools Table (click here to visit) in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

No tools have been currently identified for this Topic. If you wish to suggest a tool, please contact the Page Editor listed at the bottom of the page.

 6. Lessons Learned

No Lessons Learned have currently been identified.

  • No labels