3. GuidanceAs noted above, software test reports document an assessment of the software based on the test results, the detailed test results, a log of the testing activities, and the rationale for any decisions made during the testing process. The software test reports may be tailored by software classification. Goddard Space Flight Center's (GSFC)'s 580-STD-077-01, Requirements for Minimum Contents of Software Documents, provides one suggestion for tailoring software test reports based on the required contents and the classification of the software being tested. Test reports are to be written following each type of testing activity, such as a pass of unit or integration tests. Note that NASA-GB-8719.13, NASA Software Safety Guidebook , recommends formal unit test reports to be written for safety-critical unit tests, while other unit tests' reports may be as simple as notes in a log book. Also note, however, that project planning will include a determination of the degree of formality for the various types of testing, including unit and integration testing, and that formality needs to be documented in the project's software development plan (SDP). Depending on a project's defined procedures, test reports can be of multiple types: Preliminary Test Reports | Detailed Test Reports | Prepared at the end of each test session. | Prepared within a week of test execution. | Provide rapid assessment of how software is working. | Describe problems but do not identify their sources in the code. | Provide early indications of any major problems. | Prepared by test team by analyzing results obtained during test sessions, using hand calculations and detailed comparisons with expected results. | Prepared by test team on basis of a "quick-look" evaluation of the executions. |
|
To identify and properly configuration manage the test environment used for testing, include the following information in the test report: - Version numbers of all software elements used to support testing, such as simulators and monitoring tools.
- Inventory numbers and calibration dates for tools such as logic analyzers, oscilloscopes, multimeters.
- Version numbers for all FPGA (Field Programmable Gate Array) components present on the testbed.
- Serial numbers of the testbed elements.
Test reports are to provide evidence of the thoroughness of the testing, including: - Differences in the test environment and the operational environment and any effects those differences had on the test results.
- Any test anomalies and the disposition of any related corrective actions or problem reports.
- Details of the test results (see requirement text), including test case identifications, test version, completion status, etc., along with the associated item tested as required by the project's software test plan.
- Location of original test results (output from tests, screen shots, error messages, etc., as captured during the actual testing activity).
Issues that may arise related to software test reports include: - If acquired software does not include complete (meet the required content) test reports, it is prudent for the acquirer to conduct additional testing upon delivery of the software from the provider.
- Test reports may not be available for acquired off-the-shelf (OTS) products. SWE-027 requires the project to ensure OTS is verified and validated to the same level of confidence as a developed software component. Lack of existing OTS test reports, does not relieve the project from subsequent tests and test reports to determine fitness for use.
Useful processes or recommended practices for software test reports include: - Review and signoff by Software Assurance and Software Safety for software safety verifications; may include NASA Independent Verification and Validation (IV&V) if those services are being used for the project.
- Maintenance under configuration management.
- If software is acquired, the provider needs to provide software test reports containing the required information.
- Use of templates or checklists to ensure accurate capture and recording of detailed information such as the test logs and deviations from planned test cases/test procedures.
- Use of tables to summarize test results, including requirements tested, pass/fail results, identification of tests performed, etc.
- Referencing of problem reports/change requests when documenting details of remaining deficiencies, limitations, constraints, etc., rather than duplication of the information contained in those reports/requests.
Additional guidance related to software testing may be found in the following requirements in this Handbook: SWE-027 | Use of Commercial, Government, and Legacy Software | SWE-065 | Test Plans, Procedures, Reports | SWE-066 | Perform Testing | SWE-068 | Evaluate Test Results |
|