See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
4.4.5 The project manager shall unit test the software code.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Unit testing is the process of testing the range of inputs to a unit to ensure that only the intended outputs are produced. By doing this at the lowest level, fewer issues will be discovered when the components are later integrated and tested as a whole. Therefore, during unit testing, it is important to check the maximum and minimum values, invalid values, empty and corrupt data, etc. for each input and output to ensure the unit properly handles the data (processes or rejects it).
Unit testing can be described as the confirmation that the unit performs the capability assigned to it, correctly interfaces with other units and data, and represents a faithful implementation of the unit design.
Ensuring that developers perform unit testing following written test plans helps build quality into the software from the beginning and allows bugs to be corrected early in the project life cycle when such corrections cost the least to the project.
3. Guidance
The project manager shall assure that the unit test results are repeatable. SWE-186
Per IEEE Std 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology, a "unit" is defined as:
(1) A separately testable element specified in the design of a computer software component.
(2) A logically separable part of a computer program.
(3) A software component that is not subdivided into other components.
Given the low-level nature of a unit of code, the person most able to fully test that unit is the developer who created it.
Projects ensure that the appropriate test environment, test materials, and personnel training (SWE-017), are in place and then conduct unit tests per the approved plans (5.10 - STP - Software Test Plan), according to the schedule (SWE-016), and with proper monitoring per the software assurance plan, making sure that:
- Criteria for a successful test are established before the test.
- The test environment represents inputs, output, and stimulus the unit will experience in operation.
- Capture weaknesses or differences between the unit test environment and the actual target environment.
- Following the approved plans for unit testing:
- Unit test results are captured.
- Issues are identified and documented (some minor issues, such as typos, as defined by the project, may simply be corrected without documentation).
- Unit test issues are corrected; these may include:
- Issues found in the code.
- Issues found in test instruments (e.g., scripts, data, procedures).
- Issues found in testing tools (e.g., setup, configuration).
- Unit test corrections are captured (for root cause analysis, as well as proof that the unit test plans were followed).
- Unit test results are evaluated by someone other than the tester to confirm the results, as applicable and practical; evaluation results captured.
- Unit test data, scripts, test cases, procedures, test drivers, test stubs are captured for reference and any required regression testing.
- Notes captured in software engineering notebooks or other documents are captured for reference.
- Objective evidence that unit tests were completed and unit test objectives met is captured in the Software Development Folders (SDFs) or other appropriate project location as called out in the project documentation (e.g., Software Development Plan (SDP)/ Software Management Plan (SMP), Configuration Management (CM)Plan).
- Unit test metrics captured, as appropriate and defined for the project.
Per NASA-GB-8719.13 276, NASA Software Safety Guidebook, software assurance is to "Verify unit testing and data verification is completed before the unit is integrated." Either software assurance or Independent Verification and Validation (IV&V) personnel "Verify unit tests adequately test the software and are performed." When less formal confirmation of unit testing is needed, a software team lead or other designated project member may verify the completeness and correctness of the testing by comparing the results to the test plan to ensure that all logic paths have been tested and verifying the test results are accurate.
Unit testing tools and some integrated development environments (IDEs) can auto-generate unit tests based on the code. These tools provide a quick method to generate unit tests, but may not completely exercise the unit of code. Rerun units test each time the unit is updated to ensure the code continues to work as expected. When continuous integration is part of the life cycle, all of the unit tests are rerun each time the code is updated to ensure only the working code is integrated.
Documented test results, results evaluations, issues, problem reports, corrections, and tester notes can all serve as evidence that unit tests were completed. Comparing those documents to the software test plans for unit testing can ensure the tests were completed following those documented procedures.
Make sure evidence of all test passes is captured.
NASA-GB-8719.13, NASA Software Safety Guidebook, further states in the section on safety-critical unit test plans that "documentation is required to prove adequate safety testing of the software." Therefore, unit test results can play an important role in supporting reviews of safety-critical software.
Consult Center PALs for Center-specific guidance and resources related to unit testing.
Additional guidance related to unit testing may be found in the following related requirements in this handbook:
4. Small Projects
Projects with limited budgets and personnel may choose to perform unit testing or capture unit test results and artifacts in a less formal manner than projects with greater resources. Regardless of the formality of the procedures used, the software test plans for unit testing need to describe the test environment/setup, the results captured, simple documentation procedures, and compliance checks against the procedures. Some Centers have tailored lean unit test procedures and support tools specifically for small projects.
5. Resources
5.1 References
- (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-013) "Code and Unit Test," HOU-EGP-310, Boeing, 2002. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-031) SEL-84-101, Revision 1, Software Engineering Laboratory Series, NASA Goddard Space Flight Center, 1990.
- (SWEREF-047) SEL-81-305, Revision 3, Software Engineering Laboratory Series, NASA Goddard Space Flight Center, 1992.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-220) NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-222) IEEE STD 610.12-1990, 1990. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
- (SWEREF-276) NASA-GB-8719.13, NASA, 2004. Access NASA-GB-8719.13 directly: https://swehb-pri.msfc.nasa.gov/download/attachments/16450020/nasa-gb-871913.pdf?api=v2
- (SWEREF-452) SED Unit Test Guideline, 580-GL-062-02, Systems Engineering Division, NASA Goddard Space Flight Center (GSFC), 2012. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook. Replaces SWEREF-081
- (SWEREF-530) Public Lessons Learned Entry: 939.
- (SWEREF-533) Public Lessons Learned Entry: 1023.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to unit testing:
- MPL Uplink Loss Timer Software/Test Errors (1998) (Plan to test against a full range of parameters.) Lesson Number 0939 530: Lesson Learned No. 2 states: "Unit and integration testing should, at a minimum, test against the full operational range of parameters. When changes are made to database parameters that affect logic decisions, the logic should be re-tested."
- Computer Software/Configuration Control/Verification and Validation (V&V) (Unit level V&V needed for auto-code and auto-code generators.) Lesson Number 1023 533: "The use of the Matrix X auto code generator for ISS software can lead to serious problems if the generated code and Matrix X itself are not subjected to effective configuration control or the products are not subjected to unit-level V&V. These problems can be exacerbated if the code generated by Matrix X is modified by hand."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
Confirm that the project successfully executes the required unit tests, particularly those testing safety-critical functions.
Confirm that the project addresses or otherwise tracks closure errors, defects, or problem reports found during unit tests.
7.2 Software Assurance Products
- None at this time.
Objective Evidence
- Unit test results.
- Software problem or defect report findings related to issues identified in unit testing.
7.3 Metrics
- # of planned unit test cases vs. # of actual unit test cases completed
- # of tests successfully completed vs. total # of tests
- # of tests executed vs. # of tests successfully completed
- # of software work product Non-Conformances identified by life-cycle phase over time
- # of Non-Conformances identified during each testing phase (Open, Closed, Severity)
- # of Requirements tested successfully vs. total # of Requirements
- Total # of Non-Conformances over time (Open, Closed, # of days Open, and Severity of Open)
- # of Non-Conformances in the current reporting period (Open, Closed, Severity)
- # of safety-related non-conformances identified by life-cycle phase over time
- # of Closed action items vs. # of Open action items
- # of Safety-Critical tests executed vs. # of Safety-Critical tests witnessed by SA
- # of detailed software requirements tested to date vs. total # of detailed software requirements
- # of safety-critical requirement verifications vs. total # of safety-critical requirement verifications completed
- # of Open issues vs. # of Closed over time
- # of Hazards containing software that has been successfully tested vs. total # of Hazards containing software
7.4 Guidance
Software assurance will confirm that a thorough set of unit tests are included as part of the software test plan and that the developers are executing those tests, as planned. It is particularly important to include any code performing a safety-critical function in the unit tests since that is often the only place those functions can be tested well. When confirming the tests are being run, be aware that the unit tests must also be repeatable, as per SWE-186. See the SA guidance on SWE-186 for the information needed to make the test repeatable. The software guidance provides the following list of activities that should be occurring during unit testing:
- Criteria for a successful test are established before the test.
- The test environment represents inputs, output, and stimulus the unit will experience in operation.
- Capture weaknesses or differences between the unit test environment and the actual target environment.
- Following the approved plans for unit testing:
- Unit test results are captured.
- Issues are identified and documented (some minor issues, such as typos, as defined by the project, may simply be corrected without documentation).
- Unit test issues are corrected; these may include:
- Issues found in the code.
- Issues found in test instruments (e.g., scripts, data, procedures).
- Issues found in testing tools (e.g., setup, configuration).
- Unit test corrections are captured (for root cause analysis, as well as proof that the unit test plans were followed).
- Unit test results are evaluated by someone other than the tester to confirm the results, as applicable and practical; evaluation results captured.
- Unit test data, scripts, test cases, procedures, test drivers, test stubs are captured for reference and any required regression testing.
- Notes captured in software engineering notebooks or other documents are captured for reference.
- Objective evidence that unit tests were completed and unit test objectives met is captured in the Software Development Folders (SDFs) or other appropriate project location as called out in the project documentation (e.g., Software Development Plan (SDP)/ Software Management Plan (SMP), Configuration Management (CM)Plan).
- Unit test metrics captured, as appropriate and defined for the project
Finally, software assurance will confirm that any errors, defects, etc., found during the unit tests are addressed or tracked to closure. Unit tests should be rerun to verify that the changes made to the software fixed the problem. Any unit tests for safety functions may need to be rerun to make sure that have not been affected.
0 Comments