This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C
4.5.4 The project manager shall verify the requirement to the implementation of each software requirement.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Class A B C CSC D DSC E F G H Applicable?
Key: - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Not Safety Critical; CSC & DSC = Safety Critical; E - H = Never Safety Critical.
Requirements are the basis for a project. They identify the need to be addressed, the behavior of the system, and the constraints under which the problem is to be solved. They also specify the product to be delivered by a contracted provider of software.
To ensure that the resulting product addresses the need, provides the specified behavior, and performs within the stated constraints, the implementation (code) of those requirements needs to be verified against the requirements.
Verifying implementation refers to the activity of verifying that the software requirements, including interface requirements, have been implemented. Specific to a software project's implementation, verification involves confirming that the implementation (code) correctly, completely, consistently, and accurately includes each software requirement. Verification methods can include testing, analysis, demonstration, and inspection. Per NPR 7150.2, “testing verifies the code against the requirements and the design to ensure that the requirements are implemented. Testing also identifies problems and defects that are corrected and tracked to closure before product delivery. Testing also validates that the software operates appropriately in the intended environment.”
“Software item qualification testing is performed to demonstrate ... that software item requirements have been met. It covers software items requirements in the Software Requirements Specifications (SRSs) and in associated Interface Requirements Specifications (IRSs),” 478 including interface control documents (ICDs) and hardware interface requirements.
The software testing should be performed using the target hardware. The target hardware used for the software testing should be as close as possible to the operational target hardware and should be in a configuration as close as possible to the operational configuration. The software test cases should cover “verification of all software requirements under conditions that are as close as possible to those that the software will encounter in the operational environment (e.g., operational data constraints, operational input and output data rates, operational scenarios, target hardware configurations); verification of all software interface requirements, using the actual interface wherever possible or high-fidelity simulations of the interface where not possible; verification of all software specialty engineering requirements (e.g., supportability, testability, dependability/reliability/maintainability/availability, safety, security, and human systems integration, as applicable), including in particular verification of software reliability requirements and fault detection, isolation, and recovery requirements; stress testing, including worst–case scenarios; and resource utilization measurements (e.g., CPU, memory, storage, bandwidth). All software requirements should be verified by software ... testing whether they are satisfied by COTS, reuse... or newly developed software.” 478
The NASA Independent Verification and Validation Technical Framework (IVV 09-1) document 003 states that "It is important to recognize that requirements cannot be evaluated in isolation. Requirements must be evaluated as a set in order to determine that a particular goal or behavior is being met."
Confirmation of software requirements implementation needs to occur at various times in the project life cycle to ensure that any issues are found and corrected as early as possible:
- Peer reviews.
- Traceability analyses.
- Completion of units of code.
- Integration testing.
- System testing.
Verification of requirements implementation includes the following objectives:
- Ensure that the source code reliably performs capabilities stated in the requirements under nominal and off-nominal conditions, as applicable to the software classification (NASA IV&V Technical Framework 003, Revision M).
- Ensure that the source code provides the reliability and fault tolerance stated in the requirements, as applicable to the software classification (NASA IV&V Technical Framework 003, Revision M).
- Ensure that the source code satisfies functional, performance, and other requirements (Software Development Process Description Document 001, Revision R).
While the primary means of implementation verification is testing, the following analysis techniques are also useful (Software Development Process Description Document 001, Revision R). Note that these techniques are useful for detecting coding issues and may not necessarily be required for verifying that the software properly implements the requirements unless the requirements include statements related to the complexity, memory usage, coding standards, etc.
- McCabe complexity analysis (measures the number of linearly independent paths through a program's source code).
- Memory analysis.
- Static analysis.
- Code standards checking.
Requirements implementation verification activities need to be planned and documented (Software Test Plan) and verification techniques can be included in the bidirectional requirements traceability matrix (SWE-072) to ensure that all requirements are verified in the implementation. Results of this activity are to be documented (SWE-069), evaluated (SWE-068), and defects corrected.
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to implementation verification.
Additional guidance related to software testing, including specifics of plan, procedure, and report contents may be found in the following related requirements in this handbook:
4. Small Projects
Small projects with few personnel could use a single document to describe the verification procedure as well as the test results rather than have the overhead of using separate documents.
Small projects with few requirements may combine verification planning, the traceability matrix, and verification results into one product, rather than separate documents. This document may look like a verification matrix that identifies the requirements, type of verification for each requirement, any required tools, personnel, and/or environment needed for the verification, and the final results. The key for this requirement is to ensure the final product meets the content of the stated software requirements.
Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.
No tools have been currently identified for this SWE. If you wish to suggest a tool, please leave a comment below.
6. Lessons Learned
No Lessons Learned have currently been identified for this requirement.