This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D

SWE-067 - Verify Implementation

1. Requirements

4.5.4 The project manager shall verify the requirement to the implementation of each software requirement.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes























Key:    - Applicable | - Not Applicable
A & B = Always Safety-Critical; C & D = Not Safety-Critical; CSC & DSC = Safety-Critical; E - H = Never Safety-Critical.

2. Rationale

Requirements are the basis for a project. They identify the need to be addressed, the behavior of the system, and the constraints under which the problem is to be solved. They also specify the product to be delivered by a contracted provider of software.

To ensure that the resulting product addresses the need, provides the specified behavior, and performs within the stated constraints, the implementation (code) of those requirements needs to be verified against the requirements.

3. Guidance

Verifying implementation refers to the activity of verifying that the software requirements, including interface requirements, have been implemented.  Specific to a software project's implementation, verification involves confirming that the implementation (code) correctly, completely, consistently, and accurately includes each software requirement. Verification methods can include testing, analysis, demonstration, and inspection. Per NPR 7150.2, “testing verifies the code against the requirements and the design to ensure that the requirements are implemented. Testing also identifies problems and defects that are corrected and tracked to closure before product delivery. Testing also validates that the software operates appropriately in the intended environment.”

“Software item qualification testing is performed to demonstrate ... that software item requirements have been met.  It covers software items requirements in the Software Requirements Specifications (SRSs) and in associated Interface Requirements Specifications (IRSs),” 478 including interface control documents (ICDs) and hardware interface requirements.

The software testing should be performed using the target hardware.  The target hardware used for the software testing should be as close as possible to the operational target hardware and should be in a configuration as close as possible to the operational configuration.  The software test cases should cover “verification of all software requirements under conditions that are as close as possible to those that the software will encounter in the operational environment (e.g., operational data constraints, operational input and output data rates, operational scenarios, target hardware configurations); verification of all software interface requirements, using the actual interface wherever possible or high-fidelity simulations of the interface where not possible; verification of all software specialty engineering requirements (e.g., supportability, testability, dependability/reliability/maintainability/availability, safety, security, and human systems integration, as applicable), including in particular verification of software reliability requirements and fault detection, isolation, and recovery requirements; stress testing, including worst–case scenarios; and resource utilization measurements (e.g., CPU, memory, storage, bandwidth).  All software requirements should be verified by software ... testing whether they are satisfied by COTS, reuse... or newly developed software.” 478

The NASA Independent Verification and Validation Technical Framework (IVV 09-1) document 003 states that "It is important to recognize that requirements cannot be evaluated in isolation. Requirements must be evaluated as a set in order to determine that a particular goal or behavior is being met."

Confirmation of software requirements implementation needs to occur at various times in the project life cycle to ensure that any issues are found and corrected as early as possible:

  • Peer reviews.
  • Traceability analyses.
  • Completion of units of code.
  • Integration testing.
  • System testing. 
The NASA Independent Verification and Validation Technical Framework (IVV 09-1) document 003 states that "It is important to recognize that requirements cannot be evaluated in isolation. Requirements must be evaluated as a set in order to determine that a particular goal or behavior is being met."

Verification of requirements implementation includes the following objectives:

  • Ensure that the source code reliably performs capabilities stated in the requirements under nominal and off-nominal conditions, as applicable to the software classification (NASA IV&V Technical Framework 003, Revision M).
  • Ensure that the source code provides the reliability and fault tolerance stated in the requirements, as applicable to the software classification (NASA IV&V Technical Framework 003, Revision M).
  • Ensure that the source code satisfies functional, performance, and other requirements (Software Development Process Description Document 001, Revision R).

While the primary means of implementation verification is testing, the following analysis techniques are also useful (Software Development Process Description Document 001, Revision R). Note that these techniques are useful for detecting coding issues and may not necessarily be required for verifying that the software properly implements the requirements unless the requirements include statements related to the complexity, memory usage, coding standards, etc.

  • McCabe complexity analysis (measures the number of linearly independent paths through a program's source code).
  • Memory analysis.
  • Static analysis.
  • Code standards checking.

Requirements implementation verification activities need to be planned and documented (Software Test Plan) and verification techniques can be included in the bidirectional requirements traceability matrix (SWE-072) to ensure that all requirements are verified in the implementation. Results of this activity are to be documented (SWE-069), evaluated (SWE-068), and defects corrected.

NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to implementation verification.

Additional guidance related to software testing, including specifics of plan, procedure, and report contents may be found in the following related requirements in this handbook:

4. Small Projects

Small projects with few personnel could use a single document to describe the verification procedure as well as the test results rather than have the overhead of using separate documents.

Small projects with few requirements may combine verification planning, the traceability matrix, and verification results into one product, rather than separate documents. This document may look like a verification matrix that identifies the requirements, type of verification for each requirement, any required tools, personnel, and/or environment needed for the verification, and the final results. The key for this requirement is to ensure the final product meets the content of the stated software requirements.

5. Resources

  • (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. See Chapters 16 and 17. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

  • (SWEREF-003)

    IVV 09-1, Revision P, NASA Independent Verification and Validation Program, Effective Date: February 26, 2016

  • (SWEREF-271)

    NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07

  • (SWEREF-478)

    Aerospace Report No. TOR-2004(3909)-3537, Revision B, March 11, 2005.

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

No Lessons Learned have currently been identified for this requirement.

  • No labels