3.10.2 The project manager shall plan software verification activities, methods, environments, and criteria for the project.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Key: - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Not Safety Critical; CSC & DSC = Safety Critical; E - H = Never Safety Critical.
Software verification solicits the confirmation that work products, not just the final software product, properly reflect the requirements specified for them. In other words, verification includes multiple processes and techniques, including, but not limited to, testing, to assess products generated throughout the project life cycle to ensure that "you built it right."
Verification ensures that the developer conducts ongoing evaluations of the software development processes, software products and work products and software services to ensure:
- Adherence of the designated, performed processes to the applicable process descriptions, standards, and procedures in accordance with the contract and planning documentation.
- Adherence of the designated work products and services to the applicable process descriptions, standards and procedures in accordance with the contract and planning documentation.
- That each software product required by the project plans, standards, contracts provisions and Agency or Center requirements exist and has undergone software product evaluations and peer reviews, testing (for those products where testing is applicable), and corrective action (for all identified problems), as required.
- That all requirements will be properly evaluated and matched to the appropriate method of verification.
The software verification process and the software validation process (see SWE-029) are both interrelated and complementary. Each process uses the other's process results to establish completion criteria for each software life-cycle activity.
The purpose of these processes is to help the development organization build quality into the software during the software life cycle. The processes provide an objective assessment of software products and processes throughout the software life cycle. This assessment demonstrates whether the software requirements and system requirements (i.e., those allocated to software) are correct, complete, accurate, consistent, and testable. Software V&V is performed in parallel with software development, not at the conclusion of the development effort.
Software V&V is an extension of program management and software systems engineering. The execution of this rigorous methodology collects objective data and formulates conclusions to provide feedback about software quality, performance, and schedule to the development organization. This feedback suggests anomaly resolutions, performance improvements, and quality improvements over all expected operating conditions and also across the full spectrum of the system and its interfaces. Early feedback results allow the development organization to modify the software products in a timely fashion, thereby restraining project and schedule impacts. Without a proactive approach, anomalies and associated software system changes remain undiscovered until later in the program schedule, resulting in proportionately greater program costs and schedule delays.
The verification process provides objective evidence regarding the ability of the software and its associated products and processes to:
- Conform to requirements (e.g., correctness, completeness, consistency, accuracy) for all life-cycle processes during each life-cycle phase (acquisition, supply, development, operation, and maintenance).
- Satisfy standards, practices, and conventions applicable to the work products.
- Successfully complete each life cycle activity and satisfy all the exit and entrance criteria (see Topic 7.9-Entrance and Exit Criteria ) for initiating succeeding life cycle phases (e.g., building the software correctly)
Software verification includes:
- Identification of selected software verification methods and success criteria across the life cycle, for example:
- Software peer review/inspections procedures.
- Re-review/inspection criteria.
- Software verification of requirements by use of simulations.
- Black box and white box testing techniques.
- Software load testing procedures.
- Software stress testing procedures.
- Software performance testing procedures.
- Decision table-based testing procedures.
- Functional decomposition-based testing procedures.
- Acceptance testing procedures.
- Path coverage testing procedures.
- Procedures for analyses of requirement implementation.
- Software product demonstration procedures.
- Identification of selected work products to be verified.
- Description of software verification environments that are to be established for the project (e.g., software testing environment, system testing environment, regression testing environment).
- Identification of where actual software verification records and analysis of the results will be documented (e.g., test records, software peer review/inspection records) and where software verification corrective action will be documented.
The development of a reasonable body of evidence requires a trade-off between the amount of time required versus the set size of system conditions and assumptions used to perform the software verification tasks. Each project defines criteria for a reasonable body of evidence (i.e., selecting a software level establishes one of the basic parameters), time, schedule, and scope of the analysis and test tasks (i.e., range of system conditions and assumptions).
Planning helps identify and propose revisions to requirements that are not verifiable. Planning the software verification activities allows the software development team to evaluate and choose from the best of existing and new techniques and tools, and be trained in their use, before they are needed. Planning also allows a current project to utilize lessons learned from previous software project verification activities, including using more appropriate or more efficient techniques and ensuring the completeness of all steps in the process.
Having a plan also allows the software verification activity to be reviewed, checked for omissions, improved, and approved before it is implemented to ensure the outcome will meet the expectations and goals of the verification activity. Planning also helps to ensure the verification activity is cost-efficient and timely and allows a project to develop, schedule, or procure verification assets or environments before they are needed as well as allocate and train people in the use of these assets prior to the verification activities.
This requirement does not assign the responsibility for performing the software verification tasks to any specific organization. The analysis, evaluation, and test activities may be performed by multiple organizations; however, the methods and purpose will differ for each organization's functional objectives. Organizational assignments are captured during V&V planning and documented in an appropriate project plan.
The completion of the software V&V activity results in the following benefits to the program:
- Early detection and correction of software anomalies.
- Enhanced management insight into process and product risk.
- Support for the life cycle processes to ensure conformance to program performance, schedule, and budget.
- Early assessment of software and system performance.
- Objective evidence of software and system conformance to support a formal certification process.
- Identified improvements for the software development and maintenance processes.
- Process improvements for an integrated systems analysis model.
The choices for software verification activities are dependent upon the software requirements (see SWE-050 and SWE-051), the software architecture (see SWE-057) and design (see SWE-056 and SWE-058), the method of component and system integration (see SWE-060 and SWE-063), and the overall testing philosophy and approach (see SWE-062 and SWE-066 and SWE-073).
The software verification engineer gains an understanding of these portions of the software development activities prior to developing the plan for software verification. In addition, the software verification engineer coordinates planning with the software validation planning activities (see SWE-029) to achieve the most efficient and integrated verification activities.
Software V&V needs to be executed on all of the primary software life cycle processes including:
- Software management processes.
- Software acquisition processes.
- Software supply processes.
- Software development processes.
- Software operation processes.
- Software maintenance processes.
The verification work-flow cycle in Figure 3-1 presents the basic steps for conducting a logical verification activity. It can be used iteratively and recursively during each phase of the software development life cycle (see SWE-019) to verify the software requirements, software work products, and the software units/components up to integrated systems of hardware and software. The four steps enveloped in the larger box are treated in this guidance. (The remaining two steps are discussed in the guidance for SWE-030.)
Figure 3.1. Verification Work Flow 1
The project team and software team review the plan and verification results at various life-cycle reviews (see Topic 7.8 - Maturity of Life Cycle Products at Milestone Reviews ), particularly whenever requirements change during the project. Any identified issues are captured in problem reports, change requests, and action items and resolved before the requirements are used as the basis for development activities.
The verification plan document contains a detailed description of the planned activities, including the verification methods, test activities, the testing environment(s), and a controlled schedule showing all the verification activities. The software verification activity plans are typically included in the Software Management or Development Plan (see Topic 7.18 - Documentation Guidance), or in a standalone Software Verification & Validation plan. Alternatively, they can be included in a project plan's V&V section. The following list suggests information items to include in a software verification section or plan:
- Referenced documents.
- V&V overview.
- Master schedule.
- Software Class or level scheme.
- Resources summary (include personnel).
- Tools, techniques, and methods (include test environments).
- Training (in the methods chosen and in the software product itself).
- V&V activities during the following processes:
- Process: Management.
- Process: Acquisition.
- Process: Supply.
- Process: Development.
- Process: Operation.
- Process: Maintenance.
- V&V reporting requirements:
- Task reports.
- Activity summary reports.
- Anomaly reports.
- V&V final report.
- Special studies reports (optional).
- Other reports (optional).
- V&V Administrative requirements:
- Anomaly resolution and reporting.
- Task iteration policy.
- Deviation policy.
- Standards, practices, and conventions.
- V&V test documentation requirements.
NASA-specific verification planning resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
It is important to remember that verification activities occur in all phases of the project life cycle (e.g., requirements verification during Formulation, and design verification during Implementation see SWE-019). When planning the methods to use, the verification engineer uses the most appropriate method and does not simply assume that testing is the only choice.
Software Work Product Selection
Software work products include requirements and specifications, environmental and coding standards, software architectures, design descriptions, units, components, systems, and related items.
The plan covers verification of software work products that are developed in specific phases of the software development life cycle.
- During the concept phase, verification activities evaluate systems requirements against customer and stakeholder needs.
- During the requirements phase, verification activities evaluate functionally allocated requirements and bidirectional traceability matrices, and t the software tools to be used to develop the software.
- For the coding and testing phase, software audits, inspections, unit testing, systems testing, and integrated systems testing will all produce verification results.
- During product use, verification occurs when the product’s use in the intended operational environment satisfies all remaining project requirements (i.e., it was built in the way it was intended to be coded).
Verification planning is reviewed during each phase of the life cycle and updated as needed based on results from earlier verification activities. Any identified issues are captured in problem reports, change requests, and corrective action activities. Each issue should be tracked to closure.
Verification planning requires the software development team to develop expected results for each verification activity. Satisfaction criteria may be given in numerical form (a specific value, a minimum value, a range of values). They may also be in a pass/fail or true/false format. The expected criteria for successful requirement verifications are typically entered into planning documents and data definition books.
4. Small Projects
Small projects may wish to consolidate their verification planning into the Software Development Plan (SDP) or document it as part of a verification or traceability matrix. They may also reduce the work effort to develop verification plans by using a plan template.
No references have been currently identified for this SWE. If you wish to suggest a reference, please leave a comment below.
Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to verification planning:
- Software Modification Verification.Lesson Number 2816: This lesson indicates the need for software modification verification. The key lesson was to plan to run all procedural change so they are verified against math models of the system. This was not done because it was thought the system was well understood $param0.
- MPL Uplink Loss Timer Software/Test Errors (1998). Lesson Number 0939: This lesson indicates the need for complete test and verification of software. "The Mars Polar Lander (MPL) flight software design contained mission-critical logic errors that were not detected during testing of the spacecraft due to omissions in the pre-launch test program and pre-launch uplink verification process." The Recommendation is that "Unit and integration testing should, at a minimum, test against the full operational range of parameters. When changes are made to database parameters that affect logic decisions, the logic should be re-tested. $param0
- MRO Articulation Keep-Out Zone Anomaly. Lesson Number 2044: This lesson indicates the need for requirements planning and verification planning. "An articulating solar array collided with the MRO spacecraft due to inadequate definition and verification/validation of system-level design requirements for implementing the appendage's keep-out zone in flight software." The project recommended that "special techniques" be applied "to increase confidence in requirements quality and verification completeness. For example, construct SysML or State Analysis models to ensure requirements discovery is complete and to allow early simulations." A key thought here is that verification planning can only be successful if requirements development is complete $param0.