bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-028 - Verification Planning
Unknown macro: {div3}

1. Requirements

2.4.1 The project shall plan software verification activities, methods, environments, and criteria for the project.

1.1 Notes">1.1 Notes

Software verification is a software engineering activity that shows confirmation that software products properly reflect the requirements specified for them. In other words, verification ensures that "you built it right." Examples of verification methods include but are not limited to: software peer reviews/inspections of software engineering products for discovery of defects, software verification of requirements by use of simulations, black box and white box testing techniques, software load testing, software stress testing, software performance testing, decision table-based testing, functional decomposition-based testing, acceptance testing, path coverage testing, analyses of requirement implementation, and software product demonstrations. Refer to the software plan requirements for software verification planning and incorporation, including the planned use of software IV&V activities.

1.2 Applicability Across Classes

Class D and not Safety Critical is labeled with "P (Center). This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class G is labeled with "P (Center). This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

    P(C)

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Verification planning ensures that all requirements will be properly evaluated and matched to the appropriate method of verification. Planning the software verification activities allows the software development team to evaluate and choose from the best of existing and new techniques and tools, and be trained in their use, before they are needed. Planning also allows a current project to utilize lessons learned from previous software project verification activities, including using more appropriate or more efficient techniques and ensuring the completeness of all steps in the process.

Having a plan also allows the software verification activity to be reviewed, checked for omissions, improved, and approved before it is implemented to ensure the outcome will meet the expectations and goals of the verification activity. Planning also helps to ensure the verification activity is cost-efficient and timely and allows a project to develop, schedule or procure verification assets or environments before they are needed as well as allocate and train people in the use of these assets prior to the verification activities.

Unknown macro: {div3}

3. Guidance

"Verification proves that a realized product for any system model within the system structure conforms to the build to requirements (for software elements)...". 2

Software verification solicits the confirmation that work products properly reflect the requirements specified for them. In other words, verification ensures that "you built it right." The software verification process and the software validation process (see [SWE-029]) are both interrelated and complementary. Each process uses the other's process results to establish completion criteria for each software life cycle activity.

Software verification and validation (V&V) processes determine whether the work products of a given activity conform to the requirements for that product and whether the software satisfies its intended use and user needs. Software V&V life cycle processes should be selected and tailored as needed for different software classes. Software verification processes have applicability to software-based systems, computer software, hardware, and interfaces. Software verification processes include analysis, evaluation, review, inspection, assessment, and testing of software products during each phase of the software life cycle (see the NASA Systems Engineering Handbook for an explanation of these verification activities. The IEEE Std 1012, Standard for Software Verification and Validation Plans, also has a good overview and definition of the different software verification tasks that can be executed on particular software classes as needed).

The purpose of these processes is to help the development organization build quality into the software during the software life cycle. The processes provide an objective assessment of software products and processes throughout the software life cycle. This assessment demonstrates whether the software requirements and system requirements (i.e., those allocated to software) are correct, complete, accurate, consistent, and testable. Software V&V is performed in parallel with software development, not at the conclusion of the development effort.

Software V&V is an extension of program management and software systems engineering. The execution of this rigorous methodology collects objective data and formulates conclusions to provide feedback about software quality, performance, and schedule to the development organization. This feedback suggests anomaly resolutions, performance improvements, and quality improvements over all expected operating conditions and also across the full spectrum of the system and its interfaces. Early feedback results allow the development organization to modify the software products in a timely fashion, thereby restraining project and schedule impacts. Without a proactive approach, anomalies and associated software system changes remain undiscovered until later in the program schedule, resulting in proportionately greater program costs and schedule delays.

The verification process provides objective evidence regarding the ability of the software and its associated products and processes to:

  • Conform to requirements (e.g., correctness, completeness, consistency, accuracy) for all life cycle processes during each life cycle phase (acquisition, supply, development, operation, and maintenance)
  • Satisfy standards, practices, and conventions applicable to the work products
  • Successfully complete each life cycle activity and satisfy all the exit and entrance criteria (see Topic 7.3 ) for initiating succeeding life cycle phases (e.g., building the software correctly)

Software verification includes:

  • Identification of selected software verification methods and success criteria across the life cycle (e.g., software peer review/inspections procedures, re-review/inspection criteria, testing procedures).
  • Identification of selected work products to be verified.
  • Description of software verification environments that are to be established for the project (e.g., software testing environment, system testing environment, regression testing environment).
  • Identification of where actual software verification records and analysis of the results will be documented (e.g., test records, software peer review/inspection records) and where software verification corrective action will be documented.

The development of a reasonable body of evidence requires a trade-off between the amount of time required versus the set size of system conditions and assumptions against which to perform the software verification tasks. Each project should define criteria for a reasonable body of evidence (i.e., selecting a software level establishes one of the basic parameters), time, schedule, and scope of the analysis and test tasks (i.e., range of system conditions and assumptions).

This requirement does not assign the responsibility for performing the software verification tasks to any specific organization. The analysis, evaluation, and test activities may be performed by multiple organizations; however, the methods and purpose will differ for each organization's functional objectives. Organizational assignments should be captured in the software V&V plan.

The completion of the software V&V activity results in the following benefits to the program:

  • Early detection and correction of software anomalies
  • Enhanced management insight into process and product risk
  • Support for the life cycle processes to ensure conformance to program performance, schedule, and budget
  • Early assessment of software and system performance
  • Objective evidence of software and system conformance to support a formal certification process
  • Identified improvements for the software development and maintenance processes
  • Process improvements for an integrated systems analysis model

The choices for software verification activities are dependent upon the software requirements (see [SWE-050] and [SWE-051]), the software architecture (see [SWE-057]) and design (see [SWE-056] and [SWE-058]), the method of component and system integration (see [SWE-060] and [SWE-063]), and the overall testing philosophy and approach (see [SWE-062] and [SWE-066] and [SWE-073]).

The software verification engineer must have an understanding of these portions of the software development activities prior to developing the plan for software verification. In addition, the software verification engineer must coordinate planning with the software validation planning activities (see [SWE-029]) to achieve the most efficient and integrated verification activities.
Software V&V needs to be executed on all of the primary software life cycle processes including:

  • Software management processes
  • Software acquisition processes
  • Software supply processes
  • Software development processes
  • Software operation processes
  • Software maintenance processes

Verification Activities

The verification work flow cycle in Figure 3-1 presents the basic steps for conducting a logical verification activity. It can be used iteratively and recursively during each phase of the software development life cycle (see [SWE-019]) to verify the software requirements, software work products, and the software units/components up to integrated systems of hardware and software. The four steps enveloped in the larger box are treated in this guidance. (The remaining two steps are discussed in the guidance for [SWE-030].)

The project team and software team needs to review the plan and verification results at various life cycle reviews (see Topic 7.4 ), particularly whenever requirements change during the project. Any identified issues can be captured in problem reports, change requests, and action items and resolved before the requirements are used as the basis for development activities.

The verification plan document contains a detailed description of the planned activities, including the verification methods, test activities, the testing environment(s), and a controlled schedule showing all the verification activities. The software verification activity plans are typically included in the Software Management or Development Plan (see [SWE-102]), or in a standalone Software Verification & Validation plan. Alternatively, they can be included in a project plan's verification and validation (V&V) section. The following list suggests information items to include in a software verification section or plan:

  • Purpose
  • Referenced documents
  • Definitions
  • V&V overview
  • Organization
  • Master schedule
  • Software Class or level scheme
  • Resources summary
  • Responsibilities
  • Tools, techniques, and methods
  • V&V processes
    • Process: Management
    • Process: Acquisition
    • Process: Supply
    • Process: Operation
    • Process: Maintenance
  • V&V reporting requirements
    • Task reports
    • Activity summary reports
    • Anomaly reports
    • V&V final report
    • Special studies reports (optional)
    • Other reports (optional)
  • V&V Administrative requirements
    • Anomaly resolution and reporting
    • Task iteration policy
    • Deviation policy
  • Standards, practices, and conventions
  • V&V test documentation requirements

See the Resources section for other template examples 1, 2 of information to include in a software verification plan.

It is important to remember that verification activities occur in all phases of the project life cycle (e.g., requirements verification during Formulation, and design verification during Implementation see [SWE-019]). When planning the methods to use, the verification engineer should use the most appropriate method and not simply assume that testing is the only choice.

Software Work Product Selection

Software work products include requirements and specifications, environmental and coding standards, software architectures, design descriptions, units, components, systems, and related items.

The plan should cover verification of software work products that are developed in specific phases of the software development life cycle.

  • During the concept phase, verification activities should evaluate systems requirements against customer and stakeholder needs
  • During the requirements phase, functionally allocated requirements, bidirectional traceability matrices, should be a part of the verification activities and that the software tools to be used to develop the software are verified to the project plan
  • For the coding and testing phase, software audits, inspections, unit testing, systems testing, and integrated systems testing will all produce verification results
  • During product use, verification occurs when the product's use in the intended operational environment satisfies all remaining project requirements (i.e., it was built in the way it was intended to be coded)

Verification planning should be reviewed during each phase of the life cycle and updated as needed based on results from earlier verification activities. Any identified issues should be captured in problem reports, change requests, and corrective action activities. Each issue should be tracked to closure.

Verification Criteria

Verification planning requires the software development team to develop expected results for each verification activity. Satisfaction criteria may be given in numerical form (a specific value, a minimum value, a range of values). They may also be in a pass/fail or true/false format. The expected criteria for successful requirement verifications are typically entered into planning documents and data definition books.

Unknown macro: {div3}

4. Small Projects

Small projects may wish to consolidate their verification planning into the Software Development Plan (SDP) or document it as part of a verification or traceability matrix. They may also reduce the work effort to develop verification plans by using a plan template. 9, 10

Unknown macro: {div3}

5. Resources

  1. IEEE Std 12207, Systems and software engineering — Software life cycle Processes, 2008
  2. NASA Systems Engineering Handbook, NASA/SP-2007-6105 Rev1, 2007
  3. NASA Systems Engineering Processes and Requirements with Change 1, NPR 7123.1A, 2009
  4. NASA Governance and Strategic Management Handbook, NPD 1000.0A, 2008
  5. NASA Engineering and Program/Project Management Policy, NPD 7120.4D, 2010
  6. NASA Space Flight Program and Project Management Requirements, NPR 7120.5D (NM-7120.81), 2009
  7. COTS Software: Vendor Demonstration Guidelines and Scripts, Defense Acquisition University, 2009
  8. IEEE Std 1012, Standard for Software Verification and Validation Plans, 2004
  9. Software Verification and Validation Plan (SVVP) Template (based on IEEE standards), Texas State University Computer Science Department, 2001
  10. Verification and Validation Plan Template, adapted from NASA-SP-6105.

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

WindRiver Workbench 3.2

COTS

Wind River

http://www.windriver.com/products/workbench/ ...

"Based on the Eclipse platform, Wind River Workbench is a collection of tools that accelerates time-to-market for developers building devices with VxWorks and Wind River Linux. Through tight integration with the industry's leading RTOS and the leading device Linux distribution, Workbench offers the only end-to-end, open standards-based collection of tools for device software design, development, debugging, test, and management."

IV&V Centers?, ARC (no version noted)

Simics 4.4

COTS

Wind River

http://www.windriver.com/products/simics/ ...

"Wind River Simics is a full system simulator used by software developers to simulate any target hardware from a single processor to large, complex, and connected electronic systems. This simulation enables the target software (board support package, firmware, real-time operating system, middleware, and application) to run on a virtual platform the same way it does on the physical hardware."

IV&V Centers ?, JSC

Unknown macro: {div3}

6. Lessons Learned

  • The following lessons learned from the Hubble Space Telescope project (Lesson Number 2816) indicates the need for software modification verification.
    The key lesson was to plan to run all procedural change so they are verified against math models of the system. This was not done because it was thought the system was well understood. (http://www.nasa.gov/offices/oce/llis/imported_content/lesson_2816.html).
  • The following lessons learned from the Mars Polar Lander project (Lesson Number 0939) indicates the need for complete test and verification of software.
    The Mars Polar Lander (MPL) flight software design contained mission-critical logic errors that were not detected during testing of the spacecraft due to omissions in the pre-launch test program and pre-launch uplink verification process. Unit and integration testing should, at a minimum, test against the full operational range of parameters. When changes are made to database parameters that affect logic decisions, the logic should be re-tested. (http://www.nasa.gov/offices/oce/llis/0939.html)
  • The following lessons learned from the Mars Reconnaissance Orbiter project (Lesson Number 2044) indicates the need for requirements planning and verification planning.
    An articulating solar array collided with the MRO spacecraft due to inadequate definition and verification/validation of system-level design requirements for implementing the appendage's keep-out zone in flight software.
    The project recommended that special techniques be applied to increase confidence in requirements quality and verification completeness. For example, construct SysML or State Analysis models to ensure requirements discovery is complete and to allow early simulations.
    A key thought here is that verification planning can only be successful if requirements development is complete. (http://www.nasa.gov/offices/oce/llis/2044.html)
  • No labels