bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C

SWE-029 - Validation Planning

1. Requirements

3.10.3 The project shall plan the software validation activities, methods, environments, and criteria for the project.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Class

     A      

     B      

     C      

   CSC   

     D      

   DSC   

     E      

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Not Safety Critical; CSC & DSC = Safety Critical; E - H = Never Safety Critical.

2. Rationale

“The Validation Process provides evidence for whether the products perform the following:

  • Satisfy system and software requirements allocated to the products at the end of each life cycle activity
  • Solve the right problem (e.g., correctly model physical laws, implement business rules, and use the proper system assumptions)
  • Satisfy intended use and user needs in the operational environment (i.e., builds the correct product)” 209

“The dynamics of complex systems and the multitude of different logic paths available within the system in response to varying stimuli and conditions demand that the validation effort examines the correctness of the system for each possible variation in conditions. The ability to model complex, real-world conditions will be limited, and thus, the validation effort examines whether the limits of the modeling are realistic and reasonable for the desired solution. The unlimited combination of system conditions presents the [verification and validation] V&V effort with the challenge of using a finite set of analytical, test, simulation, and demonstration techniques to establish a reasonable body of evidence that the system is correct." 209

V&V processes “are used to determine whether the development products of a given activity conform to the requirements of that activity and whether the product satisfies its intended use and user needs.” 209 Software validation is a software engineering activity that shows confirmation that the software product, as provided (or as it will be provided), fulfills its intended use in its intended environment. In other words, validation includes multiple processes and techniques throughout the project life cycle, including, but not limited to, testing, to assess software to ensure that "you built the right thing."

3. Guidance

Software validation is defined as confirmation that the product, as provided (or as it will be provided), fulfills its intended use. In other words, validation ensures that “you built the right thing.”

“V&V processes provide an objective assessment of products and processes throughout the life cycle. This assessment demonstrates whether the requirements are correct, complete, accurate, consistent, and testable. The V&V processes determine whether the products of a given activity conform to the requirements of that activity and whether the product satisfies its intended use and user needs. The determination includes assessment, analysis, evaluation, review, inspection, and testing of products and processes. V&V tasks shall be performed in parallel with all life cycle stages, not at their conclusion.” 209

“The results of V&V provide the following benefits to the program:

  • Facilitate early detection and correction of anomalies
  • Enhance management insight into process and product risks
  • Support the life cycle processes to assure conformance to program performance, schedule, and budget
  • Provide an early assessment of performance
  • Provide objective evidence of conformance to support a formal certification process
  • Improve the products from the acquisition, supply, development, and maintenance processes
  • Support process improvement activities” 209

Validation includes establishing roles, responsibility, and authority to plan, perform, analyze, and report validation activities. This is often necessary when some or all of the software development is performed under contract. It is also important when the validation environment is a service performed by another organization (e.g., high-fidelity simulators or system integration labs).

The basic validation process is shown below with the steps addressed by this requirement highlighted:

                             Figure 3.1. Validation Process With Planning Steps Highlighted

Planning is appropriate for any activity that is to be repeated, that needs to be verified before use, and that requires thought before implementation. Planning the validation activity allows the project team to put more thought into tasks, methods, environments, and related criteria before they are implemented. The identification of validation resources such as validation environments allow them to be developed before they are needed. Planning also allows a current project to improve based on lessons learned from previous projects, including using more appropriate or efficient techniques and ensuring the completeness of all steps in the process.

Having a plan also allows the validation activity to be reviewed, improved, and verified before it is implemented to ensure the outcome will meet the expectations and goals of the validation activity. Planning also helps to ensure the validation activity is cost-efficient and timely.

Validation activities are not performed in an ad hoc manner, but are planned and captured in a validation plan document.  The validation plan is typically part of a verification and validation (V&V) plan, a software V&V plan (SVVP), or is included in the Software Management / Development Plan (SMP/SDP).

The plan covers the validation activities that will occur at various times in the development lifecycle including:

  • During requirements development, validation is accomplished by bringing in the customer and outside people for a review of the requirements, e.g., focus groups, requirements reviews, formal reviews, etc. 
  • During design, validation occurs when the customers have a chance to view prototypes of the product or pieces of the product, e.g., focus groups, user groups, formal reviews, etc. 
  • During implementation, validation occurs when team members review the behavior of software components under both nominal and exception scenarios. For example, a peer review or inspection could trace the execution path through the code under representative scenarios.
  • Prior to delivery, validation occurs when customers see the completed product function in a nearly operational environment, e.g., acceptance testing, operational demonstrations, etc.
  • During product use, validation occurs when the product is used in the operational environment in the way the customer expects it to be used. 

Other examples of validation methods include, but are not limited to: functional demonstrations, acceptance testing against mathematical models, software testing, software peer reviews/inspections of software product components, behavior in a simulated environment, and analyses. Refer to the software plan requirements for software validation planning and incorporation (see 7.18 - Documentation Guidance) in this Handbook.

The project team reviews the plan and validation results at various life cycle reviews, particularly whenever changes occur throughout the duration of the project. Any identified issues are captured in problem reports/change requests/action items and resolved before the requirements are used as the basis for development activities.

Validation is often on the critical path to project completion. Validation activities, therefore, need to be planned and tracked in order to realistically assess progress toward completion. The validation plan will address more than just validation of software requirements. It includes a schedule, stakeholder involvement, and planned reviews, if they are required to complete the validation activities and gain agreement that the requirements are a correct and acceptable description of the system or software to be implemented. Other elements to include in the overall plan:

The Scope and Approach sections of the plan identify the project and define the purpose and goals of the plan including responsibilities, assumptions, and a summary of the efforts described in the plan.

Resources include personnel, environments (such as simulators, facilities, tools, etc.), and include any skills and/or training necessary for those resources to carry out the validation activities.

When developing the validation plan, consider the following for inclusion:

  • Identifying the key functions and/or components that require validation (based on criticality, safety, security, etc.).
  • Identifying the validation methods, techniques, tests to carry out the validation activities for components as well as the system as a whole (see SWE-055).
  • Commercial Off the Shelf (COTS), Government Off the Shelf (GOTS), Modified Off the Shelf (MOTS) effects on the project and associated validation planning (SWE-027).
  • Identifying criteria by which success will be measured for each validation activity.
  • Establishing the target environment (which could be a high-fidelity simulation) for validating the software or system, including validation of tools used in those environments (see SWE-073).
  • Models, simulations, and/or analysis tools and associated validation planning (SWE-070, SWE-135, SWE-136).
  • Identifying how the results will be documented and reported, when and to whom they will be reported (SWE-031).
  • Issue resolution (capture and tracking to closure) for issues or findings identified during validation activities (could be as simple as using the project configuration management process) (see SWE-031).
  • Identifying validation activities, as applicable, to occur during the various life-cycle phases.
  • Re-validation plans to accommodate changes as the system is developed.
  • Method for obtaining customer approval of the validation plan, if applicable.

If not part of the team developing the validation plan, Software Assurance should be part of the plan’s review team to ensure the plan meets all assurance requirements.

NASA-specific validation planning resources and process information is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook. 

See also related requirements in this Handbook:

SWE-031

Validation Results

SWE-055

Requirements Validation

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

5. Resources

  • (SWEREF-019) FSW Testbed Validation Description, 582-2008-006, Version 1, Flight Software Systems Branch, NASA Goddard Space Flight Center (GSFC), 2008. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

  • (SWEREF-209)

    IEEE Computer Society, IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004), This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.

  • (SWEREF-211)

    IEEE Computer Society, IEEE STD 1059-1993, 1993. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.

  • (SWEREF-264)

    NPR 7120.7, NASA Office of the Chief Engineer, 2008. The following appears on this document (web): SPECIAL ATTENTION: ONLY USE NID 7120.99: NASA Information Technology and Institutional Infrastructure Program and Project Management Requirements, as it is the interim directive to NPR 7120.7 and contains the most recent requirements. !DO NOT USE OTHER LINKS ON THIS PAGE!

  • (SWEREF-279)

    Dolores R. Wallace, Laura M. Ippolito, Barbara B. Cuthill, National Institute of Standards and Technology (NIST), NIST Special Publication 500-234, 1996.

  • (SWEREF-334)

    Software Quality Assurance.org, Accessed December20, 2017.

  • (SWEREF-463) Verification and Validation Document: Plan and Report for the System X GFE, Johnson Space Center, 2012.

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

Simics 4.4

COTS

Wind River

http://www.windriver.com/products/simics/ ...

"Wind River Simics is a full system simulator used by software developers to simulate any target hardware from a single processor to large, complex, and connected electronic systems. This simulation enables the target software (board support package, firmware, real-time operating system, middleware, and application) to run on a virtual platform the same way it does on the physical hardware."

IV&V Centers ?, JSC

6. Lessons Learned

There are currently no Lessons Learned identified for this requirement.

  • No labels