bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 98 Next »


SWE-029 - Validation Planning

1. Requirements

2.4.2 The project shall plan the software validation activities, methods, environments, and criteria for the project.

1.1 Notes

Software validation is a software engineering activity that shows confirmation that the software product, as provided (or as it will be provided), fulfills its intended use in its intended environment. In other words, validation ensures that "you built the right thing." Examples of validation methods include but are not limited to: formal reviews, prototype demonstrations, functional demonstrations, software testing, software peer reviews/inspections of software product component, behavior in a simulated environment, acceptance testing against mathematical models, analyses, and operational environment demonstrations. Refer to the software plan requirements for software validation planning and incorporation (Chapter 5 [of NPR 7150.2, NASA Software Engineering Requirements, section 5.1.1]) [detailed in SWE-102 in this Handbook].

1.2 Applicability Across Classes

Class D Non-Safety Critical and Class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement. 

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

    P(C)

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

Planning is appropriate for any activity that is to be repeated, that needs to be verified before use, and that requires thought before implementation. Planning the validation activity allows the project team to put more thought into tasks, methods, environments, and related criteria before they are implemented. The identification of validation resources such as validation environments allow them to be developed before they are needed. Planning also allows a current project to improve based on lessons learned from previous projects, including using more appropriate or efficient techniques and ensuring the completeness of all steps in the process.

Having a plan also allows the validation activity to be reviewed, improved, and verified before it is implemented to ensure the outcome will meet the expectations and goals of the validation activity. Planning also helps to ensure the validation activity is cost-efficient and timely.

3. Guidance

Validation includes establishing roles, responsibility, and authority to plan, perform, analyze, and report validation activities. This is often necessary when some or all of the software development is performed under contract. It is also important when the validation environment is a service performed by another organization (e.g., high-fidelity simulators or system integration labs).

The basic validation process is shown below with the steps addressed by this requirement highlighted:

                             Figure 3.1. Validation Process With Planning Steps Highlighted

Validation activities are not performed in an ad hoc manner, but are planned and captured in a validation plan document. The validation plan is typically part of a verification and validation (V&V) plan, a software V&V plan (SVVP), or is included in the Software Management / Development Plan (SMP/SDP).

The plan covers the validation activities that will occur at various times in the development life cycle including:

  • During requirements development, validation is accomplished by bringing in the customer and outside people for a review of the requirements, e.g., focus groups, requirements reviews, etc.
  • During design, validation occurs when the customers have a chance to view prototypes of the product or pieces of the product, e.g., focus groups, user groups, etc.
  • During implementation, validation occurs when team members review the behavior of software components under both nominal and exception scenarios. For example, a peer review or inspection could trace the execution path through the code under representative scenarios.
  • Prior to delivery, validation occurs when customers see the completed product function in a nearly operational environment, e.g., acceptance testing, operational demonstrations, etc.
  • During product use, validation occurs when the product is used in the operational environment in the way the customer expects it to be used.

The project team reviews the plan and validation results at various life cycle reviews, particularly whenever changes occur throughout the duration of the project. Any identified issues are captured in problem reports/change requests/action items and resolved before the requirements are used as the basis for development activities.

Validation is often on the critical path to project completion. Validation activities, therefore, need to be planned and tracked in order to realistically assess progress toward completion. The validation plan will address more than just validation of software requirements. It includes a schedule, stakeholder involvement, and planned reviews, if they are required to complete the validation activities and gain agreement that the requirements are a correct and acceptable description of the system or software to be implemented. Other elements to include in the overall plan:

  • Scope.
  • Approach.
  • Resources.
  • Specific tasks and activities.
  • Validation methods and criteria (SWE-102).
  • Identification of work products to be validated (SWE-102).
  • Identification of where validation records and corrective actions will be captured (SWE-102).

The Scope and Approach sections of the plan identify the project and define the purpose and goals of the plan including responsibilities, assumptions, and a summary of the efforts described in the plan.

Resources include personnel, environments (such as simulators, facilities, tools, etc.), and include any skills and/or training necessary for those resources to carry out the validation activities.

When developing the validation plan, consider the following for inclusion:

  • Identifying the key functions and/or components that require validation (based on criticality, safety, security, etc.).
  • Identifying the validation methods, techniques, tests to carry out the validation activities for components as well as the system as a whole (see SWE-055-Requirements Validation).
  • COTS (Commercial Off the Shelf), GOTS (Government Off the Shelf), MOTS (Modified Off the Shelf) affects on the project and associated validation planning (SWE-027 - Use of Commercial, Government, and Legacy Software).
  • Identifying criteria by which success will be measured for each validation activity.
  • Establishing the target environment (which could be a high-fidelity simulation) for validating the software or system, including validation of tools used in those environments (see SWE-073 - Platform or High-Fidelity Simulations).
  • Models, simulations, and/or analysis tools and associated validation planning (SWE-070 - Models, Simulations, Tools, SWE-135 - Static Analysis, SWE-136 - Validation of Software Development Tools).
  • Identifying how the results will be documented and reported, when and to whom they will be reported (SWE-031- Validation Results).
  • Issue resolution (capture and tracking to closure) for issues or findings identified during validation activities (could be as simple as using the project configuration management process) (see SWE-031 - Validation Results).
  • Identifying validation activities, as applicable, to occur during the various life cycle phases.
  • Re-validation plans to accommodate changes as the system is developed.
  • Method for obtaining customer approval of the validation plan, if applicable.

If not part of the team developing the validation plan, Software Assurance needs to be part of the plan's review team to ensure the plan meets all assurance requirements.

Additional guidance related to validation planning may be found in the following related requirements in this Handbook:

SWE-031

Validation Results

SWE-055

Requirements Validation

SWE-102

Software Development/Management Plan

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

5. Resources

  • (SWEREF-019) FSW Testbed Validation Description, 582-2008-006, Version 1, Flight Software Systems Branch, NASA Goddard Space Flight Center (GSFC), 2008. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

  • (SWEREF-209)

    IEEE Computer Society, IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004), This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.

  • (SWEREF-211)

    IEEE Computer Society, IEEE STD 1059-1993, 1993. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.

  • (SWEREF-264)

    NPR 7120.7, NASA Office of the Chief Engineer, 2008. The following appears on this document (web): SPECIAL ATTENTION: ONLY USE NID 7120.99: NASA Information Technology and Institutional Infrastructure Program and Project Management Requirements, as it is the interim directive to NPR 7120.7 and contains the most recent requirements. !DO NOT USE OTHER LINKS ON THIS PAGE!

  • (SWEREF-279)

    Dolores R. Wallace, Laura M. Ippolito, Barbara B. Cuthill, National Institute of Standards and Technology (NIST), NIST Special Publication 500-234, 1996.

  • (SWEREF-334)

    Software Quality Assurance.org, Accessed December20, 2017.

  • (SWEREF-463) Verification and Validation Document: Plan and Report for the System X GFE, Johnson Space Center, 2012.

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

Simics 4.4

COTS

Wind River

http://www.windriver.com/products/simics/ ...

"Wind River Simics is a full system simulator used by software developers to simulate any target hardware from a single processor to large, complex, and connected electronic systems. This simulation enables the target software (board support package, firmware, real-time operating system, middleware, and application) to run on a virtual platform the same way it does on the physical hardware."

IV&V Centers ?, JSC

6. Lessons Learned

There are currently no Lessons Learned identified for this requirement.

  • No labels