Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
{tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned}


h1. 1. Requirements The project shall perform requirements validation to ensure that the software will perform as intended in the customer environment.

h2. {color:#003366}{*}1.1 Notes{*}{color}

Requirements validation includes confirmation that the requirements meet the needs and expectations of the customer. Requirement validation is confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled.

h2. 1.2 Implementation Notes from Appendix D

NPR 7150.2 does not include any notes for this requirement.

h2. 1.3 Applicability Across Classes


h1. 2. Rationale

Requirements are the basis for a project. They identify the need to be addressed, the behavior of the system, and the constraints under which the problem is to be solved. They also specify the performance of the product to be delivered by a contracted provider of software.

_{floatbox} Per the\_ [NASA IV&V Technical Framework|] \_document, "The objective of Requirements IV&V is to ensure the system's software requirements are high quality (correct, consistent, complete, accurate, readable, and testable), and will adequately meet the needs of the system and expectations of its customers and users, considering its operational environment under nominal and off-nominal conditions, and that no unintended features are introduced..." {floatbox}_

Requirements that accurately describe the need to be solved by the project team should be defined before the main planning and building activities begin. Validation is one way to ensure the requirements define the need completely, clearly, correctly, and consistently to give the software engineers the best chance to build the correct product. 

Validation is a process of evaluating artifacts to ensure that the right behaviors have been defined in the artifacts.  The right behaviors adequately describe what the system is supposed to do, what the system is not supposed to do, and what the system is supposed to do under adverse conditions.  

Marasco (2007) describes requirements validation as; "making sure everyone understands and agrees on the requirements put forth, and that they are realistic and precise" ^11^

Other reasons for validating requirements:
* To ensure customer satisfaction with the end product
* To reduce costs (i.e., get it right the first time)
* To gain confidence that the requirements can be fulfilled for the intended use
* To clarify meaning and expectations

h1. 3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:




{panel} Validation activities should not be confused with verification activities as each has a specific goal.  Validation is designed to confirm the right system is being produced while verification is designed to confirm the product is being produced correctly. {panel}

Requirements validation, as used in this requirement, should address all of the following:
* Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders
* Confirmation that the requirements will be  fulfilled by the resulting product
* Confirmation that implied or inherent requirements (e.g., system should do X before Y) are correctly implemented

Validation activities should not be performed in an ad hoc manner, but should be planned and captured in a validation plan document.  The validation plan is typically part of a validation and verification (V&V) plan, a software V&V plan (SVVP), or is included in the Software Management / Development Plan (SMP/SDP).

All levels, or decomposition, of requirements should be validated, including, but not limited to:
* System requirements (note that systems level validation procedures are described in [NPR 7123.1A|], with guidelines in the [NASA Systems Engineering Handbook|])
* Subsystem requirements
* Safety requirements
* Component requirements
* Integration requirements
* Dependability requirements

To perform complete requirements validation, multiple techniques may be required based on the nature of the system, the environment in which the system will functions, or even the phase of the development lifecycle.  Sample validation techniques or methods include, but are not limited to:
* Prototype demonstrations -- 
** Creating incomplete versions of the software being created to allow stakeholders to evaluate the proposed solution(s) by trying them out ([Wikipedia|])
** Use this technique when budget and time allow, when stakeholders are hands-on, when the development model is an iterative process, etc.
* Functional demonstrations -- 
** Demonstrating specific actions or functions of the code ([Wikipedia|])
** Use this technique to validate requirements related to questions such as "can the user do this" or "does this particular feature work"
* Formal reviews -- 
** Structured reviews in which specified steps are taken and roles are assigned to individual reviewers ([NASA Software Safety Guidebook|])
** Formal reviews are useful for validating documents such as software requirements specifications (SRS) and allow for discussion and eventual agreement on the requirements among persons with varied viewpoints
** Formal reviews typically only address portions (sections or specified number of pages) of documents in a single review rather than an entire document
** Formal reviews allow for identification of defects as well as suggested corrections
* Software peer reviews / inspections of product components -- 
** Visual examination of a software product to detect and identify software anomalies, including errors and deviations from standards and specifications
** Peer reviews and inspections are useful for validating documents such as software requirements specifications (SRS) and allow for peer discussion of the technical aspects of the document content
** Inspections can be informal or formal with formal inspections being more structured with specific activities, assigned roles, and defined results (metrics, defects, etc.)
** Inspections typically cover larger volumes of information than formal reviews and only identify the issues; solutions are typically not part of the peer review / inspection process
* Analysis
** Analysis can be considered a "lightweight" version of running software against a simulation and involves going through the calculations without actually running the simulation in real time
** Analysis removes the time-related aspects of the validation which may need to be validated using a different technique
** Use this technique as part of an overall validation strategy or as a precursor step to a full simulation
** Beta testing of new software applications
** Use this technique when budget and time allow, when stakeholders are hands-on, when stakeholders (primarily user groups) and project schedule are amenable to this type of testing, etc.
* Paper simulations / prototyping: ^10^
** Drawing prototypes on paper
** This is a low-cost prototyping technique that might be useful in the early stages of high-level requirements development to capture and display visually the ideas and concepts or for projects that don't have the budget for prototyping in software
** Issues with prototyping on paper include storing for future reference and difficulty transforming into executable prototypes
** Typically these prototypes simply end up in requirements documents
* Use-case based modeling: ^10^
** Modeling system behavior using use cases to identify actors and their interaction with the system
** Use this technique when it is easy to identify users (both human and other systems) and services or functionality provided by the system
** This technique is helpful when the focus of the software defined by the requirements is on user interaction with the system because use case models depict a problem and solution from the user's point of view, "who" does "what" with the system
* Viewpoint-oriented requirements validation: ^10^
** Identify conflicting requirements based on viewpoints of various stakeholders
** Use this technique when there is a need to identify conflicting requirements based on viewpoints or when it is important to consider requirements from multiple perspectives
WhenperspectivesWhen validating requirements, either new or modified, the following roles should be considered to participate because all roles will review the requirements from a different perspective:
| *Sample   Roles for Validation Activities* |
| Customer |
| Developer |
| Interface Representative |
| Requirement/Software Requirement Specification (SRS) Author |
| Reviewer (topic expert, safety, software assurance, etc.)\\
\-       As   appropriate, multiple Reviewers could be included, each providing a   specialized perspective |

When available and appropriate, checklists and documented procedures should be used for the various techniques selected for requirements validation to ensure consistency of application of the technique. 
| *Sample   Checklists and Procedures* |
| Peer review /   inspection checklists |
| Formal review   checklists |
| Analysis procedures |
| Acceptance test procedures |

Samples are included in the Resources and Tools section of this guidance, but Center procedures should take precedence when conducting requirements validation activities at a particular Center.

A requirements traceability matrix may also be useful to ensure that all requirements are validated.  The matrix could include:
* Links to higher-level requirements which identify/define user needs
* A place to record validation methods
* A place to record or reference the validation results

Some common issues related to requirements validation include: ^11^
* Confusing management of requirements with validation of requirements
** Managing requirements will not ensure they are correct
* When using prototyping to validate requirements, 
** Failing to keep the focus on _what_ the software is supposed to do
** Allowing the focus to shift to the _how_ the system will look when it is done
* Failing to re-validate requirements as they change during the project life cycle
* Getting stakeholders with different views to agree on a single version of a requirement; interpretation can be troublesome
* When using visual models to bridge the communication gaps among stakeholders, only translating a limited number of requirements into visual models (often due to time or budgetary constraints)
* Failing to link the text to visual models; both are needed for understanding
* Failing to use a formal process to track all versions of the requirements as they change during the project

Additionally, it is important to confirm with stakeholders that their needs and expectations remain adequately and correctly captured by the requirements following resolution of conflicting, impractical and/or unrealizable stakeholder requirements.

While the Software Requirements Review (SRR) addresses more than just "getting the requirements right", the SRR can include that action as part of the review.

See also related requirements in this handbook:
| *SWE-029:* | Validation planning |
| *SWE-031* | Validation results |
| *SWE-073:* | Platform or hi-fidelity simulations) |
| *SWE-102* | SW development/management plan |

h1. 4. Small Projects

Small projects need to balance the effectiveness of the available methods against available resources to validate requirements associated with software. Safety critical requirements, human rated requirements, and other critical requirements should be validated with appropriately rigorous methods which are documented in the project's software development/management plan.

h1. 5. Resources

# Software Engineering Division, Goddard Space Flight Center, "[Requirements Management|]", 580-PC-024-02, 2010.
# IEEE Computer Society, "[IEEE Standard for Software Verification and Validation|]", Chapter 7, IEEE STD 1012-2004, 2004.  This link requires an account on the NASA START (AGCY NTSS) system ([|]).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
# "[Systems and software engineering -- Software life cycle processes|]", ISO/IEC 12207, IEEE Std 12207-2008, 2008 (Key section: Stakeholder Requirements Definition Process).  This link requires an account on the NASA START (AGCY NTSS) system ([|] ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
# Kandt, Ronald Kirk, Jet Propulsion Lab, "Software Quality Improvement, Software Requirements Engineering: Practices and Techniques", JPL Document D-24994, 2003 (Validation Practices section).
#  "Product Requirements Development and Management Procedure", 5526_7-21-06_Req_RevA_generic-R1V0, 2006 (Section 4: Validate Requirements).
# NASA Technical Standard, "[Software Formal Inspections Standard|]", NASA-STD-2202-93, 1993.
# Information Systems Division, Goddard Space Flight Center, "[ISD Inspections, Peer Reviews, and Walkthroughs|]", 580-SP-055-01, 2006.
# Jet Propulsion Laboratory (JPL), "Software Review Handbook", JPL D-25798, Rev. 0, [|].
# IEEE Computer Society, "[IEEE Standard for Software Reviews|]", IEEE STD 1028-1997, 1997. This link requires an account on the NASA START (AGCY NTSS) system ([|]).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
# Raja, "[Empirical Studies of Requirements Validation Techniques|]".
# Marasco, Dr. Joe, "[The importance of testing software requirements|]", 2007. Accessed June 2011.
# Software Engineering Division, Goddard Space Flight Center, "[Checklist for the Contents of Software Requirements Review (SRR)|]", 580-CK-005-02, 2009.
# Information Systems Division (ISD), Goddard Space Flight Center, "[Requirements Peer Review Checklist|]", 580-CK-057-01, 2006.
# "Peer Review Inspection Checklists (R1 -- Software Requirements Checklist)", 1990, [|] .
# NASA Procedural Requirement, "[NASA Systems Engineering Processes and Requirements w/ Change 1|]", NPR 7123.1A, 2009.
# NASA Scientific and Technical Information (STI), NASA Center for AeroSpace Information, "[NASA Systems Engineering Handbook|]", NASA/SP-2007-6105, Rev1, 2007.

h2. 5.1 Tools

{panel} Tools relative to this SWE may be found in the table above. If no tools are listed, none have been currently identified for this SWE. You may wish to reference table XYZ i in this handbook for an evolving  list of these and other tools in use at NASA.  Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool.  *Check with your Center to see what tools are available to facilitate compliance with this requirement.* {panel}

h2. 6. Lessons Learned

A documented lesson from the NASA Lessons Learned database describes a situation in which a mishap could have been prevented if requirements validation had caught a mismatch between interface documentation and the requirements.  Because the mismatch was not caught, the Mars Climate Orbiter (MCO) spacecraft was lost due to "the failure to use metric units in the coding of a ground software file...used in trajectory models...The data in the ...file was required to be in metric units per existing software interface documentation. " The data was provided in English units per the requirements. ([|])