The basic validation process is shown below with the steps addressed by this requirement highlighted:
Validation activities are not to be confused with verification activities as each has a specific goal. Validation is designed to confirm the right system is being produced while verification is designed to confirm the product is being produced correctly.
Requirements validation, as used in this requirement, addresses all of the following:
- Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders.
- Confirmation that the requirements will be fulfilled by the resulting product.
- Confirmation that implied or inherent requirements (e.g., system should do X before Y) are correctly implemented.
Validation activities are not performed in an ad hoc manner, but are planned and captured in a validation plan document. The validation plan is typically part of a verification and validation (V&V) plan, a software V&V plan (SVVP), or is included in the Software Management/Development Plan (SMP/SDP).
All requirements need to be validated. Categories include, but are not limited to:
- System requirements (note that systems level validation procedures are described in NPR 7123.1, NASA Systems Engineering Processes and Requirements , with guidelines in NASA/SP-2007-6105, NASA Systems Engineering Handbook ).
- Subsystem requirements.
- Safety requirements.
- Component requirements.
- Integration requirements.
To perform complete requirements validation, multiple techniques may be required based on the nature of the system, the environment in which the system will function, or even the phase of the development life cycle. Sample validation techniques or methods include, but are not limited to:
- Develop operational concepts –
- Document descriptions of how the software "will be operated during the life-cycle phases ... describes the system characteristics from an operational perspective."
- Use this technique to improve the quality of customer requirements.
- Use this technique to ensure customer requirements and expectations are correctly captured.
- Prototype demonstrations –
- Creating incomplete versions of the software being created to allow stakeholders to evaluate the proposed solution(s) by trying them out (No Silver Bullet: Essence and Accidents of Software Engineering by Frederick P. Brooks ).
- Use this technique when budget and time allow, when stakeholders are hands-on, when the development model is an iterative process, etc.
- Functional demonstrations –
- Demonstrating specific actions or functions of the code.
- Use this technique to validate requirements related to questions such as "can the user do this" or "does this particular feature work."
- Formal reviews –
- Structured reviews in which specified steps are taken and roles are assigned to individual reviewers (NASA-GB-8719.13, NASA Software Safety Guidebook )
- Formal reviews are useful for validating documents, such as software requirements specifications (SRS), and allow for discussion and eventual agreement on the requirements among persons with varied viewpoints.
- Formal reviews typically only address portions (sections or specified number of pages) of documents in a single review rather than an entire document.
- Formal reviews allow for identification of defects as well as suggested corrections.
- Software peer reviews/inspections of product components –
- Relevant stakeholders investigate and review a product, such as requirements to determine if it meets preset criteria and to identify product defects.
- Peer reviews and inspections are useful for validating documents, such as SRSs, and allow for peer discussion of the technical aspects of the document content.
- Inspections can be informal or formal with formal inspections being more structured with specific activities, assigned roles, and defined results (metrics, defects, etc.).
- Inspections typically cover larger volumes of information than formal reviews and only identify the issues; solutions are typically not part of the peer review/inspection process.
- Analysis can be considered a "lightweight" version of running software against a simulation and involves going through the calculations without actually running the simulation in real time.
- Analysis removes the time-related aspects of the validation, which may need to be validated using a different technique.
- Use this technique as part of an overall validation strategy or as a precursor step to a full simulation.
- Beta testing of new software applications.
- Use this technique when budget and time allow, when stakeholders are hands-on, when stakeholders (primarily user groups) and project schedule are amenable to this type of testing, etc.
- Paper simulations/prototyping/storyboarding:
- Drawing prototypes on paper.
- This is a low-cost prototyping technique that might be useful in the early stages of high-level requirements development to capture and display visually the ideas and concepts or for projects that don't have the budget for prototyping in software.
- Issues with prototyping on paper include storing for future reference and difficulty transforming into executable prototypes.
- Typically these prototypes simply end up in requirements documents.
- Use-case based modeling:
- Modeling system behavior using use-cases to identify actors and their interaction with the system.
- Use this technique when it is easy to identify users (both human and other systems) and services or functionality provided by the system.
- This technique is helpful when the focus of the software defined by the requirements is on user interaction with the system because use-case models depict a problem and solution from the user's point of view, "who" does "what" with the system.
- Viewpoint-oriented requirements validation:
- Identify conflicting requirements based on viewpoints of various stakeholders.
- Use this technique when there is a need to identify conflicting requirements based on viewpoints or when it is important to consider requirements from multiple perspectives.
- Formal methods
- Mathematically rigorous techniques.
- Use this technique to validate formal requirements specifications or to validate key properties of requirements.
- Review of test cases
- Reviewing test cases individually and as a set to confirm coverage of system scenarios .
- Reviewing test cases with stakeholders to confirm functional and operational scenarios (as defined by the requirements).
- Development and review of test cases can help find problems in the requirements "since it requires completely thinking through the operation of the application."
- This technique is particularly useful for test-driven software development.
When validating requirements, either new or modified, consider including the following roles because all roles will review the requirements from a different perspective:
Sample Roles for Validation Activities
Reviewer (topic expert, safety, software assurance, etc.)
- As appropriate, multiple Reviewers could be included, each providing a specialized perspective
When available and appropriate, checklists and documented procedures are used for the various techniques selected for requirements validation to ensure consistency of application of the technique.
Sample Checklists and Procedures
Peer review/inspection checklists
Formal review checklists
Acceptance test procedures
Samples are included in the Resources section of this guidance, but Center procedures take precedence when conducting requirements validation activities at a particular Center.
A requirements traceability matrix may also be useful to ensure that all requirements are validated. The matrix could include:
- Links to higher-level requirements which identify/define user needs.
- A place to record validation methods.
- A place to record or reference the validation results.
Some common issues related to requirements validation include:
- Confusing management of requirements with validation of requirements.
- Managing requirements will not ensure they are correct.
- When using prototyping to validate requirements:
- Failing to keep the focus on what the software is supposed to do.
- Allowing the focus to shift to the how the system will look when it is done.
- Failing to re-validate requirements as they change during the project life cycle.
- Getting stakeholders with different views to agree on a single version of a requirement; interpretation can be troublesome.
- When using visual models to bridge the communication gaps among stakeholders, only translating a limited number of requirements into visual models (often due to time or budgetary constraints).
- Failing to link the text to visual models; both are needed for understanding.
- Failing to use a formal process to track all versions of the requirements as they change during the project.
Additionally, it is important to confirm with stakeholders that their needs and expectations remain adequately and correctly captured by the requirements following resolution of conflicting, impractical and/or unrealizable stakeholder requirements.
While the Software Requirements Review (SRR) addresses more than just "getting the requirements right", the SRR can include that action as part of the review.
Additional guidance related to requirements validation may be found in the following related requirements in this Handbook: