bannerc
SWE-055 - Requirements Validation

1. Requirements

4.1.7 The project manager shall perform requirements validation to ensure that the software will perform as intended in the customer environment. 

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-055 - Last used in rev NPR 7150.2D

RevSWE Statement
A

3.1.2.3 The project shall perform requirements validation to ensure that the software will perform as intended in the customer environment.

Difference between A and B

No change

B

4.1.3.3 The project manager shall perform requirements validation to ensure that the software will perform as intended in the customer environment.

Difference between B and C

No change

C

4.1.7 The project manager shall perform requirements validation to ensure that the software will perform as intended in the customer environment. 

Difference between C and DNo change
D

4.1.7 The project manager shall perform requirements validation to ensure that the software will perform as intended in the customer environment. 



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Requirements are the basis for a project. They identify the need to be addressed, the behavior of the system, and the constraints under which the problem is to be solved. They also specify the performance of the product to be delivered by a contracted provider of software.

Per the NASA IV&V Technical Framework 003document, "The objective of Requirements IV&V is to ensure the system's software requirements are high quality (correct, consistent, complete, accurate, readable, and testable), and will adequately meet the needs of the system and expectations of its customers and users, considering its operational environment under nominal and off-nominal conditions, and that no unintended features are introduced..."

Requirements that accurately describe the need to be solved by the project team need to be defined before the main planning and building activities begin. Validation is one way to ensure the requirements define the need completely, clearly, correctly, and consistently to give the software engineers the best chance to build the correct product. 

Validation is a process of evaluating artifacts to ensure that the right behaviors have been defined in the artifacts. The right behaviors adequately describe what the system is supposed to do, what the system is not supposed to do, and what the system is supposed to do under adverse conditions.  

Requirements validation includes confirmation that the requirements meet the needs and expectations of the customer. Requirement validation is confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled. Marasco (2007) describes requirements validation as "making sure everyone understands and agrees on the requirements put forth, and that they are realistic and precise" 247

Other reasons for validating requirements:

  • To ensure customer satisfaction with the end product.
  • To reduce costs (i.e., get it right the first time).
  • To gain confidence that the requirements can be fulfilled for the intended use.
  • To clarify meaning and expectations.

3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:


Validation activities are not to be confused with verification activities as each has a specific goal.  Validation is designed to confirm the right system is being produced while verification is designed to confirm the product is being produced correctly.

Requirements validation, as used in this requirement, addresses all of the following:

  • Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders.
  • Confirmation that the requirements will be fulfilled by the resulting product.
  • Confirmation that implied or inherent requirements (e.g., the system should do X before Y) are correctly implemented.

Validation activities are not performed in an ad hoc manner but are planned and captured in a validation plan document.  The validation plan is typically part of a verification and validation (V&V) plan, a software V&V plan (SVVP), or is included in the Software Management/Development Plan (5.08 - SDP-SMP - Software Development - Management Plan).

All requirements need to be validated. Categories include, but are not limited to:

  • System requirements (note that systems-level validation procedures are described in NPR 7123.1, NASA Systems Engineering Processes and Requirements 041, with guidelines in NASA/SP-2007-6105, NASA Systems Engineering Handbook 273).
  • Subsystem requirements.
  • Safety requirements.
  • Component requirements.
  • Integration requirements.
  • Interface requirements.
  • COTS, MOTS, GOTS, and reused software requirements.

To perform complete requirements validation, multiple techniques may be required based on the nature of the system, the environment in which the system will function, or even the phase of the development life cycle.  Sample validation techniques or methods include, but are not limited to:

  • Develop operational concepts –
    • Document descriptions of how the software "will be operated during the life-cycle phases ... describes the system characteristics from an operational perspective." 206
    • Use this technique to improve the quality of customer requirements.
    • Use this technique to ensure customer requirements and expectations are correctly captured.
  • Prototype demonstrations –
    • Creating incomplete versions of the software being created allow stakeholders to evaluate the proposed solution(s) by trying them out (No Silver Bullet: Essence and Accidents of Software Engineering by Frederick P. Brooks 322).
    • Use this technique when budget and time allow when stakeholders are hands-on when the development model is an iterative process, etc.
  • Functional demonstrations –
    • Demonstrating specific actions or functions of the code.
    • Use this technique to validate requirements related to questions such as "can the user do this" or "does this particular feature work."
  • Formal reviews –
    • Structured reviews in which specified steps are taken and roles are assigned to individual reviewers (NASA-GB-8719.13, NASA Software Safety Guidebook 276)
    • Formal reviews are useful for validating documents, such as software requirements specifications (5.09 - SRS - Software Requirements Specification), and allow for discussion and eventual agreement on the requirements among persons with varied viewpoints.
    • Formal reviews typically only address portions (sections or a specified number of pages) of documents in a single review rather than an entire document.
    • Formal reviews allow for the identification of defects as well as suggested corrections.
  • Software peer reviews/inspections of product components –
    • Relevant stakeholders investigate and review a product, such as requirements to determine if it meets preset criteria and to identify product defects.
    • Peer reviews and inspections are useful for validating documents, such as SRSs, and allow for peer discussion of the technical aspects of the document content.
    • Inspections can be informal or formal with formal inspections being more structured with specific activities, assigned roles, and defined results (metrics, defects, etc.).
    • Inspections typically cover larger volumes of information than formal reviews and only identify the issues; solutions are typically not part of the peer review/inspection process.
  • Analysis
    • Analysis can be considered a "lightweight" version of running software against simulation and involves going through the calculations without actually running the simulation in real-time.
    • Analysis removes the time-related aspects of the validation, which may need to be validated using a different technique.
    • Use this technique as part of an overall validation strategy or as a precursor step to full simulation.
    • Beta testing of new software applications.
    • Use this technique when budget and time allow when stakeholders are hands-on when stakeholders (primarily user groups) and project schedules are amenable to this type of testing, etc.
  • Paper simulations/prototyping/storyboarding: 304
    • Drawing prototypes on paper.
    • This is a low-cost prototyping technique that might be useful in the early stages of high-level requirements development to capture and display visually the ideas and concepts or for projects that don't have the budget for prototyping in software.
    • Issues with prototyping on paper include storing for future reference and difficulty transforming into executable prototypes.
    • Typically these prototypes simply end up in requirements documents.
  • Use-case based modeling: 304
    • Modeling system behavior using use-cases to identify actors and their interaction with the system.
    • Use this technique when it is easy to identify users (both human and other systems) and services or functionality provided by the system.
    • This technique is helpful when the focus of the software defined by the requirements is on user interaction with the system because use-case models depict a problem and solution from the user's point of view, "who" does "what" with the system.
  • Viewpoint-oriented requirements validation: 304
    • Identify conflicting requirements based on the viewpoints of various stakeholders.
    • Use this technique when there is a need to identify conflicting requirements based on viewpoints or when it is important to consider requirements from multiple perspectives.
  • Formal methods
    • Mathematically rigorous techniques.
    • Use this technique to validate formal requirements specifications or to validate key properties of requirements. 181
  • Review of test cases
    • Review test cases individually and as a set to confirm coverage of system scenarios.
    • Review test cases with stakeholders to confirm functional and operational scenarios (as defined by the requirements).
    • Development and review of test cases can help find problems in the requirements "since it requires completely thinking through the operation of the application." 276
    • This technique is particularly useful for test-driven software development.

When validating requirements, either new or modified, consider including the following roles because all roles will review the requirements from a different perspective:

Sample Roles for Validation Activities

Customer

Developer

Interface Representative

Requirement/SRS Author

Reviewer (topic expert, safety, software assurance, etc.)
-       As appropriate, multiple Reviewers could be included, each providing a specialized perspective

When available and appropriate, checklists and documented procedures are used for the various techniques selected for requirements validation to ensure consistency of application of the technique. 

Sample Checklists and Procedures

Peer review/inspection checklists

Formal review checklists

Analysis procedures

Acceptance test procedures

Samples are included in the Resources section of this guide, but Center procedures take precedence when conducting requirements validation activities at a particular Center.

A requirements traceability matrix may also be useful to ensure that all requirements are validated.  The matrix could include:

  • Links to higher-level requirements that identity/define user needs.
  • A place to record validation methods.
  • A place to record or reference the validation results.

Some common issues related to requirements validation include: 012

  • Confusing management of requirements with validation of requirements.
    • Managing requirements will not ensure they are correct.
  • When using prototyping to validate requirements:
    • Failing to keep the focus on what the software is supposed to do.
    • Allowing the focus to shift to how the system will look when it is done.
  • Failing to re-validate requirements as they change during the project life cycle.
  • Getting stakeholders with different views to agree on a single version of a requirement; interpretation can be troublesome.
  • When using visual models to bridge the communication gaps among stakeholders, only translating a limited number of requirements into visual models (often due to time or budgetary constraints).
  • Failing to link the text to visual models; both are needed for understanding.
  • Failing to use a formal process to track all versions of the requirements as they change during the project.

Additionally, it is important to confirm with stakeholders that their needs and expectations remain adequately and correctly captured by the requirements following the resolution of conflicting, impractical, and/or unrealizable stakeholder requirements.

While the Software Requirements Review (SRR) addresses more than just "getting the requirements right", the SRR can include that action as part of the review.

Additional guidance related to requirements validation may be found in the following related requirements in this Handbook:

4. Small Projects

Small projects need to balance the effectiveness of the available methods against available resources to validate requirements associated with the software. Safety-critical requirements, human-rated requirements, and other critical requirements need to be validated with appropriately rigorous methods that are documented in the project's software development/management plan.

5. Resources

5.1 References

  • (SWEREF-003) IVV 09-1, Revision P, NASA Independent Verification and Validation Program, Effective Date: February 26, 2016
  • (SWEREF-012) Checklist for the Contents of Software Requirements Review (SRR), 580-CK-005-02, Software Engineering Division, NASA Goddard Space Flight Center, 2009. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-041) NPR 7123.1D, Office of the Chief Engineer, Effective Date: July 05, 2023, Expiration Date: July 05, 2028
  • (SWEREF-061) JPL Document D-24994, NASA Jet Propulsion Laboratory, 2003. See Page 20. Approved for U.S. and foreign release.
  • (SWEREF-079) SED Inspections, Peer Reviews, and Walkthroughs, 580-SP-055-02, Software Engineering Division, NASA Goddard Space Flight Center (GSFC), 2006. Updated title from "ISD Inspections, Peer Reviews, and Walkthroughs, 580-SP-055-01" to "SED Inspections, Peer Reviews, and Walkthroughs, 580-SP-055-02" to reflect updated version of document, and to reflect the new name (Software Engineering Division) of the GSFC organization that produced the document.) This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-086) 5526_7-21-06_Req_RevA_generic-R1V0, 2006. See Section 4: Validate Requirements. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
  • (SWEREF-091) Requirements Management, 580-PC-024-04, Software Engineering Division, NASA Goddard Space Flight Center (GSFC), 2016. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-181) Easterbrook, Steve, 1998. NASA-IVV-97-015, October, 16, 1997, Accessed November 2011 from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19980016986_1998065191.pdf.
  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
  • (SWEREF-206) Hooks, Ivy F., Farry, Kirstin A., American Management Association, New York, 2001.
  • (SWEREF-209) IEEE Computer Society, IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004), See Chapter 7. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
  • (SWEREF-219) IEEE Std 1028, 2008. IEEE Computer Society, NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
  • (SWEREF-224) ISO/IEC 12207, IEEE Std 12207-2008, 2008. See Key section: Stakeholder Requirements Definition Process. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
  • (SWEREF-247) Marasco, Dr. Joe, 2007. In http://www.techtarget.com. Requires Free Membership to view content.
  • (SWEREF-273) NASA SP-2016-6105 Rev2,
  • (SWEREF-276) NASA-GB-8719.13, NASA, 2004. Access NASA-GB-8719.13 directly: https://swehb.nasa.gov/download/attachments/16450020/nasa-gb-871913.pdf?api=v2
  • (SWEREF-277) NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1
  • (SWEREF-304) Raja, U.A. (February, 2009). IEEE Computer, control and Communication, 2009. 2nd International Conference. This link may require you to be logged in on your NASA network or to have an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards. Retrieved on December 12, 2017.
  • (SWEREF-322) Brooks, Frederick P., Computer, Vol. 20, No. 4 (April 1987) pp. 10-19.
  • (SWEREF-449) Requirements Peer Review Checklist, 580-CK-057-02, Software Engineering Division (SED), NASA Goddard Space Flight Center (GSFC), 2012. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-513) Public Lessons Learned Entry: 641.


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641 513: A mishap could have been prevented if requirements validation had caught a mismatch between interface documentation and the requirements.  Because the mismatch was not caught, the Mars Climate Orbiter (MCO) spacecraft was lost due to "the failure to use metric units in the coding of a ground software file...used in trajectory models...The data in the ...file was required to be in metric units per existing software interface documentation. " The data was provided in English units per the requirements.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-055 - Requirements Validation
4.1.7 The project manager shall perform requirements validation to ensure that the software will perform as intended in the customer environment. 

7.1 Tasking for Software Assurance

  1. Confirm that the project software testing has shown that software will function as expected in the customer environment.

7.2 Software Assurance Products

  • None


    Objective Evidence

    • Software Test Procedures
    • Software Test Reports

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of tests executed vs. # of tests completed
  • # of tests completed vs. total # of tests
  • The number of requirements successfully tested in the customer environment versus the total number of requirements.
  • # of detailed software requirements tested to date vs. total # of detailed software requirements

7.4 Guidance

Software validation is a software engineering activity that shows confirmation that the software product, as provided (or as it will be provided), fulfills its intended use in its intended environment.  In other words, validation ensures that “you built the right thing.”  Examples of validation methods include but are not limited to:  formal reviews, prototype demonstrations, functional demonstrations, software testing, software peer reviews/inspections of software product components, behavior in a simulated environment, acceptance testing against mathematical models, analyses, and operational environment demonstrations. 

Requirements validation is the process of checking that requirements are defined for development, define the system that the customer wants. To check issues related to requirements, we perform requirements validation. We usually use requirements validation to check errors at the initial phase of development as the error may increase excessive rework when detected later in the development process.

In the requirements validation process, we perform a different type of test to check the requirements mentioned in the Software Requirements Specification (SRS), these checks include:

  • Completeness checks
  • Consistency checks
  • Validity checks
  • Realism checks
  • Ambiguity checks
  • Verifiability

The output of requirements validation is the list of problems and agreed-on actions of detected problems. The lists of problems indicate the problem detected during the process of requirement validation. The list of agreed actions states the corrective action that should be taken to fix the detected problem.

All requirements need to be validated. Categories include, but are not limited to:

  • System requirements (note that systems-level validation procedures are described in NPR 7123.1, NASA Systems Engineering Processes and Requirements 041, with guidelines in NASA/SP-2007-6105, NASA Systems Engineering Handbook 273).
  • Subsystem requirements.
  • Safety requirements.
  • Component requirements.
  • Integration requirements.
  • Interface requirements.
  • COTS, MOTS, GOTS, and reused software requirements.

To perform complete requirements validation, multiple techniques may be required based on the nature of the system, the environment in which the system will function, or even the phase of the development life cycle.  Sample validation techniques or methods include, but are not limited to:

  • Develop operational concepts –
    • Document descriptions of how the software "will be operated during the life-cycle phases ... describes the system characteristics from an operational perspective." 206
    • Use this technique to improve the quality of customer requirements.
    • Use this technique to ensure customer requirements and expectations are correctly captured.
  • Prototype demonstrations –
    • Creating incomplete versions of the software being created allow stakeholders to evaluate the proposed solution(s) by trying them out (No Silver Bullet: Essence and Accidents of Software Engineering by Frederick P. Brooks 322).
    • Use this technique when budget and time allow when stakeholders are hands-on when the development model is an iterative process, etc.
  • Functional demonstrations –
    • Demonstrating specific actions or functions of the code.
    • Use this technique to validate requirements related to questions such as "can the user do this" or "does this particular feature work."
  • Formal reviews –
    • Structured reviews in which specified steps are taken and roles are assigned to individual reviewers (NASA-GB-8719.13, NASA Software Safety Guidebook 276)
    • Formal reviews are useful for validating documents, such as software requirements specifications (5.09 - SRS - Software Requirements Specification), and allow for discussion and eventual agreement on the requirements among persons with varied viewpoints.
    • Formal reviews typically only address portions (sections or a specified number of pages) of documents in a single review rather than an entire document.
    • Formal reviews allow for the identification of defects as well as suggested corrections.
  • Software peer reviews/inspections of product components –
    • Relevant stakeholders investigate and review a product, such as requirements to determine if it meets preset criteria and to identify product defects.
    • Peer reviews and inspections are useful for validating documents, such as SRSs, and allow for peer discussion of the technical aspects of the document content.
    • Inspections can be informal or formal with formal inspections being more structured with specific activities, assigned roles, and defined results (metrics, defects, etc.).
    • Inspections typically cover larger volumes of information than formal reviews and only identify the issues; solutions are typically not part of the peer review/inspection process.
  • Analysis
    • Analysis can be considered a "lightweight" version of running software against simulation and involves going through the calculations without actually running the simulation in real-time.
    • Analysis removes the time-related aspects of the validation, which may need to be validated using a different technique.
    • Use this technique as part of an overall validation strategy or as a precursor step to full simulation.
    • Beta testing of new software applications.
    • Use this technique when budget and time allow when stakeholders are hands-on when stakeholders (primarily user groups) and project schedules are amenable to this type of testing, etc.
  • Paper simulations/prototyping/storyboarding: 304
    • Drawing prototypes on paper.
    • This is a low-cost prototyping technique that might be useful in the early stages of high-level requirements development to capture and display visually the ideas and concepts or for projects that don't have the budget for prototyping in software.
    • Issues with prototyping on paper include storing for future reference and difficulty transforming into executable prototypes.
    • Typically these prototypes simply end up in requirements documents.
  • Use-case based modeling: 304
    • Modeling system behavior using use-cases to identify actors and their interaction with the system.
    • Use this technique when it is easy to identify users (both human and other systems) and services or functionality provided by the system.
    • This technique is helpful when the focus of the software defined by the requirements is on user interaction with the system because use-case models depict a problem and solution from the user's point of view, "who" does "what" with the system.
  • Viewpoint-oriented requirements validation: 304
    • Identify conflicting requirements based on the viewpoints of various stakeholders.
    • Use this technique when there is a need to identify conflicting requirements based on viewpoints or when it is important to consider requirements from multiple perspectives.
  • Formal methods
    • Mathematically rigorous techniques.
    • Use this technique to validate formal requirements specifications or to validate key properties of requirements. 181
  • Review of test cases
    • Review test cases individually and as a set to confirm coverage of system scenarios 209.
    • Review test cases with stakeholders to confirm functional and operational scenarios (as defined by the requirements).
    • Development and review of test cases can help find problems in the requirements "since it requires completely thinking through the operation of the application." 276
      • This technique is particularly useful for test-driven software development.
    • During the review of the test cases and preparation for the actual acceptance testing of the software system, it is important to consider the customer's operational environment. These tests should be executed in an environment that is as close to the customer's operational environment as possible. Ideally, the test should be run under the same conditions and configurations that the customer will be using. However, this is not always possible. It may be necessary to run the test using one or more simulators/emulators to mirror the customer's operational environment. SA should confirm that test cases specify the environment and configuration to be used for testing and that it is as close to the customer's operational environment as possible. Software assurance should confirm that the specified environment and configuration comprise the actual set-up being used for the testing.



  • No labels

0 Comments