bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 57 Next »

The license could not be verified: License Certificate has expired! Administrators, please check your license details here.

SWE-031 - Validation Results

1. Requirements

2.4.4 The project shall record, address, and track to closure the results of software validation activities.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Class G is labeled with "P (Center)." This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

Simply performing validation is not sufficient to ensure the software will perform as intended in the customer environment. The project team must capture the results of validation activities to prove the validation was conducted and to document the findings from those activities. Identified issues need to be analyzed to determine the cause. The team then identifies associated resolutions, and tracks to closure the work to implement those resolutions. This completes the cycle and provides the best assurance that the software will be of high quality and perform as intended.

Where resolution involves revision to the validation process, environment, etc., tracking such issues helps to document process improvements and improve process quality resulting in more accurate validation results in the future.

3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:

Recording validation results and activities

Results and the validation activities and validated products that generated them can be recorded in the following ways, as appropriate for the validation method used:

  • Validation reports.
  • Review meeting minutes/reports.
  • Review Item Dispositions (RIDs).
  • As-run test logs.
  • Demonstration results.
  • Analysis reports.
  • Beta test reports.
  • User group reports.
  • Issue tracking system.
  • Change requests/change request system.

Analyzing validation results

When analyzing validation results to determine whether those results support a conclusion that the software will perform as intended in the operational environment, consider the following steps:

  • Compare actual to expected results.
  • Identify discrepancies or mismatches in behavior.
    • Document discrepancies individually for ease of tracking through the resolution process.
    • Determine cause of the issue, including problems with the validation methods, criteria, or environment.
  • Identify the changes required to address discrepancies.
  • Evaluate and record the impact of changes needed to correct issues/discrepancies.
    • Rework and test.
    • Re-validation.
  • Obtain and record approval for changes to be made versus those to be addressed at a different time.
  • Measure and predict quality of the software based on the validation results (typically, a software assurance activity).

Tracking discrepancies to closure

When tracking discrepancies to closure, consider the following activities:

  • Capture the changes to be made (change requests, corrective actions, RIDs, etc.) with or with reference to the original validation results.
  • Carry out the approved changes to be made.
  • Follow chosen solution to completion, including any re-validation necessary to confirm the issue has been resolved.
  • Obtain and record approval for issue close-out.

Typical content and documentation

The following is a list of information typically captured in the validation results, analysis results, or in the post-analysis documentation of those results. This is not a complete list. Center procedures may call for additional information, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.

  • Specific reference (name, identification number, version, etc.) to the item or product (requirement, document, function, system, etc.) being validated.
  • Where appropriate, the specific validation activity applied (test identifier, demonstration step(s), etc.).
    • Include sequence of events leading to problem, if appropriate.
  • Description of the validation result(s).
  • Description of identified issue or problem.
  • Root cause of issue or problem, including problems with the validation environment, process, etc.
  • Criticality of issue or problem, including impact on the item under validation or software system.
  • Recommended correction or change to address the issue.
  • Impact of recommended correction or change.
  • Justification for the chosen resolution, including choices to postpone a change.
  • Approval to implement the chosen resolution.
  • Tracking information such as validator (person, team, etc.) identification, date of activity (validation date, analysis or review date, correction date, follow-up validation activity date, etc.).
  • Approval of issue close-out.

Center or project procedures may require additional documentation not described above.

Additional documents which may be produced

Validation summary report

Status reports

Metric data/reports

Lessons learned

Reporting results

The team reports the results of validation activities to one or more of the following:

  • Project management.
  • Software assurance, if not involved in the validation activities.
  • Customer, as appropriate.
  • Other stakeholders, as appropriate.

Potential problems when recording, analyzing and tracking validation results include, but are not limited to:

  • Assuming all issues are problems in the software (issues may be caused by the validation procedures, criteria, or environment).
  • Failing to involve knowledgeable or "expert" personnel in the analysis process.
  • Not reviewing validation results at life cycle product milestone, or other relevant, reviews (waiting to review such results can potentially result in carrying issues into later life-cycle phases).

NASA Centers typically have templates and tools for capturing and tracking validation results. Use the tools available at each particular Center for projects developed at those Centers.

Additional guidance related to Validation may be found in the following related requirements in this Handbook:

SWE-029

Validation Planning

SWE-055

Requirements Validation

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

5. Resources

  • (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. See Chapter 16. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
  • (SWEREF-279) Dolores R. Wallace, Laura M. Ippolito, Barbara B. Cuthill, National Institute of Standards and Technology (NIST), NIST Special Publication 500-234, 1996.
  • (SWEREF-334) Software Quality Assurance.org, Accessed December20, 2017.
  • (SWEREF-551) Public Lessons Learned Entry: 1370.


5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to validation:

  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Produce, test and fly interim versions.) Lesson Number 1370: "Enough rigorous ground and flight testing must be planned to thoroughly exercise the firmware." 551
  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Test as much as you can.) Lesson Number 1370: "A lack of comprehensive, end-to-end testing has resulted in a number of spacecraft failures. Deep integration of systems makes them more vulnerable to software issues. As navigation systems become more complex and more deeply integrated, software quality and verification become more important...Testing should also involve any hardware and software that interfaces with the unit. Thorough off line testing of the unit and proposed algorithms that will interface with it should be performed before committing to specific integration architecture. Once the integration has been performed, thorough testing of navigation unit interaction with the rest of the avionics system is needed...End-to-end testing, over the complete flight profile, is required. For space applications, lab tests lasting days or weeks should be conducted. Use good engineering judgment when dispositioning issues, backed up with ground test and flight data." 551
  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Value of Independent Verification and Validation.) Lesson Number 1370: "The trend to use NDI avionics containing proprietary software may prevent independent validation and verification of firmware. This is an issue for applications that involve human safety and unmanned applications requiring a high degree of autonomy. The ground and flight test environments will not be able to produce conditions needed to reveal all firmware issues or verify all firmware modifications and fixes. Code audits are needed, both by the vendor and an IV&V organization. Guidelines should be created concerning audit scope and the definition of credible failure scenarios. Lack of an IV&V level firmware audit will result in lingering suspicion about a unit." 551
  • No labels