3. GuidanceThe basic validation process is shown below with the steps addressed by this requirement highlighted: 
Recording validation results and activities Results and the validation activities and validated products that generated them can be recorded in the following ways, as appropriate for the validation method used: - Validation reports.
- Review meeting minutes/reports.
- Review Item Dispositions (RIDs).
- As-run test logs.
- Demonstration results.
- Analysis reports.
- Beta test reports.
- User group reports.
- Issue tracking system.
- Change requests/change request system.
Analyzing validation results When analyzing validation results to determine whether those results support a conclusion that the software will perform as intended in the operational environment, consider the following steps: - Compare actual to expected results.
- Identify discrepancies or mismatches in behavior.
- Document discrepancies individually for ease of tracking through the resolution process.
- Determine cause of the issue, including problems with the validation methods, criteria, or environment.
- Identify the changes required to address discrepancies.
- Evaluate and record the impact of changes needed to correct issues/discrepancies.
- Rework and test.
- Re-validation.
- Obtain and record approval for changes to be made versus those to be addressed at a different time.
- Measure and predict quality of the software based on the validation results (typically, a software assurance activity).
Tracking discrepancies to closure When tracking discrepancies to closure, consider the following activities: - Capture the changes to be made (change requests, corrective actions, RIDs, etc.) with or with reference to the original validation results.
- Carry out the approved changes to be made.
- Follow chosen solution to completion, including any re-validation necessary to confirm the issue has been resolved.
- Obtain and record approval for issue close-out.
Typical content and documentation The following is a list of information typically captured in the validation results, analysis results, or in the post-analysis documentation of those results. This is not a complete list. Center procedures may call for additional information, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically. - Specific reference (name, identification number, version, etc.) to the item or product (requirement, document, function, system, etc.) being validated.
- Where appropriate, the specific validation activity applied (test identifier, demonstration step(s), etc.).
- Include sequence of events leading to problem, if appropriate.
- Description of the validation result(s).
- Description of identified issue or problem.
- Root cause of issue or problem, including problems with the validation environment, process, etc.
- Criticality of issue or problem, including impact on the item under validation or software system.
- Recommended correction or change to address the issue.
- Impact of recommended correction or change.
- Justification for the chosen resolution, including choices to postpone a change.
- Approval to implement the chosen resolution.
- Tracking information such as validator (person, team, etc.) identification, date of activity (validation date, analysis or review date, correction date, follow-up validation activity date, etc.).
- Approval of issue close-out.
Center or project procedures may require additional documentation not described above. Additional documents which may be produced Validation summary report | Status reports | Metric data/reports | Lessons learned |
|
Reporting results The team reports the results of validation activities to one or more of the following: - Project management.
- Software assurance, if not involved in the validation activities.
- Customer, as appropriate.
- Other stakeholders, as appropriate.
Potential problems when recording, analyzing and tracking validation results include, but are not limited to: - Assuming all issues are problems in the software (issues may be caused by the validation procedures, criteria, or environment).
- Failing to involve knowledgeable or "expert" personnel in the analysis process.
- Not reviewing validation results at life cycle product milestone, or other relevant, reviews (waiting to review such results can potentially result in carrying issues into later life-cycle phases).
NASA Centers typically have templates and tools for capturing and tracking validation results. Use the tools available at each particular Center for projects developed at those Centers. |
Additional guidance related to Validation may be found in the following related requirements in this Handbook: |