bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


{alias:SWE-031} {tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned} {div3:id=tabs-1} h1. 1. Requirements
Wiki Markup
Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned


Div
idtabs-1

1. Requirements

2.4.4

The

project

shall

record,

address,

and

track

to

closure

the

results

of

software

validation

activities.

h2.

1.1

Notes

NPR

7150.2

does not include any notes for this requirement. h2.

, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2

Applicability

Across

Classes

Class

G

is

labeled

with

"P (Center).

" This

means

that

an

approved

Center-defined

process

that

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement.

{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=1|cnsc=1|dsc=1|dnsc=1|esc=1|f=1|g=p} {div3}{div3:id=tabs-2} h1. 2. Rationale Simply performing validation is not sufficient to ensure the software will perform as intended in the customer environment. The project team must capture the results of validation activities to prove the validation was conducted and to document the findings from those activities. Identified issues should be analyzed to determine the cause. The team should then identify associated resolutions, and track to closure the work to implement those resolutions. This completes the cycle and provides the best assurance that the software will be of high quality and perform as intended. Where resolution involves revision to the validation process, environment, etc., tracking such issues helps to document process improvements and improve process quality resulting in more accurate validation results in the future. {div3}{div3:id=tabs-3} h1. 3. Guidance The basic validation process is shown below with the steps addressed by this requirement highlighted: !validate2.png|width=650,height=167! Results and the validation activities and validated products that generated them can be recorded in the following ways, as appropriate for the validation method used: * Validation reports * Review meeting minutes / reports * Review Item Dispositions (RIDs) * As-run test logs * Demonstration results * Analysis reports * Beta test reports * User group reports * Issue tracking system * Change requests / change request system * Etc. When analyzing validation results to determine whether those results support a conclusion that the software will perform as intended in the operational environment, consider the following steps: * Compare actual to expected results * Identify discrepancies or mismatches in behavior ** Document discrepancies individually for ease of tracking through the resolution process ** Determine cause of the issue, including problems with the validation methods, criteria, or environment * Identify the changes required to address discrepancies * Evaluate and record the impact of changes needed to correct issues / discrepancies ** Rework and test ** Re-validation * Obtain and record approval for changes to be made versus those to be addressed at a different time * Measure and predict quality of the software based on the validation results (typically, a software assurance activity) When tracking discrepancies to closure, consider the following activities: * Capture the changes to be made (change requests, corrective actions, RIDs, etc.) with or with reference to the original validation results * Carry out the approved changes to be made * Follow chosen solution to completion, including any re-validation necessary to confirm the issue has been resolved * Obtain and record approval for issue close-out The following is a list of information typically captured in the validation results, analysis results, or in the post-analysis documentation of those results. This is not a complete list. Center procedures may call for additional information, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically. * Specific reference (name, identification number, version, etc.) to the item or product (requirement, document, function, system, etc.) being validated * Where appropriate, the specific validation activity applied (test identifier, demonstration step(s), etc.) ** Include sequence of events leading to problem, if appropriate * Description of the validation result(s) * Description of identified issue or problem * Root cause of issue or problem, including problems with the validation environment, process, etc. * Criticality of issue or problem, including impact on the item under validation or software system * Recommended correction or change to address the issue * Impact of recommended correction or change * Justification for the chosen resolution, including choices to postpone a change * Approval to implement the chosen resolution * Tracking information such as validator (person, team, etc.) identification, date of activity (validation date, analysis or review date, correction date, follow-up validation activity date, etc.), etc. * Approval of issue close-out Center or project procedures may require additional documentation not described above. {div:style=text-align: center; margin: 0pt auto; width: 22em;} *Additional documents which may be produced* | Validation summary report | Status reports | | Metric data/reports | Lessons learned | {div} The team should report the results of validation activities to one or more of the following: * Project management * Software assurance, if not involved in the validation activities * Customer, as appropriate * Other stakeholders, as appropriate Potential problems when recording, analyzing and tracking validation results include, but are not limited to: * Assuming all issues are problems in the software (issues may be caused by the validation procedures, criteria, or environment) * Failing to involve knowledgeable or "expert" personnel in the analysis process * Not reviewing validation results at lifecycle product milestone, or other relevant, reviews (waiting to review such results can potentially result in carrying issues into later lifecycle phases) NASA Centers typically have templates and tools for capturing and tracking validation results. The tools for each particular center should be used for projects developed at those centers. See also related requirements in this handbook: | ?[SWE-029 - Validation Planning|SWE-029 - Validation Planning]\\ | | {color:#3366ff}SWE-055 - Requirements Validation{color}\\ | {div3}{div3:id=tabs-4} h1. 4. Small Projects There is no information applicable to this section. {div3}{div3:id=tabs-5} h1. 5. Resources # [The CMMi easy button presentation of CMMi -- Validation (VAL)|http://www.software-quality-assurance.org/cmmi-validation.html] (SP 2.2 Analyze Validation Results), Software Quality Assurance.org # [Reference Information for the Software Verification and Validation Process|http://hissa.ncsl.nist.gov/HHRFdata/Artifacts/ITLdoc/234/val-proc.html], NIST Special Publication 500-234, 1996 # [Software Development Process Description Document, EI32-OI-001, Revision R, 2010, (Hosted on NEN)|https://nen.nasa.gov/web/software/nasa-software-process-asset-library-pal?p_p_id=webconnector_WAR_webconnector_INSTANCE_PA7b&p_p_lifecycle=1&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_webconnector_WAR_webconnector_INSTANCE_PA7b_edu.wisc.my.webproxy.URL=https%3A%2F%2Fnx.arc.nasa.gov%2Fnx%2Fdsweb%2FGet%2FDocument-499471%2FSDPDD_Rev%2BQ_080207.pdf|https://nen.nasa.gov/web/software/nasa-software-process-asset-library-pal?p_p_id=webconnector_WAR_webconnector_INSTANCE_PA7b&p_p_lifecycle=1&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_webconnector_WAR_webconnector_INSTANCE_PA7b_edu.wisc.my.webproxy.URL=https%3A%2F%2Fnx.arc.nasa.gov%2Fnx%2Fdsweb%2FGet%2FDocument-499471%2FSDPDD_Rev%2BQ_080207.pdf] h2. 5.1 Tools There are no tools currently identified for this requirement. {div3}{div3:id=tabs-6} h1. 6. Lessons Learned The NASA Lessons Learned website offers a few lessons learned related to validating results from flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS. [https://nen.nasa.gov/web/ll/home/llis-doc-viwer?url=https://nen.nasa.gov/llis_content/1370.html] {div3} {tabclose}


applicable
f1
gp
h0
ansc1
asc1
bnsc1
csc1
bsc1
esc1
cnsc1
dnsc1
dsc1
ensc0



Div
idtabs-2

2. Rationale

Simply performing validation is not sufficient to ensure the software will perform as intended in the customer environment. The project team must capture the results of validation activities to prove the validation was conducted and to document the findings from those activities. Identified issues need to be analyzed to determine the cause. The team then identifies associated resolutions, and tracks to closure the work to implement those resolutions. This completes the cycle and provides the best assurance that the software will be of high quality and perform as intended.

Where resolution involves revision to the validation process, environment, etc., tracking such issues helps to document process improvements and improve process quality resulting in more accurate validation results in the future.


Div
idtabs-3

3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:

Image Added

Recording validation results and activities

Results and the validation activities and validated products that generated them can be recorded in the following ways, as appropriate for the validation method used:

  • Validation reports.
  • Review meeting minutes/reports.
  • Review Item Dispositions (RIDs).
  • As-run test logs.
  • Demonstration results.
  • Analysis reports.
  • Beta test reports.
  • User group reports.
  • Issue tracking system.
  • Change requests/change request system.

Analyzing validation results

When analyzing validation results to determine whether those results support a conclusion that the software will perform as intended in the operational environment, consider the following steps:

  • Compare actual to expected results.
  • Identify discrepancies or mismatches in behavior.
    • Document discrepancies individually for ease of tracking through the resolution process.
    • Determine cause of the issue, including problems with the validation methods, criteria, or environment.
  • Identify the changes required to address discrepancies.
  • Evaluate and record the impact of changes needed to correct issues/discrepancies.
    • Rework and test.
    • Re-validation.
  • Obtain and record approval for changes to be made versus those to be addressed at a different time.
  • Measure and predict quality of the software based on the validation results (typically, a software assurance activity).

Tracking discrepancies to closure

When tracking discrepancies to closure, consider the following activities:

  • Capture the changes to be made (change requests, corrective actions, RIDs, etc.) with or with reference to the original validation results.
  • Carry out the approved changes to be made.
  • Follow chosen solution to completion, including any re-validation necessary to confirm the issue has been resolved.
  • Obtain and record approval for issue close-out.

Typical content and documentation

The following is a list of information typically captured in the validation results, analysis results, or in the post-analysis documentation of those results. This is not a complete list. Center procedures may call for additional information, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.

  • Specific reference (name, identification number, version, etc.) to the item or product (requirement, document, function, system, etc.) being validated.
  • Where appropriate, the specific validation activity applied (test identifier, demonstration step(s), etc.).
    • Include sequence of events leading to problem, if appropriate.
  • Description of the validation result(s).
  • Description of identified issue or problem.
  • Root cause of issue or problem, including problems with the validation environment, process, etc.
  • Criticality of issue or problem, including impact on the item under validation or software system.
  • Recommended correction or change to address the issue.
  • Impact of recommended correction or change.
  • Justification for the chosen resolution, including choices to postpone a change.
  • Approval to implement the chosen resolution.
  • Tracking information such as validator (person, team, etc.) identification, date of activity (validation date, analysis or review date, correction date, follow-up validation activity date, etc.).
  • Approval of issue close-out.

Center or project procedures may require additional documentation not described above.


Div
styletext-align: center; margin: 0pt auto; width: 22em;

Additional documents which may be produced


Validation summary report

Status reports

Metric data/reports

Lessons learned



Reporting results

The team reports the results of validation activities to one or more of the following:

  • Project management.
  • Software assurance, if not involved in the validation activities.
  • Customer, as appropriate.
  • Other stakeholders, as appropriate.

Potential problems when recording, analyzing and tracking validation results include, but are not limited to:

  • Assuming all issues are problems in the software (issues may be caused by the validation procedures, criteria, or environment).
  • Failing to involve knowledgeable or "expert" personnel in the analysis process.
  • Not reviewing validation results at life cycle product milestone, or other relevant, reviews (waiting to review such results can potentially result in carrying issues into later life-cycle phases).


Note

NASA Centers typically have templates and tools for capturing and tracking validation results. Use the tools available at each particular Center for projects developed at those Centers.


Additional guidance related to Validation may be found in the following related requirements in this Handbook:


SWE-029

Validation Planning

SWE-055

Requirements Validation



Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.


Div
idtabs-5

5. Resources


refstable



toolstable


Div
idtabs-6

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to validation:

  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Produce, test and fly interim versions.) Lesson Number 1370: "Enough rigorous ground and flight testing must be planned to thoroughly exercise the firmware."
    sweref
    551
    551
  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Test as much as you can.) Lesson Number 1370: "A lack of comprehensive, end-to-end testing has resulted in a number of spacecraft failures. Deep integration of systems makes them more vulnerable to software issues. As navigation systems become more complex and more deeply integrated, software quality and verification become more important...Testing should also involve any hardware and software that interfaces with the unit. Thorough off line testing of the unit and proposed algorithms that will interface with it should be performed before committing to specific integration architecture. Once the integration has been performed, thorough testing of navigation unit interaction with the rest of the avionics system is needed...End-to-end testing, over the complete flight profile, is required. For space applications, lab tests lasting days or weeks should be conducted. Use good engineering judgment when dispositioning issues, backed up with ground test and flight data."
    sweref
    551
    551
  • Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Value of Independent Verification and Validation.) Lesson Number 1370: "The trend to use NDI avionics containing proprietary software may prevent independent validation and verification of firmware. This is an issue for applications that involve human safety and unmanned applications requiring a high degree of autonomy. The ground and flight test environments will not be able to produce conditions needed to reveal all firmware issues or verify all firmware modifications and fixes. Code audits are needed, both by the vendor and an IV&V organization. Guidelines should be created concerning audit scope and the definition of credible failure scenarios. Lack of an IV&V level firmware audit will result in lingering suspicion about a unit."
    sweref
    551
    551