Comment:
Migration of unmigrated content due to installation of a new plugin
Alias
alias
SWE-031
Tabsetup
1. The Requirement
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
Div
id
tabs-1
1. Requirements
2.4.4 The project shall record, address, and track to closure the results of software validation activities.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Class G is labeled with "P (Center)." This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement.
applicable
f
1
g
p
h
0
ascansc
1
anscasc
1
bnsc
1
csc
1
bsc
1
esc
1
cnsc
1
dscdnsc
1
dnscdsc
1
ensc
0
Div
id
tabs-2
2. Rationale
Simply performing validation is not sufficient to ensure the software will perform as intended in the customer environment. The project team must capture the results of validation activities to prove the validation was conducted and to document the findings from those activities. Identified issues need to be analyzed to determine the cause. The team then identifies associated resolutions, and tracks to closure the work to implement those resolutions. This completes the cycle and provides the best assurance that the software will be of high quality and perform as intended.
Where resolution involves revision to the validation process, environment, etc., tracking such issues helps to document process improvements and improve process quality resulting in more accurate validation results in the future.
Div
id
tabs-3
3. Guidance
The basic validation process is shown below with the steps addressed by this requirement highlighted:
Recording validation results and activities
Results and the validation activities and validated products that generated them can be recorded in the following ways, as appropriate for the validation method used:
Validation reports.
Review meeting minutes/reports.
Review Item Dispositions (RIDs).
As-run test logs.
Demonstration results.
Analysis reports.
Beta test reports.
User group reports.
Issue tracking system.
Change requests/change request system.
Analyzing validation results
When analyzing validation results to determine whether those results support a conclusion that the software will perform as intended in the operational environment, consider the following steps:
Compare actual to expected results.
Identify discrepancies or mismatches in behavior.
Document discrepancies individually for ease of tracking through the resolution process.
Determine cause of the issue, including problems with the validation methods, criteria, or environment.
Identify the changes required to address discrepancies.
Evaluate and record the impact of changes needed to correct issues/discrepancies.
Rework and test.
Re-validation.
Obtain and record approval for changes to be made versus those to be addressed at a different time.
Measure and predict quality of the software based on the validation results (typically, a software assurance activity).
Tracking discrepancies to closure
When tracking discrepancies to closure, consider the following activities:
Capture the changes to be made (change requests, corrective actions, RIDs, etc.) with or with reference to the original validation results.
Carry out the approved changes to be made.
Follow chosen solution to completion, including any re-validation necessary to confirm the issue has been resolved.
Obtain and record approval for issue close-out.
Typical content and documentation
The following is a list of information typically captured in the validation results, analysis results, or in the post-analysis documentation of those results. This is not a complete list. Center procedures may call for additional information, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.
Specific reference (name, identification number, version, etc.) to the item or product (requirement, document, function, system, etc.) being validated.
Where appropriate, the specific validation activity applied (test identifier, demonstration step(s), etc.).
Include sequence of events leading to problem, if appropriate.
Description of the validation result(s).
Description of identified issue or problem.
Root cause of issue or problem, including problems with the validation environment, process, etc.
Criticality of issue or problem, including impact on the item under validation or software system.
Recommended correction or change to address the issue.
Impact of recommended correction or change.
Justification for the chosen resolution, including choices to postpone a change.
Approval to implement the chosen resolution.
Tracking information such as validator (person, team, etc.) identification, date of activity (validation date, analysis or review date, correction date, follow-up validation activity date, etc.).
Approval of issue close-out.
Center or project procedures may require additional documentation not described above.
The team reports the results of validation activities to one or more of the following:
Project management.
Software assurance, if not involved in the validation activities.
Customer, as appropriate.
Other stakeholders, as appropriate.
Potential problems when recording, analyzing and tracking validation results include, but are not limited to:
Assuming all issues are problems in the software (issues may be caused by the validation procedures, criteria, or environment).
Failing to involve knowledgeable or "expert" personnel in the analysis process.
Not reviewing validation results at life cycle product milestone, or other relevant, reviews (waiting to review such results can potentially result in carrying issues into later life-cycle phases).
Note
NASA Centers typically have templates and tools for capturing and tracking validation results. Use the tools available at each particular Center for projects developed at those Centers.
Additional guidance related to Validation may be found in the following related requirements in this Handbook:
No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.
Div
id
tabs-5
5. Resources
refstable
toolstable
Div
id
tabs-6
6. Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to validation:
Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Produce, test and fly interim versions.) Lesson Number 1370: "Enough rigorous ground and flight testing must be planned to thoroughly exercise the firmware."
sweref
551
551
Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Test as much as you can.) Lesson Number 1370: "A lack of comprehensive, end-to-end testing has resulted in a number of spacecraft failures. Deep integration of systems makes them more vulnerable to software issues. As navigation systems become more complex and more deeply integrated, software quality and verification become more important...Testing should also involve any hardware and software that interfaces with the unit. Thorough off line testing of the unit and proposed algorithms that will interface with it should be performed before committing to specific integration architecture. Once the integration has been performed, thorough testing of navigation unit interaction with the rest of the avionics system is needed...End-to-end testing, over the complete flight profile, is required. For space applications, lab tests lasting days or weeks should be conducted. Use good engineering judgment when dispositioning issues, backed up with ground test and flight data."
sweref
551
551
Flights of "Off the Shelf" Aviation Navigation Units on the Space Shuttle, GPS (Value of Independent Verification and Validation.) Lesson Number 1370: "The trend to use NDI avionics containing proprietary software may prevent independent validation and verification of firmware. This is an issue for applications that involve human safety and unmanned applications requiring a high degree of autonomy. The ground and flight test environments will not be able to produce conditions needed to reveal all firmware issues or verify all firmware modifications and fixes. Code audits are needed, both by the vendor and an IV&V organization. Guidelines should be created concerning audit scope and the definition of credible failure scenarios. Lack of an IV&V level firmware audit will result in lingering suspicion about a unit."