


1. Requirements
2.4.3 The project shall record, address, and track to closure the results of software verification activities.
1.1 Notes">1.1 Notes
NPR 7150.2 does not include any notes for this requirement.
1.2 Applicability Across Classes
Class G is labeled with "P(Center)". This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
Class |
A_SC |
A_NSC |
B_SC |
B_NSC |
C_SC |
C_NSC |
D_SC |
D_NSC |
E_SC |
E_NSC |
F |
G |
H |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Applicable? |
|
|
|
|
|
|
|
|
|
|
|
P(C) |
|
Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable |
- Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures
2. Rationale
Simply performing verification is only one of the elements necessary to ensure the software work product meets the specified requirements. The software development team must capture the results of verification activities and document the findings to prove the verification was conducted correctly. All identified issues should be analyzed to determine their causes. The team should then identify associated resolutions, and track to closure the work required to implement those resolutions. Successful closure assures that all requirements are satisfied (including the development and resolution of waiver requests, if needed).
Where resolution involves revision to the verification process, environment, etc., tracking such changes helps to identify and document process improvements. The subsequent implementation of these process improvements will capture quality increases that result in more accurate verification results in the future.
3. Guidance
The basic verification process is shown below with the steps addressed by this requirement highlighted:
The following list provides examples of information typically captured in the verification results, analysis results, or in the post-analysis documentation of those results. Center procedures may call for additional items, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.
- A full identification of the system and the software, including, as applicable, identification number(s), title(s), abbreviation(s), version number(s), and release number(s)
- An assessment of the manner in which the verification environment may be different from the operational environment and the effect of this difference on the results
- The results of the verification activity, including all verified requirements. The summary should include information on the requirements evaluated, (if tested, the success of the test), and references to any problems or deviations
- An overall assessment of the software performance and quality as demonstrated by the results in this report
- A list of any deficiencies, limitations, or constraints that were detected; problem or change reports may be used to provide deficiency information. (e.g., a description of the impact on software and system performance, the impact a correction would have on software, system design, and associated documentation, and recommendations for correcting the deficiency, limitation, or constraint)
The results, and the verification activities and verified products that generated them, can be recorded in the following ways, as appropriate for the method used:
- Verification reports
- Review meeting minutes / reports
- Review Item Dispositions (RIDs)
- As-run test logs
- Demonstration results
- Analysis reports
- User group reports
- Conduct peer reviews
- Analyze peer review results
- Issue tracking system
- Change requests / change request system
The Tools section lists several examples of templates for this type of document. See [SWE-028] and the Note in [SWE-102] for information on what verification includes. See [SWE-118] for information on software test reporting. See [SWE-031] for related information on reporting validation results that may be helpful here.
When analyzing verification results to determine whether those results support a conclusion that the software satisfies the specified requirements, consider the following steps:
- Compare actual to expected results
- Identify discrepancies or mismatches in specification or behavior
- Document discrepancies individually for ease of tracking through the resolution process
- Determine cause of the issue, including problems with the verification methods, criteria, or environment
- Identify the changes required to address discrepancies
- Evaluate and record the impact of changes needed to correct issues / discrepancies
- Plan for any repeat of verification effort (See [SWE-028])
- Obtain and record approval for changes to be made to the current project versus those to be addressed at a different time as part of a general process improvement activity
- Measure and predict quality of the software based on the verification results (typically, a software assurance activity)
When tracking discrepancies to closure, consider the following activities:
- Capture the changes to be made (change requests, corrective actions,
<ac:macro ac:name="unmigrated-wiki-markup">
RID
<ac:plain-text-body><![CDATA[]]></ac:plain-text-body>
</ac:macro> - Carry out the approved changes to be made
- Follow chosen solution to completion, including any verification activities necessary to confirm the issue has been resolved
- Obtain and record approval for issue close-out
Center or project procedures may require additional documentation not described above.
4. Small Projects
There is currently no guidance specific to small projects for this requirement.
5. Resources
Existing
- NASA, Independent Verification and Validation Technical Framework
, IVV 09-1, 2010
- IEEE Computer Society, "IEEE Standard for Software Verification and Validation", Chapter 7, IEEE STD 1012-2004, 2004
. This link requires an account on the NASA START (AGCY NTSS) system (http://standards.nasa.gov
). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
- Reference Information for the Software Verification and Validation Process
, NIST Special Publication 500-234, 1996
- NASA Systems Engineering Handbook
, NASA/SP-2007-6105 Rev1, 2007
- NASA Systems Engineering Processes and Requirements
with Change 1, NPR 7123.1A, 2009
New
(SWEREF-003)
IVV 09-1, Revision P, NASA Independent Verification and Validation Program, Effective Date: February 26, 2016(SWEREF-041)
NPR 7123.1C, Office of the Chief Engineer, Effective Date: February 14, 2020, Expiration Date: February 14, 2025(SWEREF-209)
IEEE Computer Society, IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004), This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.(SWEREF-273)
NASA SP-2016-6105 Rev2 supersedes SP-2007-6105 Rev 1 dated December, 2007(SWEREF-279)
Dolores R. Wallace, Laura M. Ippolito, Barbara B. Cuthill, National Institute of Standards and Technology (NIST), NIST Special Publication 500-234, 1996.
5.1 Tools
Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.