bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
{alias:SWE-030} 
{get-data:SWEREF-003}{get-data}

{tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned}

{div3:id=tabs-1}

h1. 1. Requirements

2.4.3 The project shall record, address, and track to closure the results of software verification activities.

h2. {color:#003366}{*}1.1 Notes{*}{color}

NPR 7150.2 does not include any notes for this requirement.

h2. 1.2 Applicability Across Classes

Class G is labeled with "P(Center)". This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
\\
{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=1|cnsc=1|dsc=1|dnsc=1|esc=1|ensc=0|f=1|g=p|h=0}
{div3}
{div3:id=tabs-2}

h1. 2. Rationale

Simply performing verification is only one of the elements necessary to ensure the software work product meets the specified requirements. The software development team must capture the results of verification activities and document the findings to prove the verification was conducted correctly. All identified issues should be analyzed to determine their causes. The team should then identify associated resolutions, and track to closure the work required to implement those resolutions. Successful closure assures that all requirements are satisfied (including the development and resolution of waiver requests, if needed).

Where resolution involves revision to the verification process, environment, etc., tracking such changes helps to identify and document process improvements. The subsequent implementation of these process improvements will capture quality increases that result in more accurate verification results in the future.


{div3}
{div3:id=tabs-3}

h1. 3. Guidance

The basic verification process is shown below with the steps addressed by this requirement highlighted:


!SWE-030 graphic.jpg|border=0!


The following list provides examples of information typically captured in the verification results, analysis results, or in the post-analysis documentation of those results. Center procedures may call for additional items, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.

* A full identification of the system and the software, including, as applicable, identification number(s), title(s), abbreviation(s), version number(s), and release number(s)

* An assessment of the manner in which the verification environment may be different from the operational environment and the effect of this difference on the results

* The results of the verification activity, including all verified requirements. The summary should include information on the requirements evaluated, (if tested, the success of the test), and references to any problems or deviations

* An overall assessment of the software performance and quality as demonstrated by the results in this report

* A list of any deficiencies, limitations, or constraints that were detected; problem or change reports may be used to provide deficiency information. (e.g., a description of the impact on software and system performance, the impact a correction would have on software, system design, and associated documentation, and recommendations for correcting the deficiency, limitation, or constraint)

The results, and the verification activities and verified products that generated them, can be recorded in the following ways, as appropriate for the method used:

* Verification reports
* Review meeting minutes / reports
* Review Item Dispositions (RIDs)
* As-run test logs
* Demonstration results
* Analysis reports
* User group reports
* Conduct peer reviews
* Analyze peer review results
* Issue tracking system
* Change requests / change request system

The Tools section lists several examples of templates for this type of document. See [SWE-028|SWE-028] and the Note in [SWE-102|SWE-102] for information on what verification includes. See [SWE-118|SWE-118] for information on software test reporting. See [SWE-031|SWE-031] for related information on reporting validation results that may be helpful here.

When analyzing verification results to determine whether those results support a conclusion that the software satisfies the specified requirements, consider the following steps:

* Compare actual to expected results
* Identify discrepancies or mismatches in specification or behavior
** Document discrepancies individually for ease of tracking through the resolution process
** Determine cause of the issue, including problems with the verification methods, criteria, or environment
* Identify the changes required to address discrepancies
* Evaluate and record the impact of changes needed to correct issues / discrepancies
** Plan for any repeat of verification effort (See [SWE-028|SWE-028])
* Obtain and record approval for changes to be made to the current project versus those to be addressed at a different time as part of a general process improvement activity
* Measure and predict quality of the software based on the verification results (typically, a software assurance activity)

When tracking discrepancies to closure, consider the following activities:

* Capture the changes to be made (change requests, corrective actions, {term:RID}s, etc.)
* Carry out the approved changes to be made
* Follow chosen solution to completion, including any verification activities necessary to confirm the issue has been resolved
* Obtain and record approval for issue close-out

Center or project procedures may require additional documentation not described above.



{div3}
{div3:id=tabs-4}

h1. 4. Small Projects

There is currently no guidance specific to small projects for this requirement.
{div3}
{div3:id=tabs-5}

h1. 5. Resources

{report-info:data:SREF0001}


*{+}Existing{+}*

# NASA, [Independent Verification and Validation Technical Framework|http://www.nasa.gov/centers/ivv/pdf/170825main_IVV_09-1_-_Rev_M.pdf], IVV 09-1, 2010
# IEEE Computer Society, ["IEEE Standard for Software Verification and Validation", Chapter 7, IEEE STD 1012-2004, 2004|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1488512]. This link requires an account on the NASA START (AGCY NTSS) system ([http://standards.nasa.gov]). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
# [Reference Information for the Software Verification and Validation Process|http://hissa.ncsl.nist.gov/HHRFdata/Artifacts/ITLdoc/234/val-proc.html], NIST Special Publication 500-234, 1996
# [NASA Systems Engineering Handbook|http://education.ksc.nasa.gov/esmdspacegrant/Documents/NASA SP-2007-6105 Rev 1 Final 31Dec2007.pdf], NASA/SP-2007-6105 Rev1, 2007
# [NASA Systems Engineering Processes and Requirements|http://nodis3.gsfc.nasa.gov/displayDir.cfm?t=NPR&c=7123&s=1A] with Change 1, NPR 7123.1A, 2009




*{+}New{+}*

{refstable}
\\
{toolstable}
{div3}
{div3:id=tabs-6}

h2. 6. Lessons Learned

No Lessons Learned have currently been identified for this requirement.
{div3}
{tabclose}