bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 15 Next »

Error formatting macro: alias: java.lang.NullPointerException

 
SET: 4

GET:

SWE-030 - Verification Results
Unknown macro: {div3}

1. Requirements

2.4.3 The project shall record, address, and track to closure the results of software verification activities.

1.1 Notes">1.1 Notes

NPR 7150.2A does not include any notes for this requirement.

1.2 Applicability Across Classes

Class G is labeled with "P(Center)". This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Simply performing verification is only one of the elements necessary to ensure the software work product meets the specified requirements. The software development team must capture the results of verification activities and document the findings to prove the verification was conducted correctly. All identified issues should be analyzed to determine their causes. The team should then identify associated resolutions, and track to closure the work required to implement those resolutions. Successful closure assures that all requirements are satisfied (including the development and resolution of waiver requests, if needed).

Where resolution involves revision to the verification process, environment, etc., tracking such changes helps to identify and document process improvements. The subsequent implementation of these process improvements will capture quality increases that result in more accurate verification results in the future.

Unknown macro: {div3}

3. Guidance

The basic verification process is shown below with the steps addressed by this requirement highlighted:

The following list provides examples of information typically captured in the verification results, analysis results, or in the post-analysis documentation of those results. Center procedures may call for additional items, or additional information may be needed to document fully the result or identified issue. Note that problem tracking or corrective action tools may capture some of this information automatically.

  • A full identification of the system and the software, including, as applicable, identification number(s), title(s), abbreviation(s), version number(s), and release number(s)
  • An assessment of the manner in which the verification environment may be different from the operational environment and the effect of this difference on the results
  • The results of the verification activity, including all verified requirements. The summary should include information on the requirements evaluated, (if tested, the success of the test), and references to any problems or deviations
  • An overall assessment of the software performance and quality as demonstrated by the results in this report
  • A list of any deficiencies, limitations, or constraints that were detected; problem or change reports may be used to provide deficiency information. (e.g., a description of the impact on software and system performance, the impact a correction would have on software, system design, and associated documentation, and recommendations for correcting the deficiency, limitation, or constraint)

The results, and the verification activities and verified products that generated them, can be recorded in the following ways, as appropriate for the method used:

  • Verification reports
  • Review meeting minutes / reports
  • Review Item Dispositions (RIDs)
  • As-run test logs
  • Demonstration results
  • Analysis reports
  • User group reports
  • Conduct peer reviews
  • Analyze peer review results
  • Issue tracking system
  • Change requests / change request system

The Tools section lists several examples of templates for this type of document. See [SWE-028] and the Note in [SWE-102] for information on what verification includes. See [SWE-118] for information on software test reporting. See [SWE-031] for related information on reporting validation results that may be helpful here.

When analyzing verification results to determine whether those results support a conclusion that the software satisfies the specified requirements, consider the following steps:

  • Compare actual to expected results
  • Identify discrepancies or mismatches in specification or behavior
    • Document discrepancies individually for ease of tracking through the resolution process
    • Determine cause of the issue, including problems with the verification methods, criteria, or environment
  • Identify the changes required to address discrepancies
  • Evaluate and record the impact of changes needed to correct issues / discrepancies
    • Plan for any repeat of verification effort (See [SWE-028])
  • Obtain and record approval for changes to be made to the current project versus those to be addressed at a different time as part of a general process improvement activity
  • Measure and predict quality of the software based on the verification results (typically, a software assurance activity)

When tracking discrepancies to closure, consider the following activities:

  • Capture the changes to be made (change requests, corrective actions, RIDs, etc.)
  • Carry out the approved changes to be made
  • Follow chosen solution to completion, including any verification activities necessary to confirm the issue has been resolved
  • Obtain and record approval for issue close-out

Center or project procedures may require additional documentation not described above.

Unknown macro: {div3}

4. Small Projects

There is no information applicable to this section.

Unknown macro: {div3}

5. Resources

Existing

  1. NASA, Independent Verification and Validation Technical Framework, IVV 09-1, 2010
  2. IEEE Computer Society, "IEEE Standard for Software Verification and Validation", Chapter 7, IEEE STD 1012-2004, 2004. This link requires an account on the NASA START (AGCY NTSS) system (http://standards.nasa.gov). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
  3. Reference Information for the Software Verification and Validation Process, NIST Special Publication 500-234, 1996
  4. NASA Systems Engineering Handbook, NASA/SP-2007-6105 Rev1, 2007
  5. NASA Systems Engineering Processes and Requirements with Change 1, NPR 7123.1A, 2009

New


5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Unknown macro: {div3}

6. Lessons Learned

No lessons learned have currently been identified for this Guidance.

  • No labels