bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-069 - Document Defects & Track
Unknown macro: {div3}

1. Requirements

3.4.5 The project shall document defects identified during testing and track to closure.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Implementation Notes from Appendix D

NPR 7150.2 does not include any notes for this requirement.

1.3 Applicability Across Classes

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

    P(C)

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Documenting software defects identified during testing provides a basis for improving the software product, preventing future occurrences of the same issues, and capturing data used to measure software quality.  This includes software reliability which is based on defect data and the level of fault and failure detection (NASA Software Assurance Standard, NASA-STD-8739.8).  The NASA Software Assurance Standard includes the following possible effects of software defects, which serve as good reasons for capturing and tracking defects to closure:

  • adversely impact attainment of mission objectives
  • cause operational problems
  • cause rework
  • affect the productivity of users
  • impact system safety
  • lead to loss of functionality

Other reasons for defect tracking include:

  • Give management visibility into past and current s/w problems (NASA Software Safety Standard, NASA-STD-8719.13B)
  • Record defects for future reference (NASA Software Safety Guidebook, NASA-GB-8719.13)
  • Input to metrics determination (e.g., defect density)
  • Identify process tendencies and mitigate weaknesses
  • Basis of learning and avoidance of repeated mistakes
  • Historical data allows:* o   Better estimates for completing s/w changes
  • o   Better estimates for unit testing effort/time
  • o   Identify if more time needed for unit testing (due to high defects/LOC)
  • o   Better time estimates for development phases in future projects
  • o   Estimation of time for future change requests
  • o   Identify where testing resources are best used (focus on finding defects earlier in lifecycle)
  • o   Determine benefit of development process changes
Unknown macro: {div3}

3. Guidance

While defects are documented during the developer's unit tests as well as formal testing, the formality of the documentation during local unit test is considerably less. When formally documenting software defects and tracking them to closure, the following fundamental tasks should be performed, regardless of the tool used5:

  • Capture information* o   Create guidelines that establish the minimum information to be captured, ensuring that adequate, vital  information is captured about the defect
  • o   Consider:
    • §  Title: A simple clear title that makes searching for the defect easy
    • §  Summary: A clearly written paragraph or two describing the problem
    • §  System configuration: Exact configuration of the system when and where the defect was found (not all defects are found on all system configurations)
    • §  Steps to reproduce: Precise, detailed explanation of how to reproduce the defect
    • §  Expected results: A description of what was expected, how the system should work; use diagrams or screen shots when necessary
    • §  Notes: Information not captured elsewhere, but important to reach a resolution; if not captured elsewhere, contact information for the person who found the defect
  • o   Consider the following to allow report generation from the defect data:
    • §  Date reported
    • §  User information (name, contact information, etc.)
    • §  Organizational data (department who found the defect)
    • §  Other clarifying data.* Ensure reproduction* o   To confirm that the defect is fixed, it must be reproducible
  • o   Exceptions exist, but confirmation of the correction can be time-consuming if the problem is not readily reproducible* Prioritize defects* o   Valuable resources must be allocated to correct defects, so prioritizing them is important to ensure the best and most beneficial use of those resources
  • o   Consider using priority definitions and "rules" appropriate for your project such as:
    • §  High: major impact on customer; must be fixed immediately.
    • §  Medium: major impact on customer; should be fixed before release or a patch issued Low: minor impact on customer; should be fixed if there is time, but can be deferred
  • o   See the Resources section of this guidance for additional information for prioritizing defects* Use constructive communication* o   Keep channels of communication open between those who enter problem reports and those assigned to correct those defects
  • o   Constructive communication can be essential to problem resolution* Use a tool* o   A defect-tracking tool provides a common approach to defect tracking and facilitates vital information capture
  • o   A defect-tracking tool with a good interface can simplify information capture; in addition, users are more likely to use the tool if a minimal feature set is implemented that focuses on the project's defect tracking goals and objectives
  • o   A defect tracking tool allows for automated metric generation
    The NASA Software Safety Standard recommends the following:

    Capture defects in tools as well as the software product

    Review problem reports for safety impacts

    Include in the problem report:

  • Description of the problem
  • Recommended solutions
  • Final disposition of the problem
  • The system-level hazard, if the problem is a safety-critical software problem | Use a Change Control Board (CCB) to review and disposition (reject or approve) problem reports.
    |

    Consider capturing safety concerns in a Problem Reporting and Corrective Action (PRACA) system


The NASA Software Safety Guidebook recommends the following:

Use configuration management (CM), problem reporting or defect management tool for ordered recording and tracking to closure

Review problems for safety impact, including review of corrective action or software update to ensure no additional hazard or adverse impact to safety

Include in the problem report:

  • Description of unexpected behavior
  • Analysis performed to determine the cause
  • What was done to correct the problem

When tracking to closure:

  • Verify problem fixed
  • Ensure fix does not have negative effect on other parts of the system


Other practices to consider include5:

  • Avoid duplicating defects
  • Use a defect tracking workflow (e.g., verify defect, allocate resources to fix, verify fix, release fix)
  • Integrate defect tracking tool with configuration management

Problem reports can provide key data for trending software quality metric data.  To ensure meaningful metrics can be reported, problem reports should contain information such as that shown below.  Specific metrics should be based on the policies and procedures at each NASA Center or the needs/desires of a particular project. 

The team should clearly define options for each category so users apply them consistently and to allow accurate metrics collection.

  • Defect types (error in logic, design, computation, data handling, functional specification, etc.)
  • Defect location (subsystem, module, database, document, etc.)
  • Defect priority/criticality (Level 1, Level 2, Level 3; high, medium, low; severe, major, minor; show stopper, enhancement, emergency)
  • Lifecycle phase (requirements, design, unit test, integration test, system test, acceptance test, etc.)

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to documenting and tracking software defects.

Additional guidance related to software testing and related defects may be found in the following related requirements in this handbook:

SWE-62

Unit Test

SWE-068:

Evaluate Test Results

SWE-104:

Software Test Plan

SWE-118:

Software Test Report


Unknown macro: {div3}

4. Small Projects

The Goddard Space Flight Center (GSFC) Excel-based tool which can be found in the Tools Table located in this Electronic Handbook can be used to manage problem reports and generate related metrics analyses. It is particularly targeted to small projects that may not want to use a larger, more complex, or expensive tool. Using this tool to manage problem reports will meet all requirements for problem reporting -- both the information stored for each problem and the summary metrics used to assess the overall software quality. The User's Guide for this tool is found on the first tab in the spreadsheet, and describes both the initial set-up process and how the tool is used throughout a project.

Ames Research Center (ARC) has successfully used free Open Source tools that can be customized or used "out of the box".  These tools are viable options for projects with limited budgets.

Unknown macro: {div3}

5. Resources

  1. 1.    NASA Technical Standard, NASA Software Assurance Standard, NASA-STD-8739.8, 2004.
  2. 2.    NASA Technical Standard, "NASA Software Safety Standard", NASA-STD-8719.13B, 2004.
  3. 3.    NASA Technical Standard, "NASA Software Safety Guidebook", NASA-GB-8719.13, 2004.
  4. 4.    Goddard Space Flight Center, "Recommended Approach to Software Development", Revision 3, Software Engineering Laboratory Series, SEL-81-305, 1992.
  5. 5.    Seapine Software, "Best Practices for Effective Defect Tracking", 2008.
  6. 6.    StickyMinds.com, "Categorizing Defects by Eliminating "Severity" and "Priority".  Accessed June 2011.
  7. 7.    wikiHow, "How to Categorize Defects".  Accessed June 2011.
  8. 8.    Scaffadaffa.com, "Software Defects: Severity, Priority, and Impact".  Accessed June 2011.

5.1 Tools


Tools relative to this SWE may be found in the table above. If no tools are listed, none have been currently identified for this SWE. You may wish to reference table XYZ i in this handbook for an evolving  list of these and other tools in use at NASA.  Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool.  Check with your Center to see what tools are available to facilitate compliance with this requirement.


Unknown macro: {div3}

6. Lessons Learned

There are currently no Lessons Learned listed for this requirement.

  • No labels