4.5.6 The project manager shall record defects identified during testing and track to closure.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Recording software defects identified during testing provides a basis for improving the software product, preventing future occurrences of the same issues, and capturing data used to measure software quality.
The purpose of this requirement is for the project to capture and record all defects identified during testing and track all identified defects to closure.
NASA-STD-8739.8, NASA Software Assurance Standard,
includes the following possible effects of software defects, which serve as good reasons for capturing and tracking defects to closure:
Adverse impact to attainment of mission objectives.
Adverse impact to user productivity.
Impact to system safety.
Loss of functionality.
Other reasons for defect tracking include:
Give management visibility into past and current software problems (NASA-STD-8719.13, NASA Software Safety Standard
Record defects for future reference (NASA-GB-8719.13, NASA Software Safety Guidebook
Input to metrics determination (e.g., defect density).
Identify process tendencies and mitigate weaknesses.
Basis of learning and avoidance of repeated mistakes.
Historical data allows:
Better estimates for completing software changes.
Better estimates for unit testing effort/time.
Identify if more time needed for unit testing (due to high defects/ lines of code [LOC]).
Better time estimates for development phases in future projects.
Estimation of time for future change requests.
Identify where testing resources are best used (focus on finding defects earlier in the life cycle).
Determine benefit of development process changes.
The word record is used in this requirement because defects identified during testing can be captured in a database system or a problem reporting tool. Final disposition of all identified defects should be recorded in some level of software documentation or databases.
When recording software defects and tracking them to closure, the following fundamental tasks are performed, regardless of the tool used
Create guidelines that establish the minimum information to be captured, ensuring that adequate, vital information is captured about the defect.
Consider the following:
Title: A simple clear title that makes searching for the defect easy.
Summary: A clearly written paragraph or two describing the problem.
System configuration: Exact configuration of the system when and where the defect was found (not all defects are found on all system configurations).
Steps to reproduce: Precise, detailed explanation of how to reproduce the defect.
Expected results: A description of what was expected, how the system should work; use diagrams or screen shots when necessary.
Notes: Information not captured elsewhere, but important to reach a resolution; if not captured elsewhere, contact information for the person who found the defect.
Consider the following to allow report generation from the defect data:
User information (name, contact information, etc.).
Organizational data (department who found the defect).
Other clarifying data.
To confirm that the defect is fixed, it must be reproducible.
Exceptions exist, but confirmation of the correction can be time-consuming if the problem is not readily reproducible.
Valuable resources must be allocated to correct defects, so prioritizing them is important to ensure the best and most beneficial use of those resources.
Consider using priority definitions and "rules" appropriate for your project such as:
High: Major impact on customer; must be fixed immediately.
Medium: Major impact on customer; should be fixed before release or a patch issued.
Low: Minor impact on customer; should be fixed if there is time, but can be deferred
See the Resources section of this guidance for additional information for prioritizing defects.
Use constructive communication.
Keep channels of communication open between those who enter problem reports and those assigned to correct those defects.
Constructive communication can be essential to problem resolution.
Use a tool.
A defect-tracking tool provides a common approach to defect tracking and facilitates vital information capture.
A defect-tracking tool with a good interface can simplify information capture; in addition, users are more likely to use the tool if a minimal feature set is implemented that focuses on the project's defect tracking goals and objectives.
A defect tracking tool allows for automated metric generation.
NASA-STD-8719.13, NASA Software Safety Standard,
recommends the following:
Capture defects in tools as well as the software product.
Review problem reports for safety impacts.
Include in the problem report:
Description of the problem.
Final disposition of the problem.
The system-level hazard, if the problem is a safety-critical software problem.
Use a Change Control Board (CCB) to review and disposition (reject or approve) problem reports.
Consider capturing safety concerns in a Problem Reporting and Corrective Action (PRACA) system.
NASA-GB-8719.13, NASA Software Safety Guidebook,
recommends the following:
Use configuration management (CM), problem reporting or defect management tool for ordered recording and tracking to closure.
Review problems for safety impact, including review of corrective action or software update to ensure no additional hazard or adverse impact to safety.
Include in the problem report:
Description of unexpected behavior.
Analysis performed to determine the cause.
What was done to correct the problem.
When tracking to closure:
Verify problem fixed.
Ensure fix does not have negative effect on other parts of the system.
Other practices to consider include
Avoid duplicating defects.
Use a defect tracking workflow (e.g., verify defect, allocate resources to fix, verify fix, release fix).
Integrate defect tracking tool with CM.
Problem reports can provide key data for trending software quality metric data. To ensure meaningful metrics can be reported, problem reports contain information such as that shown below. Specific metrics are based on the policies and procedures at each NASA Center or the needs/desires of a particular project.
The team needs to clearly define options for each category so users apply them consistently and to allow accurate metrics collection.
Defect types (error in logic, design, computation, data handling, functional specification, etc.).
The Goddard Space Flight Center (GSFC) Excel-based tool which can be found in the Tools Table located in this Electronic Handbook can be used to manage problem reports and generate related metrics analyses. It is particularly targeted to small projects that may not want to use a larger, more complex, or expensive tool. Using this tool to manage problem reports will meet all requirements for problem reporting -- both the information stored for each problem and the summary metrics used to assess the overall software quality. The User's Guide for this tool is found on the first tab in the spreadsheet, and describes both the initial set-up process and how the tool is used throughout a project.
Ames Research Center (ARC) has successfully used free Open Source tools that can be customized or used "out of the box." These tools are viable options for projects with limited budgets.
6. Lessons Learned
No Lessons Learned have currently been identified for this requirement.