bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)


SWE-080 - Track and Evaluate Changes

1. Requirements

4.1.2 The project shall track and evaluate changes to software products.

1.1 Notes

The project can use a software change request system or a software problem tracking system. The minimum content for software change requests or a software problem report is defined in Chapter 5 [of NASA-STD-7150.2, NASA Software Engineering Requirements, Section 5.2.5, and detailed in SWE-113 (Software change request/problem report) of this Handbook].

1.2 Applicability Across Classes

Class D Not Safety Critical, and Class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

    P(C)

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

Tracking and evaluating changes are useful for a variety of reasons, not the least of which is to maintain documented descriptions of problems, issues, faults, etc., their impact to the software and system, and their related resolutions. Evaluating changes allows key stakeholders to determine the cost-benefit of implementing changes and to make decisions based on that information.

"Having defect information from previous projects can be a big plus when debugging the next project. ...Recording the defects allows metrics to be determined. One of the easiest ways to judge whether a program is ready for serious safety testing is to measure its defect density---the number of defects per line of code." (NASA-GB-8719.13, NASA Software Safety Guidebook 276)

3. Guidance

Tracking and evaluating changes occurs throughout the project life cycle and applies to all software providers, internal and subcontracted.

The NASA Systems Engineering Handbook 273, NASA/SP-2007-6105, Rev1, provides a flowchart of a "typical" change control process. This flowchart, shown below, highlights key activities and roles for capturing and tracking changes that are appropriate considerations for any project establishing a new change control process. Several of these steps are addressed in other guidance in this Handbook (see the table of related guidance at the end of this section), including configuration status accounting (CSA).

Guidance for key elements from this flowchart is included below, including preparing the change request, evaluating the change, and tracking the request through the change control process.

Considerations for capturing the change

  • Changes can be requested for baselined software products including specifications, requirements, design, code, databases, test plans, user documentation, training materials, etc.
  • Discrepancies.
  • Problems or failures.
  • Reconfiguration changes, including routine changes, to operational software.
  • Changes related to upgrades.
  • Enhancement requests.

Capturing the requested change usually involves completing a predefined change request form or problem report and may require access to a change tracking system. A problem reporting/corrective action (PRACA) system is also an option for capturing changes, particularly, after the software is operational (NASA-GB-8719.13, NASA Software Safety Guidebook 276).

Depending on system access and project procedures, requests may be entered by developers, testers, end users, help desk personnel, etc. See SWE-113  in this Handbook for guidance for change requests and problem reports. Consider the following suggestions for the change capture process:

  • Require a separate change request or problem report for each change.
  • Use a form/format that clearly guides the writer through the process of capturing all key information needed to capture the issue and process the request.

Considerations for evaluating the change and suggested solution

  • Project impact analysis.
    • Include appropriate set of stakeholders, such as procurement, quality assurance, risk management, relevant experts, management (e.g., change requested by high visibility customer may result in a business decision to implement a change as opposed to volumes of end users not seeing the problem), etc.
    • Evaluate impact to schedule and cost, including making and testing the change and regression testing the software.
    • Evaluate impact to other groups and resources, as applicable.
    • Evaluate impact to functions and features, interfaces, system resource requirements.
    • Evaluate impact to other baseline products, such as design, tests, documentation (traceability matrices are helpful here).
    • Evaluate risk of making the change vs. not making it.
    • Evaluate size, complexity, criticality of the change.
    • Evaluate whether change request is within scope of project.
    • Evaluate whether change request is needed to meet project requirements.
    • Evaluate impact on performance, reliability, quality, etc.
    • Evaluate alternatives to making the change.
  • Software safety impact analysis (NASA-STD-8719.13, NASA Software Safety Standard 271)
    • Include software quality assurance, safety personnel in this review.
    • Look for potential creation of new hazard contributions and impacts.
    • Look for potential modification of existing hazard controls or mitigations.
    • Look for detrimental effect on safety-critical software or hardware.
    • Determine effect on software safety.
    • Determine effect on system safety.
  • Capture evaluation/analysis results and related decisions, including action items.

Impact analysis, including impact to the safety of the system, may be performed by a change control board (CCB) or experts they designate to perform the analysis. See SWE-082 for additional guidance on impact analysis as it relates to authorizing changes.

Considerations for tracking the change

  • Use a change control system that is compatible with the project environment and capable of tracking change until completed.
  • Trace safety-critical problems back to the related system-level hazard.
  • Include in the tracking records the actual change request/problem reports, impact analysis, notes from evaluation/approval boards and meetings, etc.
  • Track the software products and versions changed as part of implementing the change (requirements, code, specifications, etc.)
  • Close change requests only after verification and approval of the implemented change and all associated documentation revisions.

Tracking a change through its disposition (approve, defer, disapprove, etc.) is made easier if the tracking can be done as part of the same system used to capture the change request/problem report. Once disposition decisions are made, the relevant stakeholders are informed of the decisions.

Current status of changes is presented at appropriate reviews, including project life cycle reviews. Review of historical trends and details on open changes is also considered for reviews.

Additional Guidance

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to methods, tools, and procedures for tracking and evaluating changes.

Additional guidance related to tracking and evaluating changes to software may be found in the following related requirements in this handbook:

SWE-079

Develop CM Plan

SWE-082

Authorizing Changes

SWE-083

Status Accounting

SWE-103

Software Configuration Management Plan

SWE-113

Software Change Request/Problem Report

4. Small Projects

Projects with limited budgets or personnel could reduce the overhead of tracking and evaluating changes, collecting metrics, etc. by using automated change request tools. Using existing tools can reduce purchase and setup costs for the project and if the tools are familiar to team personnel, training and start-up costs may also be minimized. Some automated tools have multiple capabilities that can provide the team with the means to perform multiple change tracking and evaluation activities with a single tool.

Additionally, a small team size may be conducive to less formal evaluation methods, such as incorporating impact analysis into team meetings rather than holding separate meetings or assigning separate tasks with formal reports due to an evaluation board. Even though small projects may use less formal methods of tracking and evaluating changes, it is still very important to have a record of the changes and associated decisions so the team can have confidence in the final products.

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following from the Space Shuttle program directly related to tracking and evaluating software changes:

  • Problem Reporting and Corrective Action System. Lesson Number 0738: "The information provided by Problem Reporting and Corrective Action System (PRACAS) allows areas in possible need of improvement to be highlighted to engineering for development of a corrective action, if deemed necessary. With this system in place in the early phases of a program, means are provided for early elimination of the causes of failures. This contributes to reliability growth and customer satisfaction. The system also allows trending data to be collected for systems that are in place. Trend analysis may show areas in need of design or operational changes." 520

Additionally, the Software Program Managers Network 431 documents the following relevant lesson as one of its configuration management lessons learned following "visits with many different software-intensive development programs in all three Services. It describes problems uncovered ... on several Department of Defense (DoD) software-intensive programs."

  • "The change control board's structure used by software projects is often overly cumbersome. It does not adequately assess, prior to authorizing a change, the impacts of a proposed change or the risk and cost of making these changes."