5.1.3 The project manager shall track and evaluate changes to software products.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
As software teams design, develop and deploy software, it is common for multiple versions of the same software to be used in different sites and for the software's developers to be working simultaneously on updates. Bugs or defects of the software are often only present in certain versions (because of the fixing of some problems and the introduction of others as the program develops). Therefore, for the purposes of locating and fixing bugs, it is important to be able to retrieve and run different versions of the software to determine in which version(s) the problem occurs. It may also be necessary to develop two versions of the software concurrently (for instance, where one version has bugs fixed, but no new features (branch), while the other version is where new features are worked on (trunk). Change requests address not only new or changed requirements but also failures and defects in software products. Change requests are analyzed to determine the impact that the change will have on the software product, related software products, the budget, and the schedule. Tracking and evaluating changes are useful for a variety of reasons, not the least of which is to maintain documented descriptions of problems, issues, faults, etc., their impact to the software and system, and their related resolutions. Evaluating changes allows key stakeholders to determine the cost-benefit of implementing changes and to make decisions based on that information.
"Having defect information from previous projects can be a big plus when debugging the next project. ...Recording the defects allows metrics to be determined. One of the easiest ways to judge whether a program is ready for serious safety testing is to measure its defect density---the number of defects per line of code." (NASA-GB-8719.13, NASA Software Safety Guidebook
Tracking and evaluating changes occurs throughout the project life cycle and applies to all software providers, internal and subcontracted.
The NASA Systems Engineering Handbook
, NASA/SP-2007-6105, Rev1, provides a flowchart of a "typical" change control process. This flowchart, shown below, highlights key activities and roles for capturing and tracking changes that are appropriate considerations for any project establishing a new change control process. Several of these steps are addressed in other guidance in this Handbook (see the table of related guidance at the end of this section), including configuration status accounting (CSA).
Guidance for key elements from this flowchart is included below, including preparing the change request, evaluating the change, and tracking the request through the change control process.
Considerations for capturing the change
Changes can be requested for baselined software products including specifications, requirements, design, code, databases, test plans, user documentation, training materials, etc.
Problems or failures.
Reconfiguration changes, including routine changes, to operational software.
Changes related to upgrades.
Capturing the requested change usually involves completing a predefined change request form or problem report and may require access to a change tracking system. A problem reporting/corrective action (PRACA) system is also an option for capturing changes, particularly after the software is operational (NASA-GB-8719.13, NASA Software Safety Guidebook
Depending on system access and project procedures, requests may be entered by developers, testers, end users, help desk personnel, etc. See Change Requests/Problem Reports in this Handbook for guidance for change requests and problem reports. Consider the following suggestions for the change capture process:
Require a separate change request or problem report for each change.
Use a form/format that clearly guides the writer through the process of capturing all key information needed to capture the issue and process the request.
Considerations for evaluating the change and suggested solution
Project impact analysis.
Include appropriate set of stakeholders, such as procurement, quality assurance, risk management, relevant experts, management (e.g., change requested by high visibility customer may result in a business decision to implement a change as opposed to volumes of end users not seeing the problem), etc.
Evaluate impact to schedule and cost, including making and testing the change and regression testing the software.
Evaluate impact to other groups and resources, as applicable.
Evaluate impact to functions and features, interfaces, system resource requirements.
Evaluate impact to other baseline products, such as design, tests, documentation (traceability matrices are helpful here).
Evaluate risk of making the change vs. not making it.
Evaluate size, complexity, criticality of the change.
Evaluate whether change request is within scope of project.
Evaluate whether change request is needed to meet project requirements.
Evaluate impact on performance, reliability, quality, etc.
Evaluate alternatives to making the change.
Software safety impact analysis (NASA-STD-8719.13, NASA Software Safety Standard
Include software quality assurance, safety personnel in this review.
Look for potential creation of new hazard contributions and impacts.
Look for potential modification of existing hazard controls or mitigations.
Look for detrimental effect on safety-critical software or hardware.
Determine effect on software safety.
Determine effect on system safety.
Capture evaluation/analysis results and related decisions, including action items.
Impact analysis, including impact to the safety of the system, may be performed by a change control board (CCB) or experts they designate to perform the analysis. See SWE-082 for additional guidance on impact analysis as it relates to authorizing changes.
Considerations for tracking the change
Use a change control system that is compatible with the project environment and capable of tracking change until completed.
Trace safety-critical problems back to the related system-level hazard.
Include in the tracking records the actual change request/problem reports, impact analysis, notes from evaluation/approval boards and meetings, etc.
Track the software products and versions changed as part of implementing the change (requirements, code, specifications, etc.)
Close change requests only after verification and approval of the implemented change and all associated documentation revisions.
Tracking a change through its disposition (approve, defer, disapprove, etc.) is made easier if the tracking can be done as part of the same system used to capture the change request/problem report. Once disposition decisions are made, the relevant stakeholders are informed of the decisions.
When tracking and evaluating changes to software products, also consider this activity as part of data management activities. A basic description of data management is provided in SWE-079.
Current status of changes is presented at appropriate reviews, including project life-cycle reviews. Review of historical trends and details on open changes is also considered for reviews.
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to methods, tools, and procedures for tracking and evaluating changes.
NASA-specific configuration management planning information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to tracking and evaluating changes to software may be found in the following related requirements in this Handbook:
Projects with limited budgets or personnel could reduce the overhead of tracking and evaluating changes, collecting metrics, etc., by using automated change request tools. Using existing tools can reduce purchase and setup costs for the project and if the tools are familiar to team personnel, training and start-up costs may also be minimized. Some automated tools have multiple capabilities that can provide the team with the means to perform multiple change tracking and evaluation activities with a single tool.
Additionally, a small team size may be conducive to less formal evaluation methods, such as incorporating impact analysis into team meetings rather than holding separate meetings or assigning separate tasks with formal reports due to an evaluation board. Even though small projects may use less formal methods of tracking and evaluating changes, it is still very important to have a record of the changes and associated decisions so the team can have confidence in the final products.
6. Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following from the Space Shuttle program directly related to tracking and evaluating software changes:
Problem Reporting and Corrective Action System. Lesson Number 0738: "The information provided by Problem Reporting and Corrective Action System (PRACAS) allows areas in possible need of improvement to be highlighted to engineering for development of a corrective action, if deemed necessary. With this system in place in the early phases of a program, means are provided for early elimination of the causes of failures. This contributes to reliability growth and customer satisfaction. The system also allows trending data to be collected for systems that are in place. Trend analysis may show areas in need of design or operational changes."
Software Requirements Management. Lesson Number 3377: "The ability to manage and trace software requirements is critical to achieve success in any software project and to produce software products in a cost effective and timely fashion. Conversely, incomplete, incorrect, or changing software requirements result in cost and schedule impacts that increase the later they occur or are discovered in the software life cycle. Current software technology, processes, and tools provide innovative, automated methods to facilitate optimum management of software requirements."
Additionally, the Software Program Managers Network
documents the following relevant lesson as one of its configuration management lessons learned following "visits with many different software-intensive development programs in all three Services. It describes problems uncovered ... on several Department of Defense (DoD) software-intensive programs."
"The change control board's structure used by software projects is often overly cumbersome. It does not adequately assess, prior to authorizing a change, the impacts of a proposed change or the risk and cost of making these changes."