- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
5.1.3 The project manager shall track and evaluate changes to software products.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Key: - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Sometimes Safety Critical; E - F = Never Safety Critical.
As software teams design, develop and deploy software, it is common for multiple versions of the same software to be used in different sites and for the software's developers to be working simultaneously on updates. Bugs or defects of the software are often only present in certain versions (because of the fixing of some problems and the introduction of others as the program develops). Therefore, for the purposes of locating and fixing bugs, it is important to be able to retrieve and run different versions of the software to determine in which version(s) the problem occurs. It may also be necessary to develop two versions of the software concurrently (for instance, where one version has bugs fixed, but no new features (branch), while the other version is where new features are worked on (trunk). Change requests address not only new or changed requirements but also failures and defects in software products. Change requests are analyzed to determine the impact that the change will have on the software product, related software products, the budget, and the schedule. Tracking and evaluating changes are useful for a variety of reasons, not the least of which is to maintain documented descriptions of problems, issues, faults, etc., their impact to the software and system, and their related resolutions. Evaluating changes allows key stakeholders to determine the cost-benefit of implementing changes and to make decisions based on that information.
The NASA Systems Engineering Handbook 273, NASA/SP-2007-6105, Rev1, provides a flowchart of a "typical" change control process. This flowchart, shown below, highlights key activities and roles for capturing and tracking changes that are appropriate considerations for any project establishing a new change control process. Several of these steps are addressed in other guidance in this Handbook (see the table of related guidance at the end of this section), including configuration status accounting (CSA).
Guidance for key elements from this flowchart is included below, including preparing the change request, evaluating the change, and tracking the request through the change control process.
Considerations for capturing the change
- Changes can be requested for baselined software products including specifications, requirements, design, code, databases, test plans, user documentation, training materials, etc.
- Problems or failures.
- Reconfiguration changes, including routine changes, to operational software.
- Changes related to upgrades.
- Enhancement requests.
Capturing the requested change usually involves completing a predefined change request form or problem report and may require access to a change tracking system. A problem reporting/corrective action (PRACA) system is also an option for capturing changes, particularly after the software is operational (NASA-GB-8719.13, NASA Software Safety Guidebook 276).
Depending on system access and project procedures, requests may be entered by developers, testers, end users, help desk personnel, etc. See Change Requests/Problem Reports in this Handbook for guidance for change requests and problem reports. Consider the following suggestions for the change capture process:
- Require a separate change request or problem report for each change.
- Use a form/format that clearly guides the writer through the process of capturing all key information needed to capture the issue and process the request.
Considerations for evaluating the change and suggested solution
- Project impact analysis.
- Include appropriate set of stakeholders, such as procurement, quality assurance, risk management, relevant experts, management (e.g., change requested by high visibility customer may result in a business decision to implement a change as opposed to volumes of end users not seeing the problem), etc.
- Evaluate impact to schedule and cost, including making and testing the change and regression testing the software.
- Evaluate impact to other groups and resources, as applicable.
- Evaluate impact to functions and features, interfaces, system resource requirements.
- Evaluate impact to other baseline products, such as design, tests, documentation (traceability matrices are helpful here).
- Evaluate risk of making the change vs. not making it.
- Evaluate size, complexity, criticality of the change.
- Evaluate whether change request is within scope of project.
- Evaluate whether change request is needed to meet project requirements.
- Evaluate impact on performance, reliability, quality, etc.
- Evaluate alternatives to making the change.
- Software safety impact analysis (NASA-STD-8719.13, NASA Software Safety Standard 271)
- Include software quality assurance, safety personnel in this review.
- Look for potential creation of new hazard contributions and impacts.
- Look for potential modification of existing hazard controls or mitigations.
- Look for detrimental effect on safety-critical software or hardware.
- Determine effect on software safety.
- Determine effect on system safety.
- Capture evaluation/analysis results and related decisions, including action items.
Impact analysis, including impact to the safety of the system, may be performed by a change control board (CCB) or experts they designate to perform the analysis. See SWE-082 for additional guidance on impact analysis as it relates to authorizing changes.
Considerations for tracking the change
- Use a change control system that is compatible with the project environment and capable of tracking change until completed.
- Trace safety-critical problems back to the related system-level hazard.
- Include in the tracking records the actual change request/problem reports, impact analysis, notes from evaluation/approval boards and meetings, etc.
- Track the software products and versions changed as part of implementing the change (requirements, code, specifications, etc.)
- Close change requests only after verification and approval of the implemented change and all associated documentation revisions.
When tracking and evaluating changes to software products, also consider this activity as part of data management activities. A basic description of data management is provided in SWE-079.
Current status of changes is presented at appropriate reviews, including project life-cycle reviews. Review of historical trends and details on open changes is also considered for reviews.
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to methods, tools, and procedures for tracking and evaluating changes.
NASA-specific configuration management planning information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to tracking and evaluating changes to software may be found in the following related requirements in this Handbook:
4. Small Projects
Projects with limited budgets or personnel could reduce the overhead of tracking and evaluating changes, collecting metrics, etc., by using automated change request tools. Using existing tools can reduce purchase and setup costs for the project and if the tools are familiar to team personnel, training and start-up costs may also be minimized. Some automated tools have multiple capabilities that can provide the team with the means to perform multiple change tracking and evaluation activities with a single tool.
Additionally, a small team size may be conducive to less formal evaluation methods, such as incorporating impact analysis into team meetings rather than holding separate meetings or assigning separate tasks with formal reports due to an evaluation board. Even though small projects may use less formal methods of tracking and evaluating changes, it is still very important to have a record of the changes and associated decisions so the team can have confidence in the final products.
- STEP Level 2 Software Configuration Management and Data Management course, SMA-SA-WBT-204, SATERN (need user account to access SATERN courses).This NASA-specific information and resource is available in at the System for Administration, Training, and Educational Resources for NASA (SATERN), accessible to NASA-users at https://saterninfo.nasa.gov/.
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following from the Space Shuttle program directly related to tracking and evaluating software changes:
- Problem Reporting and Corrective Action System. Lesson Number 0738: "The information provided by Problem Reporting and Corrective Action System (PRACAS) allows areas in possible need of improvement to be highlighted to engineering for development of a corrective action, if deemed necessary. With this system in place in the early phases of a program, means are provided for early elimination of the causes of failures. This contributes to reliability growth and customer satisfaction. The system also allows trending data to be collected for systems that are in place. Trend analysis may show areas in need of design or operational changes." 520
- Software Requirements Management. Lesson Number 3377: "The ability to manage and trace software requirements is critical to achieve success in any software project and to produce software products in a cost effective and timely fashion. Conversely, incomplete, incorrect, or changing software requirements result in cost and schedule impacts that increase the later they occur or are discovered in the software life cycle. Current software technology, processes, and tools provide innovative, automated methods to facilitate optimum management of software requirements." 576
Additionally, the Software Program Managers Network 431 documents the following relevant lesson as one of its configuration management lessons learned following "visits with many different software-intensive development programs in all three Services. It describes problems uncovered ... on several Department of Defense (DoD) software-intensive programs."
- "The change control board's structure used by software projects is often overly cumbersome. It does not adequately assess, prior to authorizing a change, the impacts of a proposed change or the risk and cost of making these changes."
7. Software Assurance
7.1 Tasking for Software Assurance
1. Analyze proposed changes to software products for impacts, particularly to safety, and security.
a. that the project tracks the changes,
b. that the changes are approved and documented before implementation,
c. that the implementation of changes is complete, and
d. that the project tests the changes.
3. Confirm software changes done in the software change control process.
7.2 Software Assurance Products
- Evidence that SA has concurred or signed-off on approved software changes.
- None at this time
- Software assurance should analyze all proposed changes for impacts, looking closely at any impacts the change may have in any of the software related to safety or security. The analysis should also consider whether there will be any impacts on existing interfaces or the use of any COTS, GOTS, MOTS, or reused software in the system and whether the change will impact any future maintenance effort. Any identified risks should be brought up in the CCB meeting to discuss approval/rejection of the change.
- That the project tracks the changes
Software assurance will check to see that any changes that are submitted are properly documented and tracked through all the state of resolution (investigation, acceptance/rejection, implementation, test, closure) in the project tracking system.
- That the changes are approved and documented before implementation
Software assurance should track the changes from their submission to their closure or rejection. Initially, SA should confirm that all changes follow the change management process that the project has established. Initially, the change will be documented and submitted to the authorizing CCB for consideration. The authorizing CCB (which should include a software assurance person) will evaluate any changes for impacts. Some considerations:
- Is the change an error correction or a new requirement?
- Will the change fix the problem without major changes to other areas?
- If major changes to other areas are needed, are they specified and is this change really necessary?
- If the change is a requirements change, has the new requirement been approved?
- How much effort will be required to implement the change?
- If there is an impact to safety or reliability, are there addition changes that need to be made in those areas? Note: If there is a conflict between safety and security, safety changes have priority.
When all the impacts are considered, the CCB votes on acceptance/rejection. Software assurance is a voting member of the CCB. Software assurance verifies that the decision is recorded and is acceptable, defined as:
- When the resolution is to “accept as is”, verify that the impact of that resolution on quality, safety, reliability and security is compatible with the Project’s risk posture and is compliant with NPR 7150.2 and other Center and Agency requirements for risk.
- When the resolution is a change to the SW, the change will sufficiently address the problem and will not impact quality, safety, reliability, security, and compliance with NPR 7150.2; the change will not introduce new, or exacerbate other, discrepancies or problems.
- In either case, the presence of other instances of the same kind of discrepancy/problem have been sought out and, if detected, addressed accordingly.
- Verify that appropriate software severity levels are assigned and maintained.
- Assure any risk associated with the change is added to the Project/facility risk management system and is addressed, as needed, in safety, reliability, or other risk systems.
- That the implementation of the changes is complete
Software assurance will check to see if the implementation of the approved changes has actually been coded as per the change request. Check to see that any associated documentation changes are submitted/approved and/made as needed (i.e., updates to requirements, design, test plans/procedures, etc.)
- That the project tests the changes
Software assurance will check to see that the project test any of the code that has changed and runs a set of regression tests to see that the change has not caused problem anywhere else in the software system. If the software is safety critical, a full set of regression tests should be run to ensure that there was no impact to the safety critical functions.
3. Confirm software changes are done in the software control process
Software assurance will check that the software control process has been followed throughout the handling of the submitted change and that the status of the change is recorded and confirmed as closed