See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Configuration audits provide checks to ensure that the planned product is the developed product.
3. Guidance
For software configuration, audits help ensure that configuration items (CIs) have been developed and completed following the documents and requirements that define them. Audits also help ensure that CIs achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all CIs that are supposed to be part of the baseline or release is actually in the baseline or release and are the correct version and revision.
There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases; however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.
Per NASA/SP-2007-6105, Rev 1, NASA Systems Engineering Handbook 273, the FCA "examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the preliminary design review (PDR) and critical design review (CDR). FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."
The PCA "(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products." 273
Audit plans, including goals, schedules, participants, contractor participation, and procedures are documented in the configuration management (CM) plan (see 5.06 - SCMP - Software Configuration Management Plan).
When planning audits, it is important to remember that audits are samplings, not a look at every record. It is also important to remember that auditors should not have any direct responsibility for the software products they audit.
The basic steps in an audit are:
The Department of Defense Configuration Management Guidance Handbook 351 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions). These tables address both the Government and contractor roles in these activities and can be tailored as applicable for a project.
The SMA (Safety and Mission Assurance) Technical Excellence Program (STEP) Level 2 Software Configuration Management and Data Management course taught by the Westfall Team 343 suggests that the following be included in an FCA:
- "An audit of the formal test documentation against test data.
- "An audit of the verification and validation (V&V) reports.
- "A review of all approved changes.
- "A review of updates to previously delivered documents.
- "A sampling of design review outputs.
- "A comparison of code with documented requirements.
- "A review to ensure all testing was accomplished." 343
- Additional sample testing or rerunning of tests, as appropriate for the project.
The STEP 2 course suggests that the following be included in a PCA 343:
- "An audit of the system specification for completeness [and removal of all to-be-determined (TBD)].
- "An audit of the FCA report for discrepancies & actions taken.
- "A comparison of the architectural design with the detailed design components for consistency.
- "A review of the module listing for compliance with the approved coding standards.
- "An audit of the manuals for format completeness & conformance to system & functional descriptions." 343
Additional audit topics to consider include:
- As coded, software products reflect their design.
- User documentation complies with standards as specified.
- Activities have been conducted according to applicable requirements, plans, and contracts.
NASA/SP-2007-6105, NASA Systems Engineering Handbook, 273 includes the following table showing the data typically reviewed during each of these audits:
Consider the following options when deciding when to conduct audits:
- At the time a product is released.
- Before delivery to assure that all delivered products are complete, contain the proper versions and revisions and that all discrepancies, openwork, and deviations, and waivers are properly documented and approved; can be FCA or PCA.
- At the end of a life-cycle phase per Capability Maturity Model Integration (CMMI).
- Before the release of a new or revised baseline.
- As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues. such as meeting coding standards that could affect large segments of the project.
- Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address the status of all identified action items (FCA).
When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found. Findings are grouped as major or minor depending on the range and effect of the non-conformance.
Non-conformances result in corrective actions that address and correct the root cause of the non-conformances. Follow-up needs to be conducted to ensure the corrective actions were completed and are effective.
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits.
NASA-specific configuration management information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to configuration audits may be found in the following related requirements in this Handbook:
4. Small Projects
For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects.
5. Resources
5.1 References
- (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. See Chapter 13. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-022) Functional Configuration Audit Form, Form 0010 NASA Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-044) Physical Configuration Audit Form, Form 0011, NASA Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-212) IEEE Computer Society, IEEE STD 1042-1987, 1987. This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
- (SWEREF-216) IEEE STD IEEE 828-2012, 2012., NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-224) ISO/IEC 12207, IEEE Std 12207-2008, 2008. See Key section: Stakeholder Requirements Definition Process. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
- (SWEREF-343) This NASA-specific information and resource is available in at the System for Administration, Training, and Educational Resources for NASA (SATERN), accessible to NASA-users at https://saterninfo.nasa.gov/.
- (SWEREF-351) U.S. Department of Defense (1997). Editor Note: Revision A is the reference. This URL is the download page for this document. The download is free.
- (SWEREF-513) Public Lessons Learned Entry: 641.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
- Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641 513: Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap. One of the Recommendations states to "Conduct [a] software audit for specification compliance on all data transferred between JPL and [contractor]."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them.
7.2 Software Assurance Products
- None at this time
Objective Evidence
- Software Problem reporting or defect tracking data
- Software configuration management system data
- Software assurance audit results of the change management processes.
- Software version description document(s).
7.3 Metrics
- # of Configuration Management Audits conducted by the project – Planned vs. Actual.
- # of Compliance Audits planned vs. # of Compliance Audits performed.
- # of software work product Non-Conformances identified by life-cycle phase over time.
- # of Non-Conformances identified in release documentation (Open, Closed).
- # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project.
- Trends of # Open vs. # Closed over time.
- # of Non-Conformances per audit (including findings from process and compliance audits, process maturity).
- # of Configuration Management Audits conducted by the project – Planned vs. Actual.
- # of Open vs. Closed Audit Non-Conformances over time.'
- Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.).
7.4 Guidance
Software assurance should perform or participate in the project’s configuration audits, typically the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). The Functional Configuration Audit. A Functional Configuration Audit (FCA) examines the functional characteristics of the configured product and verifies that the product has met the requirements specified in its Functional Baseline documentation approved at the Preliminary Design Review (PDR) and Critical Design Review (CDR). A Physical Configuration Audit (PCA) is the formal examination of the "as-built" configuration of a configuration item against its technical documentation to establish or verify the configuration item's product baseline. The physical configuration audit done at the point of delivery establishes that all the products in the delivery baseline are actually in the delivery baseline and that they are the correct versions, as documented in the Version Description Document.
Software Assurance personnel should sign off on the delivery documentation to indicate that the delivery is complete and meets its requirements. More detail on the FCA and PCA are found in the software guidance in this requirement (SWE-084).
Examples of checklists for a functional configuration audit, a physical configuration audit, and a baseline configuration audit are below:
Functional Configuration Audit Checklist
Asset Number: | Approved by: (signature) | ||
Effective Date: | Name: | ||
Expiration Date: | Title: | ||
Responsible Office: | Asset Type: Checklist: | ||
Title: | Functional Configuration Audit Checklist | PAL Number: |
Functional Configuration Audit Checklist | |||||
The Functional Configuration Audit (FCA) Checklist is to be used by Configuration Management (CM) personnel when conducting audits for major software builds or final releases. A functional configuration audit verifies that the configuration item (CI) has been completed satisfactorily, that the item has achieved the specified performance and functional characteristics, and that its operational and support documents are complete and satisfactory. During this audit, CM will analyze the project’s Requirements Traceability Matrix, Requirements documentation, Test reports, Problem reports, Change authorizations, and any Waivers or Deviations. For each checklist item below, place a check (P) in the box if the checklist item is fully satisfied. Place a note in Section 1 below describing any actions performed during the audit to resolve discrepancies and thereby satisfy the criteria. Any criteria that require an action item for future resolution and follow-up should be captured in Section 2, with the issues/non-compliances captured under “Issues and/or Comments”. The completed checklist should be maintained as a record at the location defined in the project’s data management list (DML). | |||||
Date of Audit: __________________________________________ Project Build/Release: ___________________________________ Name of Auditor: _________________________________ | |||||
|
| P | Issues and/or Comments | ||
1 | Are all configuration items (as specified in the “Baselines Table”) available for this deliverable/release? Guidance: This should include such items as the Requirements Specifications, Software Requirements Traceability Matrix, Version Description Documents, software change requests, all test results, and any operational documents or manuals. | ||||
2 | Have all deviations or waivers to the requirements been considered and approved? | ||||
3 | Does the deliverable/release incorporate all approved changes? | ||||
4 | Was system/software testing completed? | ||||
5 | Were all requirements (functional and performance) verified? | ||||
6 | For those requirements that have not been verified or failed verification, has the status been captured (e.g., in a VDD)? | ||||
7 | Has analysis or simulation been conducted for any performance parameters that couldn’t be completely verified during testing? | ||||
8 | If analysis or simulations were performed, do the results support the requirement verification? | ||||
9 | Is the software requirements traceability matrix (SRTM) complete and bi-directional (i.e., is there evidence that all system/software requirements can be traced to the design, source code, and test procedures)? | ||||
10 | Has all associated documentation been placed under CM control per the project’s CM plan? | ||||
Section 1: Notes/Actions performed during the audit to resolve any discrepancies to satisfy the criteria | |||||
# | Action | ||||
Section 2: Action Items for future follow-up | |||||
# | Action | Assignee | Due Date | ||
Notes | If using a different media to capture action items, please provide the name and location of that asset: ____________________________________________________ |
Change History | Version | Approval Date | Description of Improvements |
| 1.0 | Initial approved version by CCB | |
| 2.0 |