See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Configuration audits provide checks to ensure that the planned product is the developed product.
3. Guidance
3.1 Configuration Audits
For software configuration, audits help ensure that configuration items (CIs) have been developed and completed following the documents and requirements that define them. Audits also help ensure that CIs achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all CIs that are supposed to be part of the baseline or release is actually in the baseline or release and are the correct version and revision.
There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases; however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.
FCA
"examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the preliminary design review (PDR) and critical design review (CDR). FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."
PCA
"(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products."
3.2 Planning For Audits
Audit plans, including goals, schedules, participants, contractor participation, and procedures are documented in the configuration management (CM) plan (see 5.06 - SCMP - Software Configuration Management Plan).
When planning audits, it is important to remember that audits are samplings, not a look at every record. It is also important to remember that auditors should not have any direct responsibility for the software products they audit.
The basic steps in an audit are:
The Department of Defense Configuration Management Guidance Handbook 351 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions). These tables address both the Government and contractor roles in these activities and can be tailored as applicable for a project.
3.3 Functional Configuration Audit (FCA)
The SMA (Safety and Mission Assurance) Technical Excellence Program (STEP) Level 2 Software Configuration Management and Data Management course taught by the Westfall Team 343 suggests that the following be included in an FCA:
- "An audit of the formal test documentation against test data.
- "An audit of the verification and validation (V&V) reports.
- "A review of all approved changes.
- "A review of updates to previously delivered documents.
- "A sampling of design review outputs.
- "A comparison of code with documented requirements.
- "A review to ensure all testing was accomplished." 343
- Additional sample testing or rerunning of tests, as appropriate for the project.
A Checklist template for FCA can be found in PAT-001 - FCA Checklist.
3.4 Physical Configuration Audit (PCA)
The STEP 2 course suggests that the following be included in a PCA 343:
- "An audit of the system specification for completeness [and removal of all to-be-determined (TBD)].
- "An audit of the FCA report for discrepancies & actions taken.
- "A comparison of the architectural design with the detailed design components for consistency.
- "A review of the module listing for compliance with the approved coding standards.
- "An audit of the manuals for format completeness & conformance to system & functional descriptions." 343
Additional audit topics to consider include:
- As coded, software products reflect their design.
- User documentation complies with standards as specified.
- Activities have been conducted according to applicable requirements, plans, and contracts.
A Checklist template for PCA can be found in PAT-002 - PCA Checklist.
3.5 Data Typically Reviewed
NASA/SP-2007-6105, NASA Systems Engineering Handbook, 273 includes the following table showing the data typically reviewed during each of these audits:
3.6 When Should Audits Be Done?
Consider the following options when deciding when to conduct audits:
- At the time a product is released.
- Before delivery to assure that all delivered products are complete, contain the proper versions and revisions and that all discrepancies, openwork, and deviations, and waivers are properly documented and approved; can be FCA or PCA.
- At the end of a life cycle phase per Capability Maturity Model Integration (CMMI).
- Before the release of a new or revised baseline.
- As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues. such as meeting coding standards that could affect large segments of the project.
- Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address the status of all identified action items (FCA). See also SWE-079 - Develop CM Plan, SWE-083 - Status Accounting.
See also SWE-085 - Release Management,
When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found. Findings are grouped as major or minor depending on the range and effect of the non-conformance.
Non-conformances result in corrective actions that address and correct the root cause of the non-conformances. Follow-up needs to be conducted to ensure the corrective actions were completed and are effective.
3.7 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
3.8 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects.
5. Resources
5.1 References
- (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. See Chapter 13. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-022) Functional Configuration Audit Form, Form 0010 NASA Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-044) Physical Configuration Audit Form, Form 0011, NASA Marshall Space Flight Center (MSFC), 2010. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-212) IEEE Computer Society, IEEE STD 1042-1987, 1987. This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
- (SWEREF-216) IEEE STD IEEE 828-2012, 2012., NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-224) ISO/IEC 12207, IEEE Std 12207-2008, 2008. See Key section: Stakeholder Requirements Definition Process. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
- (SWEREF-343) This NASA-specific information and resource is available in at the System for Administration, Training, and Educational Resources for NASA (SATERN), accessible to NASA-users at https://saterninfo.nasa.gov/.
- (SWEREF-351) U.S. Department of Defense (1997). Editor Note: Revision A is the reference. This URL is the download page for this document. The download is free.
- (SWEREF-513) Public Lessons Learned Entry: 641.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
5.3 Process Asset Templates
(PAT-001 - )
SWE-084, tab 7, For use on all release candidate audits.(PAT-002 - )
SWE-084, tab 7, For use on all release candidate audits
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
- Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641 513: Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap. One of the Recommendations states to "Conduct [a] software audit for specification compliance on all data transferred between JPL and [contractor]."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them.
7.2 Software Assurance Products
- None at this time
Objective Evidence
- Software Problem reporting or defect tracking data
- Software configuration management system data
- Software assurance audit results of the change management processes.
- Software version description document(s).
7.3 Metrics
- # of Configuration Management Audits conducted by the project – Planned vs. Actual.
- # of Compliance Audits planned vs. # of Compliance Audits performed.
- # of software work product Non-Conformances identified by life cycle phase over time.
- # of Non-Conformances identified in release documentation (Open, Closed).
- # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project.
- Trends of # Open vs. # Closed over time.
- # of Non-Conformances per audit (including findings from process and compliance audits, process maturity).
- # of Open vs. Closed Audit Non-Conformances over time.'
- Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.).
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
Software assurance should perform or participate in the project’s configuration audits, typically the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). The Functional Configuration Audit. A Functional Configuration Audit (FCA) examines the functional characteristics of the configured product and verifies that the product has met the requirements specified in its Functional Baseline documentation approved at the Preliminary Design Review (PDR) and Critical Design Review (CDR). A Physical Configuration Audit (PCA) is the formal examination of the "as-built" configuration of a configuration item against its technical documentation to establish or verify the configuration item's product baseline. The physical configuration audit done at the point of delivery establishes that all the products in the delivery baseline are actually in the delivery baseline and that they are the correct versions, as documented in the Version Description Document.
Software Assurance personnel should sign off on the delivery documentation to indicate that the delivery is complete and meets its requirements. More detail on the FCA and PCA are found in the software guidance in this requirement (SWE-084).
Examples of checklists for a functional configuration audit and a physical configuration audit are below: To get a downloadable copy of the checklists, click on the small document box.
Functional Configuration Audit Checklist PAT-001
Click on the image to preview the file. From the preview, click on Download to obtain a usable copy.
Physical Configuration Audit Checklist PAT-002
Click on the image to preview the file. From the preview, click on Download to obtain a usable copy.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: