Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-084 - Configuration Audits
Unknown macro: {div3}

1. Requirements

4.1.6 The project shall ensure that software configuration audits are performed to determine the correct version of the configuration items and verify that they conform to the documents that define them.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C through E and Safety Critical are labeled, "SO". This means that this requirement applies to the safety-critical aspects of the software.

Class G is labeled with "P(Center)".  This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.





























Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

An audit is a planned, independent and documented assessment to verify compliance to agreed-upon requirements. 

For software configuration, audits help ensure that configuration items have been developed and completed in accordance with the documents that define them.  Audits also help ensure that configuration items achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all configuration items that are supposed to be part of the baseline or release are actually in the baseline or release and are the correct version and revision.

Audits provide an independent check to ensure that the planned product is the developed product.

Configuration audits allow developers to "provide notice that contractual obligations are nearing completion, and to provide sufficient evidence for the clients or user organization to accept the product and initiate the transition into operational usage." (IEEE Guide to Software Configuration Management_)

Unknown macro: {div3}

3. Guidance

There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). 

Per the NASA Software Assurance Standard, NASA-STD-8739.81, the FCA is defined as an audit "conducted to verify that the development of a configuration item has been completed satisfactorily, that the item has achieved the performance and functional characteristics specified in the functional or allocated configuration identification, and that its operational and support documents are complete and satisfactory."   

The PCS is define as an audit "conducted to verify that one or more configuration items, as built, conform to the technical documentation that defines it. [Based on IEEE 610.12, IEEE Standard Glossary of Software Engineering Terminology]"

Audit plans, including goals, schedules, participants, contractor participation, and procedures should be documented in the configuration management plan (see SWE-103).

When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit.

The basic steps in an audit are:

The Department of Defense Configuration Management Guidance Handbook5 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions).  These tables address both the government and contractor roles in these activities and can be tailored as applicable for a project.

The STEP Level 2 Software Configuration Management and Data Management course taught by the Westfall Team2 suggests that the following be included in a FCA:

  • "An audit of the formal test documentation against test data
  • An audit of the V&V reports
  • A review of all approved changes
  • A review of updates to previously delivered documents
  • A sampling of design review outputs
  • A comparison of code with documented requirements
  • A review to ensure all testing was accomplished"2
  • Additional sample testing or rerunning of tests, as appropriate for the project

The STEP 2 course suggests that the following be included in a PCA2:

  • "An audit of the system specification for completeness [and removal of all TBDs]
  • An audit of the FCA report for discrepancies & actions taken
  • A comparison of the architectural design with the detailed design components for consistency
  • A review of the module listing for compliance with the approved coding standards
  • An audit of the manuals for format completeness & conformance to system & functional descriptions"2

Additional audit topics to consider include:

  • As coded, software products reflect their design
  • User documentation complies with standards as specified
  • Activities have been conducted according to applicable requirements, plans, and contract

The NASA Systems Engineering Handbook, NASA/SP-2007-6105, Rev16, includes the following table showing the data typically reviewed during each of these audits:

Consider the following options when deciding when to conduct audits:

  • At the time a product is released
  • Prior to delivery to assure that all delivered products are complete, contain the proper versions and revisions, and that all discrepancies, open work, and deviations and waivers are properly documented and approved; can be FCA or PCA
  • At the end of a lifecycle phase per CMMI
  • Prior to the release of a new or revised baseline
  • As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues such as meeting coding standards that could affect large segments of the project
  • Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address status of all identified action items (FCA)

When preparing for an audit, it is important to gain an understanding of what is supposed to be there, develop relevant questions, determine what to sample, determine who to interview, and plan other steps that will make the audit proceed more smoothly once it gets started. Checklists (essentially a set of prepared questions) and prepared interview questions are important to the success of an audit and help ensure all audit criteria are covered.  The NASA Configuration Management (CM) Standard8 includes sample checklists in Appendix E.  When creating checklists and question, keep the following in mind:

  • Checklist questions should be answerable with a Yes or No response with Yes responses always indicating a positive or good response and No always indicating the negative
  • Checklists should be precise, measurable, factual
  • Checklists correspond to the audit requirements and should remain within the scope of the audit
  • Interview questions should be open-ended (e.g., "What do you use...?" as opposed to "Do you use XYZ to...?")
  • Interview question should prompt the auditee to do most of the talking
  • Interview questions should be context free (e.g., "How do you track the status of a change request"? as opposed to "How do you use the XYZ system to track the status of a change request?")
  • Interview questions should not include the answer in the question
  • Interview questions should focus on systems, products, processes, not the person

The STEP Level 2 Software Configuration Management and Data Management course2 provides a set of questions to consider for audit checklists.  A few of those questions are shown below:

"Are the procedures and/or work instructions for the task defined at the appropriate level of detail?"

"Were the entry criteria for the task verified before the work began?"

"Are the environment, infrastructure and tools utilized during the task adequate to achieve conformity?"

"Are the outputs from the task appropriately verified and/or approved and/or controlled?"

"Are nonconformities/defects appropriately reported and tracked to closure?"

"Are the appropriate records being kept?"

When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found.  Findings should be grouped as major or minor depending on the range and effect of the non-conformance. 

  Non-conformances should result in corrective actions that address and correct the root cause of the non-conformances.  Follow-up should be conducted to ensure the corrective actions were completed and are effective. 

  Consult Center Process Asset Libraries (PALs) for center-specific guidance and resources related to configuration audits. 

Additional guidance related to configuration audits may be found in the following related requirements in this handbook:


Develop CM Plan


Status Accounting


Software Configuration Management Plan

Unknown macro: {div3}

4. Small Projects

For projects with limited personnel, consider sharing lead auditors or audit team members among projects.

Unknown macro: {div3}

5. Resources

  1. NASA Technical Standard, "NASA Software Assurance Standard", NASA-STD-8739.8, 2004.
  2. STEP Level 2 Software Configuration Management and Data Management course, SMA-SA-WBT-204, SATERN (need user account to access SATERN courses). A SATERN user account is needed to access this material.
  3. IEEE Computer Society, "IEEE Guide to Software Configuration Management", IEEE STD 1042-1987, 1987.  This link requires an account on the NASA START (AGCY NTSS) system ( ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
  4. IEEE Computer Society, "IEEE Standard for Software Configuration Management Plans", IEEE STD 828-2005, 2005.  This link is to the NASA START accessible copy of this standard.  It requires an account on the NASA START (AGCY NTSS) system ( ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
  5. Department of Defense, "Department of Defense Configuration Management Guidance Handbook", MIL-HDBK-61, 1997.  This URL is the download page for this document.  The download is free.
  6. NASA Scientific and Technical Information (STI), NASA Center for AeroSpace Information, "NASA Systems Engineering Handbook", NASA/SP-2007-6105, Rev1, 2007.
  7. Flight and Ground Software Division, MSFC, "Software Development Process Description Document", EI32-OI-001, Revision R, 2010.
  8. NASA Technical Standard, "NASA Configuration Management (CM) Standard", Appendix E, NASA-STD-0005, 2008.
  9. ISO/IEC 12207, "Systems and software engineering – Software life cycle processes", IEEE Std 12207-2008, 2008 (Key section: Stakeholder Requirements Definition Process).  This link requires an account on the NASA START (AGCY NTSS) system ( ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.

5.1 Tools

Tools relative to this SWE may be found in the table above.  If no tools are listed, none have been currently identified for this SWE.  You may wish to reference table XYZ in this handbook for an evolving list of these and other tools in use at NASA.  Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool.  Check with your Center to see what tools are available to facilitate compliance with this requirement

Unknown macro: {div3}

6. Lessons Learned

A recommendation in the NASA Lessons Learned database specifically calls out configuration audits as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap, "conduct software audit for specification compliance on all data transferred between JPL and [contractor]" (

  • No labels