bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 28 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-084 - Configuration Audits
Unknown macro: {div3}

1. Requirements

4.1.6 The project shall ensure that software configuration audits are performed to determine the correct version of the configuration items and verify that they conform to the documents that define them.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C through E and Safety Critical are labeled, "SO". This means that this requirement applies to the safety-critical aspects of the software.

Class G is labeled with "P(Center)".  This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

    X

   

    X

   

    X

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

For software configuration, audits help ensure that configuration items have been developed and completed in accordance with the documents that define them.  Audits also help ensure that configuration items achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all configuration items that are supposed to be part of the baseline or release are actually in the baseline or release and are the correct version and revision.

Configuration audits provide checks to ensure that the planned product is the developed product.

Configuration audits allow developers to "provide notice that contractual obligations are nearing completion, and to provide sufficient evidence for the clients or user organization to accept the product and initiate the transition into operational usage." (IEEE Guide to Software Configuration Management)

Unknown macro: {div3}

3. Guidance

There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases, however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.

Per the NASA Systems Engineering Handbook, NASA/SP-2007-6105, Rev16, the FCA "examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

PDR

]]></ac:plain-text-body>
</ac:macro>

and

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

CDR

]]></ac:plain-text-body>
</ac:macro>

. FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."

The PCA "(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products." 6

Audit plans, including goals, schedules, participants, contractor participation, and procedures should be documented in the configuration management plan (see [SWE-103]).

When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit.

The basic steps in an audit are:

The Department of Defense Configuration Management Guidance Handbook5 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions).  These tables address both the government and contractor roles in these activities and can be tailored as applicable for a project.

The

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

STEP

]]></ac:plain-text-body>
</ac:macro>

Level 2 Software Configuration Management and Data Management course taught by the Westfall Team2 suggests that the following be included in a FCA:

  • "An audit of the formal test documentation against test data
  • An audit of the

    <ac:macro ac:name="unmigrated-wiki-markup">
    <ac:plain-text-body><![CDATA[

    V&V

    ]]></ac:plain-text-body>
    </ac:macro>

    reports
  • A review of all approved changes
  • A review of updates to previously delivered documents
  • A sampling of design review outputs
  • A comparison of code with documented requirements
  • A review to ensure all testing was accomplished"2
  • Additional sample testing or rerunning of tests, as appropriate for the project

The STEP 2 course suggests that the following be included in a PCA2:

  • "An audit of the system specification for completeness [and removal of all TBDs]
  • An audit of the FCA report for discrepancies & actions taken
  • A comparison of the architectural design with the detailed design components for consistency
  • A review of the module listing for compliance with the approved coding standards
  • An audit of the manuals for format completeness & conformance to system & functional descriptions"2

Additional audit topics to consider include:

  • As coded, software products reflect their design
  • User documentation complies with standards as specified
  • Activities have been conducted according to applicable requirements, plans, and contract

The NASA Systems Engineering Handbook includes the following table showing the data typically reviewed during each of these audits:

Consider the following options when deciding when to conduct audits:

  • At the time a product is released
  • Prior to delivery to assure that all delivered products are complete, contain the proper versions and revisions, and that all discrepancies, open work, and deviations and waivers are properly documented and approved; can be FCA or PCA
  • At the end of a life cycle phase per CMMI
  • Prior to the release of a new or revised baseline
  • As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues such as meeting coding standards that could affect large segments of the project
  • Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address status of all identified action items (FCA)

When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found.  Findings should be grouped as major or minor depending on the range and effect of the non-conformance. 


Non-conformances should result in corrective actions that address and correct the root cause of the non-conformances.  Follow-up should be conducted to ensure the corrective actions were completed and are effective. 


Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits.

Additional guidance related to configuration audits may be found in the following related requirements in this handbook:

[SWE-079]

Develop CM Plan

[SWE-083]

Status Accounting

[SWE-103]

Software Configuration Management Plan


Unknown macro: {div3}

4. Small Projects

For projects with limited personnel, consider sharing lead auditors or audit team members among projects.

Unknown macro: {div3}

5. Resources

  1. NASA Technical Standard, "NASA Software Assurance Standard", NASA-STD-8739.8, 2004.
  2. STEP Level 2 Software Configuration Management and Data Management course, SMA-SA-WBT-204, SATERN (need user account to access SATERN courses). A SATERN user account is needed to access this material.
  3. IEEE Computer Society, "IEEE Guide to Software Configuration Management", IEEE STD 1042-1987, 1987.  This link requires an account on the NASA START (AGCY NTSS) system (http://standards.nasa.gov ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
  4. IEEE Computer Society, "IEEE Standard for Software Configuration Management Plans", IEEE STD 828-2005, 2005.  This link is to the NASA START accessible copy of this standard.  It requires an account on the NASA START (AGCY NTSS) system (http://standards.nasa.gov ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
  5. Department of Defense, "Department of Defense Configuration Management Guidance Handbook", MIL-HDBK-61, 1997.  This URL is the download page for this document.  The download is free.
  6. NASA Scientific and Technical Information (STI), NASA Center for AeroSpace Information, "NASA Systems Engineering Handbook" (6.7 Technical Assessment), NASA/SP-2007-6105, Rev1, 2007.
  7. Flight and Ground Software Division, MSFC, "Software Development Process Description Document" (Chapter 13), EI32-OI-001, Revision R, 2010.
  8. NASA Technical Standard, "NASA Configuration Management (CM) Standard", Appendix E, NASA-STD-0005, 2008.
  9. ISO/IEC 12207, "Systems and software engineering – Software life cycle processes", IEEE Std 12207-2008, 2008 (Key section: Stakeholder Requirements Definition Process).  This link requires an account on the NASA START (AGCY NTSS) system (http://standards.nasa.gov ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards. 
  10. Software Engineering Division, Goddard Space Flight Center, Baseline Audit Checklist, 580-CK-072-02, 2008.
  11. Marshall Space Flight Center, Functional Configuration Audit Form, 2010.
  12. Software Engineering Division, Goddard Space Flight Center, Functional Configuration Audit Checklist, 580-CK-029-03, 2008.
  13. Marshall Space Flight Center, Physical Configuration Audit Form, 2010.
  14. Software Engineering Division, Goddard Space Flight Center, Physical Configuration Audit Checklist, 580-CK-036-03, 2008.

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

No tools have been currently identified for this SWE. If you wish to suggest a tool, please leave a comment below.

Unknown macro: {div3}

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641: Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap, "conduct software audit for specification compliance on all data transferred between JPL and [contractor]" (http://www.nasa.gov/offices/oce/llis/0641.html).

  • No labels