bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


{alias:SWE-084} {tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned} {div3:id=tabs-1} h1. 1. Requirements
Wiki Markup
Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned


Div
idtabs-1

1. Requirements

4.1.6

The

project

shall

ensure

that

software

configuration

audits

are

performed

to

determine

the

correct

version

of

the

configuration

items

and

verify

that

they

conform

to

the

documents

and requirements that

define

them.

h2. {color:#003366}{*}

1.1

Notes{*}{color} NPR

Notes

NPR 7150.2

does not include any notes for this requirement. h2.

, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2

Applicability

Across

Classes

Classes

C

through

E

and

Safety

Critical

are

labeled,

"SO."

.

This

means

that

this

requirement

applies

to

the

safety-critical

aspects

of

the

software.

Class

G

is

labeled

with

"P (Center)."

. 

This

means

that

an

approved

Center-defined

process

which

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement.

{applicable:asc=1\|ansc=1\|bsc=1\|bnsc=1\|csc=*\|cnsc=0\|dsc=*\|dnsc=0\|esc=*\|ensc=0\|f=1\|g=p|h=0|} {div3} {div3:id=tabs-2} h1. 2. Rationale {panel}An audit is a planned, independent and documented assessment to verify compliance to agreed-upon requirements.  {panel} For software configuration, audits help ensure that configuration items have been developed and completed in accordance with the documents that define them.  Audits also help ensure that configuration items achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all configuration items that are supposed to be part of the baseline or release are actually in the baseline or release and are the correct version and revision. Audits provide an independent check to ensure that the planned product is the developed product. {floatbox:width=full}{_}Configuration audits allow developers to "provide notice that contractual obligations are nearing completion, and to provide sufficient evidence for the clients or user organization to accept the product and initiate the transition into operational usage." (_[IEEE Guide to Software Configuration Management|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=89631&tag=1]) {floatbox}\\ {div3} {div3:id=tabs-3} h1. 3. Guidance There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA).  Per the [NASA Software Assurance Standard|https://standards.nasa.gov/documents/detail/3315130], NASA-STD-8739.8{^}1^, the FCA is defined as an audit "conducted to verify that the development of a configuration item has been completed satisfactorily, that the item has achieved the performance and functional characteristics specified in the functional or allocated configuration identification, and that its operational and support documents are complete and satisfactory."    The PCS is define as an audit "conducted to verify that one or more configuration items, as built, conform to the technical documentation that defines it. \[Based on IEEE 610.12, IEEE Standard Glossary of Software Engineering Terminology\]" Audit plans, including goals, schedules, participants, contractor participation, and procedures should be documented in the configuration management plan (see [SWE-103|SWE-103]). When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit. The basic steps in an audit are: !SWE-084_Image.jpg|border=0! The [Department of Defense Configuration Management Guidance Handbook|http://www.everyspec.com/MIL-HDBK/MIL-HDBK+%280001+-+0099%29/MIL-HDBK-61_11531/]^5^ includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions).  These tables address both the government and contractor roles in these activities and can be tailored as applicable for a project. The STEP Level 2 Software Configuration Management and Data Management course taught by the Westfall Team{^}2^ suggests that the following be included in a FCA: * "An audit of the formal test documentation against test data * An audit of the V&V reports * A review of all approved changes * A review of updates to previously delivered documents * A sampling of design review outputs * A comparison of code with documented requirements * A review to ensure all testing was accomplished"^2^ * Additional sample testing or rerunning of tests, as appropriate for the project The STEP 2 course suggests that the following be included in a PCA{^}2^: * "An audit of the system specification for completeness \[and removal of all TBDs\] * An audit of the FCA report for discrepancies & actions taken * A comparison of the architectural design with the detailed design components for consistency * A review of the module listing for compliance with the approved coding standards * An audit of the manuals for format completeness & conformance to system & functional descriptions"^2^ Additional audit topics to consider include: * As coded, software products reflect their design * User documentation complies with standards as specified * Activities have been conducted according to applicable requirements, plans, and contract The [NASA Systems Engineering Handbook|http://www.ap233.org/ap233-public-information/reference/20080008301_2008008500.pdf], NASA/SP-2007-6105, Rev1{^}6^, includes the following table showing the data typically reviewed during each of these audits: !SWE-084_Image 2.jpg|align=center,width=729,height=424! Consider the following options when deciding when to conduct audits: * At the time a product is released * Prior to delivery to assure that all delivered products are complete, contain the proper versions and revisions, and that all discrepancies, open work, and deviations and waivers are properly documented and approved; can be FCA or PCA * At the end of a life cycle phase per CMMI * Prior to the release of a new or revised baseline * As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues such as meeting coding standards that could affect large segments of the project * Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address status of all identified action items (FCA) When preparing for an audit, it is important to gain an understanding of what is supposed to be there, develop relevant questions, determine what to sample, determine who to interview, and plan other steps that will make the audit proceed more smoothly once it gets started. Checklists (essentially a set of prepared questions) and prepared interview questions are important to the success of an audit and help ensure all audit criteria are covered.  The [NASA Configuration Management (CM) Standard|https://standards.nasa.gov/documents/detail/3315133]^8^ includes sample checklists in Appendix E.  When creating checklists and question, keep the following in mind: * Checklist questions should be answerable with a Yes or No response with Yes responses always indicating a positive or good response and No always indicating the negative * Checklists should be precise, measurable, factual * Checklists correspond to the audit requirements and should remain within the scope of the audit * Interview questions should be open-ended (e.g., "What do you use...?" as opposed to "Do you use XYZ to...?") * Interview question should prompt the auditee to do most of the talking * Interview questions should be context free (e.g., "How do you track the status of a change request"? as opposed to "How do you use the XYZ system to track the status of a change request?") * Interview questions should not include the answer in the question * Interview questions should focus on systems, products, processes, not the person The STEP Level 2 Software Configuration Management and Data Management course{^}2^ provides a set of questions to consider for audit checklists.  A few of those questions are shown below: | "Are the procedures and/or work instructions for the task defined at the appropriate level of detail?" | "Were the entry criteria for the task verified before the work began?" | | "Are the environment, infrastructure and tools utilized during the task adequate to achieve conformity?" | "Are the outputs from the task appropriately verified and/or approved and/or controlled?" | | "Are nonconformities/defects appropriately reported and tracked to closure?" | "Are the appropriate records being kept?" | \\ {panel}When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found.  Findings should be grouped as major or minor depending on the range and effect of the non-conformance.  {panel}\\ {panel}  Non-conformances should result in corrective actions that address and correct the root cause of the non-conformances.  Follow-up should be conducted to ensure the corrective actions were completed and are effective.  {panel}\\ {note}Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits. {note} Additional guidance related to configuration audits may be found in the following related requirements in this handbook: | [SWE-079|SWE-079] | Develop CM Plan | | [SWE-083|SWE-083] | Status Accounting | | [SWE-103|SWE-103] | Software Configuration Management Plan | \\ {div3} {div3:id=tabs-4} h1. 4. Small Projects For projects with limited personnel, consider sharing lead auditors or audit team members among projects. {div3} {div3:id=tabs-5} h1. 5. Resources # NASA Technical Standard, "[NASA Software Assurance Standard|https://standards.nasa.gov/documents/detail/3315130]", NASA-STD-8739.8, 2004. # STEP Level 2 Software Configuration Management and Data Management course, SMA-SA-WBT-204, [SATERN|https://saterninfo.nasa.gov/] (need user account to access SATERN courses). A SATERN user account is needed to access this material. # IEEE Computer Society, "[IEEE Guide to Software Configuration Management|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=89631&tag=1]", IEEE STD 1042-1987, 1987.  This link requires an account on the NASA START (AGCY NTSS) system ([http://standards.nasa.gov|http://standards.nasa.gov/] ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards. # IEEE Computer Society, "[IEEE Standard for Software Configuration Management Plans|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1502775&tag=1]", IEEE STD 828-2005, 2005.  This link is to the NASA START accessible copy of this standard.  It requires an account on the NASA START (AGCY NTSS) system ([http://standards.nasa.gov|http://standards.nasa.gov/] ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards. # Department of Defense, "[Department of Defense Configuration Management Guidance Handbook|http://www.everyspec.com/MIL-HDBK/MIL-HDBK+%280001+-+0099%29/MIL-HDBK-61_11531/]", MIL-HDBK-61, 1997.  This URL is the download page for this document.  The download is free. # NASA Scientific and Technical Information (STI), NASA Center for AeroSpace Information, "[NASA Systems Engineering Handbook|http://www.ap233.org/ap233-public-information/reference/20080008301_2008008500.pdf]", NASA/SP-2007-6105, Rev1, 2007. # Flight and Ground Software Division, MSFC, "[Software Development Process Description Document|https://nen.nasa.gov/web/software/nasa-software-process-asset-library-pal?p_p_id=webconnector_WAR_webconnector_INSTANCE_PA7b&p_p_lifecycle=1&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_webconnector_WAR_webconnector_INSTANCE_PA7b_edu.wisc.my.webproxy.URL=https%3A%2F%2Fnx.arc.nasa.gov%2Fnx%2Fdsweb%2FGet%2FDocument-499471%2FSDPDD_Rev%2BQ_080207.pdf]" (Chapter 13), EI32-OI-001, Revision R, 2010. # NASA Technical Standard, "[NASA Configuration Management (CM) Standard|https://standards.nasa.gov/documents/detail/3315133]", Appendix E, NASA-STD-0005, 2008. # ISO/IEC 12207, "[Systems and software engineering -- Software life cycle processes|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4475826&tag=1]", IEEE Std 12207-2008, 2008 (Key section: Stakeholder Requirements Definition Process).  This link requires an account on the NASA START (AGCY NTSS) system ([http://standards.nasa.gov|http://standards.nasa.gov/] ).  Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.  # Software Engineering Division, Goddard Space Flight Center, [Baseline Audit Checklist|http://software.gsfc.nasa.gov/AssetsApproved/PA3.1.1.5.doc], 580-CK-072-02, 2008. # Marshall Space Flight Center, [Functional Configuration Audit Form|http://software.gsfc.nasa.gov/AssetsApproved/PA3.1.1.5.doc], 2010. # Software Engineering Division, Goddard Space Flight Center, [Functional Configuration Audit Checklist|http://software.gsfc.nasa.gov/AssetsApproved/PA3.1.1.3.doc], 580-CK-029-03, 2008. # Marshall Space Flight Center, [Physical Configuration Audit Form|http://spi.msfc.nasa.gov/Process_Asset_Library-PAL/Software_Process_and_Reference_Materials/Forms/Form_0011_PCA_Audit.docx], 2010. # Software Engineering Division, Goddard Space Flight Center, [Physical Configuration Audit Checklist|http://software.gsfc.nasa.gov/AssetsApproved/PA3.1.1.4.doc], 580-CK-036-03, 2008. {toolstable} {div3} {div3:id=tabs-6} h2. 6. Lessons Learned A recommendation in the NASA Lessons Learned database specifically calls out configuration audits as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap, "conduct software audit for specification compliance on all data transferred between JPL and \[contractor\]" ([http://www.nasa.gov/offices/oce/llis/0641.html|http://www.nasa.gov/offices/oce/llis/0641.html]). {div3} {tabclose}


applicable
f1
gp
h0
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnsc0
dnsc0
dsc*
ensc0



Div
idtabs-2

2. Rationale

For software configuration, audits help ensure that configuration items (CIs) have been developed and completed in accordance with the documents and requirements that define them. Audits also help ensure that CIs achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all CIs that are supposed to be part of the baseline or release are actually in the baseline or release and are the correct version and revision.

Configuration audits provide checks to ensure that the planned product is the developed product.


Floatbox
widthfull

Configuration audits allow developers to "provide notice that contractual obligations are nearing completion, and to provide sufficient evidence for the clients or user organization to accept the product and initiate the transition into operational usage." (IEEE SA 1042-1987, IEEE Guide to Software Configuration Management

sweref
212
212
)




Div
idtabs-3

3. Guidance

There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases; however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.

Per NASA/SP-2007-6105, Rev 1, NASA Systems Engineering Handbook

sweref
273
273
, the FCA "examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the Preliminary Design Review (PDR) and Critical Design Review (CDR). FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."

The PCA "(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products."

sweref
273
273

Audit plans, including goals, schedules, participants, contractor participation, and procedures are documented in the configuration management (CM) plan (see SWE-103).

When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit.

The basic steps in an audit are:

Image Added

The Department of Defense Configuration Management Guidance Handbook

sweref
351
351
includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions). These tables address both the Government and contractor roles in these activities and can be tailored as applicable for a project.

The SMA (Safety and Mission Assurance) Technical Excellence Program (STEP) Level 2 Software Configuration Management and Data Management course taught by the Westfall Team

sweref
343
343
suggests that the following be included in an FCA:

  • "An audit of the formal test documentation against test data.
  • "An audit of the Verification and Validation (V&V) reports.
  • "A review of all approved changes.
  • "A review of updates to previously delivered documents.
  • "A sampling of design review outputs.
  • "A comparison of code with documented requirements.
  • "A review to ensure all testing was accomplished."
    sweref
    343
    343
  • Additional sample testing or rerunning of tests, as appropriate for the project.

The STEP 2 course suggests that the following be included in a PCA

sweref
343
343
:

  • "An audit of the system specification for completeness [and removal of all to-be-determineds (TBDs)].
  • "An audit of the FCA report for discrepancies & actions taken.
  • "A comparison of the architectural design with the detailed design components for consistency.
  • "A review of the module listing for compliance with the approved coding standards.
  • "An audit of the manuals for format completeness & conformance to system & functional descriptions."
    sweref
    343
    343

Additional audit topics to consider include:

  • As coded, software products reflect their design.
  • User documentation complies with standards as specified.
  • Activities have been conducted according to applicable requirements, plans, and contract.

NASA/SP-2007-6105, NASA Systems Engineering Handbook,

sweref
273
273
includes the following table showing the data typically reviewed during each of these audits:

Image Added

Consider the following options when deciding when to conduct audits:

  • At the time a product is released.
  • Prior to delivery to assure that all delivered products are complete, contain the proper versions and revisions, and that all discrepancies, open work, and deviations and waivers are properly documented and approved; can be FCA or PCA.
  • At the end of a life cycle phase per Capability Maturity Model Integration (CMMI).
  • Prior to the release of a new or revised baseline.
  • As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues. such as meeting coding standards that could affect large segments of the project.
  • Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address status of all identified action items (FCA).


Panel

When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found. Findings are grouped as major or minor depending on the range and effect of the non-conformance. 



Panel

Non-conformances result in corrective actions that address and correct the root cause of the non-conformances. Follow-up needs to be conducted to ensure the corrective actions were completed and are effective. 



Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits.


Additional guidance related to configuration audits may be found in the following related requirements in this Handbook:


SWE-079

Develop CM Plan

SWE-083

Status Accounting

SWE-103

Software Configuration Management Plan




Div
idtabs-4

4. Small Projects

For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects.


Div
idtabs-5

5. Resources


refstable

toolstable


Div
idtabs-6

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641: Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap.  One of the Recommendations states to "Conduct [a] software audit for specification compliance on all data transferred between JPL and [contractor]."

sweref
513
513