bannerc

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
Tabsetup
01. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
67. Software Assurance
Div
idtabs-1

1. Requirements

Excerpt

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

Expand
titleClick here to view the history of this requirement: SWE-084 History

Include Page
SITE:SWE-084 History
SITE:SWE-084 History

1.3 Applicability Across Classes

Applicable c
a1
b1
csc1
c1
d1
dsc1
e0
f1
g0
h0

Div
idtabs-2

2. Rationale

Configuration audits provide checks to ensure that the planned product is the developed product.

Div
idtabs-3

3. Guidance

For software configuration, audits help ensure that configuration items (CIs) have been developed and completed following the documents and requirements that define them. Audits also help ensure that CIs achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all CIs that are supposed to be part of the baseline or release is actually in the baseline or release and are the correct version and revision.

There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases; however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.

Per NASA/SP-2007-6105, Rev 1, NASA Systems Engineering Handbook

Swerefn
refnum273
, the FCA "examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the preliminary design review (PDR) and critical design review (CDR). FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."

The PCA "(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products."

Swerefn
refnum273

Audit plans, including goals, schedules, participants, contractor participation, and procedures are documented in the configuration management (CM) plan (see 5.06 - SCMP - Software Configuration Management Plan).

When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit.

The basic steps in an audit are:


The Department of Defense Configuration Management Guidance Handbook

Swerefn
refnum351
 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions). These tables address both the Government and contractor roles in these activities and can be tailored as applicable for a project.

The SMA (Safety and Mission Assurance) Technical Excellence Program (STEP) Level 2 Software Configuration Management and Data Management course taught by the Westfall Team

Swerefn
refnum343
 suggests that the following be included in an FCA:

  • "An audit of the formal test documentation against test data.
  • "An audit of the verification and validation (V&V) reports.
  • "A review of all approved changes.
  • "A review of updates to previously delivered documents.
  • "A sampling of design review outputs.
  • "A comparison of code with documented requirements.
  • "A review to ensure all testing was accomplished."
    Swerefn
    refnum343
  • Additional sample testing or rerunning of tests, as appropriate for the project.

The STEP 2 course suggests that the following be included in a PCA

Swerefn
refnum343
:

  • "An audit of the system specification for completeness [and removal of all to-be-determined (TBD)].
  • "An audit of the FCA report for discrepancies & actions taken.
  • "A comparison of the architectural design with the detailed design components for consistency.
  • "A review of the module listing for compliance with the approved coding standards.
  • "An audit of the manuals for format completeness & conformance to system & functional descriptions."
    Swerefn
    refnum343

Additional audit topics to consider include:

  • As coded, software products reflect their design.
  • User documentation complies with standards as specified.
  • Activities have been conducted according to applicable requirements, plans, and contracts.

NASA/SP-2007-6105, NASA Systems Engineering Handbook,

Swerefn
refnum273
 includes the following table showing the data typically reviewed during each of these audits:


Consider the following options when deciding when to conduct audits:

  • At the time a product is released.
  • Before delivery to assure that all delivered products are complete, contain the proper versions and revisions and that all discrepancies, openwork, and deviations, and waivers are properly documented and approved; can be FCA or PCA.
  • At the end of a life-cycle phase per Capability Maturity Model Integration (CMMI).
  • Before the release of a new or revised baseline.
  • As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues. such as meeting coding standards that could affect large segments of the project.
  • Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address the status of all identified action items (FCA).


Panel

When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found. Findings are grouped as major or minor depending on the range and effect of the non-conformance. 

Panel

Non-conformances result in corrective actions that address and correct the root cause of the non-conformances. Follow-up needs to be conducted to ensure the corrective actions were completed and are effective. 

Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits.

NASA-specific configuration management information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.  

Additional guidance related to configuration audits may be found in the following related requirements in this Handbook:

Div
idtabs-4

4. Small Projects

For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects.

Div
idtabs-5

5. Resources

5.1 References

refstable
Show If
groupconfluence-users
Panel
titleColorred
titleVisible to editors only

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted


SWEREFs called out in text: 273, 343, 351, 513

SWEREFs NOT called out in text but listed as germane: 001, 022, 044, 212, 216, 224, 278


5.2 Tools

Include Page
Tools Table Statement
Tools Table Statement

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641
    Swerefn
    refnum513
    :
    Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap.  One of the Recommendations states to "Conduct [a] software audit for specification compliance on all data transferred between JPL and [contractor]."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

Div
idtabs-7

7. Software Assurance

Excerpt Include
SWE-084 - Configuration Audits
SWE-084 - Configuration Audits

7.1 Tasking for Software Assurance

1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them. 

7.2 Software Assurance Products

  • None at this time


Note
titleObjective Evidence
  • Software Problem reporting or defect tracking data
  • Software configuration management system data
  • Software assurance audit results of the change management processes.
  • Software version description document(s).
Expand
titleDefinition of objective evidence

Include Page
SITE:Definition of Objective Evidence
SITE:Definition of Objective Evidence

7.3 Metrics

  • #  of Configuration Management Audits conducted by the project – Planned vs. Actual.
  • # of Compliance Audits planned vs. # of Compliance Audits performed.
  • # of software work product Non-Conformances identified by life-cycle phase over time.
  • # of Non-Conformances identified in release documentation (Open, Closed).
  • # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project.
  • Trends of # Open vs. # Closed over time.
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity).
  • #  of Configuration Management Audits conducted by the project – Planned vs. Actual.
  • # of Open vs. Closed Audit Non-Conformances over time.'
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.).

7.4 Guidance

Software assurance should perform or participate in the project’s configuration audits, typically the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). The Functional Configuration Audit. A Functional Configuration Audit (FCA) examines the functional characteristics of the configured product and verifies that the product has met the requirements specified in its Functional Baseline documentation approved at the Preliminary Design Review (PDR) and Critical Design Review (CDR). A Physical Configuration Audit (PCA) is the formal examination of the "as-built" configuration of a configuration item against its technical documentation to establish or verify the configuration item's product baseline. The physical configuration audit done at the point of delivery establishes that all the products in the delivery baseline are actually in the delivery baseline and that they are the correct versions, as documented in the Version Description Document.

Software Assurance personnel should sign off on the delivery documentation to indicate that the delivery is complete and meets its requirements. More detail on the FCA and PCA are found in the software guidance in this requirement (SWE-084).

Examples of checklists for a functional configuration audit, a physical configuration audit, and a baseline configuration audit are below:

                      Functional Configuration Audit Checklist

 





Asset Number:
Approved by: (signature)
Effective Date:
Name:
Expiration Date:
Title:
Responsible Office: 
Asset Type: Checklist:
Title:Functional Configuration Audit ChecklistPAL Number:

Functional Configuration  Audit Checklist


The   Functional Configuration Audit (FCA) Checklist is to be used by Configuration   Management (CM) personnel when conducting audits for major software builds or final releases.  A functional configuration audit verifies that the configuration item (CI) has been completed satisfactorily, that the item has achieved the specified performance and functional characteristics, and that its operational and support documents are complete and satisfactory.  

During this audit, CM will analyze the project’s Requirements Traceability Matrix,   Requirements documentation, Test reports, Problem reports, Change authorizations, and any Waivers or Deviations. 

For each checklist item below, place a check (P) in the box if the checklist item is fully satisfied.  Place a note in Section 1   below describing any actions performed during the audit to resolve discrepancies and thereby satisfy the criteria.  Any criteria that require an action item for future resolution and follow-up should be captured in Section 2, with the issues/non-compliances captured under “Issues and/or Comments”.  

The completed checklist should be maintained as a record at the location defined in the project’s data management list (DML).

Date of Audit:    __________________________________________

Project Build/Release:    ___________________________________

Name of Auditor:  _________________________________

 

 

P

Issues and/or Comments

1

Are all configuration items (as specified in the   “Baselines Table”) available for this deliverable/release? 


Guidance: This should include such items as the Requirements   Specifications, Software Requirements Traceability Matrix, Version   Description Documents, software change requests, all test results, and any operational documents or manuals.



2

Have all deviations or waivers to the requirements been considered and approved?



3

Does the deliverable/release incorporate all approved changes?



4

Was system/software testing completed? 



5

Were all requirements   (functional and performance) verified?



6

For those requirements that have not been verified or failed verification, has the status been captured (e.g., in a VDD)?



7

Has analysis or simulation been conducted for any performance parameters that couldn’t be completely verified during testing? 



8

If analysis or simulations were performed, do the results support the requirement verification?



9

Is the software requirements traceability matrix (SRTM) complete and bi-directional (i.e., is there evidence that all system/software requirements can be traced to the design, source code, and test procedures)?



10

Has all associated documentation been placed under CM control per the project’s CM plan?



Section 1:  Notes/Actions performed during the audit to   resolve any discrepancies to satisfy the criteria

#

Action









Section 2:  Action Items for future follow-up

#

Action

Assignee

Due Date



























Notes

If using a different media   to capture action items, please provide the name and location of that   asset:    ____________________________________________________

Change History

Version
Approval   Date

Description of Improvements

 

1.0


Initial   approved version by CCB

 

2.0