bannerc
SWE-084 - Configuration Audits

1. Requirements

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-084 - Last used in rev NPR 7150.2D

RevSWE Statement
A

4.1.6 The project shall ensure that software configuration audits are performed to determine the correct version of the configuration items and verify that they conform to the documents and requirements that define them.

Difference between A and B

No change

B

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the configuration items and verify that they conform to the records that define them.

Difference between B and C

No change

C

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.

Difference between C and DNo change
D

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Configuration audits provide checks to ensure that the planned product is the developed product.

3. Guidance

For software configuration, audits help ensure that configuration items (CIs) have been developed and completed following the documents and requirements that define them. Audits also help ensure that CIs achieve their performance and functional characteristics goals and that all associated operational and support documents are complete and meet their requirements. Audits also determine if all CIs that are supposed to be part of the baseline or release is actually in the baseline or release and are the correct version and revision.

There are two types of configuration audits: the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). Configuration audits are performed for all releases; however, audits of interim, internal releases may be less formal and rigorous, as defined by the project.

Per NASA/SP-2007-6105, Rev 1, NASA Systems Engineering Handbook 273, the FCA "examines the functional characteristics of the configured product and verifies that the product has met, via test results, the requirements specified in its functional baseline documentation approved at the preliminary design review (PDR) and critical design review (CDR). FCAs will be conducted on both hardware or software configured products and will precede the PCA of the configured product."

The PCA "(also known as a configuration inspection) examines the physical configuration of the configured product and verifies that the product corresponds to the build-to (or code-to) product baseline documentation previously approved at the CDR. PCAs will be conducted on both hardware and software configured
products." 273

Audit plans, including goals, schedules, participants, contractor participation, and procedures are documented in the configuration management (CM) plan (see 5.06 - SCMP - Software Configuration Management Plan).

When planning audits, it is important to remember that audits are samplings, not a look at every record.  It is also important to remember that auditors should not have any direct responsibility for the software products they audit.

The basic steps in an audit are:


The Department of Defense Configuration Management Guidance Handbook 351 includes tables of activities for audit planning, preparation, performance, and close-out (generating the audit report and addressing corrective actions). These tables address both the Government and contractor roles in these activities and can be tailored as applicable for a project.

The SMA (Safety and Mission Assurance) Technical Excellence Program (STEP) Level 2 Software Configuration Management and Data Management course taught by the Westfall Team 343 suggests that the following be included in an FCA:

  • "An audit of the formal test documentation against test data.
  • "An audit of the verification and validation (V&V) reports.
  • "A review of all approved changes.
  • "A review of updates to previously delivered documents.
  • "A sampling of design review outputs.
  • "A comparison of code with documented requirements.
  • "A review to ensure all testing was accomplished." 343
  • Additional sample testing or rerunning of tests, as appropriate for the project.

The STEP 2 course suggests that the following be included in a PCA 343:

  • "An audit of the system specification for completeness [and removal of all to-be-determined (TBD)].
  • "An audit of the FCA report for discrepancies & actions taken.
  • "A comparison of the architectural design with the detailed design components for consistency.
  • "A review of the module listing for compliance with the approved coding standards.
  • "An audit of the manuals for format completeness & conformance to system & functional descriptions." 343

Additional audit topics to consider include:

  • As coded, software products reflect their design.
  • User documentation complies with standards as specified.
  • Activities have been conducted according to applicable requirements, plans, and contracts.

NASA/SP-2007-6105, NASA Systems Engineering Handbook, 273 includes the following table showing the data typically reviewed during each of these audits:


Consider the following options when deciding when to conduct audits:

  • At the time a product is released.
  • Before delivery to assure that all delivered products are complete, contain the proper versions and revisions and that all discrepancies, openwork, and deviations, and waivers are properly documented and approved; can be FCA or PCA.
  • At the end of a life-cycle phase per Capability Maturity Model Integration (CMMI).
  • Before the release of a new or revised baseline.
  • As the project progresses to prevent finding major issues at the end when it's more costly to fix them and to identify systemic issues. such as meeting coding standards that could affect large segments of the project.
  • Incrementally for very large, complex systems focusing on specific functional areas with a summary audit held to address the status of all identified action items (FCA).


When reporting the results of an audit, it is important to remain unbiased and include positive observations as well as issues found. Findings are grouped as major or minor depending on the range and effect of the non-conformance. 

Non-conformances result in corrective actions that address and correct the root cause of the non-conformances. Follow-up needs to be conducted to ensure the corrective actions were completed and are effective. 

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to configuration audits.

NASA-specific configuration management information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.  

Additional guidance related to configuration audits may be found in the following related requirements in this Handbook:

4. Small Projects

For projects with limited personnel, consider sharing lead auditors or audit team members among projects. Another suggestion is for members of small projects to conduct configuration audits of other small projects.

5. Resources

5.1 References


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Mars Climate Orbiter Mishap Investigation Board - Phase I Report. Lesson Number 0641 513: Configuration audits are called out as one of several mitigations for the root cause of the Mars Climate Orbiter (MCO) mishap.  One of the Recommendations states to "Conduct [a] software audit for specification compliance on all data transferred between JPL and [contractor]."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-084 - Configuration Audits
5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.

7.1 Tasking for Software Assurance

1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them. 

7.2 Software Assurance Products

  • None at this time


Objective Evidence

  • Software Problem reporting or defect tracking data
  • Software configuration management system data
  • Software assurance audit results of the change management processes.
  • Software version description document(s).

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • #  of Configuration Management Audits conducted by the project – Planned vs. Actual.
  • # of Compliance Audits planned vs. # of Compliance Audits performed.
  • # of software work product Non-Conformances identified by life-cycle phase over time.
  • # of Non-Conformances identified in release documentation (Open, Closed).
  • # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project.
  • Trends of # Open vs. # Closed over time.
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity).
  • #  of Configuration Management Audits conducted by the project – Planned vs. Actual.
  • # of Open vs. Closed Audit Non-Conformances over time.'
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.).

7.4 Guidance

Software assurance should perform or participate in the project’s configuration audits, typically the Functional Configuration Audit (FCA) and the Physical Configuration Audit (PCA). The Functional Configuration Audit. A Functional Configuration Audit (FCA) examines the functional characteristics of the configured product and verifies that the product has met the requirements specified in its Functional Baseline documentation approved at the Preliminary Design Review (PDR) and Critical Design Review (CDR). A Physical Configuration Audit (PCA) is the formal examination of the "as-built" configuration of a configuration item against its technical documentation to establish or verify the configuration item's product baseline. The physical configuration audit done at the point of delivery establishes that all the products in the delivery baseline are actually in the delivery baseline and that they are the correct versions, as documented in the Version Description Document.

Software Assurance personnel should sign off on the delivery documentation to indicate that the delivery is complete and meets its requirements. More detail on the FCA and PCA are found in the software guidance in this requirement (SWE-084).

Examples of checklists for a functional configuration audit, a physical configuration audit, and a baseline configuration audit are below:

                      Functional Configuration Audit Checklist

 





Asset Number:
Approved by: (signature)
Effective Date:
Name:
Expiration Date:
Title:
Responsible Office: 
Asset Type: Checklist:
Title:Functional Configuration Audit ChecklistPAL Number:

Functional Configuration  Audit Checklist


The   Functional Configuration Audit (FCA) Checklist is to be used by Configuration   Management (CM) personnel when conducting audits for major software builds or final releases.  A functional configuration audit verifies that the configuration item (CI) has been completed satisfactorily, that the item has achieved the specified performance and functional characteristics, and that its operational and support documents are complete and satisfactory.  

During this audit, CM will analyze the project’s Requirements Traceability Matrix,   Requirements documentation, Test reports, Problem reports, Change authorizations, and any Waivers or Deviations. 

For each checklist item below, place a check (P) in the box if the checklist item is fully satisfied.  Place a note in Section 1   below describing any actions performed during the audit to resolve discrepancies and thereby satisfy the criteria.  Any criteria that require an action item for future resolution and follow-up should be captured in Section 2, with the issues/non-compliances captured under “Issues and/or Comments”.  

The completed checklist should be maintained as a record at the location defined in the project’s data management list (DML).

Date of Audit:    __________________________________________

Project Build/Release:    ___________________________________

Name of Auditor:  _________________________________

 

 

P

Issues and/or Comments

1

Are all configuration items (as specified in the   “Baselines Table”) available for this deliverable/release? 


Guidance: This should include such items as the Requirements   Specifications, Software Requirements Traceability Matrix, Version   Description Documents, software change requests, all test results, and any operational documents or manuals.



2

Have all deviations or waivers to the requirements been considered and approved?



3

Does the deliverable/release incorporate all approved changes?



4

Was system/software testing completed? 



5

Were all requirements   (functional and performance) verified?



6

For those requirements that have not been verified or failed verification, has the status been captured (e.g., in a VDD)?



7

Has analysis or simulation been conducted for any performance parameters that couldn’t be completely verified during testing? 



8

If analysis or simulations were performed, do the results support the requirement verification?



9

Is the software requirements traceability matrix (SRTM) complete and bi-directional (i.e., is there evidence that all system/software requirements can be traced to the design, source code, and test procedures)?



10

Has all associated documentation been placed under CM control per the project’s CM plan?



Section 1:  Notes/Actions performed during the audit to   resolve any discrepancies to satisfy the criteria

#

Action









Section 2:  Action Items for future follow-up

#

Action

Assignee

Due Date



























Notes

If using a different media   to capture action items, please provide the name and location of that   asset:    ____________________________________________________

Change History

Version
Approval   Date

Description of Improvements

 

1.0


Initial   approved version by CCB

 

2.0
 




  • No labels