bannerd


8.59 - Audit Reports

Return to 8.16 - SA Products

1.  Introduction

A main software assurance objective is to provide a level of confidence that the software produced is error free and performs in its intended manner. Thus, the software assurance personnel are involved in projects throughout the entire software development life cycle monitoring the development activities, ensuring that the correct standards, processes, and procedures are being followed, and evaluating the product quality as it is being developed. Audits are a common tool used to assess the quality of the software products and the level of compliance with the process requirements.


The Audit Reports topic focuses on the many aspects of software auditing and how to report the results. Since audits and auditing are discussed in many areas of this Handbook, some audit topics will provide links to other areas of the Handbook where the information is already located. For example, Topic 8.12 - Basics of Software Auditing contains an extensive description of the basic process of planning and conducting an audit, along with reporting the results. 

The information in this topic is divided into several tabs as follows:

  • Tab 1 – Introduction.
  • Tab 2 – Audit requirements and recommendations in the Software Assurance and Software Safety Standard (NASA-STD-8739.8).
  • Tab 3 – Audit Metrics that are typically collected or associated with audits and their use.
  • Tab 4 – Audit Report Contents discusses the communication of the audit results.
  • Tab 5 – Resources for this topic including a table of available checklists.

1.1 Additional Guidance

Links to Additional Guidance materials for this subject have been compiled in the Relevant Links table. Click here to see the Additional Guidance in the Resources tab.

See also SWE-022 - Software Assurance for Audit Reports listed in SA Work Products. 

2. Audits

Auditing is one tool used by the software assurance and software safety personnel to determine whether projects are following and in compliance with governing requirements and standards along with project processes and procedures. Thus, the Software Assurance and Software Safety (SASS) Standard (NASA-STD-8739.8) lists requirements for many types of audits and required tasks that could be performed by auditing (e.g., compliance assessments).

There are three basic types of audit requirements – targeted, generic, and those that satisfy other requirements. The targeted audits focus on one specific process, product, or activity (e.g., software configuration audits). Generic audits are more general in nature and may call for audits of standards, processes, procedures, and practices. Those audits may require multiple audits, spanning the life cycle to ensure all aspects are met (e.g., software development audits). A third category of audits is comprised of those requirements in the SASS Standard listed as “confirm” or “assess” that can be easily satisfied by performing an audit (e.g., SASS tasking for SWE-139 which calls for assessing compliance to NPR 7150.2). All three types of audits should be considered when planning the schedule of audits for a project. All project audits should be planned in conjunction with the activities that are occurring in that time period and need to be coordinated with the project schedule. For example, it does not make sense to plan an audit evaluate the test procedures during the planning phase when they have not been developed yet.

As mentioned previously in the Introduction, there are many good resources on planning, conducting, and closing out an audit in Topic 8.12 - Basics of Software Auditing. As part of the planning for an audit, checklists should be generated or used. To make this easier, there are many checklists available in this Handbook as well as in Center asset libraries, which can be accessed from the NASA Engineering Network (NEN) Software Community of Practice (CoP) Software Processes Across NASA (SPAN)  197 site. A list of audit checklists found in this Handbook can be found in Tab 5.3. As part of the audit preparation, checklists should be reviewed as they may need to be modified to fit specific tailored, approved processes that are in place for the project.

After the audit is conducted, a key part of any audit involves the post audit activities and actions. Any type of non-conformance, including findings, issues, defects, or errors found during an audit is recorded in a tracking system and these non-conformances are tracked to closure by the audit team. An analysis or assessment of the non-conformances should be performed and compared with previous or similar audit results to determine if there are systemic or consistent problems, which might require a closer look.

Table 1 below lists the required audits on a particular process, activity, or product along with the associated SWE requirement number. Table 2 lists some of the SASS Tasks where an audit might be one way to satisfy the requirement but is not the only method that could be used.

Table 1: Audit Requirements in the Software Assurance and Software Safety Standard (NASA-STD-8739.8 278):

SWE #

NPR 7150.2  Requirement 083

NASA-STD-8739.8   Software Assurance and Software Safety Tasks 278

084

5.1.7 The project manager shall perform software configuration audits to determine the correct version of the software configuration items and verify that they conform to the records that define them.

1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them.

077

4.6.3 The project manager shall complete and deliver the software product to the customer with appropriate records, including as-built records, to support the operations and maintenance phase of the software’s life cycle. 

1. Confirm that the correct version of the products is delivered, including as-built documentation and project records. 

2. Perform audits for all deliveries per the configuration management processes to verify that all products are being delivered and are the correct versions.

045

5.1.9 The project manager shall participate in any joint NASA/developer audits. 

1. Participate in or assess the results from any joint NASA/developer audits. Track any findings to closure.

088

5.3.3 The project manager shall, for each planned software peer review or software inspection:

a. Use a checklist or formal reading technique (e.g., perspective-based reading) to evaluate the work products.
b. Use established readiness and completion criteria.
c. Track actions identified in the reviews until they are resolved.
d. Identify the required participants.

3. Perform audits on the peer-review process.

086

5.2.1 The project manager shall record, analyze, plan, track, control, and communicate all of the software risks and mitigation plans.

2. Perform audits on the risk management process for the software activities.

039

3.1.8 The project manager shall require the software developer(s) to periodically report status and provide insight into software development and test activities; at a minimum, the software developer(s) will be required to allow the project manager and software assurance personnel to:

    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development processes and practices.
    5. Participate in software reviews and technical interchange meetings.

5. Perform audits on software development processes and practices at least once every two years.

6. Develop and provide status reports.

7. Develop and maintain a list of all software assurance review discrepancies, risks, issues, findings, and concerns.

195

4.6.5 The project manager shall maintain the software using standards and processes per the applicable software classification throughout the maintenance phase.

1. Perform audits on the standards and processes used throughout maintenance based on the software classification.

085

5.1.8 The project manager shall establish and implement procedures for the storage, handling, delivery, release, and maintenance of deliverable software products.  

2. Perform audits on the project to ensure that the project follows defined procedures for deliverable software products.

032

3.9.2 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:

    1. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
    2. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.

3. Perform audits on the software development and software assurance processes.


Table 2: Requirements in NASA-STD-8739.8 that might be satisfied with the use of an audit:

SWE

NPR 7150.2 Requirement

Software Assurance and Software Safety Tasks

139

3.1.11 The project manager shall comply with the requirements in this NPR that are marked with an “X” in Appendix C consistent with their software classification.

1. Assess that the project's software requirements, products, procedures, and processes are compliant with the NPR 7150.2 requirements per the software classification and safety criticality for software.

079

5.1.2 The project manager shall develop a software configuration management plan that describes the functions, responsibilities, and authority for the implementation of software configuration management for the project.

1. Assess that a software configuration management plan has been developed and complies with the requirements in NPR 7150.2 and Center/project guidance.

016

3.3.1 The project manager shall document and maintain a software schedule that satisfies the following conditions:

    1. Coordinates with the overall project schedule.
    2. Documents the interactions of milestones and deliverables between software, hardware, operations, and the rest of the system.
    3. Reflects the critical dependencies for software development activities.
    4. Identifies and accounts for dependencies with other projects and cross-program dependencies.

1. Assess that the software schedule satisfies the conditions in the requirement.

024

3.1.4 The project manager shall track the actual results and performance of software activities against the software plans.

    1. Corrective actions are taken, recorded, and managed to closure.
    2. Changes to commitments (e.g., software plans) that have been agreed to by the affected groups and individuals are taken, recorded, and managed.

1. Assess plans for compliance with NPR 7150.2 requirements, NASA-STD-8739.8, including changes to commitments.

013

3.1.3 The project manager shall develop, maintain, and execute software plans, including security plans, that cover the entire software life cycle and, as a minimum, address the requirements of this directive with approved tailoring.

1. Confirm that all plans, including security plans, are in place and have expected content for the life cycle events, with proper tailoring for the classification of the software.

2. Develop and maintain a Software Assurance Plan following the content defined in NASA-HDBK-2203 for a software assurance plan, including software safety.

Note: This is not a complete list but is meant to provide some examples.

2.1 Additional Guidance

Links to Additional Guidance materials for this subject have been compiled in the Relevant Links table. Click here to see the Additional Guidance in the Resources tab.

3. Audit Metrics

Audit metrics may be collected and reported on at the organization or project-level. Those collected at the organization-level may be the result of organization-level audits or the roll-up of the audit data from individual projects to see how the organization is performing as a whole. The audit metrics collected at the project-level are specific to the project and should be able to provide insight into the health and status of the project.

Note: Projects include any projects where NASA may be performing audits (NASA in-house, contracted, or external commercial projects where NASA is providing some evaluation). Ideally, software assurance for external commercial projects should follow these practices also.

The audit metrics that are tracked and reported on will depend on the specific metrics chosen by the audit team. The Audit Team may be from a project, an independent 3rd party auditors for commercial ventures, or organization. The audit team must select or develop the audit metrics that best reflect the goals of audit to be performed. Organizations or projects may have a pre-defined set of metrics making the selection process straight forward. Typically, only a few audit metrics are selected, as the most important things to come out of an audit are the findings and observations themselves.

Topic 8.18 - SA Suggested Metrics provides a list of potential metrics including audit metrics that could be selected for specific requirements and are sorted by the type of metrics they support (e.g., Peer Review Process Audits, Compliance Audits). This is not an exhaustive list and may not cover all of the possible audit metrics that need to be collected and reported.

When selecting metrics, the audit team may want to look at the data from different aspects. Selecting a few high-level metrics will provide overall status information. Some of these should be chosen to give the management team a quick view of the “state of the project,” or the “state of the Software Assurance work” being done. A couple of examples high-level audit metrics are:  # of audits performed vs. # of audits planned and # of Open vs. Closed Audit Non-Conformances over time.

Selecting a few low-level metrics will attest to the quality of the work that was performed. The lower-level metrics should be chosen to give the management team insight into how well the projects and organizations are following the NASA requirements and standards as well as their organizational and project processes and procedures. A couple of examples of low-level requirements are:  # of Non-Conformances identified in the CM Plan, # of audit Non-Conformances per peer review audit.

Some metrics examples that could be chosen:

3.1 Audits of Software Assurance:

  • # of audits performed versus # of audits planned
  • # of Peer Review Audits planned vs. # of Peer Review Audits performed
  • # of Compliance Audits planned vs. # of Compliance Audits performed
  • Trends on non-conformances from audits (Open, Closed, life cycle Phase)
  • Time required to close audit Non-Conformances
  • Preparation time each audit participant spent preparing for audit

3.2 Peer Review Process Audit Metrics:

  • # of audit Non-Conformances per Peer Review Audit
  • # of audit Observations per Peer Review Audit
  • # of Peer Review Audits planned vs. # of Peer Review Audits performed
  • Trends on non-conformances from audits (Open, Closed, Life cycle Phase)
  • Time required to close Peer Review Audit Non-Conformances
  • Preparation time each audit participant spent preparing for audit

3.3 Compliance Audit Metrics:

  • # of Compliance Audits planned vs. # of Compliance Audits performed
  • # of Open vs. Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
  • # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)
  • # of Non-Conformances identified in the software Configuration Management Plan
    • Trends of # Open vs. # Closed over time
  • # of Configuration Management Audits conducted by the project – Planned vs. Actual
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
  • # of process Non-Conformances (e.g., activities not performed) identified vs. # accepted by the project
    • Trends of # Open vs. # Closed over time"
  • # of process Non-Conformances in the Risk Management process identified

3.4 Project health and status based on Audit metrics:

  • # of process Non-Conformances (e.g., activities not performed) identified vs. # accepted by the project
  • # of Compliance Audits planned vs. # of Compliance Audits performed
  • # of Open vs. Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)

3.5 SA value in Peer Reviews:

  • How many non-conformances have been found in each peer review? # of non-conformances found in each peer review
  • # of non-conformances found by SA
  • # of non-conformances accepted by the project
  • Trend of open non-conformances vs closed non-conformances over time (SA analysis of trends could show project problems if # of closed non-conformances continue to lag further and further behind # of open non-conformances

3.6 Observations of Project status based on Audit metrics:

  • # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project
  • # of Compliance Audits planned vs. # of Compliance Audits performed
  • # of Open vs. Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)

3.7 Safety issues and non-conformances:

  • # of safety related-requirements issues (over time, open, closed)
  • # of safety-related non-conformances identified by life cycle phase (over time, open, closed)

3.8 Display Data With Charts

Simple charts are a good way to show the data for discussion when reporting status. Some simple examples are below:


3.9  Additional Guidance

Links to Additional Guidance materials for this subject have been compiled in the Relevant Links table. Click here to see the Additional Guidance in the Resources tab.

4. Audit or Assessment Report Content


Audit results should be reported in a high-level summary and conveyed as part of weekly or monthly SA Status Reports. The high-level summary should provide an overall evaluation of the audit, any associated risks, and thoughts on the health and status of the project or organization audited.

When an audit is conducted, it may be necessary to provide a detailed report of the results to the management team outside of the normal status reporting cycle. This will allow the management team to prioritize and delegate the necessary corrections. If a time-critical issue is uncovered, it should be reported to management immediately so that the affected organization may begin addressing it at once.

When a project has safety-critical software, audits results should be shared with the Software Safety personnel. The results of audits conducted by Software Assurance personnel and those done by Software Safety personnel may be combined into one report, if desired.

Per SWE-201 – SASS Task 1 and SWE-039 – SASS Task 8, all audit findings and observations are documented in a problem/issue tracking system and tracked to closure. These items are communicated to the affected organization’s personnel and possible solutions discussed. 

4.1 Minimum Recommended Content for Audit Summary

When reporting the results of an audit for a SA Status Report, the following defines the minimum recommended content:

  1. Identification of what was audited: Mission/Project/Application
  2. Audit Type/Subject (e.g., Peer Review, Process, CM baseline)
  3. Group or department (Branch, Division, Project or subset, etc.) being audited
  4. Overall evaluation of audit subject, based on audit observations/results. Include thoughts on the health and status of the project or organization audited.
  5. Major findings and associated risk – The detailed reporting should include where the finding was discovered and an estimate of the amount of risk involved with the finding.
  6. Observations – This should include important observations such as any systemic issues
  7. Current status of findings: open/closed; projection for closure timeframe
  8. Optional: include metrics charts showing other details of audit findings

4.2 Minimum Recommended Content for Detailed Audit Report

When reporting the detailed results of an audit, the following defines the minimum recommended content:

  1. Identification of what was audited: Mission/Project/Application

  2. Audit Type/Subject (e.g., Peer Review, Process, CM baseline)

  3. Person or group doing audit(s)

  4. Period/Timeframe/Phase audit performed during

  5. Documents used in audit (e.g., requirements version, etc.)

  6. Group or department (Branch, Division, Project or subset, etc.) being audited

  7. Description of techniques used (Checklists, interviews, comparisons, etc.)

  8. Overall evaluation of audit subject, based on audit observations/results

  9. Major findings and associated risk – The detailed reporting should include where the finding was discovered and an estimate of the amount of risk involved with the finding.

  10. Minor findings

  11. Observations – This should include any systemic issues

  12. Current status of findings: open/closed; projection for closure timeframe

  13. Optional: include metrics charts showing other details of audit finding

5. Resources

5.1 References


5.2 Tools 

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

5.3 Audit Checklists Process Asset Templates 

Click on a link to download a usable copy of the template. (AudCK)



5.4 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

5.5 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 



  • No labels

0 Comments