bannerd


SWE-045 - Project Participation in Audits

1. Requirements

5.1.9 The project manager shall participate in any joint NASA/developer audits. 

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-045 - Last used in rev NPR 7150.2D

RevSWE Statement
A

2.6.2.3 The project shall participate in any joint NASA/contractor audits of the software development process and software configuration management process.

Difference between A and B

No change

B

3.13.2 The project manager  shall participate in any joint NASA/supplier audits of the software development process and software configuration management process.

Difference between B and C

Changed "supplier" to "developer";
Expanded scope by removing "of the software development process and software configuration management process."

C

5.1.9 The project manager shall participate in any joint NASA/developer audits. 

Difference between C and DNo change
D

5.1.9 The project manager shall participate in any joint NASA/developer audits. 



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


1.4 Related Activities

This requirement is related to the following Activities:

2. Rationale

This requirement ensures that the project manager is actively involved in the oversight, coordination, and resolution of issues identified during joint NASA/developer audits. Effective audits are critical for assessing software development progress, ensuring compliance with standards and requirements (e.g., NPR 7150.2), evaluating risks, and identifying improvement opportunities. The active participation of the project manager provides leadership, accountability, and an effective means of addressing concerns early in the project.


Rationale

1. Ensuring Alignment Between NASA and the Developer

The project manager plays a pivotal role in ensuring that the project’s goals, objectives, and requirements are aligned between NASA and the developer. Joint audits offer opportunities for both parties to:

  • Validate shared understanding of project deliverables, milestones, and quality expectations.
  • Assess whether the developer’s processes adhere to the project's technical, contractual, and safety-critical requirements.
  • Jointly resolve misalignments or differing interpretations of requirements, ensuring that these issues do not create larger risks later in the project lifecycle.

2. Promoting Ownership and Leadership

The project manager is responsible for the overall success of the project. Their participation underscores:

  • Accountability: Demonstrating a commitment to ensuring the quality and compliance of project deliverables.
  • Leadership: Facilitating discussions or decisions that emerge from audit findings. The project manager is often in the best position to escalate high-priority issues or balance competing constraints such as cost, schedule, and quality.
  • Decision-Making: Joint audits may require immediate decisions on findings and corrective actions (e.g., resolving non-conformances or adapting project plans). The project manager’s involvement ensures informed decisions are made efficiently.

3. Risk Mitigation

Audits can uncover risks in the developer’s software development processes, tools, or environments. Active participation by the project manager ensures:

  • Early Identification: Risks or process weaknesses are brought to the attention of leadership immediately for mitigation.
  • Corrective Actions: Follow-through on remediation plans or process improvements identified during the audit.
  • Risk Assessment Continuity: Both NASA and the developer maintain a shared risk perspective, ensuring thorough documentation and ownership of risks throughout the project lifecycle.

4. Facilitating Communication and Transparency

Joint audits are critical points of communication between NASA and the developer. Participation by the project manager ensures:

  • Transparency: Open, collaborative discussions between NASA and the developer about the project's challenges, progress, or changes.
  • Stakeholder Engagement: Ensuring all stakeholders are engaged in the review and resolution of findings, building trust and fostering collaboration.
  • Clear Feedback Loop: Aligning feedback from the audit with project priorities through direct, real-time communication with senior leadership.

5. Adherence to Contractual Obligations and NASA Policies

For software projects with contracted developers, joint NASA/developer audits are often explicitly stated in contracts or agreements as a way to:

  • Verify adherence to contractual deliverables, milestones, and performance metrics.
  • Confirm compliance with NASA standards, such as NPR 7150.2, NASA-STD-8739.8, and other relevant Center or project-specific directives.
  • Fulfill NASA’s policy on mission assurance and accountability in software projects.

6. Avoiding Audit Disconnects

Developer audits without project manager involvement can lead to:

  • Missed Issues: Key risks or non-compliances may not surface to project leadership, increasing the likelihood of costly errors later in the lifecycle.
  • Gaps in Responsibility: Lack of clarity on how findings impact project deliverables or whether they are being addressed effectively.
  • Unresolved Findings: Without project leadership present during audits, non-conformance items may not be tracked and resolved promptly due to a lack of prioritization or coordination.

7. Supporting Continuous Improvement

Participating in joint audits allows the project manager to promote iterative process improvement for both the developer and NASA teams:

  • Identify recurring issues in development or management and improve procedural controls to prevent audit findings.
  • Evaluate whether lessons learned from past projects are being applied effectively.
  • Facilitate discussions on process maturity and adherence to Quality Management Systems (QMS) through audits, enabling a culture of continuous improvement.

Why the Project Manager Specifically?

The project manager is uniquely positioned within the organizational hierarchy to bridge technical, managerial, and contractual considerations. Their involvement ensures that:

  1. Issues impacting the project's success—on cost, schedule, or scope—are addressed with the necessary authority.
  2. Resource allocation decisions (e.g., implementing corrective actions) are supported by project-level oversight.
  3. Stakeholder concerns, including those of NASA, contractors, and developers, are addressed collaboratively throughout the audit process.

Key Outcomes of Project Manager Participation in Joint Audits

  1. Audit Issues Are Taken Seriously: The project manager’s presence signals the criticality of findings and ensures they are resolved appropriately.
  2. Stakeholders Are Aligned: Ensures NASA-specific expectations are clearly communicated to and met by the developer.
  3. Project Success Is Secured: Findings during audits contribute to improving software quality, mission safety, and alignment with project goals.

Lessons Learned Supporting This Rationale

  1. Case Study: ISS Software Configuration Management (LL No. 1130)
    A lack of visibility into software developer processes and inadequate joint oversight contributed to inefficiencies and delays when resolving on-orbit anomalies. Active project management participation in such audits can help mitigate these problems.

  2. Case Study: Redundant Verification of Timing Errors (LL No. 559)
    An overlooked defective software patch led to mission failure, emphasizing the importance of thorough, joint verifications. Project managers present during audits ensure critical findings are tracked and resolved.


Conclusion

The requirement for the project manager’s participation in joint NASA/developer audits ensures alignment, accountability, and proactive management of risks and issues affecting software development. It improves communication, fosters transparency, and demonstrates NASA’s commitment to the success of the project and the quality of deliverable software.


IEEE Std 1028-2008, Software Reviews and Audits

8.1 Introduction to Audits
"The purpose of a software audit is to provide an independent evaluation of conformance of software products and processes to applicable regulations, standards, guidelines, plans, specifications, and procedures."
219

 Audits are part of the supplier/provider monitoring activities performed by the acquirer 224, but may also be external audits, internal audits, or some other type of audit.  

To avoid surprises resulting from audits, project personnel needs to know ahead of time that an audit will be occurring.

Audits are conducted by audit teams and require the participation and cooperation of the personnel involved with the software being audited, both acquirer and provider personnel, including contractors, as appropriate for the particular audit being performed.

3. Guidance

3.1 Projects Participation In Audits

The intent of this requirement is to ensure projects actively support and participate in audits involving any portion of the software lifecycle, including development, testing, delivery, and maintenance phases. This participation promotes transparency, strengthens communication between stakeholders (NASA, contractors, independent auditors), and ensures software products meet mission goals, project objectives, and compliance with applicable standards, such as NPR 7150.2.

Participation in audits can provide substantial benefits to both the project and auditors by leveraging project-specific insights, domain knowledge, and technical expertise. This requirement is not prescriptive about the degree of involvement but emphasizes the necessity for project teams—including project management, software engineering, and assurance teams—to engage meaningfully at an appropriate level.


Key Objectives of Project Participation in Audits:

  1. Ensure Compliance: Verify that project processes, contract deliverables, and standards (e.g., software safety, development plans) conform to NASA's policies, project specifications, and mission objectives.
  2. Promote Collaboration: Facilitate communication and cooperation between the project, software suppliers, and auditors to ensure smooth and productive audit processes.
  3. Address Risks Early: Leverage participation in audits to identify risks, non-conformances, and process gaps early, addressing them before they escalate into larger project challenges.
  4. Enhance Knowledge Sharing: Enable audit teams to benefit from project-specific expertise, ensuring that audit findings are specific, practical, and aligned with project context.
  5. Support Continuous Improvement: Use audit findings to improve software processes, mitigate recurring inefficiencies, and strengthen future deliverables.

Guidance for Implementing Requirement 3.1

1. Types of Audits Requiring Participation

Projects are expected to participate in a broad range of software-related audits. Types of audits include, but are not limited to:

  • Internal Audits: Audits initiated by NASA to ensure compliance with internal standards, NPR 7150.2, and project-specific processes.
  • Supplier/Contractor Audits: Supplier-driven audits that validate adherence to contractual agreements, including software deliverables and acceptance criteria.
  • Independent/External Audits: Formal assessments conducted by third-party entities or independent verification and validation (IV&V) teams to ensure compliance with quality, reliability, safety, and security standards.
  • Process Audits: Audits of software engineering processes, practices, and lifecycle activities such as configuration management, testing, and maintenance.
  • Product Audits: Audits verifying that specified software products and documentation meet technical and functional requirements.
  • Contractual Audits: Audits formalized in supplier contracts, such as acceptance reviews, acceptance testing, and joint reviews.

2. Levels of Project Involvement in Audits

Project participation in audits may take several forms, and the degree of involvement is flexible based on the project's context and the type of audit being conducted. Levels of involvement include:

  • Observer Role: The project is briefed during the audit process, receiving findings or updates as the audit progresses.
  • Active Participation: Project personnel provide inputs, observations, and responses throughout the audit process, helping address issues in real time.
  • Support Role: Projects support the audit team by making personnel available for interviews, preparing process documentation, or providing technical data.
  • Leadership Role: Key project personnel, such as the project manager or assurance lead, guide the audit team by organizing scope, coordinating activities, and resolving high-priority issues.

The appropriate levels of involvement should depend on the criticality of the audit findings, compliance risks, and the impact on project deliverables.


3. Early Integration of Audit Requirements in Contracts

For audits involving software suppliers, it is critical to incorporate the right of NASA project personnel to participate directly in supplier-conducted audits into contracts and project plans (e.g., RFP, SOW). This ensures:

  • Suppliers are contractually obligated to allow access to their audit processes.
  • Joint audits are planned well in advance during project initiation stages.
  • NASA's expectations for project oversight are unambiguously defined.

Best Practices for Early Integration:

  • Define participation expectations and responsibilities in the Statement of Work (SOW) early in the acquisition phase.
  • Specify audit types and milestones in the contract, such as formal configuration audits, acceptance audits, and compliance reviews.
  • Ensure that contractors are aligned with NASA’s software policies, including NPR 7150.2 and NASA-STD-8739.8.

For more details, refer to Topic 7.03 - Acquisition Guidance.


4. Roles and Responsibilities for Project Personnel

Projects must ensure that qualified personnel are prepared and assigned to audit participation as needed. Involvement may include individuals with technical, management, or assurance responsibilities. Typical roles include:

  • Project Manager: Oversees the project’s participation in audits, ensuring proper resources are allocated, and findings addressed.
  • Software Engineers: Provide domain knowledge on processes, tools, and technical requirements to the auditing team.
  • Systems Engineers: Facilitate the alignment of software-related audit findings with system-level goals.
  • Software Assurance Personnel: Validate software safety, quality, and compliance during the audit as specified in the Software Assurance Plan (NASA-STD-8739.8).
  • Configuration Managers: Support audits focused on software configuration management to ensure baselines, changes, and procedures are properly tracked.

Preparation Checklist for Audit Participants:

  1. Understand the purpose and scope of the audit.
  2. Review relevant software artifacts (e.g., requirements, plans, test reports, code repositories).
  3. Prepare any requested materials, such as documentation or software traceability matrices.
  4. Be prepared to answer audit questions related to specific project processes, tools, or quality assurance measures.

5. Best Practices for Effective Participation

To maximize the benefits of project participation in audits, projects should adopt the following practices:

  • Proactive Involvement: Rather than reactively supporting audits, aim to engage early and collaboratively with audit teams.
  • Timely Preparation: Identify personnel and prepare documentation or data well ahead of scheduled audits to avoid delays.
  • Collaboration Across Teams: Foster cooperation between NASA and contractor personnel by maintaining transparency around audit goals and findings.
  • Audit Issue Resolution: Participate in post-audit reviews to resolve non-conformances (e.g., open findings, risks) and incorporate corrective actions into project plans.
  • Continuous Improvement: Use audit outcomes to improve internal processes, standards adherence, and team alignment.

For additional process insights, see Topic 8.12 - Basics of Software Auditing.


6. Lessons Learned Supporting Audit Participation

  1. Case Lesson: Lack of Early Audit Participation Causes Delays:
    • A NASA project that delayed participation in contractor acceptance testing audits experienced risks that went undetected, resulting in costly rework.
    • Early participation can help identify and mitigate risks proactively.
  2. Lesson Learned: Strong Collaboration Reduces Non-Conformance Risks:
    • Projects where NASA engineers regularly assisted contractors in audits saw fewer discrepancies during milestone reviews.
    • These projects benefited from improved communication and aligned quality controls.

7. Key Benefits of Project Participation in Audits

  • Improves Quality: Project knowledge enhances audit results and ensures non-conformances are applicable and actionable.
  • Builds Trust: Transparency during audits fosters collaboration between NASA, contractors, and external organizations.
  • Mitigates Risks Early: Early detection of compliance gaps prevents escalated issues during later project phases.
  • Ensures Deliverable Compliance: Audits aligned with contract terms ensure software meets technical requirements and acceptance criteria.
  • Promotes Process Improvement: Feedback from audits encourages better lifecycle practices and lessons learned for future missions.

Conclusion

The project’s active participation in internal and external audits is vital to mission success, ensuring software compliance, quality, and alignment with NASA's technical and safety standards. By preparing personnel, incorporating audit requirements into contracts, and fostering collaboration during audits, project teams can leverage these opportunities to enhance software integrity and significantly reduce development risks.

See Also Topic 8.12 - Basics of Software Auditing.

ISO/IEC 12207, IEEE Std 12207-2008, 2008 - Systems and software engineering - Software life cycle processes

Para 6.1.2.3.4.13
"The supplier shall conduct or support ... informal meetings, acceptance review, acceptance testing, joint reviews, and audits with the acquirer as specified in the contract and project plans."
224

If these audits involve a software supplier, requirements to allow acquirer project personnel to participate, as described above, need to be incorporated into the contract because the contract is the binding document for contractor performance and deliverables. Therefore, this NPR 7150.2 requirement needs to be considered during the earliest phases of a project when the Request for Proposals (RFP), the Statement of Work (SOW), and the contract are being developed.

See also Topic 7.03 - Acquisition Guidance

3.2 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.3 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

For smaller projects, where resources, budgets, and team sizes are limited, it can be challenging to engage in audits while maintaining project progress. However, participation in audits is vital to ensuring software quality, compliance with NASA standards, and achieving project objectives. This guidance is tailored for small projects to help streamline audit participation effectively and within resource constraints.


Small Project Audit Participation Strategy

1. Streamline Roles and Responsibilities

  • In small projects, team members often play multiple roles. Assign a primary point of contact (POC) for audit participation, and delegate responsibilities to others when appropriate. Suggested roles include:
    • Project Manager: Focus on coordinating audit participation, tracking findings, and resolving issues.
    • Software Lead: Provide input on software-related technical aspects, including processes, code baselines, and testing artifacts.
    • Software Assurance Lead (if not separate from the Software Lead): Validate compliance with quality, safety, and assurance standards (e.g., NASA-STD-8739.8).
    • Configuration Manager (if applicable): Support audits examining software baselines, versioning, and change management.

For extremely small projects, a single person may play multiple roles, such as the project manager and software lead being the same individual. In such cases:

  • Focus on priority activities (e.g., preparing key documentation, resolving findings).
  • Work with external stakeholders (e.g., contractors or audit teams) to tailor audit expectations to the project’s size.

Actionable Tip:

Use a RACI Chart (Responsible, Accountable, Consulted, Informed) to clarify audit responsibilities among team members.


2. Leverage Minimal Documentation for Audit Preparation

Small projects often have reduced documentation compared to larger efforts. This can pose challenges during an audit. To streamline audits effectively:

  1. Focus on essential artifacts only:
    Ensure the following basic documents are ready for audits:

    • Software Requirements Specification (SRS).
    • Configuration Management Plan or Checklist.
    • Software test plans and results (even in minimal formats, such as spreadsheets).
    • Traceability matrix (use tools like Excel to keep this manageable).
    • Evidence of risk management activities (if applicable).
  2. Utilize existing templates or checklists:
    NASA’s smaller projects can borrow document templates from prior similar-sized efforts or use simple tools to manage requirements and test results (e.g., Excel, Google Sheets).

  3. Maintain clear audit trails:
    Even in a small project, track who made decisions and changes. A simple change log in your configuration management tool (or a manual spreadsheet for low complexity) is sufficient.


3. Prioritize Audit Participation Levels

Small projects may lack the resources to engage extensively in all audits. Focus your limited resources on participating in areas with the greatest relevance to project success. The level of participation can vary:

  • Critical Audits (High Priority Participation):
    Participate actively in audits that impact major deliverables (e.g., acceptance testing, safety reviews, system integration audits). These audits are closely tied to project milestones or customer deliverables and should involve the project manager or a technical lead.

  • Non-Critical Audits (Limited Participation):
    For contractor-driven process audits or routine compliance reviews, stay informed by appointing a project observer or reviewer. Request summary reports and recommendations instead of attending in real-time.

Tailored Tip for Small Projects:
Be clear about your resource constraints when working with external auditors or NASA teams. Establish boundaries for your involvement and focus on outcomes instead of attending all activities.


4. Use Scalable Audit Tools

Small projects don’t always need expensive or complex tools to support audit preparation and participation. Opt for lightweight solutions:

  • Use free or low-cost tools for configuration and requirements management:

    • Version Control: Git, GitHub, or Bitbucket to track software code and changes.
    • Requirements Tracking: Excel, Google Sheets, or open-source tools like OpenProject or ReqView.
    • Audit Checklists: Maintain software assurance or configuration audit checklists as Word/Excel documents.
  • If formal tools are available (e.g., DOORS or QVScribe), make use of the minimal features needed to conduct your audit. Don’t over-complicate processes for small projects.


5. Early Integration of Audits in Project Plans

For small projects, planning is vital to minimize disruptions during audits:

  • Identify anticipated audits early:
    Projects should anticipate audits in their Software Development Plan (SDP) or within project milestone schedules.

  • Plan your resources:
    For example:

    • If external audits are part of supplier contracts, ensure that participation expectations are aligned and resource needs accounted for in project timelines and budgets.
    • Incorporate time blocks for audits and task buffers in small team schedules to account for audit time.
  • Incorporate contractual considerations early:
    If suppliers are involved, ensure the RFP or SOW explicitly requires contractor coordination for joint audits.


6. Make Effective Use of Post-Audit Reviews

Small projects can maximize audit benefits by efficiently addressing findings:

  1. Prioritize and adapt resolutions: Focus on resolving high-priority findings quickly, even if certain lower-priority audit issues must be deferred due to resource constraints. Use a risk-based approach to determine which issues need action.
  2. Conduct time-limited follow-ups: For small projects, follow-ups to address audit findings don’t have to be time-intensive. A post-audit meeting that summarizes the audit findings and assigns specific action items at a high level is usually sufficient.
  3. Leverage lessons learned: Document lessons and improvements from audits to prevent recurring issues. Small projects benefit significantly from tracking improvements across short-term endeavors.

7. Key Recommendations for Small Projects with Limited Resources

  • Engage Correct Personnel: Involve team members with project-specific knowledge but define clear resource limits for their involvement.
  • Simplify Documentation: For audits, provide concise, essential documentation (e.g., simplified test logs, traceability records, or requirement baselines).
  • Request Tailored Audits: Collaborate with auditors (internal or external) to focus audits on areas critical to your project’s mission or software performance.
  • Communicate Constraints: Be transparent about resource limitations. State what your team can support and provide alternatives (e.g., post-audit debriefs where full participation isn’t feasible).
  • Learn from Audits: Even for very small projects, audits provide valuable insights into improving future processes, products, or team productivity.

Summary of Small Project Practices for 3.1 Audit Participation

Key AreaScaled Approach for Small Projects
Audit DocumentationFocus on providing only minimal essential documents (e.g., SRS, CM log).
Participation LevelPrioritize high-impact audits; limited engagement for low-priority audits.
Roles and ResponsibilitiesAssign multipurpose roles (e.g., PM doubles as audit POC).
Resource ManagementUse lightweight tools (Excel, GitHub) to prepare for audits.
Post-Audit ActionsQuickly resolve high-priority findings, defer low-risk issues as needed.

Why Audit Participation Matters for Small Projects

Even for small projects, participating in software audits ensures:

  • Compliance with NASA standards and mission objectives.
  • Early identification of risks or process issues that could scale into larger problems.
  • Improved communication and collaboration with contractors or audit teams.

By focusing on scalable processes, preparing for audits efficiently, and targeting critical areas, small projects can meet the requirements of 3.1 Projects Participation in Audits without overburdening their limited resources.

5. Resources

5.1 References

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.


6. Lessons Learned

6.1 NASA Lessons Learned

Lessons learned from past NASA projects highlight the importance of engaging in joint audits to ensure effective oversight, proper implementation, and mission success. Historical failures underline the risks of insufficient auditing and contractor oversight, particularly in managing critical software systems. By studying these lessons, NASA projects can implement joint audits to strengthen processes, avoid common pitfalls, and improve outcomes.


Relevant Lessons Learned from NASA's Database

  1. Acquisition and Oversight of Contracted Software Development
    • Lesson Number: 0921
    • Event: The loss of the Mars Climate Orbiter (MCO) mission (1999) exposed significant deficiencies in NASA's process for acquiring and managing contractor-developed software. The mission failed due to a lack of controlled processes for verifying and overseeing the development of mission-critical software. One of the key issues was the failure to validate communication of measurements between teams using different unit systems (metric vs. imperial).
    • Implication: The lack of consistent oversight during software life cycle phases meant that critical design and implementation issues went unnoticed.
    • Recommended Practice: NASA Centers must assure the adequacy of contractor design, development practices, and implementation processes across the entire software life cycle. Joint audits, in which NASA and the contractor regularly validate progress against requirements and standards at various stages, provide essential checkpoints to mitigate risks of miscommunication and process breakdowns.

Expanded Lessons Learned

  1. Mars Polar Lander and Deep Space 2 Failure

    • Lesson Number: 0934
    • Event: The failures of the Mars Polar Lander (MPL) and Deep Space 2 (DS2) were partially attributed to insufficient contractor oversight, which resulted in undetected design and implementation issues. A specific example was that critical software interactions and failure modes were not thoroughly tested or reviewed.
    • Implication: Independent audits and early NASA involvement in contractor-driven processes could have identified potential design omissions or software flaws.
    • Recommended Practice: NASA projects must participate actively in contractor audits, placing special emphasis on integration testing processes, software interface validation, and anomaly management procedures. Software Assurance personnel should be involved to detect and address risks before integration phases.
  2. Lessons from the Space Shuttle Program

    • Lesson Number: 1065
    • Event: During the Space Shuttle program, independent software verification and validation (IV&V) audits identified critical defects and implementation gaps across multiple subsystems. These audits significantly contributed to improving the safety and reliability of software platforms.
    • Implication: Regular joint audits including IV&V stakeholders are essential to maintaining confidence in software systems operating in safety-critical environments.
    • Recommended Practice: Incorporate structured life cycle audits requiring NASA project team participation in key milestones such as requirements review, design audits, and operational readiness assessments. Establish clear checklists for detecting high-risk gaps.
  3. Software Reuse without Sufficient Review

    • Lesson Number: 0588
    • Event: Improper analysis and testing of reused software components contributed to mission anomalies in prior NASA programs. Reused software was inadequately audited for compliance with new project requirements, leading to cascading issues in later life cycle stages.
    • Implication: Reused software, including Commercial Off-the-Shelf (COTS), Government Off-the-Shelf (GOTS), or Open-Source Software (OSS), can introduce hidden risks if not subjected to proper audits during project integration. Without active participation in audits, these risks may go unnoticed.
    • Recommended Practice: Ensure audit participation includes a focus on verifying how reused software components are validated, tested, and integrated into the overall mission architecture. Adjust audit checklists to focus on compliance with new workflows, interfaces, and configurations.

Additional Insights and Lessons for Audit Participation

  1. Integration of Audits as Risk Mitigation

    • Audits act as critical milestones for risk detection and mitigation. Projects that conduct joint NASA/contractor assessments, including mid-life and late-stage audits, are more likely to address emerging risks early and ensure compliance.
    • Missed or poorly scoped audits have historically led to requirements creep, schedule delays, and undetected safety-critical defects.
  2. Communication Gaps Among Stakeholders

    • Several NASA accidents and software failures highlight communication breakdowns among contractors, subcontractors, and NASA personnel. Examples include incompatible development processes, inconsistent requirements translations, and unverified cross-team assumptions. Joint audits that involve NASA participants (e.g., project managers, software assurance personnel, software engineers) provide opportunities to align expectations, clarify requirements, and validate system-level consistency.
  3. Audits as a Mechanism for Culture Change

    • Following the Columbia disaster (2003), NASA implemented significant cultural changes to encourage transparency, accountability, and proactive risk management. A component of this cultural shift was enhancing joint oversight mechanisms, including audits. Lessons from this period emphasize that active participation in audits encourages early detection of technical and process deficiencies while reinforcing a culture of shared responsibility.

Best Practices from Lessons Learned

  1. Integrate Audits Throughout the Software Development Life Cycle (SDLC)

    • Plan and incorporate joint audit milestones early in the project, covering all key software engineering phases:
      • System Requirements Review (SRR) and Preliminary Design Audit.
      • Integration Audits (e.g., testing compliance, interface validation).
      • Final Configuration Audits (e.g., delivery readiness).
      • This proactive approach reduces the risk of late-stage findings and software instability.
  2. Tailor Audits to the Project Scope and Risks

    • Small and medium-sized projects may not have the resources for continuous in-depth audits. Tailoring the scope of joint audits to focus on high-priority risks—such as safety-critical software, reused components, or contractor-sourced portions—ensures effective use of resources.
  3. Leverage Independent Verification and Validation (IV&V)

    • Lessons from failures demonstrate the value of third-party assessments. Incorporating IV&V teams in joint audits adds an extra layer of assurance, particularly for complex or safety-critical projects. Their independent perspective can uncover hidden design assumptions, inadequate test coverage, or missed edge cases.
  4. Emphasize Early Risk Detection

    • Use audits not only to assess progress but also to proactively identify risks and assess the effectiveness of risk mitigation processes. Early identification of design or implementation weaknesses allows for manageable course corrections that avoid costly rework in later stages.

Summary of NASA Lessons Learned

NASA’s history demonstrates that participating in and leveraging joint audits strengthens oversight, decision-making, and mission outcomes. Past failures—like the Mars Climate Orbiter or Mars Polar Lander—underline the importance of:

  • Early identification of risks through joint assessments.
  • Oversight of contractor design, development, testing, and integration processes.
  • Ensuring that reused software components adhere to project-specific requirements.

Joint NASA/contractor audits—not only required by standards but also validated by experience—are a proven mechanism to enhance process transparency, collaboration, and a shared commitment to success. By applying these lessons, NASA projects can achieve better alignment, reduce risks, and ultimately ensure the success of software-intensive missions.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-045 - Project Participation in Audits
5.1.9 The project manager shall participate in any joint NASA/developer audits. 

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Participate in or assess the results from any joint NASA/developer audits. Track any findings to closure.

7.2 Software Assurance Products

Software assurance (SA) plays a crucial role in ensuring the quality, safety, and reliability of software products by providing insight, oversight, and independent validation. When joint NASA/Developer audits occur, SA personnel must participate to assess compliance, identify risks, and verify the effectiveness of processes and products against applicable standards, such as NPR 7150.2 and NASA-STD-8739.8. By doing so, SA ensures that the project is developing high-quality software in alignment with mission objectives and contractual requirements.

Software assurance delivers critical assessments and inputs for joint audits. Key products include:

  1. Results Analysis of Joint Audit Findings:

    • SA personnel review the results of joint NASA/Developer audits to validate that the findings accurately depict weaknesses, gaps, or risks within the software lifecycle and project processes.
    • Findings are categorized based on their impact on safety, mission success, and compliance to prioritize corrective actions (e.g., safety-critical systems versus non-critical functionality).
  2. Defect and Problem Report Analysis:

    • Review and verify the accuracy of problem reporting systems and defect tracking data to ensure that software issues identified during the audit process are logged, managed, and addressed.
    • Assess trends in defects to identify potential systemic issues (e.g., recurring failures in a specific process or integration step).
  3. Assessment of Change Management Processes:

    • Conduct software assurance audits of the change management system to ensure updates are controlled, documented, and implemented systematically.
  4. Corrective Action (CA) Monitoring and Closure:

    • Ensure that all audit findings, corrective action plans, and required changes are tracked to closure (see "Track Findings to Closure" section below).

7.3 Metrics

Maturity Note: At this time, formal metrics specific to audit participation or results monitoring have not been established. However, projects may incorporate tailored metrics based on audit-specific objectives.

Potential examples include:

  • Number of audit findings categorized by severity (e.g., safety-critical, major, minor).
  • Average time to respond to and close audit findings.
  • Number of recurring findings or trends identified across audits.

7.4 Guidance

The Benefits of Audits

Audits provide management and software assurance personnel with critical insights into the effectiveness, compliance, and risk status of project plans and processes. Key benefits include:

  1. Adequacy Assessment:

    • Verify that processes, tools, and systems are adequate for the project’s goals.
    • Ensure plans and activities are aligned with contractual requirements, NPR 7150.2, and project-specific software assurance plans.
  2. Compliance Tracking:

    • Determine whether the software development processes comply with approved project baselines, policies, and standards.
  3. Effectiveness and Control Evaluation:

    • Assess whether the implemented processes are effective at meeting quality, safety, and performance goals.
    • Evaluate the adequacy of change control procedures and internal monitoring systems.
  4. Product Fitness Verification:

    • Ensure the software products meet technical specifications and are fit for their intended use in the mission or project context.
  5. Opportunities for Process Improvement:

    • Identify systemic issues, gaps, or inefficiencies in the software life cycle.
    • Propose recommendations for risk reduction, process optimization, or cost savings.

Trending and Long-Term Monitoring

Over time, trending audit results can uncover recurring systemic issues, gauge the impact of software improvements, and highlight areas that may require added focus or additional resources. Software assurance should:

  • Continuously track audit findings across multiple audits to identify recurring issues or ongoing risks.
  • Monitor the effectiveness of corrective actions by trending findings by category, recurrence rates, and closure timeframes.

Software Assurance Participation in Joint Audits

Software assurance personnel play an essential role in all phases of joint NASA/Developer audits. Their responsibilities include:

  1. Preparation for Audits:

    • Review relevant project plans, procedures, standards, and prior audit findings.
    • Identify high-risk areas, such as safety-critical software, reused components, and areas with recurring issues.
    • Prepare a checklist of focus areas for the audit to ensure a systematic evaluation.
  2. Active Participation During Audits:

    • Participate in process interviews, documentation reviews, and spot checks for compliance.
    • Evaluate adequacy and completeness of defect tracking systems, version control, and testing documentation.
    • Ensure findings are well-documented, traceable, and actionable, ensuring appropriate alignment with NPR 7150.2 requirements.
  3. Post-Audit Assessment:

    • Analyze the audit report, validating that findings are complete, accurate, and cover all areas of concern.
    • Discuss corrective action plans with the project team to address identified risks and issues.
  4. Oversight of Contractor Audits:

    • For projects involving contractors, SA personnel must evaluate the contractor’s internal auditing processes.
    • Ensure that contractors are performing audits consistently with standards and implementing improvements based on findings.

Track Findings to Closure

Software assurance personnel must ensure that findings from joint audits are tracked, monitored, and closed effectively. A detailed process includes the following steps:

1. Corrective Action (CA) Plan Review and Approval:

After the audit report is delivered:

  • Ensure the project provides corrective action plans within a reasonable timeframe (e.g., 2 - 4 weeks depending on project scope and schedule).
  • Review the CA plan to ensure it includes the following:
    1. Definition of the Problem or Finding: A clear statement of the issue, risk, or non-compliance.
    2. Root Cause Investigation: Analysis of the underlying cause, including whether similar risks exist in other parts of the software or processes.
    3. Short-Term Correction Plan: Immediate mitigation or fixes, with specific deadlines.
    4. Long-Term Corrective Action: Plans to prevent recurrence of the issue, such as process changes, additional testing, or updated procedures.
    5. Rationale: Justification for why a No Action resolution (if proposed) is acceptable.

2. Timeliness and Quality of Corrective Actions:

  • Verify that corrective actions are submitted and implemented in a timely manner with clear due dates for each phase.
  • Assess that corrective actions adequately address audit findings to prevent future recurrence.

3. Verification of Closure:

  • Confirm that corrections have been implemented effectively by reviewing updated documentation, analyzing revised processes, or performing follow-up audits as needed.
  • Ensure that the project team provides updates and evidence for completed actions, such as corrections to testing procedures, defect logs, or design documents.

4. Final Notification of Closure:

  • Once all findings are satisfactorily addressed, the Lead Auditor or Software Assurance Lead should notify the project manager and audit team in writing.
  • Include formal documentation of closure, describing the resolution and rationale for closing each finding.

Enhanced Software Assurance Best Practices

  1. Early Involvement:

    • Engage software assurance personnel in the planning and scoping phases of joint audits to ensure proper focus on high-risk areas.
  2. Collaboration with Developers:

    • Work collaboratively with developer teams during audits to clarify findings and promote buy-in on corrective actions.
  3. Leverage Metrics:

    • While formal metrics may not yet be standardized, use informal tracking to monitor the efficiency and effectiveness of audit findings and corrective actions over time.
  4. Continuous Feedback Loop:

    • Incorporate lessons learned and audit results into future project plans, improving processes and compliance upfront.

Conclusion

Active software assurance participation in and oversight of joint audits is critical for ensuring high-quality, reliable software products that are compliant with NASA standards and mission requirements. Audits are an opportunity to enhance communication between NASA, developers, and contractors, identify risks early, and build an environment of continuous improvement. By systematically tracking findings to closure and incorporating lessons learned, software assurance personnel contribute to safer, more efficient, and effective software engineering practices.

For information see the software assurance topic on software auditing. See also 8.16 - SA Products, 8.12 - Basics of Software Auditing.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

8. Objective Evidence

Objective Evidence

Requirement Context:
The project shall participate in audits, whether they are internal, contractor-driven, joint NASA-developer audits, or audits conducted by external organizations, to assess software development and ensure compliance with plans, processes, and standards (e.g., NPR 7150.2, NASA-STD-8739.8).

Objective Evidence refers to the verifiable documentation, artifacts, and records that demonstrate compliance with this requirement. Below are examples of objective evidence that can be used to prove project participation and engagement in audits.


1. Audit Planning and Scheduling

Artifacts that demonstrate preparation for participation in audits:

  • Audit Plans:
    A documented plan specifying the type, scope, objectives, and schedule of the audit. This may include the identification of audit participants, roles, and the audit agenda.
    Example: "Joint NASA/Contractor Audit Plan for Software Configuration Management, Version 1.0."
  • Audit Scope Document:
    A document outlining the specific software lifecycle processes, phases, or deliverables to be audited. For example, the audit might cover testing, defect tracking, or project compliance with safety-critical software standards.
  • Meeting Invitations or Agendas:
    Evidence capturing the inclusion of the project team in audit planning or coordination meetings. These could be calendar invites, meeting requests, or email exchanges with stakeholders.

2. Audit Attendance and Participation

Evidence demonstrating that project personnel attended and participated in the audit:

  • Attendance Records or Participant Logs:
    Documentation showing who attended the audit (NASA personnel, developer representatives, external auditors).
    Example: Audit Attendance Sheet, with signatures, names, and roles of participants.
  • Meeting Minutes or Notes:
    Evidence that NASA project personnel contributed to discussions during pre-audit, audit sessions, or post-audit activities. Meeting notes may document concerns raised, recommendations made, or corrective actions discussed by the project.
  • Audit Presentations or Briefing Materials:
    Materials presented by the project team during the audit, such as updates on project status, metrics, deliverables, or compliance status.

3. Audit Reports

Clear, documented results from the audit process:

  • Audit Report:
    An official document summarizing the results of the audit, including findings, observations, non-conformances, and improvement recommendations. The report should demonstrate that the project scope was part of the audit and that NASA personnel contributed to its completion.
    Example: Joint NASA-Contractor Audit Report, listing software-related findings.
  • Compliance and Conformance Assessments:
    Evidence that the audit addressed compliance with project plans, NPR 7150.2, and other relevant NASA or contractual standards.
  • Audit Discrepancy/Non-Conformance Table:
    A table or list summarizing detected non-conformances, categorized by severity (e.g., major, minor, or recommendations).

4. Corrective Action Tracking

Evidence demonstrating the reporting, resolution, and closure of audit findings:

  • Corrective Action Plans (CA Plans):
    Documentation provided in response to audit findings, detailing the root cause analysis, corrective steps, and follow-up actions. A CA Plan includes these elements:
    • Problem description.
    • Short-term fix.
    • Long-term corrective action (to prevent recurrence).
    • Timelines and due dates.
      Example: Documented Corrective Action Plan for "Major Finding 3 – Defect Tracking Process Lapse."
  • Issue Tracking Logs:
    A record of all audit findings with status updates. Logs should include:
    • A description of findings.
    • Assigned ownership.
    • Actions taken and resolution dates.
      Example: Jira or Excel tracking logs for audit non-conformance closure.
  • Closure Evidence:
    Documentation showing that corrective actions were completed and reviewed, including sign-off by the Lead Auditor or Assurance Manager.

5. Software Development Artifacts Reviewed During the Audit

Artifacts demonstrating that the software project shared key documents or work products during the audit:

  • Software Development Plans (SDP):
    Documentation that outlines project goals, processes, and software engineering practices.
  • Requirements Traceability Matrix (RTM):
    Evidence mapping software requirements to implementation, verification, and testing.
  • Configuration Management Records:
    Evidence of configuration item baselines, code changes, and version-controlled repositories (e.g., Git logs).
  • Test Plans and Results:
    Documentation of the testing process, including unit tests, system tests, and any anomalies or defects logged during testing.
  • Defect Reports:
    A list of software defects raised, analyzed, and resolved (or deferred). The reports should align with findings from the audit.

6. Evidence of NASA Oversight and Contract Compliance

For contractor-driven audits, the following evidence demonstrates NASA’s engagement and oversight during the audit process:

  • Contract Language Supporting Audit Participation:
    Contracts with developers or suppliers that explicitly require joint audits or allow project personnel to participate, review findings, or oversee corrective actions.
  • Audit Standards Checklist:
    Use of NASA standards (e.g., NPR 7150.2, NASA-STD-8739.8) by contractor teams, verifying that joint audits assess compliance with these required processes.
  • Insight/Oversight Reports or Observations:
    Reports that confirm that NASA software assurance reviewed contractor audit documentation or participated in ensuring compliance with contractual standards.

7. Evidence of Problem Reporting and Risk Management

  • Defect and Anomaly Logs:
    Logs showing software problems and issues identified during the audit and actions taken to track and fix them.
  • Risk Management Records:
    Evidence showing that risks identified in audits, including both technical and process-related risks, were evaluated and added to the project’s risk management system (e.g., Risk Register).

8. Lessons Learned Documentation

Artifacts that demonstrate that audit results were used to inform future improvements:

  • Post-Audit Retrospective Reports:
    Documents summarizing lessons learned from audits, identifying systemic improvements, and discussing how findings from one audit were applied to future projects.
  • Continuous Improvement Plans:
    Evidence of project-wide process improvements as a result of audit findings. For example:
    • A revised defect management process.
    • Changes to vendor oversight procedures.
    • Updated processes for configuration management or testing.

9. Evidence from Follow-Up Audits or Audit Closure

For ensuring that findings are fully resolved:

  • Follow-Up Audit Reports:
    Documentation showing that a subsequent audit validated corrective actions and verified that prior deficiencies were resolved.
  • Audit Closure Statement:
    A formal notification from the audit team (Lead Auditor or Software Assurance Lead) stating that all findings have been closed and that the project is compliant.

Example Checklist for Objective Evidence

CategoryExample Evidence
Audit PlanningAudit plans, agendas, participant lists.
Attendance & ParticipationMeeting minutes, audit notes, attendance logs.
Audit FindingsAudit reports, non-conformance reports.
Corrective ActionsCA Plans, issue tracking logs, root cause analyses.
Software ArtifactsSDPs, test plans, requirements matrices, defect logs.
Oversight & ContractsContractual audit language, compliance checklists.
Lessons LearnedRetrospective reports, improvement plans.
Audit ClosureClosure reports, follow-up audit reports.

Summary

Objective evidence is critical to demonstrating compliance with Requirement 3.1: Projects Participation in Audits. Audits are verifiable through detailed documentation, artifacts, and records created during all phases of joint audits: planning, participation, reporting, follow-up, and closure. By ensuring this evidence is available and organized, projects can confidently demonstrate that they have met their oversight and compliance obligations while leveraging audits to improve software quality and mission assurance.

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:
  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.