- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
- 8. Objective Evidence
1. Requirements
3.9.2 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:
- For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
- For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.
1.1 Notes
Organizations need to complete an official CMMI® Institute defined appraisal against the CMMI®-DEV model V2.0. Organizations are to maintain their rating and have their results posted on the CMMI® Institute Website, or provide an Appraisal Disclosure Statement so that NASA can assess the current maturity/capability rating. Software development organizations need to maintain their appraisal rating during the period they are responsible for the development and maintenance of the software. CMMI® ratings can cover a team, a group, a project, a division, or an entire organization.
For Class B software, an exception can be exercised for those cases in which NASA wishes to purchase a product from the "best in class provider," but the best in class provider does not have the required CMMI® rating. For Class B software, instead of a CMMI® rating by a development organization, the project will conduct an evaluation, performed by a qualified evaluator selected by the Center ETA, against the CMMI®-DEV Maturity Level 2 practices, and mitigate any risk, if deficiencies are identified in the evaluation. If this approach is used, the development organization and project are responsible for correcting the deficiencies identified in the evaluation. When this exception is exercised, the OCE and Center ETA are notified of the proposition and provided the results of the evaluation. The project manager should seek guidance from the Office of Procurement (OP) for help in exercising the exception.
1.2 History
1.3 Applicability Across Classes
Class |
A |
B |
C |
D |
E |
F |
|---|---|---|---|---|---|---|
Applicable? |
|
|
|
|
|
|
Key:
- Applicable |
- Not Applicable
1.4 Related Activities
This requirement is related to the following Activities:
2. Rationale
The CMMI® requirement is a qualifying requirement for NASA. The requirement is included to ensure that NASA projects are supported by software development organization(s) having the necessary skills and processes in place to produce reliable products within cost and schedule estimates.
NASA projects require software that is high-quality, reliable, and delivered on time to meet mission-critical objectives. To achieve these goals, CMMI-DEV (Capability Maturity Model Integration - Development) ratings provide assurance that the software development organization adheres to rigorous process standards for planning, development, and quality management.
This requirement ensures that organizations involved in the development of Class A and Class B software have mature, structured, and well-documented processes for building and maintaining software. These structured processes minimize risks, ensure consistency in outcomes, and enable successful delivery of systems for NASA missions.
Key Rationale
1. Ensuring Process Discipline for Mission-Critical Software
NASA software projects often involve mission-critical systems with strict reliability, safety, and performance requirements. Organizations with CMMI ratings demonstrate a defined and repeatable approach to software development. This process discipline is especially important for:
- Class A Software: Used in safety-critical systems where software failures could result in loss of life, mission failure, or catastrophic consequences. A Maturity Level 3 rating or higher provides confidence in the organization's ability to meet these high stakes through robust processes.
- Class B Software: Less critical than Class A but still significant for mission success. A Maturity Level 2 rating or higher indicates foundational process discipline to deliver reliable software effectively.
Why This Matters:
An organization's ability to handle mission-critical requirements, risk management, and system integration depends heavily on the maturity of its software development processes. CMMI ratings provide objective proof of those capabilities.
2. Reduction of Development Risks
Software development involves inherent risks such as delivery delays, defects, and incompatibilities. Organizations with higher CMMI ratings are better equipped to address common risks through structured processes for:
- Scope and requirements management: Ensuring that requirements are understood, documented, and accurately implemented.
- Defect management and quality assurance: Instituting quality control processes that minimize defects and ensure compliance with NASA's safety and functionality standards.
- Risk management: Identifying and mitigating risks early in the lifecycle.
- Timely delivery: Using mature project management practices to meet deadlines.
Why This Matters:
Unmanaged risks can escalate into cost overruns, missed schedules, and catastrophic failures, especially for NASA projects. Acquiring software from organizations with certified process maturity drastically reduces these risks.
3. Assurance of Organizational Capability
CMMI-DEV ratings are conducted by certified CMMI Institute Lead Appraisers, ensuring third-party vetting of the organization’s development processes. An unexpired CMMI rating guarantees that the software provider has maintained a specific maturity level in their practices and processes.
- Maturity Level 3 (Class A): A Level 3 organization has well-defined and consistently followed processes for project management, engineering, quality assurance, and delivery across all projects.
- Maturity Level 2 (Class B): A Level 2 organization ensures basic project control, including planning, monitoring, and working to prevent excessive variation in performance or outcomes.
Why This Matters:
NASA relies on advanced, complex systems, and the maturity of the development organization's processes directly impacts the viability and reliability of delivered software. CMMI ratings establish objective evidence of an organization's ability to deliver consistent performance.
4. Facilitating Compliance with NASA Standards
Software acquired, developed, or maintained by organizations with CMMI ratings is more likely to align with NASA's strict software assurance, quality, and safety standards. Mature organizations are equipped to:
- Meet NASA's requirements for traceability, validation, and testing.
- Produce documentation aligned with NASA's software lifecycle management processes.
- Respond to post-delivery needs for ongoing maintenance, updates, and repairs.
Why This Matters:
Class A and Class B software often requires integration with other mission systems and lifelong maintenance. This integration and maintenance depend on properly documented and high-quality software. Organizations with higher CMMI maturity levels typically produce artifacts that facilitate compliance, minimize rework, and streamline integration tasks.
5. Supporting Class Definitions Based on Mission Criticality
This requirement tailors the CMMI rating thresholds to the unique risks and importance of Class A and Class B software:
- Class A Software: Vital for primary mission objectives, with potential catastrophic consequences if failed. Requires Maturity Level 3 or higher, emphasizing robust engineering processes capable of managing the complexity of these systems.
- Class B Software: Essential for secondary mission objectives or tasks, with fewer catastrophic risks. Requires Maturity Level 2 or higher, reflecting foundational software process maturity suited to lower criticality levels.
Why This Matters:
This tiered approach ensures NASA receives software developed by organizations with appropriate levels of process maturity for the software’s role in the mission.
6. Long-Term Cost Savings
Mature development processes reduce design errors, test failures, rework, and maintenance overhead. Eliminating these inefficiencies leads to lower costs during development and operational phases:
- Organizations with Maturity Level 3 ratings or higher improve schedules, quality, and delivery predictability for high-risk systems like Class A software.
- Organizations with Maturity Level 2 ratings ensure foundational process discipline for less critical systems like Class B software.
Why This Matters:
Better alignment between NASA’s needs and the vendor's development capabilities minimizes costly setbacks, errors, and delays—a key benefit given NASA's budget constraints.
7. Promoting Accountability and Confidence
Acquiring software from organizations with proven capabilities incentivizes accountability and creates confidence in the software’s reliability. Organizations with CMMI-DEV maturity ratings have proven experience in:
- Delivering software aligned with customer expectations.
- Scaling processes across diverse project types.
- Managing complexity in technical and operational requirements.
Why This Matters:
Accountability and confidence are critical for NASA’s ability to deliver on objectives involving complex, multi-stakeholder systems.
Advantages of This Requirement
For NASA Projects
- Ensures access to software developed with mature engineering practices, reducing risks tied to software flaws, missed deadlines, and integration failures.
- Guarantees compliance with mission-critical safety, reliability, and functional requirements.
- Promotes scalability and reusability of software across projects by enforcing process consistency.
For Mission Success
- Positions NASA to deliver on-time, on-budget missions with reliable software that meets exacting requirements for complexity, safety, and performance.
- Reduces the likelihood of catastrophic outcomes caused by defects in mission-critical software.
For Software Development Teams
- Provides standardized benchmarks for software development performance.
- Encourages partnerships with qualified organizations that are equipped to meet high process standards.
Conclusion
The requirement to acquire, develop, and maintain software from organizations with certified CMMI-DEV maturity levels ensures that NASA projects benefit from structured and verified software development processes. For Class A and B software, this translates to improved reliability, reduced risk, better alignment with NASA’s standards, and long-term cost savings. By prioritizing process maturity, NASA maximizes its chances of success on safety-critical and mission-critical endeavors.
3. Guidance
CMMI
We're using CMMI version 2.0, adjusted for the results from the internal comparison of CMMI and the NPR 7150.2.
3.1 Capability Maturity Model Integration
Purpose of the Requirement
The Capability Maturity Model Integration for Development (CMMI-DEV) is an internationally recognized framework for process improvement that provides a structured approach to managing software development and maintenance activities. Its purpose is to ensure organizations develop software using best practices across areas such as requirements management, risk management, planning, quality assurance, and process measurement. By integrating these practices for software acquisition, development, and maintenance, NASA increases the likelihood of mission success and maximizes software quality, reliability, and safety.
This requirement establishes a standard for software engineering process maturity as part of ensuring that NASA's critical software systems in Class A and Class B meet stringent performance, safety, and reliability standards. Organizations performing these functions are required to demonstrate their process maturity through non-expired CMMI-DEV ratings.
Key Objectives of the Requirement
- Independent Process Evaluation: Measure an organization’s process maturity and capability against an industry-standard framework (CMMI-DEV), providing an objective means of identifying strengths and weaknesses in their software development activities.
- Risk Mitigation: Identify and address process risks during software acquisition, development, and maintenance to minimize potential disruptions in cost, schedule, and product quality.
- Process Consistency: Promote stable, predictable, and repeatable processes that reduce variability in the development outcomes, critical for meeting mission-critical goals.
- Alignment with Standards: Ensure compliance with NPR 7150.2 process-related requirements using well-established practices aligned with CMMI-DEV.
- Continuous Improvement: Encourage software development organizations, both internal and external, to aim for higher levels of process discipline, efficiency, and product reliability.
Benefits of Enforcing CMMI Ratings in Software Engineering
Improves Mission Success: By adopting best practices in software engineering, NASA ensures greater predictability in software quality, reliability, and schedule adherence, directly contributing to mission safety and success.
Prevents Software Failures: CMMI-compliant organizations apply structured processes to identify and resolve potential issues earlier in the lifecycle, resulting in fewer software defects that threaten safety or operational excellence.
Supports Smarter Software Sourcing: The use of non-expired CMMI ratings for organizations enables NASA to evaluate and select suppliers that have met stringent industry process improvement frameworks. This mitigates risks in contracted-out software activities.
Increases Cost-Efficiencies: Predictable, high-quality software development practices reduce costly rework, improve resource allocation, and streamline future reuse of software components and processes.
Promotes Reuse and Standardization: CMMI-based processes often produce well-documented software and work products, increasing the ability to reuse tools, methods, and components across multiple projects.
Encourages Technological Adaptability: Organizations adhering to CMMI-DEV practices are better positioned to respond to evolving software technologies and integration requirements, allowing NASA to remain agile in adapting to technological advancements.
Supports Employee Morale and Customer Satisfaction: Well-defined processes minimize confusion and maximize clarity for development teams, creating a positive work environment. High-quality deliveries increase confidence and satisfaction among stakeholders.
Establishes Continuous Process Improvement: Encouraging CMMI adoption fosters process improvement and innovation within NASA’s internal development teams as well as in the contractor community.
NASA-Specific Recommendations for CMMI Implementation
Tailoring CMMI to Project Classes
- For Class A Software: The criticality of the software (safety-related and mission-critical) necessitates a non-expired CMMI-DEV Maturity Level 3 rating or higher. This level ensures well-documented and repeatable processes across a project or organization.
- For Class B Software: A CMMI-DEV Maturity Level 2 rating or higher, or a Capability Level 2 rating, is sufficient. This reflects lower criticality but still emphasizes foundational practices (e.g., requirements management, configuration management, and quality assurance).
Periodic CMMI Verification
- It is recommended that projects confirm the validity of CMMI ratings during major lifecycle reviews to ensure continued compliance with NPR 7150.2 and account for changes in the supplier’s or organization’s rating. This check can be performed via the CMMI Institute's Published Appraisals website.
- For Class B Software on Class D Payloads, official CMMI compliance may be less stringent but provisions for informal evaluations or risk assessments should still be implemented as a best practice.
Supplier Evaluation and Engagement
- Supplier agreements for both Class A and Class B software must specify compliance with CMMI requirements as part of the contractual obligations.
- These agreements should also identify all applicable software deliverables, criteria for acceptance, and mechanisms for monitoring compliance throughout the software development lifecycle.
Role of Center Engineering Technical Authorities (ETAs)
- The ETAs are responsible for ensuring the inclusion of NASA software engineering requirements and CMMI qualifications in acquisition agreements, subcontract agreements, and internal software planning.
- ETAs also determine the level of support and oversight required from CMMI-rated organizations to ensure seamless software engineering and project execution.
Adherence in In-House Software Development
- NASA in-house software teams must ensure their own processes meet CMMI-DEV Level 3 requirements for Class A software and CMMI-DEV Level 2 for Class B software. This requirement includes internal and contractor-supported tasks.
3.2 Applicability
Specific Class-A and Class-B Software CMMI Guidance
Class A Software
Organizations involved in the development or maintenance of Class A software must maintain an active CMMI-DEV Maturity Level 3 rating or higher. For acquisition, representatives from software engineering and assurance organizations with at least a CMMI-Maturity or Capability Level 3 rating must support supplier agreements, ensuring quality and compliance at all stages.
Class B Software
Organizations developing or maintaining Class B software require an active CMMI-DEV Maturity or Capability Level 2 rating in key process areas. For these lower-risk projects, ETAs along with project management can take a tailored approach (e.g., evaluations, reviews, or informal appraisals), especially when contractors do not have CMMI-DEV ratings. Waivers may be issued via compliance matrices when warranted.
Waiver Requests
In cases where projects seek exceptions or alternate methods (such as when a contractor lacks a desired CMMI rating), waivers can be submitted for approval. The waiver process must justify the risks, planned mitigations, and alternate assurances provided.
Conclusion
This enhanced guidance emphasizes the strategic importance of integrating CMMI standards into software engineering processes across NASA’s projects. CMMI ratings establish a quantifiable benchmark for software process maturity, making it easier to evaluate, acquire, and develop software that meets NASA’s stringent safety, reliability, and mission-critical requirements. By tailoring adoption to project classes, requiring periodic compliance checks, and fostering collaboration between internal and external development teams, NASA ensures process consistency, reduced risks, and greater mission success.
See also SWE-129 - OCE NPR Appraisals and SWE-221 - OSMA NPR Appraisals.
See also SWE-003 - Center Improvement Plans for information about CMMI in the Center Improvement Plans.
3.3 General Software Acquisition Guidance
The content of the supplier agreement is critical to the acquisition of any software, including software embedded in a delivered system. In addition to the CMMI® Maturity Level requirements placed on the supplier by SWE-032, the supplier agreement must also specify compliance with the software contract requirements identified in NPR 7150.2. The creation and negotiation of any supplier agreement involving software need to include representatives from the Center's software engineering and software assurance organizations to ensure that the software requirements are represented in the acquisition agreement(s). The agreements identify the following aspects of the acquisition:
- Technical requirements on the software.
- Definition and documentation of all software deliverables.
- Required access to intermediate and final software work products throughout the development life cycle.
- Compliance and permissible exceptions to NPR 7150.2 and any applicable Center software engineering requirements.
- Software development status reporting includes implementation progress, technical issues, and risks.
- Definition of acceptance criteria for software and software work products.
- Non-technical software requirements include licensing, ownership, use of third party or Open Source Software, and maintenance agreements.
See also Topic 7.03 - Acquisition Guidance
Representatives from the Center's software engineering and assurance organizations must evaluate all software-related contract deliverables before acceptance by the Project. The deliverables must be evaluated for:
- Compliance with acceptance criteria.
- Completeness.
- Accuracy.
3.3.1 Class A software
If you acquire, develop, or maintain Class A software the organization performing the functions is required to have a non-expired CMMI®-DEV Level 3 or higher rating.
3.3.2 Class A software acquisition guidance
To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 3 or higher or rated at CMMI®-DEV Capability Level 3 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables. The extent of the CMMI®-DEV Level 3 rated organization's support required for a Class A acquisition can be determined by the Center's Engineering Technical Authority responsible for the project.
Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 3 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.
For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver for those exclusions or alternate requirements in the form of a streamlined compliance matrix for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.
3.3.3 Class A software development or maintenance guidance
The software organizations that directly develop or maintain Class A software are required to have a valid CMMI®-DEV Level 3 or higher rating for the organization performing the activities. Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments. Project contractors and subcontractors performing Class A software development are required to have their CMMI®-DEV Level 3 rating. NASA and primes need to pass this requirement down in contracts to ensure all subcontractors have the necessary CMMI®-DEV rating.
The CMMI®-DEV Level 3 rating is to be maintained throughout the project’s development or maintenance period. NASA requests organizations’ CMMI® ratings are posted on the CMMI Institute website 327. The CMMI® Institute vets the validity of the CMMI® appraisals on this list and assures the rating hasn’t expired (as of this writing CMMI® ratings are valid for 3 years). In rare instances (rating earned in a classified environment) an organization may have a current CMMI®-DEV rating, but it doesn’t appear on the CMMI® Institute website. In these cases, the supplier’s claim can be directly checked with the CMMI® Institute.
3.3.4 Class B software
(except Class B software on NASA Class D payloads) - CMMI®-DEV Maturity Level 2 Rating or higher for software, or CMMI®-DEV Capability Level 2 Rating or higher for software in the following process areas:
a. Requirements Management.
b. Configuration Management.
c. Process and Product Quality Assurance.
d. Measurement and Analysis.
e. Project Planning.
f. Project Monitoring and Control.
g. Supplier Agreement Management (if applicable).
3.3.5 Class B software acquisition guidance
To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 2 or higher or rated at CMMI®-DEV Capability Level 2 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables.
The Center Engineering Technical Authority responsible for the project determines the extent of the CMMI®-DEV Level 2 rated organization's support required (see description in the previous paragraph) for a Class B acquisition. Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 2 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.
For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver in the form of a streamlined compliance matrix for those exclusions or alternate requirements for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.
3.3.6 Class B software development or maintenance guidance
The software organizations that directly develop or maintain Class B software are required to have a valid CMMI®-DEV Level 2 or higher rating (via a Continuous or Staged representation) for the organization performing the activities. Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments. Project contractors and subcontractors performing Class B software development are required to have their own CMMI®-DEV Level 2 or higher rating. The CMMI®-DEV Level 2 maintains an active rating during the development or maintenance period. The rating is to be posted on the CMMI® Institute website 327.
3.3.7 Guidance on the exception for Class B software development and maintenance
If this option is used, the project is responsible for funding the evaluation and for addressing all risks that are identified during the evaluation. A CMMI appraisal across the listed process areas in this requirement is one method for conducting this evaluation. The Center Engineering Technical Authority is responsible for maintaining all records associated with the evaluation for the life of the project. The decision on participators in the evaluation process is determined by the responsible Center Engineering Technical Authority on the project. Recommended guidance is that the “qualified evaluator” should have demonstrated experience in an appraisal or training.
3.3.8 Guidance on Class B software on NASA Class D payloads (as defined in NPR 8705.4) and Class C software
While not required, it is highly recommended that providers have a Certified CMMI® Lead Appraiser conduct periodic informal evaluations against process areas chosen by the project and project engineering based on the risk associated with the project. The project determines if an assessment is needed, identifies the required areas for the assessment, and communicates this information to the provider. A sample assessment process, “Process for Evaluation in Lieu of CMMI® Appraisal,” can be found in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
3.4 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
| Related Links |
|---|
See also
3.5 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
| SPAN Links |
|---|
4. Small Projects
National Defense Industrial Association (NDIA) CMMI® Working Group conducted a study on the use of CMMI®-DEV within Small Businesses in 2010158. One of the counter-intuitive findings was that the "Perceptions that CMMI® is too burdensome for small businesses is not supported by data on CMMI®-DEV adoption". Significant numbers of organizations in the 1-20 employees range adopted and achieved CMMI® Level ratings.
Small projects are expected to take advantage of the Agency, Center, and/or organizational assets.
Small projects often face unique challenges when implementing process-heavy frameworks like CMMI-DEV due to their limited resources, smaller teams, and tighter constraints. The following tailored guidance ensures that small projects can effectively meet CMMI requirements without being overwhelmed by excessive overhead, while maintaining compliance and delivering high-quality software products.
Key Considerations for Small Projects
Adopt Lightweight Processes Tailored to Project Needs
- Small projects should simplify and tailor CMMI practices while maintaining their intent, focusing on creating efficient and minimalistic processes to reduce effort without compromising quality.
- Use small-scale templates, tools, and checklists designed for the project’s scope to streamline process implementation. Avoid overly detailed documentation requirements that do not provide added value.
Prioritize Key CMMI Process Areas
- Focus on the core CMMI process areas that deliver maximum value to the project:
- Requirements Management (RM): Establish clear and simple methods for gathering, documenting, and managing requirements.
- Project Planning (PP): Develop lightweight plans tailored to the project’s size, focusing on key milestones, deliverables, and resource allocation.
- Project Monitoring and Control (PMC): Use simple tracking tools to monitor progress, identify issues early, and adjust plans as necessary.
- Configuration Management (CM): Implement a lightweight version of configuration management, such as using version control systems (e.g., Git) for managing work products.
- Quality Assurance (PPQA): Perform small-scale quality assurance activities (e.g., peer reviews) and document findings to maintain standards.
- Focus on the core CMMI process areas that deliver maximum value to the project:
Leverage Organizational Tools and Resources
- Utilize NASA-provided tools, templates, and processes already designed for small projects to accelerate adoption and compliance with CMMI requirements.
- Take advantage of pre-existing organizational assets, such as reusable PIIDs (Process Implementation Indicator Documents), CMMI appraisal artifacts, and checklists from other projects.
Mentorship and Knowledge Sharing
- Small projects should engage experienced mentors or software engineers who have participated in prior CMMI processes. Mentors can guide teams in tailoring CMMI practices while avoiding unnecessary complexity and offering practical advice for resolving challenges.
Establish Incremental Compliance Goals
- Divide the work into manageable increments. For example, implement critical CMMI processes first (e.g., handling risks and managing requirements), and work toward refining less critical processes later. This approach reduces initial burden while gradually achieving full compliance.
Encourage Cross-Functional Collaboration
- Small teams can use collaborative approaches for roles like software assurance and development. For example, the same core team members might incorporate quality checks and status reporting into their routine activities rather than designating separate teams for these tasks.
Specific Small Project Adaptations by Process Area
Requirements Management (RM)
- Keep requirements concise and track them using simple tools like spreadsheets or lightweight requirements management software.
- Perform informal reviews with stakeholders to validate that requirements are complete and feasible.
Configuration Management (CM)
- Use scalable version control systems (e.g., Git or Subversion) to track software changes, documentation, and configuration artifacts.
- Establish a lightweight change management process to log and monitor changes, ensuring traceability.
Risk Management
- Identify high-priority risks early and record them in a simple risk tracking tool (e.g., a spreadsheet or Jira).
- Implement basic risk mitigation plans for high-severity risks while accepting low-severity risks.
Quality Assurance (QA)
- Perform informal peer reviews for all key software artifacts (e.g., requirements, test plans, code) to ensure quality standards.
- If an independent reviewer is not available, schedule team-based walkthroughs for work products.
Project Planning (PP) and Monitoring (PMC)
- Develop a high-level project plan with the primary milestones, resource allocations, and risks.
- Use simple tracking methods like task boards, timelines, or Kanban systems to monitor progress.
Process and Product Quality Assurance (PPQA)
- Integrate quality assurance activities directly into development workflows. For example, document QA results as inline comments or tickets in issue-tracking software.
CMMI Evaluations for Small Projects
Use Qualified Evaluators for Small Projects
- Instead of a full-scale CMMI appraisal, small projects can undergo targeted evaluations by qualified evaluators who focus on critical CMMI process areas relevant to the project’s size and scope. This reduces unnecessary evaluation overhead.
Focus Evaluation on Risk Areas
- The evaluation should prioritize process areas that pose the highest risk to project success, such as planning, testing, and risk mitigation.
Mitigate Deficiencies with Scalable Solutions
- If deficiencies are identified, implement mitigation plans that align with the small project’s constraints. For example, lightweight corrective actions such as improved task logging or semi-formal reviews could address process gaps.
Simplified Software Assurance Approach for Small Projects
Audit and Compliance Activities
- Use self-assessments or lightweight audits for compliance with critical process areas (e.g., requirements management, configuration management).
- Minimize audit documentation requirements while ensuring findings are actionable.
Risk-Focused Assurance
- Prioritize high-severity risks and focus assurance activities on early detection and mitigation of issues. Small projects often benefit from a tighter feedback loop between development and assurance teams.
Metrics for Small Projects
- Track a limited set of metrics that are meaningful for small project contexts:
- Number of risks identified versus mitigated.
- Number of non-conformances closed versus open.
- Trends in project milestones achieved versus planned.
- Process improvements identified during reviews or audits.
- Track a limited set of metrics that are meaningful for small project contexts:
Best Practices for Success in Small Projects
Engage Early with the Center Engineering Technical Authority (ETA)
- Work closely with the ETA to clarify which CMMI practices and deliverables are most relevant for the project, ensuring alignment with requirements while avoiding overburdening the team.
Embed Assurance into Daily Activities
- Avoid treating assurance as a separate task. Instead, integrate it into day-to-day project workflows to minimize overhead and increase efficiency.
Leverage Reusable Templates and Tools
- Save time and resources by using proven templates or tools provided by the organization, reducing the need to develop custom solutions from scratch.
Promote Communication and Transparency
- Foster open communication with all stakeholders, including project managers, software assurance personnel, and contractors. Regular updates and risk reviews ensure alignment and build confidence in the project's ability to deliver.
Focus on Incremental Improvements
- Use each CMMI-related activity as an opportunity to incrementally improve processes without overloading the team. Consistent small adjustments over time can yield significant benefits.
Conclusion
Small projects can successfully comply with CMMI requirements by tailoring processes to their scale, focusing on key process areas, leveraging lightweight tools, and integrating assurance activities into routine workflows. Through careful planning, mentorship, and the use of organizational resources, small projects can achieve the benefits of CMMI—such as improved quality, reduced risks, and better project outcomes—without becoming overwhelmed by unnecessary process complexity. This streamlined approach ensures that even small teams can deliver high-quality software products aligned with NASA’s goals and requirements.
5. Resources
5.1 References
- (SWEREF-153) Chrissis, M.B., Konrad, M., Shrum, S., This book is the definitive reference for CMMI-DEV Version 1.3. It describes best practices for the development and maintenance of products and services across their lifecycle. 3rd Edition, 2010, Addison-Wesley Professional, ISBN: 0-321-71150-5
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-158) NDIA Working Group, NDIA Systems Engineering Division, CMMI Technology Conference, Nov, 2010.
- (SWEREF-196) CMMI - Capability, Maturity Model, Integration, Version 2.0, See your SEPG for a NASA licensed copy.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-327) Software Engineering Institute (SEI), architecture web site.
- (SWEREF-422) Code Q-1, NASA Headquarters, February, 2001. Contains reports from 2004 through 2018
- (SWEREF-457) The link will generate a current list of Organizational Units which have completed and reported SCAMPI Class A appraisals against the CMMI or People CMM Model. Documented authorization has been received from the sponsor of each posted appraisal for this release of information.
- (SWEREF-553) Public Lessons Learned Entry: 1414.
5.2 Tools
6. Lessons Learned
6.1 NASA Lessons Learned
The NASA Lessons Learned database has documented valuable insights regarding acquisition philosophy and mechanisms and the implementation of processes such as Capability Maturity Model Integration (CMMI) to improve software acquisition and development efforts. These lessons emphasize efficiency, measurable process improvement, and structured practices to ensure successful project outcomes.
Key Lessons on Acquisition Philosophy
Lesson Number 1414: "Since the procurement's goal is to minimize the time from start to finish, efficiency must be instilled into the Contractor-Government roles and relationships. During the selection process, it is critical to ensure that the Contractor's processes, procedures, and tools meet established criteria (e.g., ISO 9001 and/or CMMI). This enables the Government to take a ‘hands-off’ approach during implementation, relying on the Contractor's maturity and expertise. Additionally, criteria used for verification, validation, or assessment of the Contractor's work after contract award should align with the Contractor's performance criteria."
This lesson underscores:
- The importance of selecting contractors with mature processes (e.g., ISO 9001, CMMI) to reduce oversight burden and improve efficiency during project implementation.
- Consistency and compatibility of criteria used for verification and validation to ensure seamless collaboration and alignment between contractors and Government objectives.
Additional Lessons Learned on Implementing CMMI at NASA
NASA has gained significant insights from implementing and maintaining CMMI practices, contributing to measurable process improvements, better project outcomes, and enhanced efficiency across centers. Key lessons include:
Preparing for Appraisals: Appraisal activities provide a structured approach to improving processes, facilitating measurable performance enhancements that strengthen overall organizational capabilities.
Establishing Baselines for Process Maturity: The implementation of CMMI helps Centers establish a clear baseline of where their maturity level stands, enabling targeted process improvement and benchmarking.
Development of Organizational Toolsets:
- Organizations often create tools like templates, spreadsheets, and checklists to assist projects in implementing and complying with CMMI practices.
- The use of such toolsets accelerates projects' compliance, especially when tailored for smaller development efforts.
Support for Small Projects: Toolsets and lightweight processes are particularly helpful for small projects, ensuring they can implement CMMI practices without becoming overwhelmed. This is critical for one-person tasks or small teams where excessive overhead could impede progress.
Mentorship and Knowledge Sharing:
- Software engineers who have participated in the CMMI process serve as mentors, guiding new projects in utilizing tools and tailoring development processes effectively.
- Mentorship improves cultural adoption of CMMI practices within teams.
Cross-Department Sponsorship and Collaboration: The CMMI process encourages the establishment of sponsorship across departments, including engineering management, fostering an organization-wide commitment to implementing process improvements.
Collaborations with Lead Appraisers: Early engagement with certified CMMI Lead Appraisers ensures alignment during appraisals and helps organizations navigate the complexities of implementing process improvements effectively.
Artifact Collection and Management:
- Developing Process Implementation Indicator Documents (PIIDs) benefits from a reliable artifact collection and data management process.
- Proper artifact tracking and organization are essential for successful appraisals and process reviews.
Workshops for Process Review and Reinforcement: CMMI workshops provide in-depth opportunities to review process areas and reinforce the use of organizational tools and practices.
Tracking and Progress Monitoring: Implementing CMMI practices establishes methods for tracking progress in software development activities, improving transparency and accountability.
Improvements in Key Process Areas:
- CMMI significantly enhances project management practices and strengthens software configuration management.
- Systematic feedback from CMMI assessments identifies areas for both process and project improvement.
Analytical and Systematic Feedback: Projects value analytical feedback that identifies gaps and improvement opportunities, helping them refine their development practices.
Challenges in Measurement and Analysis: The measurement and analysis process in CMMI can be particularly challenging, requiring focused effort and sufficient resourcing to yield actionable insights.
Early Management Plan Improvements:
- CMMI improves the quality and review of management plans early in the project lifecycle.
- These plans become reusable for new projects, providing consistency and reducing time spent on rework in future efforts.
Limited Value of Resource Planning at Low Levels: Resource planning and tracking at the individual process level often provide minimal additional benefit to projects, especially for smaller-scale efforts.
Lightweight Processes for Small Projects: Smaller projects require streamlined, lightweight processes to avoid excessive overhead, ensuring processes remain scalable and manageable.
Summary of Benefits from CMMI Implementation
NASA has realized a wide range of benefits from implementing and maintaining CMMI practices, including:
- Improved process efficiency and reduced likelihood of software failure, enhancing mission safety.
- Faster compliance and adoption of best practices with the use of tools and templates.
- Better coordination and sponsorship across departments due to the structured approach brought by CMMI.
- Enhanced opportunities for mentorship and guidance, enabling the effective tailoring of practices.
- Systematic feedback enabled by CMMI assessments, resulting in actionable insights for improving process maturity.
- Improved management plans and reusability, reducing development effort for subsequent projects.
- Better support for smaller projects, ensuring scalability and efficiency without overwhelming teams with unnecessary overhead.
NASA's experience with CMMI highlights its value as a framework for achieving measurable process improvements, supporting efficient collaboration with contractors, and promoting maturity in software development efforts. These lessons provide clear guidance on improving processes at Centers, as well as strategies for leveraging CMMI to strengthen NASA's ability to deliver high-quality software in complex mission environments.
Additional CMM/CMMI Lessons Learned by NASA associated with implementing and maintaining this requirement are:
- Preparing for an appraisal helps you get measurable process improvement.
- CMMI process helped Centers establish a baseline of where they are.
- Organizations develop an extensive set of "tools" (i.e., templates, spreadsheets) to help projects with CMMI practices and artifacts.
- The use of a toolset helped projects reach compliance much faster.
- The use of organizational tools helps support small project development efforts.
- Software Engineers that have participated in the CMMI process can be mentors that can help implement project tools and help projects utilize and tailor the software development processes.
- The CMMI process helps establish sponsorship across departments and with Engineering management.
- Establish a relationship early with the CMMI Lead Appraiser.
- PIID development depends on a good artifact collection process and a data management approach.
- The CMMI workshops can be used to review the processes in-depth and reinforce the toolsets.
- The CMMI process helped establish a method of tracking progress on software development activities.
- The CMMI process improves project management and software configuration management areas.
- CMMI assessments help identify areas for process and project improvement.
- Projects appreciate systematic and analytical feedback on what they are doing.
- Measurement and analysis are a big challenge in the CMMI process.
- Improved quality and review of management plan early in the life cycle and reuse of the plans for new projects.
- Resource planning and tracking at the individual process level provided little additional benefit to the projects.
- Smaller projects need to have lightweight processes to avoid being smothered (especially for a one-person task).
6.2 Other Lessons Learned
- As part of its annual review, the Aerospace Advisory Panel included this finding in the Computer Hardware/Software section of its Annual Report for 2000 422 : "NASA has initiated plans to have its critical systems processes evaluated according to the Capability Maturity Model (CMM) of the CMMI Institute and to work toward increasing the CMM level of its critical systems processes."
- Evaluate all software problem reports or software change tickets identified as ‘no impact’ to design or testing. Ensure the rationale is adequate, and if found inadequate perform a delta design/code review to ensure the code and data are compliant with the requirements.
- Enforce policy to use controlled design artifacts (ICDs, SRSs, SDDs) for implementation and verification purposes, rather than relying on informal design information.
Controlled content must be sufficient for implementation and verification purposes.
Software problem reports or software change tickets must be closed only based on formally controlled content.
7. Software Assurance
- For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
- For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.
7.1 Tasking for Software Assurance
1. Confirm that Class A and B software acquired, developed, and maintained by NASA is performed by an organization with a non-expired CMMI-DEV rating, as per the NPR 7150.2 requirement.
2. Assess potential process-related issues, findings, or risks identified from the CMMI assessment findings.
7.2 Software Assurance Products
Objective
The Software Assurance (SA) activities for this requirement aim to verify compliance with the Capability Maturity Model Integration for Development (CMMI-DEV) framework while supporting process audits, risk identification, and mitigation for improving software development and maintenance practices. NASA leverages these assurance efforts to ensure software is developed by mature, capable organizations and to uphold NPR 7150.2 requirements, fostering mission-critical success.
Deliverables and Activities
Assessment of CMMI Assessment Findings
- Review findings from formal CMMI appraisal reports conducted by the organization.
- Confirm that the organization's software development practices align with required CMMI rating levels (e.g., Level 2 or Level 3 depending on software class).
- Identify discrepancies between process maturity practices and requirements, document these discrepancies, and recommend corrective action plans.
Software Assurance Process Audit Report
- Conduct independent audits to evaluate whether the organization's processes adhere to required CMMI practices and NPR 7150.2 process areas.
- Document areas of non-conformance and gaps against defined CMMI process areas.
- Include findings from both compliance audits and standards audits as related to software process maturity.
Software Development Processes and Practices Audit Report
- Verify that the organization’s development practices (e.g., requirements management, risk handling, configuration management) conform to CMMI standards and project requirements.
- Evaluate and report any deviations observed during audits of the software lifecycle activities, including process reviews and work product evaluations.
Identification of Process-Related Risks from CMMI Appraisal findings
- Analyze risk exposure based on CMMI appraisal findings or internal process audits.
- Identify risks related to deficiencies in process maturity or organizational practices.
- Collaborate with the project team to document mitigation strategies for high-severity risks (e.g., "red" risks).
Confirmation of CMMI-DEV Rating Validity
- Provide evidence that the organization performing development or maintenance has achieved the required non-expired CMMI-DEV rating.
- Validate this evidence via the CMMI Published Appraisal Results website or inquire directly with the CMMI Institute for appraisal verification.
7.3 Metrics
Tracking and analyzing metrics are critical for measuring software assurance effectiveness and identifying trends over time. Metrics offer actionable insights into process deficiencies, risks, and opportunities for improvement.
Process Compliance Metrics
- Number of software process non-conformances (NCs) identified by lifecycle phase and their corrective actions over time.
- Trends in NC counts for process audits and standards audits, helping identify recurring issues or areas needing improvement.
Audit Coverage Metrics
- Planned vs. performed compliance audits (e.g., audits against NPR 7150.2, CMMI-DEV process areas, or organizational procedures).
- Count of open vs. closed NCs based on audit findings to monitor resolution progress.
Non-Conformance Trends
- Longitudinal tracking of non-conformances per audit (including findings from process maturity assessments and compliance audits).
- Distribution of process NCs by type (e.g., failure to perform activities, deviations from standards), enabling targeted corrective measures.
Risk Metrics
- Count risks by severity classification (e.g., red, yellow, green) and their mitigation status.
- Trends showing risks mitigated vs. risks escalating over time.
- Number of risks with formal mitigation plans compared to the total number of risks identified.
Effectiveness of Corrective Actions
- Track trends in open vs. closed risks and NCs, ensuring timely closure and resolution.
- Evaluate how quickly process deficiencies are addressed and whether corrective measures are preventing recurrence.
7.4 Guidance
This section provides specific actions to ensure compliance and effectiveness in implementing Software Assurance for CMMI requirements.
Project Funding Responsibilities
The project must allocate resources to fund required CMMI evaluations and to address risks identified during these evaluations. This funding ensures that evaluations are conducted thoroughly and any process deficiencies are remediated.
Understanding CMMI-DEV
CMMI-DEV is a globally recognized framework for process improvement that provides structured best practices across critical software development areas, including requirements management, decision-making, risk visibility, and more. Adhering to these practices directly enhances NASA software project outcomes, reducing risks and improving efficiency and product quality.
Verification of Organizational CMMI Compliance
The project team and Software Assurance personnel must verify the CMMI certifications of organizations performing software development or maintenance tasks.
- Class A Software: Requires a CMMI-DEV Level 3 or higher rating to ensure that the organization has mature, repeatable, and scalable processes.
- Class B Software: Requires a CMMI-DEV Level 2 or higher rating for foundational process maturity.
For Class B software exceptions, NASA may choose to work with a "best-in-class provider" that does not meet the required maturity level. In these cases:
- Conduct an evaluation against CMMI Level 2 practices using a qualified evaluator.
- Address identified deficiencies as part of mitigation planning.
- Notify the Office of the Chief Engineer (OCE) and Center Engineering Technical Authority with the results of the evaluation and any mitigations implemented.
Risk Assessment and Mitigation
The project must cooperate with engineering and safety assurance stakeholders to track and mitigate risks identified during CMMI evaluations. Risks should be prioritized based on severity, and mitigation plans should be approved by appropriate technical authorities.
Audit Findings and Guidance
Audit findings must be communicated promptly to the project team, engineering, and assurance personnel. Ensure findings are documented with sufficient detail to guide resolution activities.
- Work collaboratively with the project to address gaps identified in compliance and standards audits.
- Ensure audit findings are tracked and closed within established timelines.
Best Practice Recommendations
Early Engagement with Evaluators
Establish relationships with qualified evaluators or Lead Appraisers early in the process. Early collaboration improves alignment, minimizes delays, and facilitates a smoother evaluation experience.Artifact Management
Develop and maintain robust artifact collection and data management processes to support CMMI evaluations and audits efficiently. Organize artifacts based on PIID (Process Implementation Indicator Documents) requirements to facilitate appraisal readiness.Support for Small Projects
Lightweight processes should be tailored for smaller projects to avoid excessive overhead. Tools, templates, and streamlined practices can enable small teams to meet CMMI requirements efficiently without compromising compliance.Systematic Risk Reviews
Regularly review risks and outcomes from audits or evaluations, ensuring proactive identification and management of emerging process risks. Use trends to refine risk mitigation strategies.Sponsorship Across Departments
Build cross-department sponsorship for CMMI activities to ensure consistent buy-in from engineering, assurance, and procurement stakeholders. This collaboration fosters a shared commitment to process improvement.Efficient Use of Metrics
Metrics collected from audits, NC resolutions, and risk mitigations should be analyzed periodically to refine project processes and improve performance. Trends offer a powerful mechanism for identifying recurring gaps and planning targeted interventions.
Conclusion
The detailed Software Assurance framework outlined above supports the implementation of CMMI assessments, process audits, risk mitigation activities, and compliance tracking. By adopting a structured and metrics-driven approach, NASA can ensure its software teams operate under mature, repeatable processes, enhancing mission success, software quality, and overall safety. Proactive effort in audits, evaluations, and action on process-related risks will ensure continued adherence to NPR 7150.2 and sustained organizational excellence.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
8. Objective Evidence
This requirement ensures that the organization responsible for developing or maintaining NASA software demonstrates an appropriate level of software process maturity, as established by the Capability Maturity Model Integration for Development (CMMI-DEV) framework. Below, specific types of verifiable artifacts and documentation are identified as objective evidence to demonstrate compliance with this requirement.
Objective Evidence by Category
1. CMMI-DEV Certification Evidence
The primary evidence required to demonstrate compliance with this requirement is the CMMI-DEV certification documentation for the organization performing the software development or maintenance.
Must Include:
- CMMI-DEV Maturity Level Rating Certificate:
- A valid certificate issued by the CMMI Institute Certified Lead Appraiser.
- For Class A software: The certificate must show Maturity Level 3 or higher for software development processes.
- For Class B software: The certificate must show Maturity Level 2 or higher for software development processes.
- Certificates must not be expired (expiration dates are typically 3 years from the date of appraisal).
- Scope of Certification:
- The certificate, or supporting documentation, must clearly state:
- The organizational unit and locations covered by the appraisal.
- Whether the certification specifically applies to software development processes.
- The certificate, or supporting documentation, must clearly state:
- Appraisal Reference:
- Details of the appraisal method used (e.g., Standard CMMI Appraisal Method for Process Improvement [SCAMPI] A, which is the formal appraisal method).
Examples of Evidence:
- Example 1: A valid certificate reading:
“This certifies that [Organization X] has been appraised at Maturity Level 3 on the CMMI-DEV model for its Software Development processes under SCAMPI-A by Certified Lead Appraiser [Name], issued on [Date], valid until [Date].” - Example 2: A summary report from the Lead Appraiser confirming that the appraisal covers software development processes for the organizational unit delivering Class A or Class B software.
2. Appraisal Report and Findings
Beyond the CMMI-DEV certificate, the final appraisal report (or an excerpt) can provide supporting evidence to ensure the certification meets the specific criteria required for NASA software classification.
Must Include:
- Appraisal Results:
- Evidence that the appraisal covered development processes relevant to NASA software.
- Objectives of the appraisal, including specific activities/process areas assessed that align with the required Maturity Level.
- Organizational Scope:
- Detailed scope of the appraisal confirming that the certified organizational unit is the same one performing the software work for the project.
- Timeline Evidence:
- Confirmation that the certification was current and valid at the time the software contract was initiated or during the lifecycle phases (development/maintenance).
Examples of Evidence:
- Excerpt from the Final Appraisal Findings Report that includes:
- Appraisal results and corresponding process areas evaluated (e.g., Requirements Management, Configuration Management, and Testing for software development).
- Dates of the appraisal with confirmation that it overlaps with the contractual obligations for software delivery.
- Correspondence from the certified Lead Appraiser confirming the appraisal’s validity and scope.
3. Organizational Review Records
Documentation from NASA’s review processes ensuring that the software provider meets the required CMMI-DEV Maturity Level before an acquisition, contract, or project approval.
Must Include:
- Records of Supplier Evaluation:
- Reports or checklists from evaluations conducted by the project team, specifically verifying that the software provider’s CMMI certification meets the requirement.
- Supplier Selection Documentation:
- Rationale for selecting the supplier, including confirmation of CMMI maturity level compliance documented during the procurement process.
- Independent Verification by NASA:
- Results of NASA-conducted audits or reviews ensuring the supplier’s certification compliance.
Examples of Evidence:
- Records from contract bidding evaluations with clear evidence that the supplier was assessed specifically for:
- CMMI Maturity Level (matching the class of the software—Maturity Level 2 or 3).
- Recency and validity of CMMI certification.
- A Compliance Checklist from a Technical Readiness Review (TRR), System Requirements Review (SRR), or other milestone validation showing certification approval.
- Supplier Demonstration Evidence:
- Slide presentations, correspondences, or proof submitted by the supplier to NASA showing CMMI compliance.
4. Software Classification Evidence
The classification of software (Class A or Class B) determines the required CMMI-DEV Maturity Level. Documentation formally identifying the software class is essential evidence.
Must Include:
- Software Classification Worksheets:
- Worksheets or reports indicating the NASA-relevant software classification (as defined in NPR 7150.2 or NPR 8705.4).
- Evidence that the class was correctly assessed and matched to the required CMMI Maturity Level.
Examples of Evidence:
- A signed worksheet for the initial classification of the software, including rationale (e.g., mission criticality, complexity, safety).
- Correspondence showing concurrence on the software classification by project stakeholders (e.g., the project manager, SA lead, etc.).
5. Periodic Review and Maintenance of Certification
For organizations providing long-term software development or maintenance, additional objective evidence is required to ensure that the certification is valid for the project duration.
Must Include:
- Evidence that the organization monitors certification and renews it as needed within the appropriate timeline.
- Correspondence confirming validity post-initial delivery (e.g., re-certification efforts during system maintenance).
Examples of Evidence:
- Re-certification Documentation:
- New or updated certificates for multiyear projects.
- Periodic Compliance Audit Reports:
- Reports showing that compliance with the requirement is assessed and maintained.
6. Exceptions Documents
In rare cases, certain exceptions or nuances (e.g., Class B software for NASA Class D payloads) may apply.
Must Include:
- Documentation showing an exception was approved in alignment with applicable policies (e.g., NPR 8705.4).
- Approval documentation from the appropriate Engineering Technical Authority (ETA) or Software Engineering Process Group (SEPG) showing an allowance for deviation, as applicable.
Examples of Evidence:
- Signed approval of exception request by the governing authority.
- Documentation showing alternative means of ensuring quality and process maturity for non-standard cases or exceptions.
Summary of Objective Evidence for Requirement 3.9.2
| Category | Examples of Evidence |
|---|---|
| CMMI-DEV Certification | Valid CMMI-DEV certificate (Maturity Level 3 for Class A, Level 2 for Class B) issued by a Certified Lead Appraiser. |
| Appraisal Report and Findings | Final appraisal report, scope, results, and timeline confirming relevance to software processes. |
| Organizational Review Records | Supplier evaluation checklists, milestone reviews (e.g., SRR, TRR) confirming CMMI compliance. |
| Software Classification | Worksheets and rationale confirming Class A or Class B classification tied to CMMI requirements. |
| Periodic Compliance | Re-certification documentation or periodic audit reports demonstrating compliance maintenance. |
| Exceptions Documents | Policy-based deviation approvals or alternative compliance strategies (e.g., formal approval of exceptions). |
This comprehensive set of objective evidence ensures that the development or maintenance of software meets CMMI-DEV requirements as specified in SWE-032.


