bannerc
SWE-032 - CMMI Levels for Class A and B Software

1. Requirements

3.9.3 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:

    1. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
    2. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.

1.1 Notes

Organizations need to complete an official CMMI Institute defined appraisal against either the CMMI-DEV model V1.3 or V2.0. Organizations are to maintain their rating and have their results posted on the CMMI Institute Web site, or provide an Appraisal Disclosure Statement so that NASA can assess the current maturity/capability rating. Software development organizations need to maintain their appraisal rating during the period they are responsible for the development and maintenance of the software.

For Class B software, an exception can be exercised for those cases in which NASA wishes to purchase a product from the "best in class provider," but the best in class provider does not have the required CMMI® rating. For Class B software, instead of a CMMI® rating by a development organization, the project will conduct an evaluation, performed by a qualified evaluator selected by the Center Engineering Technical Authority, against the CMMI-DEV Maturity Level 2 practices, and mitigate any risk, if deficiencies are identified in the evaluation. If this approach is used, the development organization and project are responsible for correcting the deficiencies identified in the evaluation. When this exception is exercised, the OCE and Center Engineering Technical Authority are notified of the proposition and provided the results of the evaluation. The project manager should seek guidance from the Center Office of Procurement for help in making these determinations.

1.2 History

SWE-032 - Last used in rev NPR 7150.2D

RevSWE Statement
A

2.5.1. The project shall ensure that software is acquired, developed, and maintained by an organization with a non-expired Capability Maturity Model Integration for Development (CMMI-DEV) rating  as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser as follows:

For Class A software:

CMMI-DEV Maturity Level 3 Rating or higher for software, or CMMI-DEV Capability Level 3 Rating or higher in all CMMI-DEV Maturity Level 2 and 3 process areas for software.

For Class B software:

CMMI-DEV Maturity Level 2 Rating or higher for software, or CMMI-DEV Capability Level 2 Rating or higher for software in the following process areas:

         a. Requirements Management.
         b. Configuration Management.
         c. Process and Product Quality Assurance.
         d. Measurement and Analysis.
         e. Project Planning.
         f. Project Monitoring and Control.
         g. Supplier Agreement Management (if applicable).

For Class C software:

The required CMMI-DEV Maturity Level for Class C software will be defined per Center or project requirements.

Difference between A and B

Updated "SEI" with "CMMI Institute" for lead appraiser authorizing/certifying body; Added exception for Class B sofware on Class D Payloads; Removed CMMI-Dev requirement for Class C software.

B

3.11.3 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI DEV rating as measured by a CMMI Institute authorized or certified lead appraiser as follows:

a. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software, or CMMI-DEV Capability Level 3 Rating or higher in all CMMI-DEV Maturity Level 2 and 3 process areas for software.

b. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software, or CMMI-DEV Capability Level 2 Rating or higher for software in the following process areas:

(1)  Requirements Management.
(2)  Configuration Management.
(3)  Process and Product Quality Assurance.
(4)  Measurement and Analysis.
(5)  Project Planning.
(6)  Project Monitoring and Control.
(7)  Supplier Agreement Management (if applicable).

Difference between B and C

"Changed to CMMI ""Certified Lead Appraiser"".

Removed allowance for Capability Level 3 Rating or higher in all Maturity Level 2 and 3 process areas for software.

Removed Capability Level 2 option for specific process areas for Class B software. Now requires full Maturity Level 2 rating."

C

3.9.3 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:

    1. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
    2. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.

Difference between C and D

Added trademark symbol for CMMI to the requirements.

The notes under item b. have been updated for clarity.

D

3.9.2 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:

    1. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
    2. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

The CMMI® requirement is a qualifying requirement for NASA.  The requirement is included to ensure that NASA projects are supported by software development organization(s) having the necessary skills and processes in place to produce reliable products within cost and schedule estimates. 

3. Guidance

The Capability Maturity Model (CMM®) and the CMMI®-DEV are internationally used framework for process improvement in development organizations. It is an organized collection of best practices and proven processes areas. Practices cover topics that include eliciting and managing requirements, decision making, measuring performance, planning work, handling risks, and more. Using these practices, NASA can improve NASA software projects’ chances of mission success.

This requirement provides NASA with a methodology to:

  • Measure software development organizations against an industry-wide set of best practices that address software development and maintenance activities applied to products and services.
  • Measure and compare the maturity of an organization’s product development and acquisition processes with the industry state of the practice.
  • Measure and ensure compliance with the intent of the NPR 7150.2 process-related requirements using an industry-standard approach. 
  • Assess internal and external software development organizations' processes.
  • Identify potential risk areas within a given organization’s software development processes.

Benefits of using CMMI® include:

  • Reducing the risk of software failure - Increasing mission safety.
  • Improving the accuracy of schedule and cost estimates by requiring the use of historical data and repeatable methods.
  • Helping NASA become a smarter buyer of contracted out software.
  • Increasing quality by finding and removing more defects earlier.
  • Improving the potential for reuse of tools and products across multiple projects.
  • Increasing the ability to meet the challenges of evolving software technology.
  • Improving software development planning across the Agency.
  • Improving the NASA contractor community concerning software engineering. 
  • Lowering the software development cost.
  • Improving employee morale.
  • Improving customer satisfaction.
  • Improving NASA and contractor community knowledge and skills.
  • Providing NASA a solid foundation and structure for developing software in a disciplined manner.

CMMI® ratings can cover a team, a workgroup, a project, a division, or an entire organization. When evaluating software suppliers, it’s important to make sure that the specific organization doing the software work on the project has the cited rating (as some parts of a company may be rated while others are not).

 It’s important to note that SWE-032 and a CMMI®-DEV rating is an organizational qualifier to acquire, develop, or maintain software for or by NASA for Classes A and B.

Many of the requirements in NPR 7150.2 are consistent with the established process areas in the CMMI®-DEV framework. The CMMI®-DEV rating, as well as consistent NPR 7150.2 requirements, are both needed to ensure that organizations have demonstrated the capability to perform key software engineering processes and have a binding agreement to continue to execute key software engineering processes during the development of NASA’s most critical software systems. 

This requirement applies to software in Classes A and B.  It is recommended that projects check the status of the software development or maintenance organization's CMMI® rating at each major project life cycle review to ensure continued compliance and to identify potential risk areas in the software processes. A "check" can easily be done via the CMMI® Institute's Published Appraisals website 327.

General Software Acquisition Guidance:

The content of the supplier agreement is critical to the acquisition of any software, including software embedded in a delivered system. In addition to the CMMI® Maturity Level requirements placed on the supplier by SWE-032, the supplier agreement must also specify compliance with the software contract requirements identified in NPR 7150.2. The creation and negotiation of any supplier agreement involving software need to include representatives from the Center's software engineering and software assurance organizations to ensure that the software requirements are represented in the acquisition agreement(s). The agreements identify the following aspects of the acquisition:

  • Technical requirements on the software.
  • Definition and documentation of all software deliverables.
  • Required access to intermediate and final software work products throughout the development life cycle.
  • Compliance and permissible exceptions to NPR 7150.2 and any applicable Center software engineering requirements.
  • Software development status reporting including implementation progress, technical issues, and risks.
  • Definition of acceptance criteria for software and software work products.
  • Non-technical software requirements include licensing, ownership, use of third party or Open Source Software, and maintenance agreements.

Representatives from the Center's software engineering and assurance organizations must evaluate all software-related contract deliverables before acceptance by the Project. The deliverables must be evaluated for:

  • Compliance with acceptance criteria.
  • Completeness.
  • Accuracy.

Class A software – if you acquire, develop, or maintain Class A software the organization performing the functions is required to have a non-expired CMMI®-DEV Level 3 or higher rating. 

Class A software acquisition guidance

To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 3 or higher or rated at CMMI®-DEV Capability Level 3 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables. The extent of the CMMI®-DEV Level 3 rated organization's support required for a Class A acquisition can be determined by the Center's Engineering Technical Authority responsible for the project.

Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 3 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.

For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver for those exclusions or alternate requirements in the form of a streamlined compliance matrix for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.

Class A software development or maintenance guidance - The software organizations that directly develop or maintain Class A software are required to have a valid CMMI®-DEV Level 3 or higher rating for the organization performing the activities.  Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments.  Project contractors and subcontractors performing Class A software development are required to have their CMMI®-DEV Level 3 rating.  NASA and primes need to pass this requirement down in contracts to ensure all subcontractors have the necessary CMMI®-DEV rating.

The CMMI®-DEV Level 3 rating is to be maintained throughout the project’s development or maintenance period.  NASA requests organizations’ CMMI® ratings be posted on the CMMI Institute website 327. The CMMI® Institute vets the validity of the CMMI® appraisals on this list and assures the rating hasn’t expired (as of this writing CMMI® ratings are valid for 3 years). In rare instances (rating earned in a classified environment) an organization may have a current CMMI®-DEV rating, but it doesn’t appear on the CMMI® Institute website. In these cases, the supplier’s claim can be directly checked with the CMMI® Institute.

Class B software (except Class B software on NASA Class D payloads) - CMMI®-DEV Maturity Level 2 Rating or higher for software, or CMMI®-DEV Capability Level 2 Rating or higher for software in the following process areas:

a. Requirements Management.
b. Configuration Management.
c. Process and Product Quality Assurance.
d. Measurement and Analysis.
e. Project Planning.
f. Project Monitoring and Control.
g. Supplier Agreement Management (if applicable).


Class B software acquisition guidance -

To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 2 or higher or rated at CMMI®-DEV Capability Level 2 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables.

The Center Engineering Technical Authority responsible for the project determines the extent of the CMMI®-DEV Level 2 rated organization's support required (see description in the previous paragraph) for a Class B acquisition. Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 2 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.

For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver in the form of a streamlined compliance matrix for those exclusions or alternate requirements for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.

Class B software development or maintenance guidance - The software organizations that directly develop or maintain Class B software are required to have a valid CMMI®-DEV Level 2 or higher rating (via a Continuous or Staged representation) for the organization performing the activities.  Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments.  Project contractors and subcontractors performing Class B software development are required to have their own CMMI®-DEV Level 2 or higher rating.  The CMMI®-DEV Level 2 maintains an active rating during the development or maintenance period.  The rating is to be posted on the CMMI® Institute website 327.

Guidance on the exception for Class B software development and maintenance - If this option is used, the project is responsible for funding the evaluation and for addressing all risks that are identified during the evaluation.  A CMMI appraisal across the listed process areas in this requirement is one method for conducting this evaluation. The Center Engineering Technical Authority is responsible for maintaining all records associated with the evaluation for the life of the project.  The decision on participators in the evaluation process is determined by the responsible Center Engineering Technical Authority on the project.  Recommended guidance is that the “qualified evaluator” should have demonstrated experience in an appraisal or training.

Guidance on Class B software on NASA Class D payloads (as defined in NPR 8705.4) and Class C software - While not required, it is highly recommended that providers have a Certified CMMI® Lead Appraiser conduct periodic informal evaluations against process areas chosen by the project and project engineering based on the risk associated with the project.  The project determines if an assessment is needed, identifies the required areas for the assessment, and communicates this information to the provider. A sample assessment process, “Process for Evaluation in Lieu of CMMI® Appraisal,” can be found in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.  

4. Small Projects

National Defense Industrial Association (NDIA) CMMI® Working Group conducted a study on the use of CMMI®-DEV within Small Businesses in 2010 158. One of the counter-intuitive findings was that the "Perceptions that CMMI® is too burdensome for small businesses is not supported by data on CMMI®-DEV adoption". Significant numbers of organizations in the 1-20 employees range adopted and achieved CMMI® Level ratings.

Small projects are expected to take advantage of the Agency, Center, and/or organizational assets. 

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Acquisition Philosophy and Mechanism. Lesson Number 1414 553: "Since the procurement's goal is to minimize the time from start to finish, part of its philosophy is to instill efficiency into the Contractor-Government roles and relationships. Thus, it becomes paramount during the selection process to ensure that the Contractor's processes, procedures, and tools are adequate (as based on some established criteria such as ISO 9001 and/or CMMI) to allow the Government to take a 'hands-off approach during implementation. Also, any criteria to be used to verify/validate and or assess the Contractor's work after the contract award must be consistent and compatible with the performance criteria levied on the Contractor."

Additional CMM/CMMI Lessons Learned by NASA associated with implementing and maintaining this requirement are:


      • Preparing for an appraisal helps you get measurable process improvement.
      • CMMI process helped Centers establish a baseline of where they are.
      • Organizations develop an extensive set of "tools" (i.e., templates, spreadsheets) to help projects with CMMI practices and artifacts.
      • The use of a toolset helped projects reach compliance much faster.
      • The use of organizational tools helps support small project development efforts.
      • Software Engineers that have participated in the CMMI process can be mentors that can help implement project tools and help projects utilize and tailor the software development processes.
      • The CMMI process helps establish sponsorship across departments and with Engineering management.
      • Establish a relationship early with the CMMI Lead Appraiser.
      • PIID development depends on a good artifact collection process and a data management approach.
      • The CMMI workshops can be used to review the processes in-depth and reinforce the toolsets.
      • The CMMI process helped establish a method of tracking progress on software development activities.
      • The CMMI process improves project management and software configuration management areas.
      • CMMI assessments help identify areas for process and project improvement.
      • Projects appreciate systematic and analytical feedback on what they are doing.
      • Measurement and analysis are a big challenge in the CMMI process.
      • Improved quality and review of management plan early in the life cycle and reuse of the plans for new projects.
      • Resource planning and tracking at the individual process level provided little additional benefit to the projects.
      • Smaller projects need to have lightweight processes to avoid being smothered (especially for a one-person task).

6.2 Other Lessons Learned

  • As part of its annual review, the Aerospace Advisory Panel included this finding in the Computer Hardware/Software section of its Annual Report for 2000 422 : "NASA has initiated plans to have its critical systems processes evaluated according to the Capability Maturity Model (CMM) of the CMMI Institute and to work toward increasing the CMM level of its critical systems processes."
  • Evaluate all software problem reports or software change tickets identified as ‘no impact’ to design or testing. Ensure the rationale is adequate, and if found inadequate perform a delta design/code review to ensure the code and data are compliant with the requirements.
  • Enforce policy to use controlled design artifacts (ICDs, SRSs, SDDs) for implementation and verification purposes, rather than relying on informal design information.
    • Controlled content must be sufficient for implementation and verification purposes.

    • Software problem reports or software change tickets must be closed only based on formally controlled content.

7. Software Assurance

SWE-032 - CMMI Levels for Class A and B Software
3.9.3 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:
    1. For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
    2. For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.

7.1 Tasking for Software Assurance

  1. Confirm that Class A and B software that is acquired, developed, and maintained by NASA is performed by an organization with a non-expired CMMI-DEV rating, as per the NPR 7150.2 requirement.

  2. Assess potential process-related issues, findings, or risks identified from the CMMI assessment findings.

  3. Perform audits on the software development and software assurance processes. 

7.2 Software Assurance Products

  • Assessment of CMMI Assessment Findings
  • Software Assurance Process Audit Report
  • SW Development Processes and Practices Audit Report
  • Identification of any process-related risks, and findings from CMMI appraisal findings.


    Objective Evidence

    • Evidence of confirmation that the organization has the required CMMI-Dev rating.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of software process Non-Conformances by life-cycle phase over time.
  • # of Compliance Audits planned vs. # of Compliance Audits performed
  • # of Open vs. Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
  • # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project
  • Trends of # Open vs. # Closed over time
  • # of Risks by Severity (e.g., red, yellow, green) over time
  • # of Risks with mitigation plans vs. total # of Risks
  • # of Risks trending up over time
  • # of Risks trending down over time

7.4 Guidance

The project is responsible for funding the required CMMI evaluation and for addressing all risks that are identified during the CMMI evaluation.

The Capability Maturity Model Integrated CMMI®-DEV is an internationally used framework for process improvement in development organizations. It is an organized collection of best practices and proven processes areas. Practices cover topics that include eliciting and managing requirements, decision making, measuring performance, planning work, handling risks, and more. Using these practices, NASA can improve NASA software projects’ chances of mission success.

This requirement provides NASA with a methodology to:

  • Measure software development organizations against an industry-wide set of best practices that address software development and maintenance activities applied to products and services.
  • Measure and compare the maturity of an organization’s product development and acquisition processes with the industry state of the practice.
  • Measure and ensure compliance with the intent of the NPR 7150.2 process-related requirements using an industry-standard approach. 
  • Assess internal and external software development organizations' processes.
  • Identify potential risk areas within a given organization’s software development processes.

Look at the CMMI Published Appraisal Results 327site, https://cmmiinstitute.com/pars/ , and verify that the software organization appraisal results are listed.

For Class B software, an exception can be exercised for those cases in which NASA wishes to purchase a product from the "best in class provider," but the best in class provider does not have the required CMMI® rating. For Class B software, instead of a CMMI® rating by a development organization, the project will conduct an evaluation, performed by a qualified evaluator selected by the Center Engineering Technical Authority, against the CMMI-DEV Maturity Level 2 practices, and mitigate any risk, if deficiencies are identified in the evaluation. If this approach is used, the development organization and project are responsible for correcting the deficiencies identified in the evaluation. When this exception is exercised, the OCE and Center Engineering Technical Authority are notified of the proposition and provided the results of the evaluation. The project manager should seek guidance from the Center Office of Procurement for help in making these determinations.

For class B software the project can conduct an evaluation, performed by a qualified evaluator selected by the Center Engineering Technical Authority, against the CMMI-DEV Maturity Level 2 practices, and mitigate any risk if deficiencies are identified in the evaluation.  Look for the assessment report performed by a qualified evaluator selected by the Center Engineering Technical Authority for the results. Verify that the assessment was approved by a qualified evaluator selected by the Center Engineering Technical Authority. Verify that the identified risks have been addressed by the project, engineering, and SMA as needed.

Every task that involves performing an audit should also clarify that all audit findings are promptly shared with the project will be addressed in the handbook guidance.



  • No labels