bannerc
SWE-004 - OCE Benchmarking

1. Requirements

2.1.1.2 The NASA OCE shall periodically benchmark each Center's software engineering capability against requirements in this directive. 

1.1 Notes

Capability Maturity Model® Integration (CMMI®) for Development (CMMI-DEV) appraisals are the preferred benchmarks for objectively measuring progress toward software engineering process improvement at NASA Centers.

1.2 History

SWE-004 - Last used in rev NPR 7150.2D

RevSWE Statement
A

1.2.3 The NASA Headquarters' Chief Engineer shall periodically benchmark each Center's software engineering capability against its Center Software Engineering Improvement Plan.

Difference between A and BNo change
B

2.1.1.2 The NASA CE  shall periodically benchmark each Center's software engineering capability against its Center Software Engineering Improvement Plan.

Difference between B and C

Removed reference to  SEIP and added "per the requirements in this directive."

C

2.1.1.2 The NASA OCE shall periodically benchmark each Center's software engineering capability against requirements in this directive. 

Difference between C and DNo change
D

2.1.1.2 The NASA OCE shall periodically benchmark each Center's software engineering capability against requirements in this directive. 



2. Rationale

The Headquarters Office of the Chief Engineer (OCE) is responsible for ensuring that the Agency-level software engineering requirements and policies are being followed throughout the Agency.

3. Guidance

The Headquarters Office of the Chief Engineer (OCE) achieves this requirement by a number of methods:

  • OCE Center Surveys. OCE personnel conduct periodic assessment of compliance at the Centers and within the programs/projects to verify that they are meeting this responsibility.
  • Review of the Capability Maturity Model Integration (CMMI®) appraisal results.
  • Review and participate in program and project reviews.
  • Review of Center and project planning documents, schedule, and progress.
  • Review of Center and project waivers.
  • Feedback and status presentations provided by the Centers during the NASA Software Working Group activities.
  • Yearly task agreements between the Centers and the NASA Headquarters OCE.
  • Feedback and discussion for the NASA Software Working Group members and Mission Software Steering Committee members.
  • Project status and feedback provided to the NASA Headquarters OCE.
  • Software inventory data.
  • External Agency inquires.

The Headquarters OCE performs Center and organizational surveys. These surveys are used by the OCE to provide oversight, to maintain internal control, and to review its operations and assess compliance with Agency policy.  The OCE appraisal process addresses several objectives. They are:

  • Review Center and specified NASA Headquarters organizations' processes and infrastructure for compliance with OCE requirements and policies.
  • Review specific program/project "files" for compliance with requirements and policies.
  • Identify systemic problems or deficiencies.
  • Recognize areas of excellence/best practices.
  • Receive Center feedback regarding areas where Agency policy and requirements may need to be modified.

Currently, the OCE software surveys focus on the following core elements:

  • Compliance to Agency-level software engineering requirements and policies.
  • Software Engineering technical authority implementation at a Center.
  • Software engineering dissenting opinions and waiver process at a Center.
  • Software engineering management.
  • Incorporation of software engineering lessons learned.
  • Software Documentation and Records.
  • Software Classification process and levels.
  • Software Engineering Training approach.
  • Software Risk Management approach.
  • Software Cost Estimates and Resources Allocation.
  • Software insight responsibilities.
  • Use of software engineering discipline for software associated with Programmable Logic Devices.
  • Software Architecture and Detail Design.
  • Software Safety and assurance practices.
  • Software Inventory data.
  • Inter-Center Software Work.
  • Use and recording of software metric data.

While SWE-004 is written from the OCE point of view, the requirement also contains an inherent Center role, i.e., participation in the OCE-sponsored bench-marking activities. A Center's support of this requirement can be assessed by considering the extent of its preparations for and involvement in these benchmarking efforts. While the OCE Compliance Survey assesses overall Center response to the requirements of the NPR 7150.2, the Capability Maturity Model Integration ( CMMI®) for Development (CMMI-DEV) 157 appraisals objectively benchmark the actual progress the Center makes toward software engineering process improvements.  These CMMI appraisals are the preferred benchmarks for objectively measuring progress.

The CMMI-DEV benchmarking activities will evaluate the Center's current and, with follow-on evaluations, improved capabilities in the specific and general practices of software engineering.  The CMMI requirement is a qualifying requirement.  The requirement is included to make sure NASA projects are supported by software development organization(s) having the necessary skills and processes in place to produce reliable products within cost and schedule estimates. This requirement provides NASA with a methodology to measure software development organizations against an industry-wide set of best practices that address software development and maintenance activities applied to products and services. The CMMI is a yardstick against which the maturity of an organization's product development and acquisition processes can be measured and compared with the industry state of the practice. The CMMI requirement provides NASA with an industry-standard approach to help measure and ensure compliance with the intent of the NPR 7150.2 083 process-related requirements.  This requirement provides NASA with a common methodology to assess internal and external software development organizations processes and helps NASA identify potential risk areas within a given organization's software development processes.  See SWE-032 for rating requirements and the CMMI material on the Software Engineering Institute's website, 157 which describes the current CMMI model that is used in the evaluation of a Center's software development capabilities.

NASA's Software Engineering Initiative Improvement Plan (NSEIIP) 038, as required by NPR 7150.2, called upon each Center to develop an approved Center Software Engineering Improvement Plan (hereafter referred to as Center Plan). The approval of the Center Plan commits the Center management and staff to the development of its software engineering capabilities and related software process improvements. The collection of approved Center Plans and periodic surveys serves as a basis for the NASA Headquarters Office of the Chief Engineer (OCE) to assess and benchmark the compliance and progress being made at each Center and in aggregate across the Agency.

Additional guidance related to OCE Benchmarking of the Center Plan development and progress may be found in the following related requirements in this Handbook:


4. Small Projects

Typically the OCE includes a small project in the survey activities at a Center. The OCE survey leader will work with the Center SW point of contact (POC) to develop the appropriate level of survey involvement for small projects.

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

  • No labels