This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D
220.127.116.11 The NASA OCE shall authorize appraisals against selected requirements in this NPR to check compliance.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
The Headquarters' Office of the Chief Engineer (OCE) is responsible for promoting and monitoring software engineering practices throughout the agency. It achieves this in part by administering software requirements, policy, procedures, processes, statutes, and regulations. The Headquarters' OCE uses continuing periodic oversight of compliance at the Centers and programs/projects to verify that this responsibility is being met.
NPR 7150.2 serves as the basis for compliance appraisals for software engineering. The appraisal typically occurs during an OCE survey of a Center's processes and directives and through examinations of a project's official records. These surveys are one of the tools used by the OCE to provide oversight, to maintain internal control, and to review its operations.
While SWE-129 is written from the OCE point of view, the requirement also contains an inherent Center role, i.e., participation in the OCE survey activities. A Center's support of this SWE can be assessed by considering the extent of its preparations for and involvement in these OCE surveys.
The Headquarters Office of the Chief Engineer (OCE) controls and maintains an appraisal process for use in periodic Center and project OCE compliance surveys. The OCE compliance survey achieves several objectives. They are:
- Review Center and specified NASA Headquarters organizations’ processes and infrastructure for compliance with OCE requirements, policy, procedures, processes, statutes, and regulations.
- Review specific program/project “files” for compliance with requirements, policy, procedures, processes, statutes, and regulations.
- Identify systemic problems or deficiencies.
- Recognize areas of excellence/best practices.
- Receive Center feedback regarding modifications in Agency policy and requirements.
Currently, the OCE compliance surveys focus on the following core elements:
- Common framework for unified program and project life cycle.
- Program and project review structure.
- Technical authority implementation.
- Dissenting opinions and deviation/waiver process.
- Software engineering management.
- Systems engineering.
- Lessons learned.
- Technical standards.
In addition to NPR 7150.2, the Headquarters’ OCE compliance survey may also include a review and appraisal of the products resulting from the use of the following documents, to the extent they involve software engineering:
- NPD 7120.4D, NASA Engineering and Program/Project Management Policy. 257
- NASA-STD-8739.8, Software Assurance Standard. 278
- NASA-STD-8719.13, Software Safety Standard. 271
The two NASA standards in this list are traditionally covered in detail by the Office of Safety and Mission Assurance (OSMA) audits conducted by the NASA Safety Center.
The process of determining the scope for a survey addresses the following items at a minimum:
- Requirements implementation and compliance.
- Results from audits, reviews, and assessments conducted by other organizations.
- Trends identified across the Agency or within a single organization.
Preparations for the survey typically include reviews of the flow down of NASA OCE requirements to Center/Project procedural documents, reviews of organization and program/project specific documentation and reviews of other surveys, audits and assessments. What follows in this guidance is a brief summary of the software engineering survey team's appraisal process. The main thrust of the software sub-team's appraisal is built into a set of questions from the OCE. This baseline set of questions serves as guidance to the Center or project to communicate what the OCE wants to review. The survey leader will communicate these questions to the Center's survey manager 3 to 4 weeks before the event, who in turn conveys them to the software point of contact. (SW POC). This is usually the Center's NASA Software Working Group (SWG) primary representative.
As in many reviews, surveys, and appraisal activities, compliance is often measured against objective evidence. Expected content and types of this objective quality evidence are usually defined in the preparation for the software survey. This defined material provides the basis for confirmation of compliance with requirements and identification of strengths and weaknesses. The Carnegie Mellon University Software Engineering Institute provides the Capability Maturity Model Integration (CMMI®) appraisal method 157 for process improvement. The method provides an excellent discussion of the philosophy and types of objective evidence. The degree to which Centers follow the CMMI® method can be assessed in a Standard CMMI Appraisal Method for Process Improvement (SCAMPISM) activity. While the information presented in the SCAMPISM discussion is centered on evaluating CMMI® process implementation, the discussions and explanations in the CMMI® text provide good background information for use in the OCE appraisals to people for who are relatively inexperienced in appraisals and surveys.
Additional guidance related to OCE Appraisal activities may be found in the following related requirements and sections in this Handbook:
4. Small Projects
Typically, the OCE includes a small project in the survey activities at a Center. The OCE survey leader will work with the Center SW POC to develop the appropriate level of survey involvement for small projects.
Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
No Lessons Learned have currently been identified for this requirement.