bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 67 Next »

The license could not be verified: License Certificate has expired! Administrators, please check your license details here.

SWE-129 - OCE NPR Appraisals

1. Requirements

6.3.8 The NASA Headquarters' Office of the Chief Engineer shall authorize appraisals against selected requirements in this NPR (including NASA Headquarters' Office of the Chief Engineer approved subsets and alternative sets of requirements) to check compliance.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

   

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

The Headquarters' Office of the Chief Engineer (OCE) is responsible for promoting and monitoring software engineering practices throughout the agency. It achieves this in part by administering software requirements, policy, procedures, processes, statutes, and regulations. The Headquarters' OCE uses continuing periodic oversight of compliance at the Centers and programs/projects to verify that this responsibility is being met.

NPR 7150.2 serves as the basis for compliance appraisals for software engineering. The appraisal typically occurs during an OCE survey of a Center's processes and directives and through examinations of a project's official records. These surveys are one of the tools used by the OCE to provide oversight, to maintain internal control, and to review its operations. 

While SWE-129 is written from the OCE point of view, the requirement also contains an inherent Center role, i.e., participation in the OCE survey activities. A Center's support of this SWE can be assessed by considering the extent of its preparations for and involvement in these OCE surveys.

3. Guidance

The Headquarters OCE controls and maintains an appraisal process for use in periodic Center and project OCE compliance surveys. 374 The OCE compliance survey achieves several objectives. They are:

  • Review Center and specified NASA Headquarters organizations' processes and infrastructure for compliance with OCE requirements, policy, procedures, processes, statutes, and regulations.
  • Review specific program/project "files" for compliance with requirements, policy, procedures, processes, statutes, and regulations.
  • Identify systemic problems or deficiencies.
  • Recognize areas of excellence/best practices.
  • Receive Center feedback regarding modifications in Agency policy and requirements.

Currently, the OCE compliance surveys focus on the following core elements:

  • Common framework for unified program and project life cycle.
  • Program and project review structure.
  • Technical Authority implementation.
  • Dissenting opinions and deviation/waiver process.
  • Software engineering management.
  • Systems engineering.
  • Lessons learned.
  • Technical standards.
  • Other.

In addition to NPR 7150.2, the Headquarters' OCE compliance survey may also include a review and appraisal of the products resulting from use of the following documents, to the extent they involve software engineering:

  • NPD 7120.4D, NASA Engineering and Program/Project Management Policy. 257
  • NPR 7123.1A, NASA Systems Engineering Processes and Requirements. 041
  • NASA-STD-8739.8, Software Assurance Standard. 278
  • NASA-STD-8719.13, Software Safety Standard. 271

The two NASA Standards in this list are traditionally covered in detail by Office of Safety and Mission Assurance (OSMA) audits conducted by the NASA Safety Center.

"The baseline set of questions are reviewed and may be revised as needed to support the survey at each specific organization. Input for updates to the questions is obtained from survey team members including the software engineering sub-team lead, the systems engineering sub-team lead, and the Office of the Chief Information Officer (OCIO)representative for records management" (OCE Requirements Compliance Survey Process, 2010). 374

The OCE and appraisal teams chartered by the OCE plan the scope and content of the survey. A Survey Leader is typically named by the OCE to provide overall event planning and coordination, as well as serving as a liaison between the OCE and the Center's point of contact or survey manager.

The process of determining the scope for a survey addresses the following items at a minimum:

  • Requirements implementation and compliance
  • Results from audits, reviews, and assessments conducted by other organizations
  • Trends identified across the Agency or within a single organization

Preparations for the survey typically include reviews of the flow down of NASA OCE requirements to Center procedural documents, reviews of organization and program/project specific documentation and reviews of other surveys, audits and assessments. The complete description of this nominally week-long event can be found in the OCE Requirements Compliance Survey Process 374 file located on the NASA Engineering Network (NEN) website. 258See the OCE Requirements Compliance Survey Process  document 374for information about team formation, a generic time-line, and other helpful guidance.

What follows in this guidance is a brief summary of the software engineering survey team's appraisal process. The main thrust of the software sub-team's appraisal is built into a set of 25 questions from the OCE. This baseline set of questions serves as guidance to the Center or project to communicate what the OCE wants to review. The survey leader will communicate these questions to the Center's survey manager 3 to 4 weeks before the event, who in turn conveys them to the software point of contact. (SW POC). This is usually the Center's NASA Software Working Group (SWG) primary representative.

 The 2010 OCE SW Survey Generic Worksheet 352, located on the NEN website, is a useful template for preparing responses to the specific questions.

The set of provided questions are typically the same for all Center surveys, although the actual questions included in the survey may be tailored, based on results learned/obtained from other appraisal activities.  Reviews of partial Center (P (Center)) determinations (see SWE-140) will be included in the survey activities. Also included will be reviews of general exclusions or alternate requirements approved against requirements in NPR 7150.2 (see SWE-120). 

The software appraisal activity begins during the survey planning process with the OCE Pre-Brief Presentation. A major component of this pre-brief is the discussion of selection parameters that are used to determine which projects and software activities will be a part of the survey. This pre-brief meeting is nominally held 6-8 weeks prior to the start of the survey. The actual survey event includes entrance presentations, document reviews by the OCE survey team, interviews of Center and project personnel by the survey team, development and review of initial findings, a review for surfacing general or systemic findings, and a summary presentation to members of management. The last activity is usually an exit briefing, which is the survey team's first formal opportunity to present their findings to the senior management of the surveyed organization.  A series of follow-on actions are defined and assigned.

As in many reviews, surveys, and appraisal activities, compliance is often measured against objective evidence. Expected content and types of this objective quality evidence are usually defined in the preparation for the software survey. This defined material provides the basis for confirmation of compliance with requirements and identification of strengths and weaknesses. The Carnegie Mellon University Software Engineering Institute provides the Capability Maturity Model Integration (CMMI) appraisal method 157for process improvement.  The method provides an excellent discussion of the philosophy and types of objective evidence. The degree to which Centers follow the CMMI method can be assessed in a Standard CMMI Appraisal Method for Process Improvement (SCAMPI) activity. While the information presented in the SCAMPI discussion is centered on evaluating CMMI process implementation, the discussions and explanations in the CMMI text provide good background information for use in the OCE appraisals to people for who are relatively inexperienced in appraisals and surveys.

Findings resulting from the survey are generally classified as strengths, weaknesses, observations, opportunities, and non-compliances. See the Requirements Compliance Survey Process document 374 for definitions of these terms. However, the survey team has a clear and overriding obligation to identify all items of non-compliance and items that adversely affect safety or quality. These items will be included in the final report. Significant issues are brought to the immediate attention of the surveyed organization's management via the survey manager

Additional guidance related to OCE Appraisal activities may be found in the following related requirements in this Handbook:

SWE-004

OCE Benchmarking

SWE-036

Software Process Determination

SWE-108

Center SW Improvement Plan

4. Small Projects

Typically, the OCE includes a small project in the survey activities at a Center. The OCE survey leader will work with the Center SW POC to develop the appropriate level of survey involvement for small projects.

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

No Lessons Learned have currently been identified for this requirement.

  • No labels

0 Comments