See edit history of this section
Post feedback on this section
1. Requirements
2.1.1.4 The NASA OCE shall authorize appraisals against selected requirements in this NPR to check compliance.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
2. Rationale
The Headquarters' Office of the Chief Engineer (OCE) is responsible for promoting and monitoring software engineering practices throughout the agency. It achieves this in part by administering software requirements, policies, procedures, processes, statutes, and regulations. The Headquarters' OCE uses continuing periodic oversight of compliance at the Centers and programs/projects to verify that this responsibility is being met.
NPR 7150.2 serves as the basis for compliance appraisals for software engineering. The appraisal typically occurs during an OCE survey of processes and directives and thorough examinations of a project's official records. These surveys are one of the tools used by the OCE to provide oversight, maintain internal control, and review its operations.
While SWE-129 is written from the OCE point of view, the requirement also contains an inherent Center role, i.e., participation in the OCE survey activities. A Center's support of this SWE can be assessed by considering the extent of its preparations for and involvement in these OCE surveys.
3. Guidance
The Headquarters Office of the Chief Engineer (OCE) controls and maintains an appraisal process for use in periodic Center and project OCE compliance surveys. The OCE compliance survey achieves several objectives. They are:
- Review Center and Project and specified NASA Headquarters organizations’ processes and infrastructure for compliance with OCE requirements, policy, procedures, processes, statutes, and regulations.
- Review specific program/project “files” for compliance with requirements, policy, procedures, processes, statutes, and regulations.
- Identify systemic problems or deficiencies.
- Recognize areas of excellence/best practices.
- Receive Center feedback regarding modifications in Agency policy and requirements.
Currently, the OCE compliance surveys focus on the following core elements:
- The common framework for a unified program and project life cycle.
- Program and project review structure.
- Technical authority implementation.
- Dissenting opinions and deviation/waiver process.
- Software engineering management.
- Systems engineering.
- Lessons learned.
- Technical standards.
- Other.
See also SWE-004 - OCE Benchmarking
In addition to NPR 7150.2, the Headquarters’ OCE compliance survey may also include a review and appraisal of the products resulting from the use of the following documents, to the extent they involve software engineering:
- NPD 7120.4D, NASA Engineering, and Program/Project Management Policy. 257
- NASA-STD-8739.8 278, Software Assurance, and Software Safety Standard.
The NASA standard in this list are traditionally covered in detail by the Office of Safety and Mission Assurance (OSMA) audits conducted by the NASA Safety Center. See also SWE-221 - OSMA NPR Appraisals.
3.1 Scope of a Survey
The process of determining the scope for a survey addresses the following items at a minimum:
- Requirements implementation and compliance.
- Results from audits, reviews, and assessments conducted by other organizations.
- Trends are identified across the Agency or within a single organization.
Preparations for the survey typically include reviews of the flow down of NASA OCE requirements to Center/Project procedural documents, reviews of organization and program/project-specific documentation, and reviews of other surveys, audits, and assessments. What follows in this guidance is a brief summary of the software engineering survey team's appraisal process. The main thrust of the software subteam's appraisal is built into a set of questions from the OCE. This baseline set of questions serves as guidance to the Center or project to communicate what the OCE wants to review. The survey leader will communicate these questions to the Center's survey manager 3 to 4 weeks before the event, who in turn conveys them to the software point of contact. (SW POC).
3.2 Use of Objective Evidence
As in many reviews, surveys, and appraisal activities, compliance is often measured against objective evidence. Expected content and types of objective quality evidence are usually defined in the preparation for the software survey. This defined material provides the basis for confirmation of compliance with requirements and identification of strengths and weaknesses. The CMMI Institute provides the Capability Maturity Model Integration (CMMI®) appraisal method for process improvement. The method provides an excellent discussion of the philosophy and types of objective evidence. The degree to which Centers follow the CMMI® method can be assessed in an appraisal activity.
Findings resulting from the survey are generally classified as strengths, weaknesses, observations, opportunities, and non-compliances. However, the survey team has a clear and overriding obligation to identify all items of non-compliance and items that adversely affect safety or quality. These items will be included in the final report. Significant issues are brought to the immediate attention of the surveyed organization's management via the survey manager.
See also SWE-036 - Software Process Determination, SWE-126 - Tailoring Considerations, SWE-139 - Shall Statements.
3.3 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
3.4 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
The OCE survey leader will work with the Center SW POC to develop the appropriate level of survey involvement for small projects.
5. Resources
5.1 References
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-256) NPR 1400.1H, NASA Office of Internal Controls and Management Systems, Effective Date: March 29, 2019, Expiration Date: March 29, 2024
- (SWEREF-257) NPD 7120.4E, NASA Office of the Chief Engineer, Effective Date: June 26, 2017, Expiration Date: June 26, 2022
- (SWEREF-261) NPD 1000.0C, NASA Governance and Strategic Management Handbook, Effective Date: January 29, 2020, Expiration Date: January 29, 2025
- (SWEREF-262) NASA Headquarters NASA Office of the Chief Engineer engineering deviations and waivers website.
- (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-277) NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
- (SWEREF-374) OCE Requirements Compliance Survey Process,Office of the Chief Engineer (OCE), NASA, 2010.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
0 Comments