Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 18 Next »

Error rendering macro 'alias'


SWE-137 - Software Peer Reviews and Inspections - Peer Reviews and Inspections of Software Plans
Unknown macro: {div3}

1. Requirements

4.3.2 The project shall perform and report on software peer reviews/inspections for:

      a. Software Development or Management Plan.
      b. Software Configuration Management Plan.
      c. Software Maintenance Plan.
      d. Software Assurance Plan.
      e. Software Safety Plan.

1.1 Notes">1.1 Notes

NPR 7150.2,  NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C through E and Safety Critical are labeled with "P (Center) + SO." "P (Center)" means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement while "SO" means that the requirement applies only for safety critical portions of the software.

Class C and Not Safety Critical as well as Class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf software for these classes.





























Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Peer review/inspections are among the most effective

<ac:macro ac:name="unmigrated-wiki-markup">



practices for software 319. They can be applied to many kinds of technical artifacts, and serve to bring together human judgment and analysis from diverse stakeholders in a constructive way.

Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.

Unknown macro: {div3}

3. Guidance

NASA-STD 2202-93, NASA Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.

The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.

NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:


Software Development/Management Plan


Software CM Plan


Software Maintenance Plan


Software Assurance Plan


Software Safety Plan Contents

Additional guidance related to peer reviews and inspections may be found in the following related requirements in this Handbook:


Software Peer Reviews and Inspections - Checklist Criteria and Tracking


Software Peer Reviews and Inspections - Basic Measurements

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

Unknown macro: {div3}

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

Unknown macro: {div3}

5. Resources

  • (SWEREF-277) NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1
  • (SWEREF-319) Shull, F., Basili, V. R., Boehm, B., Brown, A. W., Costa, P., Lindvall, M., Port, D., Rus, I., Tesoriero, R., and Zelkowitz, M. V., Proc. IEEE International Symposium on Software Metrics (METRICS02), pp. 249-258. Ottawa, Canada, June 2002.

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Unknown macro: {div3}

6. Lessons Learned

The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type 319.

  • No labels