Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-137 - Software Peer Reviews and Inspections - Peer Reviews and Inspections of Software Plans
Unknown macro: {div3}

1. Requirements

4.3.2 The project shall perform and report on software peer reviews/inspections for:

      a. Software Development or Management Plan.
      b. Software Configuration Management Plan.
      c. Software Maintenance Plan.
      d. Software Assurance Plan.
      e. Software Safety Plan.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C through E and Safety Critical are labeled with "P (Center) + SO". "P(Center)" means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement while "SO" means that the requirement applies only for safety critical portions of the software.

Class C and not safety critical as well as Class G are labeled with "P(Center). This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class F is labeled with "X (not OTS)". This means that this requirement does not apply to off-the-shelf software for these classes.





























Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Peer review/inspections are among the most effective V&V practices for software. 1 They can be applied to many kinds of technical artifacts, and serve to bring to bear human judgment and analysis from diverse stakeholders in a constructive way.

Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.

Unknown macro: {div3}

3. Guidance

The NASA Software Formal Inspection Standard is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.

The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.

NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:


Software Development/Management Plan


Software CM Plan


Software Maintenance Plan


Software Assurance Plan


Software Safety Plan

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

Unknown macro: {div3}

4. Small Projects

There is currently no guidance for small projects relevant to this requirement.

Unknown macro: {div3}

5. Resources

  1. Shull, F., Basili, V. R., Boehm, B., Brown, A. W., Costa, P., Lindvall, M., Port, D., Rus, I., Tesoriero, R., and Zelkowitz, M. V., "What We Have Learned About Fighting Defects," Proc. IEEE International Symposium on Software Metrics (METRICS02), pp. 249-258. Ottawa, Canada, June 2002.
  2. NASA Technical Standard, "Software Formal Inspections Standard", NASA-STD-2202-93, 1993.

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Unknown macro: {div3}

6. Lessons Learned

The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60% and 90% of the existing defects, regardless of the artifact type. 1

  • No labels