bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-137 - Software Peer Reviews and Inspections - Peer Reviews and Inspections of Software Plans
Unknown macro: {div3}

1. Requirements

4.3.2 The project shall perform and report on software peer reviews/inspections for:

      a. Software Development or Management Plan.
      b. Software Configuration Management Plan.
      c. Software Maintenance Plan.
      d. Software Assurance Plan.
      e. Software Safety Plan.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C through E and Safety Critical are labeled with "P (Center) + SO". "P(Center)" means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement while "SO" means that the requirement applies only for safety critical portions of the software.

Class C and not safety critical as well as Class G are labeled with "P(Center)". This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class F is labeled with "X (not OTS)". This means that this requirement does not apply to off-the-shelf software for these classes.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

    X

    P(C)

    X

   

    X

   

    X

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Peer review/inspections are among the most effective V&V practices for software. 1 They can be applied to many kinds of technical artifacts, and serve to bring to bear human judgment and analysis from diverse stakeholders in a constructive way.

Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.

Unknown macro: {div3}

3. Guidance

The NASA Software Formal Inspection Standard is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.

The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.

NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:

[SWE-102]

Software Development/Management Plan

[SWE-103]

Software CM Plan

[SWE-105]

Software Maintenance Plan

[SWE-106]

Software Assurance Plan

[SWE-138]

Software Safety Plan

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

Unknown macro: {div3}

4. Small Projects

There is currently no guidance for small projects relevant to this requirement.

Unknown macro: {div3}

5. Resources

  1. Shull, F., Basili, V. R., Boehm, B., Brown, A. W., Costa, P., Lindvall, M., Port, D., Rus, I., Tesoriero, R., and Zelkowitz, M. V., "What We Have Learned About Fighting Defects," Proc. IEEE International Symposium on Software Metrics (METRICS02), pp. 249-258. Ottawa, Canada, June 2002.
  2. NASA Technical Standard, "Software Formal Inspections Standard", NASA-STD-2202-93, 1993.

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

No tools have been currently identified for this SWE. If you wish to suggest a tool, please leave a comment below.

Unknown macro: {div3}

6. Lessons Learned

The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60% and 90% of the existing defects, regardless of the artifact type. 1

  • No labels