bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D

SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures

1. Requirements

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

a. Software requirements.

b. Software plans.

c. Any design items that the project identified for software peer review or software inspections according to the software development plans.

d. Software code as defined in the software and or project plans.

e. Software test procedures.

1.1 Notes

Software peer reviews or software inspections are a recommended best practice for all safety and mission-success related software components. Recommended best practices and guidelines for software formal inspections are contained in NASA-STD-8739.9 277.

1.2 Applicability Across Classes

If Class D software is safety critical, this requirement applies to the safety-critical aspects of the software.

Classes F and G are labeled with and X and “not OTS” which indicates the project is required to meet this requirement with the exception of off-the-shelf software.

Class

     A      

     B      

     C      

   CSC   

     D      

   DSC   

     E      

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Software peer reviews or software inspections are performed to ensure product and process quality and to add value and reduce risk through expert knowledge infusion, confirmation of approach, identification of defects, and specific suggestions for product improvements. “Peer reviews are focused, in-depth technical reviews that support the evolving design and development of a product, including critical documentation or data packages. The participants in a peer review are the technical experts and key stakeholders for the scope of the review.” 486

3. Guidance

Peer reviews are one of the most efficient and effective ways to find and remove defects.

NASA-STD-8709.22 provides two definitions for peer reviews: 274

[1] A review of a software work product, following defined procedures, by peers of the producers of the product for the purpose of identifying defects and improvements. [SEI-CMM Software Engineering Institute Capability Maturity Model®].

[2] Independent evaluation by internal or external subject matter experts who do not have a vested interest in the work product under review. Peer reviews can be planned, focused reviews conducted on selected work products by the producer’s peers to identify defects and issues prior to that work product moving into a milestone review or approval cycle.

Peer reviews can also be described as “planned, focused reviews by technical team peers on a single work product with the intent of identifying issues prior to that work product moving on to the next step. A peer review includes planning, preparing, conducting, analyzing outcomes, and identifying and implementing corrective actions.” 041

A key rationale for using software peer reviews or software inspections is that they are one of the few verification and validation (V&V) approaches that can be applied in the early stages of software development, long before there is any code that can be run and tested. Moreover, when defects are found and fixed in these early stages rather than allowed to slip into later phases, it can have a huge impact on project budget. For this reason, software peer reviews and software inspections of requirements documents are explicitly required.

As documented in NASA-GB-8719.13, NASA Software Safety Guidebook 276 , "Formal Inspections have the most impact when applied early in the life of a project, especially the requirements specification and definition stages of a project. Impact means that the defects are found earlier, when it's cheaper to fix them....Formal Inspection greatly improves the communication within a project and enhances understanding of the system while scrubbing out many of the major errors and defects." 

Software plans are another key artifact on which software peer reviews or software inspections can be applied with the best return on investment. Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review or inspections are applied to improve the quality of such plans.

Software testing represents a substantial part of the effort of assuring software quality for most projects; so it is important to ensure that test cases are focused on the appropriate functional areas, cover all important usage scenarios, and specify the expected system behavior adequately and accurately. The best way to ensure these factors is via peer review/inspection, the application of human judgment and analysis.

Code and design artifacts also benefit from peer review or inspection. However, although important, inspections of such artifacts are less crucial than those of other life-cycle activities because there are other effective Verification and Validation (V&V) options available for code and design (such as testing or simulation). It is important for a project to identify the most crucial design and code segments and deploy inspections on them to improve their quality. Projects also need to focus inspections of code and design on issues that cannot be verified using automated tools; i.e., projects need to be sure that human judgment is applied on appropriate issues and rely on automation where it is best suited.

Peer reviews/inspections provide the following additional benefits:

Useful for many types of products: documentation, requirements, designs, code

Simple to understand

Provide a way for sharing/learning of good product development techniques

Serve to bring together human judgment and analysis from diverse stakeholders in a constructive way

Can result in very efficient method of identifying defects early in the product’s life cycle

Use a straight-forward, organized approach for evaluating a work product

-          To detect potential defects in a product

To methodically evaluate each defect to identify solutions and track incorporation of these solutions into the product

Projects are required to conduct peer reviews/inspections for the following documentation based on software classification:

Software Documentation

A

B

C

D

E

Software Requirements

X

X

X

X (SC only)

 

Software Plans

X

X

X

X (SC only)

 

Software Design Identified in Plans

X

X

X

X (SC only)

 

Software Code identified in Plans

X

X

X

X (SC only)

 

Test Procedures

X

X

X

X (SC only)

 

 When conducting a peer review/inspection, be sure the following are part of the process in order to have an effective software peer review/inspection:

  • Keep the review focused on the technical integrity and quality of the product.
  • Keep the review simple and informal.
  • Concentrate on review of the documentation and minimize presentations.
  • Use a round-table format rather than a stand-up presentation.
  • Give a full technical picture of items being reviewed.
  • Plan the review/inspection, use checklists, and include readiness and completion criteria.
  • Capture action items, monitor defects, results, and effort.

When putting a software peer review team together, use the following best practices:

  • Team consists of minimum of three inspectors.
    • Diverse viewpoints and objectivity are required.
  • Inspection team members are based on analysis of key stakeholders in the item under inspection.
  • Author should not be the reader, recorder, or moderator.
  • Moderator should not be a manager.
  • At a minimum, moderator should be formally trained in the process, but all participants may be trained.
  • Management presence/participation discouraged.
  • Each role has a specific responsibility as shown in the table below:

Role

Responsibility

Moderator

Conducts and controls the inspection

Author

Producer of product under inspection, answers technical questions

Reader

Presents (reads, paraphrases) the inspection product to the inspection team

Recorder

Documents defects identified during the inspection as well as open issues and action items

Inspectors

Look for defects in the product under inspection

The table below is from the NASA System Engineering Processes and Requirements, NPR 7123.1, and shows the entrance criteria and success criteria for a peer review activity.

 

Software peer reviews are conducted using the following steps:

Step

Description

Planning

Organize inspection, inspection package contents, required support, schedule

Overview

Educational briefing at time of package distribution to explain materials at high level

Preparation

Inspectors individually look for and document defects and develop questions

Inspection Meeting

Inspectors examine the product as a group to classify and record defects, capture open issues and action items

Third Hour

Optional informal meeting to resolve open issues and discuss solutions

Rework

Author corrects major defects (others when cost and schedule allow)

Follow-up

Moderator verifies all major and other dispositioned defects have been corrected; no new defects introduced; all action items/open issues are closed

 When conducting the software peer review, incorporate the following best practices to ensure an effective outcome:

  • Defects found during inspections are never used to evaluate the author – the goal is to improve the product.
  • Use checklists relevant to each inspector’s perspective.
  • Use readiness and completion criteria.
  • Limit inspection meeting to 2 hours.
  • Track action items until they are resolved.
  • Collect and use inspection data:
    • Effort, number of participants, areas of expertise.
    • Defects - list, total, type.
    • Inspection outcome (pass/fail).
    • Item being inspected and type (requirements, code, etc.).
    • Date and time.
    • Meeting length, preparation time of participants.

NASA-STD-8739.9, Software Formal Inspection Standard 277 includes lessons that have been learned by practitioners over the last decade. Contained in the Standard are best practices related to performing inspections on different work products, including recommendations for the checklist contents, the minimum set of reference materials required, the needed perspectives to be included on the inspection team, and reasonable page rates that can help plan adequate time for the inspection, specially adapted for when inspecting:

  • Requirements.
  • Design documents.
  • Source code.
  • Software plans.
  • Software test procedures.

The design and code segments selected for peer review or inspection should be those that are the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap.

The presence and participation of project management in peer review or inspection meetings is usually not recommended due to the potential negative impact to the effectiveness of the inspections. Typically, management only receives summary level information on peer reviews/inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (especially in the context software plans), they may be included as participants of the inspections only when necessary.

Both management and the inspectors must be aware that defects found during inspections are never to be used for evaluating the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and to help identify potential solutions.

The Fraunhofer Center 421 in Maryland maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.

NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

NASA-specific peer review and inspection guidance, checklists, worksheets, etc. are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook. 

Additional guidance related to peer reviews and inspections may be found in the following topic and related requirements in this handbook:

4. Small Projects

While small projects are required to use peer reviews and/or inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review/inspection team can leverage personnel from:

  • Areas that interface with the product being developed.
  • Related projects.
  • Other areas within the functional organization.
  • User organization.

Small teams also determine whether quality assurance personnel from the Center participate; for example, by providing a trained moderator to oversee the inspection logistics. 

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

A substantial body of data and experience justifies the use of inspections on requirements. Finding and fixing requirements problems during requirements analysis is cheaper than doing so later in the life cycle, and is substantially cheaper than finding and fixing the same defects after delivery of the software. Data from NASA as well as numerous other organizations (such as IBM, Toshiba, the Defense Analysis Center for Software) all confirm this effect. 319

The effectiveness of inspections for defect detection and removal in any artifact has also been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 per cent of the existing defects, regardless of the artifact type. 319

A documented lesson from the NASA Lessons Learned database notes the following:

  • Deficiencies in Mission Critical Software Development for Mars Climate Orbiter (1999). Lesson Number 0740: Experience at NASA has also shown that a lack of software reviews can result in loss of spacecraft: "1) Non-compliance with preferred software review practices may lead to mission loss. 2) To identify mission critical software, require concurrent engineering and thorough review by a team of systems engineers, developers, and end users. 3) For all mission critical software (or software interfaces between two systems or two major organizations), systems engineers, developers, and end users should participate in ... walk-throughs of requirements, design, and acceptance plans." 521


  • No labels