bannerc

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 49 Next »

SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking

1. Requirements

5.3.3 The project manager shall, for each planned software peer review or software inspection:

    1. Use a checklist or formal reading technique (e.g., perspective based reading) to evaluate the work products.
    2. Use established readiness and completion criteria.
    3. Track actions identified in the reviews until they are resolved.
    4. Identify the required participants.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-088 - Last used in rev NPR 7150.2D

RevSWE Statement
A

4.3.3 The project shall, for each planned software peer review/inspections:

      a. Use a checklist to evaluate the work products.
      b. Use established readiness and completion criteria.
      c. Track actions identified in the reviews until they are resolved.
      d. Identify required participants.

Difference between A and BIncluded "formal reading technique" as mechanism for evaluating work products.
B

5.3.3 The project manager shall, for each planned software peer review or software inspection:

    1. Use a checklist or formal reading technique (e.g., perspective based reading) to evaluate the work products.
    2. Use established readiness and completion criteria.
    3. Track actions identified in the reviews until they are resolved.
    4. Identify required participants.
Difference between B and C

No change

C

5.3.3 The project manager shall, for each planned software peer review or software inspection:

    1. Use a checklist or formal reading technique (e.g., perspective based reading) to evaluate the work products.
    2. Use established readiness and completion criteria.
    3. Track actions identified in the reviews until they are resolved.
    4. Identify the required participants.

Difference between C and DNo change
D

5.3.3 The project manager shall, for each planned software peer review or software inspection:

a. Use a checklist or formal reading technique (e.g., perspective-based reading) to evaluate the work products.
b. Use established readiness and completion criteria.
c. Track actions identified in the reviews until they are resolved.
d. Identify the required participants.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Checklists, criteria, and tracking of actions and participants are needed to conduct an effective peer review or inspection.  Peer reviews and inspections contribute to product and process quality, risk reduction, confirmation of approach, defect identification, and product improvements.

3. Guidance

This requirement calls out four important best practices that are associated with effective inspections:

a. Using a checklist supports the software peer review or software inspection team members by giving them a memory aid regarding what quality aspects they are responsible for in the document under review. The checklists provide a concrete way for the inspection to improve over time. Defect types that are seen to continually slip through peer reviews or software inspections are added to the checklist so that future teams are aware that they are important to look for. Checklist items that no longer lead to defects being found are candidates for deletion. If kept up to date in this way, checklists provide a timely and efficient list of the types of issues on which review time should be spent.

Using a formal reading technique such as perspective-based reading helps ensure that the viewpoints of the various customers and stakeholders of the product under review are represented. Peer review or inspection team members take on the roles to represent the different points of view. “The goal ... is to provide operational scenarios where members of a review team read a document from a particular perspective, e.g., tester, developer, the user.  The combination of different perspectives provides better coverage of the document, i.e., uncovers a wider range of defects, than the same number of readers using their usual technique.” 474

b. Readiness and completion criteria are used to ensure that peer review or software inspection time is being spent effectively and that confidence can be had in the outcome. Readiness criteria are satisfied before an inspection can begin. They represent the minimal set of quality characteristics that are to be satisfied before it is worthwhile to have a team of subject matter experts spend significant time understanding, assessing, and discussing the product under review or inspection. Readiness criteria also indicate the preparedness of the peer review or software inspection team to conduct the review or inspection. Readiness criteria may specify standards and guidelines to be adhered to; set project-specific criteria like the level of detail or a particular policy to be followed, and may require the use of automated tools (like static analysis tools or traceability tools). Completion criteria represent a set of measurable activities that are to be completed at the end of the inspection so that statements can be made with confidence regarding the outcome. For example, completion criteria may require that all process steps have been completed and documented; metrics have been collected; or that all major defects have been completed and approved.

Table G-19 below is from the NASA System Engineering Processes and Requirements, NPR 7123.1, and shows the entrance criteria and success criteria for a peer review activity.

Table G-19 - Peer Review Entrance and Success Criteria

Peer Review

Entrance Criteria

Success Criteria

  1. The product to be reviewed (e.g., document, process, model, design details) has been identified and made available to the review team.
  2. Peer reviewers independent from the project have been selected for their technical background related to the product being reviewed.
  3. A preliminary agenda, success criteria, and instructions to the review team have been agreed to by the technical team and project manager.
  4. Rules have been established to ensure consistency among the team members involved in the peer-review process.
  5. *Spectrum (radio frequency) considerations addressed.
  1. Peer review has thoroughly evaluated the technical integrity and quality of the product.
  2. Any defects have been identified and characterized.
  3. The results of the peer review are communicated to the appropriate project personnel.
  4. Spectrum-related aspects have been concurred to by the responsible Center spectrum manager.

*Required per NPD 2570.5.


c. Action items are required to be tracked through completion so that it is assured that the inspection has a positive impact on software quality. Due to time pressures, teams who identify significant numbers of defects in inspection and then do not take the time to resolve them, are wasting effort. Tracking the action items ensures that such an outcome is avoided. In addition to the impact on software quality, this best practice also aims at keeping the morale of inspection teams high. Nothing is more demoralizing for a team than investing significant time in identifying and reporting software defects if they are never fixed afterward.

d. Effective peer reviews or software inspections begin with a planning phase in which plans are made regarding the scope of the document under review, the time available, and other key parameters. One of the most important issues to address in this step is to analyze which perspectives of stakeholders are needed to ensure that all quality aspects can be adequately addressed in an inspection. Taking the time to apply a rigorous inspection process will not automatically yield an effective outcome if the actual engineering knowledge and expertise are never brought to bear on analyzing the document.

NASA-STD-8739.9, Software Formal Inspections Standard suggests several best practices related to the use of checklists. They recommend that:

  • Each team member uses a checklist or similar work aid available, with items relevant to the perspective each is representing.
  • Checklists are included as input to any inspection.
  • Inspectors use the given checklists during their preparation. 277

The Standard offers detailed suggestions as to what types of quality aspects need to be covered by checklists in a variety of different circumstances.

The Standard also suggests that the perspectives of key stakeholders be represented on the inspection team.  Recommended practices include:

  • Choose inspectors in consultation with the author.
  • Moderator ensures that objectivity in the selection is maintained.
  • One individual may represent multiple perspectives.
  • Inspectors representing key perspectives must be present and prepared for each relevant stage of the inspection process. 277

Best practices related to the establishment of readiness and completion criteria include:

  • Entrance and exit criteria are specified as part of the inspection procedure, and provide several examples of criteria found useful on NASA teams.
  • During the inspection planning, the work product under inspection is evaluated against the entrance criteria before the inspection can begin.
  • The project manager defines the criteria to be used to determine if an inspection ends by passing the document under review, or requiring a re-inspection.
  • To ensure that close-out activities are undertaken, at the end of any inspection meeting, the moderator:
    • Determines based on the outcome of inspections, using the criteria previously defined by the project manager, if a re-inspection will be needed.
    • Compiles, as the outcome of an inspection meeting:
      • A list of classified anomalies or defects identified from the inspections.
      • A list of change requests or discrepancy reports for defects found in work products of the previous development phase that have been put under configuration management (CM).
      • The inspected work product was marked with clerical defects.
    • Ensures that authors of the work product inspected receive the list of classified anomalies or defects.

Best practices related to tracking actions identified in the reviews until they are resolved to include:

  • Action items and defects discussed during the inspection meeting are compiled and tracked starting at that time.
  • The author's fixes to defects discovered during an inspection are verified before the end of that inspection.
  • Each project defines a method for documenting, tracking, and measuring such action items.

Best practices related to the identification of required participants include:

  • The team consists of a minimum of three inspectors, reflecting that diverse viewpoints and objectivity are required to be brought to bear during an inspection.
  • The inspection team members are based on an analysis of the key stakeholders in the document under inspection.

The Fraunhofer Center 421 in Maryland maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.

NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

4. Small Projects

Checklists for various types of inspections can be found at the Fraunhofer Center website 421. Various inspection tools can be used to reduce the effort of tracking the information associated with inspections. See the "Tools" section of the Resources tab for a list of tools.

5. Resources

5.1 References


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

  • Throughout hundreds of inspections and analyses of their results, the Jet Propulsion Laboratory (JPL) has identified key lessons learned which lead to more effective inspections 235, including:
    • Inspections are carried out by peers representing the areas of the life cycle affected by the material being inspected. Everyone participating should have a vested interest in the work product.
    • Management is not present during inspections.
    • Checklists of questions are used to define the task and to stimulate defect findings.


7. Software Assurance

SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking
5.3.3 The project manager shall, for each planned software peer review or software inspection:
    1. Use a checklist or formal reading technique (e.g., perspective based reading) to evaluate the work products.
    2. Use established readiness and completion criteria.
    3. Track actions identified in the reviews until they are resolved.
    4. Identify the required participants.

7.1 Tasking for Software Assurance

  1. Confirm that the project meets the NPR 7150.2 criteria in "a" through "d" for each software peer review.

  2. Confirm that the project resolves the actions identified from the software peer reviews.

  3. Perform audits on the peer-review process. 

7.2 Software Assurance Products

  • Peer Review Process Audit Report (SA audit results and findings on software peer-review process).


    Objective Evidence

    • Peer review metrics, reports, data, or findings
    • List of participates in the software peer reviews
    • Defect or problem reporting tracking data
    • Software assurance audit reports on the peer-review process

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of software process Non-Conformances by life-cycle phase over time
  • Preparation time each audit participant spent preparing for the audit
  • Time required to close peer review audit Non-Conformances
  • Trends on non-conformances from audits (Open, Closed, Life-cycle Phase)
  • # of Peer Review Audits planned vs. # of Peer Review Audits performed
  • # of audit Non-Conformances per peer review audit
  • # of peer review Non-Conformances per work product vs. # of peer reviewers
  • # of peer review participants vs. total # invited
  • Preparation time each review participant spent preparing for the review
  • Total # of peer review Non-Conformances (Open, Closed)
  • # of Non-Conformances identified by software assurance during each peer review
  • # of Non-Conformances from reviews (Open vs. Closed; # of days Open)
  • # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project
  • Trends of # Open vs. # Closed over time
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
  • # of Open vs. Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
  • # of Compliance Audits planned vs. # of Compliance Audits performed

7.4 Guidance

Task 1: 

Confirm that the project has met the 4 conditions in the activities specified in the software requirement:

  • Confirm that the project has prepared one or more checklists to evaluate the work product before the peer review.

The selection and preparation of the checklists to use during the review is part of the preparation for the review that needs to be completed, along with other practical considerations such as determining who should attend the review, when and where the review should be held, reserving a room, and distributing the review announcement and material to be reviewed.

Each review should have specific checklists, depending on the type of asset that is being reviewed, so different checklists would be used for reviewing software development plans, test plans, and procedures, configuration management plans, a design document, or a portion of the code. A checklist may also be written to capture a particular perspective of some roles of the team. For example, there may be checklist(s) from a perspective of a person on the requirements development team, the design team, the coding team, the test team, the operations team, or software assurance.

Checklists should be used during the review for guidance on typical types of defects to be found in the type of product being inspected. In addition, the product being inspected is checked against higher-level work products, standards, and interface documents to assure compliance and correctness.

There are many inspection checklists available that can be used as a starting point for different kinds of reviews. Check with your Center process asset team or look in SPAN for example. 

  • The next item to confirm is that the entrance and exit criteria have been established.  Before the peer review, software assurance should confirm that the entrance criteria have been established and before the close-out of the review, software assurance should confirm that the exit criteria have been met. For the entrance criteria, check on the following:
    • Has the leader selected the material to be peer-reviewed at this review, and sent it out to the participants, along with any necessary background material?
    • Has the leader reserved a room, prepared checklists, selected participants with varying roles in the project?
    • Have the participants reviewed the materials in advance of the review?
  • For the exit criteria, consider the following:
    • Have all the identified issues and defects been recorded and assigned to further investigation and resolution?
    • Has a priority been assigned to the issues and defects?
    • Have the selected metrics been collected and recorded?
    • Needs a re-inspection been established?
    • Have all the issues and problems been closed out? Software assurance should be tracking these issues and verifying they are closed out.
  • The third activity software assurance needs to confirm is that the peer review items have been resolved before the closure of the review. Software assurance will do this by tracking the item independently and verifying that items are closed.
  • The final activity that software assurance will confirm is that the selection of the participants has been made by the leader. Several considerations should be considered when selecting review team members.
    • Often peer reviews are more effective if the team size is limited to 5 to 9 people.
    • Team members must have technical knowledge about the project and be familiar with the asset being reviewed. If this is not the case, it is recommended that the leader give the team an overview before the review.
    • Team members should be selected to provide different perspectives on the product being reviewed. So, team members might be selected from the following (depending on the product being reviewed): requirements developers, designers, coders, test team members, software assurance, or knowledgeable peers from a similar project.
    • Usually, it is best to avoid including the manager of the product being reviewed as a team member. The primary purpose of the peer reviews is to find errors and there might be some hesitation to mention all the defects for fear of a poor performance rating.
    • Specific roles may be assigned to the participants to make sure the review is efficient. For example, someone may be assigned to “read” or verbally walk through the product, while someone else is assigned to record the issues and defects.

Task 2:

The other SA assigned task is to confirm that all the actions from the review have been closed out. The software assurance personnel will do this by tracking the action items and verifying they have been closed. Before the final close-out of the review, the software assurance personnel should meet with the review lead to verify no outstanding issues remain.

Task 3:

Audit the peer review process at least once every year. Every task that involves performing an audit should also clarify that all audit findings are promptly shared with the project and that improvements to the peer review process will be addressed in the handbook guidance.



  • No labels

0 Comments