Comment:
Migration of unmigrated content due to installation of a new plugin
Tabsetup
1. The Requirement
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
Div
id
tabs-1
1. Requirements
4.3.3 The project shall, for each planned software peer review/inspections:
a. Use a checklist to evaluate the work products. b. Use established readiness and completion criteria. c. Track actions identified in the reviews until they are resolved. d. Identify required participants.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Classes C through E and Safety Critical are labeled with "P (Center)+SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
Class C and Not Safety Critical as well as Class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf software for this class.
applicable
f
*
g
p
h
0
ansc
1
asc
1
bnsc
1
csc
*
bsc
1
esc
*
cnsc
p
dnsc
0
dsc
*
ensc
0
Div
id
tabs-2
2. Rationale
This requirement calls out four important best practices that are associated with effective inspections:
a. Using a checklist supports the software peer review/inspection team members by giving them a memory aid regarding what quality aspects they are responsible for in the document under review. The checklists provide a concrete way for the inspection to improve over time. Defect types that are seen to continually slip through peer reviews/inspections are added to the checklist so that future teams are aware that they are important to look for. Checklist items that no longer lead to defects being found are candidates for deletion. If kept up to date in this way, checklists provide a timely and efficient list of the types of issues on which review time should be spent.
b. Readiness and completion criteria are used to ensure that peer review/inspection time is being spent effectively and that confidence can be had in the outcome. Readiness criteria is satisfied before an inspection is able to begin. They represent the minimal set of quality characteristics that are to be satisfied before it is worthwhile to have a team of subject matter experts spend significant time on understanding, assessing, and discussing the product under review/inspection. Readiness criteria also indicate the preparedness of the peer review/inspection team to conduct the review/inspection. Readiness criteria may specify standards and guidelines to be adhered to; set project-specific criteria like the level of detail or a particular policy to be followed; and may require the use of automated tools (like static analysis tools or traceability tools). Completion criteria represent a set of measurable activities that are to be completed at the end of an inspection, so that statements can be made with confidence regarding the outcome. For example, completion criteria may require that all process steps have been completed and documented; metrics have been collected; or that all major defects have been completed and approved.
c. Action items are required to be tracked through completion so that it is assured that the inspection has a positive impact on software quality. Due to time pressures, teams who identify significant numbers of defects in an inspection and then do not take the time to resolve them, are wasting effort. Tracking the action items ensures that such an outcome is avoided. In addition to the impact on software quality, this best practice also aims at keeping the morale of inspection teams high. Nothing is more de-moralizing for a team than investing significant time in identifying and reporting software defects, if they are never fixed afterwards.
d. Effective peer reviews/inspections begin with a planning phase in which plans are made regarding the scope of the document under review, the time available, and other key parameters. One of the most important issues to address in this step is to analyze which perspectives or stakeholders are needed to ensure that all quality aspects can be adequately addressed in an inspection. Taking the time to apply a rigorous inspection process will not automatically yield an effective outcome if the actual engineering knowledge and expertise is never brought to bear on analyzing the document.
Div
id
tabs-3
3. Guidance
The creators of NASA-STD 2202-93, Software Formal Inspection Standard, which is currently under development, suggest several best practices related to the use of checklists. They recommend that:
Each team member use a checklist or similar work aid available, with items relevant to the perspective each is representing.
Checklists are included as an input to any inspection.
Inspectors use the given checklists during their preparation.
The Standard offers detailed suggestions as to what types of quality aspects need to be covered by checklists in a variety of different circumstances.
Best practices related to the establishment of readiness and completion criteria include:
Entrance and exit criteria are specified as part of the inspection procedure, and provide several examples of criteria found useful on NASA teams.
During the inspection planning, the work product under inspection is evaluated against the entrance criteria before the inspection can begin.
The project manager defines the criteria to be used to determine if an inspection ends by passing the document under review, or requiring a re-inspection.
To ensure that close-out activities are undertaken, at the end of any inspection meeting, the moderator:
Determines based on the outcome of inspections, using the criteria previously defined by the project manager, if a re-inspection will be needed.
Compiles, as the outcome of an inspection meeting:
A list of classified anomalies or defects identified from the inspections.
A list of change requests or discrepancy reports for defects found in work products of the previous development phase that have been put under configuration management (CM).
The inspected work product marked with clerical defects.
Ensures that authors of the work product inspected receive the list of classified anomalies or defects.
Best practices related to tracking actions identified in the reviews until they are resolved include:
Action items and defects discussed during the inspection meeting are compiled and tracked starting at that time.
The author's fixes to defects discovered during an inspection are verified before the end of that inspection.
Each project defines a method for documenting, tracking, and measuring such action items.
Best practices related to the identification of required participants include:
The team consist of a minimum of three inspectors, reflecting that diverse viewpoints and objectivity are required to be brought to bear during an inspection.
The inspection team members are based on an analysis of the key stakeholders in the document under inspection.
The Fraunhofer Center
sweref
421
421
in Maryland maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.
Note
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.
Div
id
tabs-4
4. Small Projects
Checklists for various types of inspections can be found at the Fraunhofer Center website
sweref
421
421
. Various inspection tools can be used to reduce the effort of tracking the information associated with inspections. See the "Tools" section of the
Tablink
1
Resources
tab
5
linktext
Resources
tab for a list of tools.
Div
id
tabs-5
5. Resources
refstable
toolstable
Div
id
tabs-6
6. Lessons Learned
Over the course of hundreds of inspections and analysis of their results, the Jet Propulsion Laboratory (JPL) has identified key lessons learned which lead to more effective inspections, including:
Inspections are carried out by peers representing the areas of the life cycle affected by the material being inspected. Everyone participating should have a vested interest in the work product.
Management is not present during inspections.
Checklists of questions are used to define the task and to stimulate defect finding