Comment:
Migration of unmigrated content due to installation of a new plugin
Wiki Markup
{alias:SWE-089}
{tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned}
{div3:id=tabs-1}
h1. 1. Requirements
Tabsetup
1. The Requirement
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
Div
id
tabs-1
1. Requirements
4.3.4
The
project
shall,
for
each
planned
software
peer
review/inspection,
record
basic
measurements.
h2. {color:#003366}{*}
1.1
Notes{*}{color}
The requirement describing the contents of a Software Peer
Notes
The requirement describing the contents of a Software Peer Review/Inspection
Report
is
defined
in
Chapter
5
\
[of
NPR
7150.2,
NASA
Software
Engineering
Requirements\].
h2.
Requirements, Section 5.3.3].
1.2
Applicability
Across
Classes
Class
F
is
labeled
with
"X
(not
OTS)."
This
means
that
this
requirement
does
not
apply
to
off-the-shelf
(OTS)
software
for
these
classes.
Class
G
is
labeled
with
"P
(Center)."
This
means
that
an
approved
Center-defined
process
which
meets
a
non-empty
subset
of
the
full
requirement
can
be
used
to
achieve
this
requirement.
{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=0|cnsc=0|dsc=0|dnsc=0|esc=0|ensc=0|f=*|g=p|h=0}
{div3}
{div3:id=tabs-2}
h1. 2. Rationale
As with other engineering practices, it is important to monitor defects, pass/fail results, and effort. This is necessary to ensure that peer reviews/inspections are being used in an appropriate way as part of the overall software development life cycle, and to be able to improve the process itself over time. Moreover, key measurements are required to interpret inspection results correctly. For example, if very little effort is expended on an inspection or key phases (such as individual preparation) are skipped altogether, it is very unlikely that the inspection will have found a majority of the existing defects.
{div3}
{div3:id=tabs-3}
h1. 3. Guidance
NASA-STD 2202-93, Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade.
The creators of the updated NASA-STD 2202-93, Software Formal Inspection Standard, suggest several best practices related to the collection and the use of inspection data.
This requirement along with [SWE-119|SWE-119] collects effort, number of participants, defects, number and types of defects found, pass/fail, and identification in order to ensure the effectiveness of the inspection. Where peer reviews/inspections yield less than expected results, some questions to address may include:
* Are peer reviews/inspections being deployed for the appropriate artifacts? As described in the rationale for [SWE-087|SWE-087], this process often is most beneficial when applied to artifacts such as requirements and test plans.
* How are peer reviews/inspections being applied with respect to other verification and validation (V&V) activities? It may be worth considering whether this process is being applied only after other approaches to quality assurance (e.g., unit testing) that are already finding defects, perhaps less cost-effectively.
* Are peer review/inspection practices being followed appropriately? Tailoring away key parts of the inspection process (e.g., planning or preparation), or undertaking inspections with key expertise missing from the team, will not produce the best results.
As with other forms of software measurement, best practices for ensuring that the collection and analysis of peer review/inspection metrics are done well include:
* There need to be clear triggers as to when the metrics are gathered and analyzed (e.g., after every inspection; once per month).
* It needs to be clear who has been assigned to do this task.
* The units of measure are recorded consistently, e.g., one inspection does not record effort in person-hours and another in calendar-days.
* Measures need to be checked for consistency once collected, and for outliers it needs to be investigated whether the data was entered correctly and the correct definitions were applied.
Best practices related to the collection and analysis of inspection data include:
* The moderator is to be responsible for compiling and reporting the inspection data.
* The project manager explicitly specifies the location and the format of the recorded data.
* Inspections are checked for process compliance using the collected inspection data, for example to verify that:
** Any inspection team consists of at least of three persons.
** Any inspection meeting is limited to approximately 2 hours, and if the discussion looks likely to extend far longer, the remainder of the meeting is rescheduled for another time when inspectors can be fresh and re-focused.
** The rate of inspection adheres to the recommended or specified rate for different inspection types.
* A set of analyses are performed periodically on the recorded data to monitor progress (i.e., number of inspection planned versus completed) and to understand the costs and benefits of inspection.
* The outcome of the analyses is leveraged to support the continuous improvement of the inspection process.
In an acquisition context, there are several important considerations for assuring proper inspection usage by software provider(s):
* The metrics to be furnished by software provider(s) must be specified in the contract.
* It must be clear and agreed upon ahead of time whether or not software providers can define their own defect taxonomies. If providers may use their own taxonomy, request the software providers furnish the definition or the data dictionary of the taxonomy. It is also important (especially when the provider team contains subcontractors) to ensure that consistent definitions are used for: defect types; defect severity levels; effort reporting (how comprehensive or restrictive are the activities that are part of the actual inspection).
Additional guidance regarding software peer review/inspection measures can be found in the guidebook section for [SWE-119|SWE-119].
{note}Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. {note}
{div3}
{div3:id=tabs-4}
h1. 4. Small Projects
Projects with small budgets or a limited number of personnel need not use heavy-weight data collection logistics.
Given the amount of data typically collected, lightweight tools such as Excel sheets or small databases (e.g., implemented in MS Access) are usually sufficient to store and analyze the inspections performed on a project.
{div3}
{div3:id=tabs-5}
h1. 5. Resources
{refstable}
{toolstable}
{div3}
{div3:id=tabs-6}
h2. 6. Lessons Learned
Over the course of hundreds of inspections and analysis of their results, the Jet Propulsion Laboratory (JPL) has identified key lessons learned which lead to more effective inspections, including:
* Statistics on the number of defects, the types of defects, and the time expended by engineers on the inspections are kept{sweref:235}.
{div3}
{tabclose}
applicable
f
*
g
p
h
0
ansc
1
asc
1
bnsc
1
csc
0
bsc
1
esc
0
cnsc
0
dnsc
0
dsc
0
ensc
0
Div
id
tabs-2
2. Rationale
As with other engineering practices, it is important to monitor defects, pass/fail results, and effort. This is necessary to ensure that peer reviews/inspections are being used in an appropriate way as part of the overall software development life cycle, and to be able to improve the process itself over time. Moreover, key measurements are required to interpret inspection results correctly. For example, if very little effort is expended on an inspection or key phases (such as individual preparation) are skipped altogether, it is very unlikely that the inspection will have found a majority of the existing defects.
Div
id
tabs-3
3. Guidance
NASA-STD 2202-93, Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade.
The creators of the updated NASA-STD 2202-93, Software Formal Inspection Standard, suggest several best practices related to the collection and the use of inspection data.
This requirement along with SWE-119 collects effort, number of participants, defects, number and types of defects found, pass/fail, and identification in order to ensure the effectiveness of the inspection. Where peer reviews/inspections yield less than expected results, some questions to address may include:
Are peer reviews/inspections being deployed for the appropriate artifacts? As described in the rationale for SWE-087, this process often is most beneficial when applied to artifacts such as requirements and test plans.
How are peer reviews/inspections being applied with respect to other verification and validation (V&V) activities? It may be worth considering whether this process is being applied only after other approaches to quality assurance (e.g., unit testing) that are already finding defects, perhaps less cost-effectively.
Are peer review/inspection practices being followed appropriately? Tailoring away key parts of the inspection process (e.g., planning or preparation), or undertaking inspections with key expertise missing from the team, will not produce the best results.
As with other forms of software measurement, best practices for ensuring that the collection and analysis of peer review/inspection metrics are done well include:
There need to be clear triggers as to when the metrics are gathered and analyzed (e.g., after every inspection; once per month).
It needs to be clear who has been assigned to do this task.
The units of measure are recorded consistently, e.g., one inspection does not record effort in person-hours and another in calendar-days.
Measures need to be checked for consistency once collected, and for outliers it needs to be investigated whether the data was entered correctly and the correct definitions were applied.
Best practices related to the collection and analysis of inspection data include:
The moderator is to be responsible for compiling and reporting the inspection data.
The project manager explicitly specifies the location and the format of the recorded data.
Inspections are checked for process compliance using the collected inspection data, for example to verify that:
Any inspection team consists of at least of three persons.
Any inspection meeting is limited to approximately 2 hours, and if the discussion looks likely to extend far longer, the remainder of the meeting is rescheduled for another time when inspectors can be fresh and re-focused.
The rate of inspection adheres to the recommended or specified rate for different inspection types.
A set of analyses is performed periodically on the recorded data to monitor progress (i.e., number of inspection planned versus completed) and to understand the costs and benefits of inspection.
The outcome of the analyses is leveraged to support the continuous improvement of the inspection process.
In an acquisition context, there are several important considerations for assuring proper inspection usage by software provider(s):
The metrics to be furnished by software provider(s) must be specified in the contract.
It must be clear and agreed upon ahead of time whether or not software providers can define their own defect taxonomies. If providers may use their own taxonomy, request the software providers furnish the definition or the data dictionary of the taxonomy. It is also important (especially when the provider team contains subcontractors) to ensure that consistent definitions are used for: defect types; defect severity levels; effort reporting (how comprehensive or restrictive are the activities that are part of the actual inspection).
Additional guidance regarding software peer review/inspection measures can be found in the guidebook section for SWE-119.
Note
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.
Div
id
tabs-4
4. Small Projects
Projects with small budgets or a limited number of personnel need not use complex or user-intensive data collection logistics.
Given the amount of data typically collected, well-known and easy to use tools such as Excel sheets or small databases (e.g., implemented in MS Access) are usually sufficient to store and analyze the inspections performed on a project.
Div
id
tabs-5
5. Resources
refstable
toolstable
Div
id
tabs-6
6. Lessons Learned
Over the course of hundreds of inspections and analysis of their results, the Jet Propulsion Laboratory (JPL) has identified key lessons learned which lead to more effective inspections, including:
Capturing statistics on the number of defects, the types of defects, and the time expended by engineers on the inspections