Comment:
Migration of unmigrated content due to installation of a new plugin
Wiki Markup
{alias:SWE-137}
{tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned}
{div3:id=tabs-1}
h1. 1. Requirements
Tabsetup
1. The Requirement
1. The Requirement
1
2. Rationale
2
3. Guidance
3
4. Small Projects
4
5. Resources
5
6. Lessons Learned
Div
id
tabs-1
1. Requirements
4.3.2
The
project
shall
perform
and
report
on
software
peer
reviews/inspections
for:
a.
Software
Development
or
Management
Plan.
b.
Software
Configuration
Management
Plan.
c.
Software
Maintenance
Plan.
d.
Software
Assurance
Plan.
e.
Software
Safety
Plan.
h2. {color:#003366}{*}
1.1
Notes{*}{color}
NPR
Notes
NPR 7150.2
does not include any notes for this requirement.
h2. 1.2 Applicability Across Classes
Classes C through E and Safety Critical are labeled with "P (Center) + SO".
, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Classes C through E and Safety Critical are labeled with "P (Center) + SO." "P (Center)"
means
that
an
approved
Center-defined
process
which
meets
a
non-empty
subset
of
the
full
requirement
can
be
used
to
achieve
this
requirement
while
"SO"
means
that
the
requirement
applies
only
for
safety
critical
portions
of
the
software.
Class
C
and not safety critical as well as Class G are labeled with
and Not Safety Critical as well as Class G are labeled with "P (Center).
" This
means
that
an
approved
Center-defined
process
which
meets
a
non-empty
subset
of
the
full
requirement
can
be
used
to
achieve
this
requirement.
Class
F
is
labeled
with
"X
(not
OTS)."
.
This
means
that
this
requirement
does
not
apply
to
off-the-shelf
software
for
these
classes.
{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=*|cnsc=p|dsc=*|dnsc=0|esc=*|ensc=0|f=*|g=p|h=0}
{div3}
{div3:id=tabs-2}
h1. 2. Rationale
Peer review/inspections are among the most effective V&V practices for software. ^1^ They can be applied to many kinds of technical artifacts, and serve to bring to bear human judgment and analysis from diverse stakeholders in a constructive way.
Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.
{div3}
{div3:id=tabs-3}
h1. 3. Guidance
The NASA Software Formal Inspection Standard is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.
The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.
NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:
| [SWE-102|SWE-102] | Software Development/Management Plan |
| [SWE-103|SWE-103] | Software CM Plan |
| [SWE-105|SWE-105] | Software Maintenance Plan |
| [SWE-106|SWE-106] | Software Assurance Plan |
| [SWE-138|SWE-138] | Software Safety Plan |
{note}Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. {note}
{div3}
{div3:id=tabs-4}
h1. 4. Small Projects
There is currently no guidance for small projects relevant to this requirement.
{div3}
{div3:id=tabs-5}
h1. 5. Resources
# Shull, F., Basili, V. R., Boehm, B., Brown, A. W., Costa, P., Lindvall, M., Port, D., Rus, I., Tesoriero, R., and Zelkowitz, M. V., "What We Have Learned About Fighting Defects," Proc. IEEE International Symposium on Software Metrics (METRICS02), pp. 249-258. Ottawa, Canada, June 2002.
# NASA Technical Standard, ["Software Formal Inspections Standard"|http://satc.gsfc.nasa.gov/Documents/fi/std/fistd.pdf], NASA-STD-2202-93, 1993.
{toolstable}
{div3}
{div3:id=tabs-6}
h2. 6. Lessons Learned
The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60% and 90% of the existing defects, regardless of the artifact type. ^1^
{div3}
{tabclose}
applicable
f
*
g
p
h
0
ansc
1
asc
1
bnsc
1
csc
*
bsc
1
esc
*
cnsc
p
dnsc
0
dsc
*
ensc
0
Div
id
tabs-2
2. Rationale
Peer review/inspections are among the most effective V&V (Verification and Validation) practices for software
sweref
319
319
. They can be applied to many kinds of technical artifacts, and serve to bring together human judgment and analysis from diverse stakeholders in a constructive way.
Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.
Div
id
tabs-3
3. Guidance
NASA-STD 2202-93, NASA Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.
The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.
NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:
Software Peer Reviews and Inspections - Basic Measurements
Note
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.
Div
id
tabs-4
4. Small Projects
No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.
Div
id
tabs-5
5. Resources
refstable
toolstable
Div
id
tabs-6
6. Lessons Learned
The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type