bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


Wiki Markup
{alias:SWE-137}
Tabsetup
Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned1. The Requirement


{div3:id=} h1.
Wiki Markup
Div
idtabs-1

1.

Requirements

4.3.2

The

project

shall

perform

and

report

on

software

peer

reviews/inspections

for:

     

      a.

Software

Development

or

Management

Plan.

     


      b.

Software

Configuration

Management

Plan.

     


      c.

Software

Maintenance

Plan.

     


      d.

Software

Assurance

Plan.

     


      e.

Software

Safety

Plan.

h2. {color:#003366}{*}

1.1

Notes{*}{color} NPR

Notes

NPR 7150.2,

 

  NASA

Software

Engineering

Requirements,

does

not

include

any

notes

for

this

requirement.

h2.

1.2

Applicability

Across

Classes

Classes

C

through

E

and

Safety

Critical

are

labeled

with

"P

(Center)

+

SO."

"P

(Center)"

means

that

an

approved

Center-defined

process

which

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement

while

"SO"

means

that

the

requirement

applies

only

for

safety

critical

portions

of

the

software.

Class

C

and

Not

Safety

Critical

as

well

as

Class

G

are

labeled

with

"P

(Center)."

This

means

that

an

approved

Center-defined

process

which

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement.

Class

F

is

labeled

with

"X

(not

OTS)."

This

means

that

this

requirement

does

not

apply

to

off-the-shelf

software

for

these

classes.

{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=*|cnsc=p|dsc=*|dnsc=0|esc=*|ensc=0|f=*|g=p|h=0} {div3}

Wiki Markup
{div3:id=tabs-2}

h1. 2. Rationale

Peer review/inspections are among the most effective {term:V&V} practices for software{sweref:319}.  They can be applied to many kinds of technical artifacts, and serve to bring together human judgment and analysis from diverse stakeholders in a constructive way.

Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.


{div3}
Wiki Markup
{div3:id=tabs-3}

h1. 3. Guidance

NASA-STD 2202-93, NASA Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.

The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.

NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:

| [SWE-102|SWE-102] | Software Development/Management Plan |
| [SWE-103|SWE-103] | Software CM Plan |
| [SWE-105|SWE-105] | Software Maintenance Plan |
| [SWE-106|SWE-106] | Software Assurance Plan |
| [SWE-138|SWE-138] | Software Safety Plan Contents |

Additional guidance related to peer reviews and inspections may be found in the following related requirements in this Handbook:

| [SWE-088|SWE-088] | Software Peer Reviews and Inspections - Checklist Criteria and Tracking |
| [SWE-089|SWE-089] | Software Peer Reviews and Inspections - Basic Measurements |
\\



{note}Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. {note}


{div3}
Wiki Markup
{div3:id=tabs-4}

h1. 4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.
{div3}
Wiki Markup
{div3:id=tabs-5}

h1. 5. Resources

{refstable}

{toolstable}


{div3}
Wiki Markup
{div3:id=tabs-6} h1. 6. Lessons Learned The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type{sweref:319}. {div3}


applicable
f*
gp
h0
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnscp
dnsc0
dsc*
ensc0



Div
idtabs-2

2. Rationale

Peer review/inspections are among the most effective V&V (Verification and Validation) practices for software

sweref
319
319
. They can be applied to many kinds of technical artifacts, and serve to bring together human judgment and analysis from diverse stakeholders in a constructive way.

Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review/inspections are applied to improve the quality of such plans.


Div
idtabs-3

3. Guidance

NASA-STD 2202-93, NASA Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Included in this Standard are several best practices for performing inspections on the different software plans, including the recommended minimum content of checklists, which perspectives to be included in the inspection team, and the inspection rate.

The presence and participation of project management in peer review/inspection meetings are usually not recommended due to the potential negative impact to the effectiveness of the inspections. However, since the project management for both the software and the system are often the stakeholders of the work products examined (in the context of this requirement), they may be included as participants of the inspections only when necessary. In such situations, both the management and the other inspectors must be aware that defects found during inspections are never to be used for evaluating the authors.

NPR 7150.2 contains specific software documentation requirements for these plans. These content requirements are a basis of checklists as well as additional quality criteria relevant to inspections:


SWE-102

Software Development/Management Plan

SWE-103

Software CM Plan

SWE-105

Software Maintenance Plan

SWE-106

Software Assurance Plan

SWE-138

Software Safety Plan Contents


Additional guidance related to peer reviews and inspections may be found in the following related requirements in this Handbook:


SWE-088

Software Peer Reviews and Inspections - Checklist Criteria and Tracking

SWE-089

Software Peer Reviews and Inspections - Basic Measurements




Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.



Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.


Div
idtabs-5

5. Resources


refstable

toolstable


Div
idtabs-6

6. Lessons Learned

The effectiveness of inspections for defect detection and removal in any artifact has been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type

sweref
319
319
.