bannerd


SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures

1. Requirements

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for: 

a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.

1.1 Notes

Software peer reviews or software inspections are recommended best practices for all safety and mission-success related software components. Recommended best practices and guidelines for software formal inspections are contained in NASA-STD-8739.9, Software Formal Inspection Standard. 277

1.2 History

SWE-087 - Last used in rev NPR 7150.2D

RevSWE Statement
A

4.3.1 The project shall perform and report on software peer reviews/inspections for:

    1. Software requirements.
    2. Software Test Plan.
    3. Any design items that the project identified for software peer review/inspections according to the software development plans.
    4. Software code as defined in the software and or project plans.
Difference between A and BAdded item e. Software test procedures to the requirement.
Broadened the scope of the plans to be reviewed to include all Plans, not just the Test Plan.
B

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

    1. Software requirements.
    2. Software plans.
    3. Any design items that the project identified for software peer review or software inspections according to the software development plans.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.
Difference between B and C

No change

C

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

    1. Software requirements.
    2. Software plans.
    3. According to the software development plans, any design items that the project identified for software peer review or software inspections.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.

Difference between C and D

Added cybersecurity plans to part b.

D

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for: 

a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Software peer reviews or inspections are performed to ensure product and process quality, add value, and reduce risk through expert knowledge infusion, confirmation of approach, identification of defects, and specific suggestions for product improvements.

NPR 7123.1D - NASA Systems Engineering Processes and Requirements (w/Change 1)

G.20 Peer Reviews
“Peer reviews are focused, in-depth technical reviews that support the evolving design and development of a product, including critical documentation or data packages. The participants in a peer review are the technical experts and key stakeholders for the scope of the review.”
041

3. Guidance

Peer reviews are one of the most efficient and effective ways to find and remove defects.

3.1 Peer Review Defined

NASA-STD-8709.22

NASA-STD-8709.22 provides two definitions for peer reviews.

[1] A review of a software work product, following defined procedures, by peers of the product producers to identify defects and improvements.

[2] Independent evaluation by internal or external subject matter experts who do not have a vested interest in the work product under review. Projects can plan peer reviews, and focused reviews conducted on selected work products by the producer’s peers to identify defects and issues before that work product moves into a milestone review or approval cycle. 274


Peer reviews can also be described as “planned, focused reviews by technical team peers on a single work product with the intent of identifying issues before that work product moves on to the next step. A peer review includes planning, preparing, conducting, analyzing outcomes, and identifying and implementing corrective actions.” 041

A key rationale for using software peer reviews is that there are a few verification and validation (V&V) approaches that can be applied in the early stages of software development, long before there is any code that can be run and tested. Moreover, when defects are found and fixed in these early stages rather than slipping into later phases, it can have a huge impact on the project budget. For this reason, software peer reviews and software inspections of requirements documents are explicitly required.

NASA-GB-8719.13, NASA Software Safety Guidebook

6.5.5 Peer Reviews of Software Requirements
"Peer Reviews have the most impact when applied early in the life of a project, especially the requirements specification and definition stages of a project. Impact means that the defects are found earlier when it's cheaper to fix them... Peer Reviews greatly improves the communication within a project and enhances understanding of the system while scrubbing out many of the major errors and defects."
276

3.2 Advantages of Peer Reviews

3.2.1 Stakeholder Buy-in

Software plans are another critical artifact on which software peer reviews can be applied with the best return on investment. Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review or inspections are applied to improve the quality of such plans.

3.2.2 Improving Quality

Software testing represents a substantial part of assuring software quality for most projects, so it is important to ensure that test cases are focused on the appropriate functional areas, cover all important usage scenarios, and specify the expected system behavior adequately and accurately. The best way to ensure these factors is via peer review/inspection, the application of human judgment, and analysis. 

See also Topic 7.06 - Software Test Estimation and Testing Levels

Code and design artifacts also benefit from peer review or inspection. However, although important, inspections of such artifacts are less crucial than those of other life cycle activities because other effective Verification and Validation (V&V) options are available for code and design (such as testing or simulation). A project needs to identify the most crucial design and code segments and deploy inspections to improve their quality. Projects also need to focus peer reviews of code and design on issues that cannot be verified using automated tools; i.e., projects need to be sure that human judgment is applied on appropriate issues and rely on automation where it is best suited.

3.2.3 Additional Benefits from Peer Reviews

Peer reviews provide the following additional benefits:

Useful for many types of products: documentation, requirements, designs, code

Simple to understand

Provide a way for sharing/learning good product development techniques

Serve to bring together human judgment and analysis from diverse stakeholders in a constructive way

This can result in a very efficient method of identifying defects early in the product’s life cycle

Use a straightforward, organized approach for evaluating a work product

-          To detect potential defects in a product

To methodically evaluate each defect to identify solutions and track the incorporation of these solutions into the product

3.3 Required Peer Reviews

Projects are required to conduct peer reviews for the following documentation based on software classification:

Software Documentation

A

B

C

D

E

Software Requirements

X

X

X

X (SC only)

 

Software Plans

X

X

X

X (SC only)

 

Software Design Identified in Plans

X

X

X

X (SC only)

 

Software Code identified in Plans

X

X

X

X (SC only)

 

Test Procedures

X

X

X

X (SC only)

 

Work Product By Activity

SWE Where WP Is Created

Minimum Contents of the WPOther Related Materials
A.01 Software Life Cycle Planning



SDP-SMP - Software Development - Management Plan

Cost Estimates



Schedule


Training Plan


Compliance Matrix

A.02 Software Assurance and Software Safety




IV&V Plan

A.03 Software Requirements




Requirements Specification

A.04 Software Design




  • Architecture Description
  • Design Description Document
  • Interface Design Description

A.05 Software Implementation




  • Code Modules
  • Unit Test Procedures
  • Code modules
  • Unit Test Procedures and test scripts

A.06 Software Testing




  • Test Plans
  • Test Procedures
  • Regression Tests
  • Code Coverage Tests
  • Acceptance Tests
  • Other Tests

A.07 Software Release, Operations, Maintenance, and Retirement




  • Maintenance Plan
  • Release Version Description Document 


A.08 Software Configuration Management




  • CM Plan

A.09 Software Risk Management




  • Risk Management Plans


A.10 Software Peer Reviews and Inspections




  • Peer Reviews and Inspections


A.11 Software Measurements




  • Metrics Reports and Collection Procedures
  • Metrics Analysis Procedures

A.12 Software Non-conformance or Defect Management




  • Defect Management Procedures



3.4 Preparing for a Peer Review 

3.4.1 Peer Review Process Elements 

When conducting a peer review, be sure the following are part of the process to have an effective software peer review:

  • Keep the review focused on the technical integrity and quality of the product.
  • Keep the review simple and informal, and manage time effectively.
  • Concentrate on the review of the documentation and minimize presentations.
  • Use a round-table format rather than a stand-up presentation. 
  • Give a full technical picture of the items being reviewed.
  • Plan the review/inspection, use checklists, and include readiness and completion criteria.
  • Capture action items, and monitor defects, results, and effort.
  • Don't pass your opinion off as fact and don't ask judgmental questions.
  • Before a software code review, run the project's static analysis tools on any source code under review.
  • Make sure that the code changes implement the software requirement and that the software requirement is up-to date.
  • Don't use emojis to point out issues and don't ghost people (let the reviewers know what you did with the comments)
  • Give a full technical picture of the items being reviewed.
  • Automate when possible.
  • Take advantage of the talent.
  • Plan the review/inspection, use checklists, and include readiness and completion criteria.
  • Capture action items, and monitor defects, results, and effort.
  • The project's static analysis tools have been run on any source code under review before the software code review.
  • Make sure that the code changes implement the software requirement and that the software requirement is up-to-date.
  • Use the code reviews and inspections as a teaching opportunity.

3.4.2 Peer Review Members and Roles

When putting a software peer review team together, use the following best practices:

  • The team consists of a minimum of four inspectors.
    • Diverse viewpoints and objectivity are required.
  • Inspection team members are based on the analysis of key stakeholders in the item under inspection.
  • The author should not be the reader, recorder, or moderator.
  • The moderator should not be a manager.
  • At a minimum, the moderator should be formally trained in the process, but all participants may be trained.
  • Management presence/participation discouraged.
  • Each role has a specific responsibility, as shown in the table below:

Role

Responsibility

Moderator

Conducts and controls the inspection

Author

The producer of the product under inspection, answers technical questions

Reader

Presents (reads, paraphrases) the inspection product to the inspection team

Recorder

Documents defects identified during the inspection as well as open issues and action items

Software Peers

Look for software and software coding defects in the product under inspection.

Hardware Engineer(s)Look for defects in the product under inspection, and ensure software control of hardware is correct.
System Engineer(s)Look for defects in the product under inspection, and ensure software control of the system is correct, including fault detection, fault isolation, and fault recoveries.

3.4.3 Entrance and Exit Criteria

The table below is from the NASA System Engineering Processes and Requirements, NPR 7123.1, and shows the entrance criteria and success criteria for a peer review activity.

Table G-19 - Peer Review Entrance and Success Criteria

Peer Review

Entrance Criteria

Success Criteria

  1. The product to be reviewed (e.g., document, process, model, design details) has been identified and made available to the review team.
  2. Peer reviewers independent from the project have been selected for their technical background related to the product being reviewed.
  3. A preliminary agenda, success criteria, and instructions for the review team have been agreed to by the technical team and project manager.
  4. Rules have been established to ensure consistency among the team members involved in the peer-review process.
  5. *Spectrum (radio frequency) considerations addressed.
  1. Peer review has thoroughly evaluated the technical integrity and quality of the product.
  2. Any defects have been identified and characterized.
  3. The results of the peer review are communicated to the appropriate project personnel.
  4. Spectrum-related aspects have been concurred on by the responsible Center spectrum manager.
  5. A code peer review checks to verify that the code meets the software requirements

*Required per NPD 2570.5.

3.4.4 Peer Review Process Steps

Software peer reviews are conducted using the following steps:

Step

Description

Planning

Organize inspection, inspection package contents, required support, and schedule.

Overview

Educational briefing at the time of package distribution to explain materials at a high level

Preparation

Inspectors individually look for and document defects and develop questions

Inspection Meeting

Inspectors examine the product as a group to classify and record defects and capture open issues and action items.

Third Hour

Optional informal meeting to resolve open issues and discuss solutions

Rework

The author corrects major defects (others when cost and schedule allow)

Follow-up

The moderator verifies all major and other dispositioned defects have been corrected; no new defects introduced; and all action items/open issues are closed.

3.4.5 Peer Review Best Practices

When conducting the software peer review, incorporate the following best practices to ensure an effective outcome:

  • Defects found during inspections are never used to evaluate the author – the goal is to improve the product.
  • Use checklists relevant to each inspector’s perspective.
  • Verify that the product being reviewed meets the requirements
  • Use readiness and completion criteria.
  • Limit inspection meeting to 2 hours.
  • Track action items until they are resolved.
  • Collect and use inspection data:
    • The effort, number of participants, and areas of expertise.
    • Defects - list, total, type.
    • Inspection outcome (pass/fail).
    • The item being inspected and type (requirements, code, etc.).
    • Date and time.
    • Meeting length, and the preparation time of participants.

NASA-STD-8739.9, Software Formal Inspection Standard 277 includes lessons that practitioners have learned over the last decade. Contained in the Standard are best practices related to performing inspections on different work products, including recommendations for the checklist contents, the minimum set of reference materials required, the needed perspectives to be included on the inspection team, and reasonable page rates that can help plan adequate time for the inspection, specially adapted for when inspecting: 

  • Requirements. 
  • Design documents. 
  • Source code. 
  • Software plans. 
  • Software test procedures. 

The design and code segments selected for peer review or inspection should be the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap. 

The presence and participation of project management in peer review or inspection meetings are usually not recommended due to the potential negative impact on the effectiveness of the inspections. Typically, management only receives summary-level information on peer reviews/inspections. However, since the project manager for both the software and the system is often the stakeholders of the work products examined (especially in the context of software plans), they may be included as participants in the inspections only when necessary. 

Both management and the inspectors must be aware that defects found during inspections are never used to evaluate the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and help identify potential solutions. 

In Maryland, the Fraunhofer Center maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements. 

NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. 

See also 5.03 - Inspect - Software Inspection, Peer Reviews, Inspections

See SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking for additional checklists. 

Some best practices related to performing peer reviews on different work products:

  1. Checklists for system requirement inspections should contain items that:
    1.  Describe the proper allocation of functions to software, firmware, hardware, and operations.
    2. Address the validation of all external user interfaces.
    3. Check that all the software system functions are identified and broken into configuration items and that the boundary between components is well-defined.
    4. Check that all configuration items within the software system are identified.
    5. Check that the identified configuration items provide all functions required of them.
    6. Check that all interfaces between configuration items within the software system are identified.
    7. Address the correctness of the software system structure.
    8. Check that all quantifiable requirements and requirement attributes have been specified.
    9. Address the verifiability of the requirements.
    10. Check for the traceability of requirements from mission needs (e.g., use cases, etc.).
    11. Check for the traceability of requirements from system safety and reliability analyses (e.g., Preliminary Hazard Analysis (PHA), Fault Tree Analysis (FTA), Failure Modes and Effects Analysis (FMEA), hazard reports, etc.).
      See also Topic 8.05 - SW Failure Modes and Effects Analysis
  2. Check that the software requirements specification of each of the following is complete and accurate:
    1. Software functions.
    2. Input and output parameters.
    3. States and modes.
    4. Timing and sizing requirements for performance.
    5. Interfaces.
    6. Use Cases if available.
    7. Check that specifications are included for error detection and recovery, reliability, maintainability, performance, safety, and accuracy.
    8. Check that safety-critical modes and states, and any safety-related constraints, are identified.
    9. Address the traceability of requirements from higher-level documents.
    10. Check that the requirements provide a sufficient base for the software design.
    11. Check that the requirements are measurable, consistent, complete, clear, concise, and testable.
    12. Check that the content of the software requirement specification fulfills the NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.

  3. Checklists for architectural (preliminary) design should contain items that:
    1. Check that the design meets approved requirements.
    2. Address the validation of all interfaces among modules within each component.
    3. Address the completeness of the list of modules and the general function(s) of each module.
    4. Address the validation of fault detection, identification, and recovery requirements.
    5. Check that the component structure meets the requirements.
    6. Address the validation of the selection of reusable components.
    7. Address the traceability of the design to the approved requirements.
    8. Address the validation of the input and output interfaces.
    9. Check that each design decision is a good match to the system’s goal.
    10. Check that the content of the design description fulfills the NPR 7150.2 recommendation, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.
    11. Check that safety controls and mitigations are identified in the design document when a safety-critical system is under inspection (Review system safety analyses in supporting documentation).
    12. When inspecting object-oriented or other design models:
      1. Check that the notations used in the diagram comply with the agreed-upon model standard notation (e.g., UML notations).
      2. Check that the design is modular.
      3. Check that the cohesion and coupling of the models are appropriate.
      4. Check that architectural styles and design patterns are used where possible. If design patterns are applied, validate that the selected design pattern is suitable.
      5. Check the output of any self or external static analysis tool outputs.

  4. Checklists for detailed design should contain items that:
    1. Check that the design meets the approved requirements.
    2. Address the validation of the choice of data structures, logic algorithms (when specified), and relationships among modules.
    3. Check that the detailed design is complete for each module.
    4. Address the traceability of the design to the approved requirements.
    5. Check that the detailed design meets the requirements and is traceable to the architectural software system design.
    6. Check that the detailed design is testable.
    7. Check that the design can be successfully implemented within the constraints of the selected architecture.
    8. Check the output from any static analysis tools available.

  5. Checklists for source code should contain items that:
    1. Address the technical accuracy and completeness of the code concerning the requirements.
    2. Check that the code implements the detailed design.
    3. Check that all required standards (including coding standards) are satisfied.
    4. Check that latent errors are not present in the code, including errors such as index out-of-range errors, buffer overflow errors, or divide-by-zero errors.
    5. Address the traceability of the code to the approved requirements.
    6. Address the traceability of the code to the detailed design.
    7. When static or dynamic code analysis is available, check the results of these tools.

  6. Checklists for the test plan should contain items that:
    1. Check that the purpose and objectives of testing are identified in the test plan and they contribute to the satisfaction of the mission objectives.
    2. Check that all new and modified software functions will be verified to operate correctly within the intended environment and according to approved requirements.
    3. Check that the resources and environments needed to verify software functions and requirements correctly are identified.
    4. Check that all new and modified interfaces will be verified.
    5. Address the identification and elimination of extraneous or obsolete test plans.
    6. Check that each requirement will be tested.
    7. Check that the tester has determined the expected results before executing the test(s).
    8. For safety-critical software systems:
      1. Check that all software safety-critical functions or hazard controls and mitigations will be tested. This testing should include ensuring that the system will enter a safe state when unexpected anomalies occur.
      2. Check that safety and reliability analyses have been used to determine which failures and failure combinations to test for.
    9. Check that the content of the test plan fulfills NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.

  7. Checklists for test procedures should contain items that:
    1. Check that the set of test procedures meets the objective of the test plan.
    2. Check that each test procedure provides:
      1. A complete and accurate description of its purpose
      2. A description of how it executes
      3. All expected results.
    3. Check that each test procedure identifies which requirement(s) it is testing and correctly tests the listed requirement(s).
    4. Check that each test procedure identifies the required hardware and software configurations.
    5. Check that test procedures exist to verify the correctness of the safety critical controls as well as any software controls or mitigations of hazards (HW, SW, or CPLD) and that the system can obtain a safe state from different modes, states, and conditions.
    6. Check that each test procedure will objectively verify the implementation of the requirement with the expected outcome.
    7. Check that the content of the software test procedure fulfills NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.

The design and code segments selected for peer review or inspection should be the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap.

The presence and participation of project management in peer review or inspection meetings are usually not recommended due to the potential negative impact on the effectiveness of the inspections. Typically, management only receives summary-level information on peer reviews/inspections. However, since the project manager for both the software and the system is often the stakeholder of the work products examined (especially in the context of software plans), they may be included as participants in the inspections only when necessary.

Both management and the inspectors must be aware that defects found during inspections are never used to evaluate the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and help identify potential solutions. See also SWE-089 - Software Peer Reviews and Inspections - Basic Measurements

3.4.6 Other Guidance 

In Maryland, the Fraunhofer Center maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.

3.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

Related Links

3.6 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

4. Small Projects

While small projects are required to use peer reviews and inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review team can leverage personnel from:

  • Areas that interface with the product being developed.
  • Related projects.
  • Other areas within the functional organization.
  • User organization.

Small teams also determine whether quality assurance personnel from the Center participate, for example, by providing a trained moderator to oversee the inspection logistics. 

5. Resources

5.1 References


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Deficiencies in Mission Critical Software Development for Mars Climate Orbiter (1999). Lesson Number 0740 521: Experience at NASA has also shown that a lack of software reviews can result in the loss of spacecraft: "1) Non-compliance with preferred software review practices may lead to mission loss. 2) Identifying mission-critical software requires concurrent engineering and thorough review by a team of systems engineers, developers, and end-users. 3) For all mission-critical software (or software interfaces between two systems or two major organizations), systems engineers, developers, and end-users should participate in ... walk-throughs of requirements, design, and acceptance plans."
  • Arianne 5 -The Inquiry Board's Recommendations: -  685
    • Review all flight software (including embedded software), and in particular:

      • Identify all implicit assumptions made by the code and its justification documents on the values of quantities provided by the equipment. Check these assumptions against the restrictions on the use of the equipment.
      • Verify the range of values taken by any internal or communication variables in the software.
      • Solutions to potential problems in the onboard computer software, paying particular attention to onboard computer switchover, shall be proposed by the Project Team and reviewed by a group of external experts, who shall report to the onboard- computer Qualification Board.
    • Include external (to the project) participants when reviewing specifications, code, and justification documents. Make sure that these reviews consider the substance of arguments, rather than check that verifications have been made.

6.2 Other Lessons Learned

  • A substantial body of data and experience justifies the use of inspections on requirements. Finding and fixing requirements problems during requirements analysis is cheaper than doing so later in the life cycle and is substantially cheaper than finding and fixing the same defects after delivering the software. Data from NASA and numerous other organizations (such as IBM, Toshiba, and the Defense Analysis Center for Software) all confirm this effect. 319
  • The effectiveness of inspections for defect detection and removal in any artifact has also been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type. 319
  • Ensure peer review participation by key stakeholders including Systems Engineering, and all affected Responsible Engineers.
  • Perform requirements checks as part of Implementation or code peer reviews.

7. Software Assurance

SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures
5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for: 

a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that software peer reviews are performed and reported on for project activities. 

2. Confirm that the project addresses the accepted software peer review findings.

3. Perform peer reviews on software assurance and software safety plans.

4. Confirm that the source code satisfies the conditions in the NPR 7150.2 requirement SWE-134, "a" through "l," based upon the software functionality for the applicable safety-critical requirements at each code inspection/review.

7.2 Software Assurance Products

  • SA peer review records (Including findings for  software assurance and software safety plans.)

Objective Evidence

  • Peer review metrics, reports, data, or findings,
  • List of participants in the software peer reviews,

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of Non-Conformances identified in each peer review
  • # of peer reviews performed vs. # of peer reviews planned
  • # of software work product Non-Conformances identified by life cycle phase over time
  • Time required to close review Non-Conformances
  • Total # of peer review Non-Conformances (Open, Closed)
  • # of Non-Conformances identified by software assurance during each peer review
  • #  of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
  • # of safety-related non-conformances identified by life cycle phase over time
  • # of safety-related requirement issues (Open, Closed) over time
  • # of Non-Conformances (activities not being performed)
  • # of Non-Conformances accepted by the project
  • # of Non-Conformances (Open, Closed, Total) 
  • Trends of Open vs. Closed Non-Conformances over time
  • % of Total Source Code for each Software Classification (*organizational measure)

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

Confirm that peer reviews are planned and are being executed on the items listed in SWE-087. Early in the project (e.g., before SRR or PDR), assurance confirms that peer reviews are planned for the requirements, the software plan, and the test procedures. In addition to these peer reviews, confirm that the project has considered whether other products need to be reviewed and have included a list of those in their software management/development plan. 

Generally, the additional peer-reviewed items are design and code products for any software design or code intended to address requirements for critical software or for any of the areas of the code or design that are particularly complex. Software assurance may also choose other areas to peer review independently if they feel a particular area or product needs review.

Plan to attend any scheduled reviews listed in this requirement and any identified in the software plan. Other software assurance responsibilities associated with peer reviews are addressed in SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking. This includes confirming that all issues and defects from the peer reviews are recorded and addressed before the review is closed out. Software assurance tracks all actions (issues, defects) from the peer reviews and verifies their closure before the reviews are closed. See also Topic 7.10 - Peer Review and Inspections Including Checklists

Software assurance products such as the SA Plan and the requirement assessments done by SA will also be peer-reviewed. In addition to addressing any issues, or defects resulting from the SA product reviews, the software assurance team tracks its review metrics, including those listed in section 7.3 above. Software assurance confirms that the software team is collecting metrics on their peer reviews. It is beneficial for software assurance to collect the metrics on any issues and defects found by software assurance in software peer reviews to show that their attendance was valuable.

Ensure peer review participation by key stakeholders including Systems Engineering, and all affected Responsible Engineers.

Ensure that peer reviews perform requirements checks as part of any code peer review activity and process.

Software Assurance personnel should verify compliance of the performed peer review to the procedures by:

  1. Selectively reviewing peer review packages for required materials and personnel participation.
  2. Participating in peer reviews, including fulfillment of any of the inspection roles.
  3. Provide an independent evaluation of the effectiveness of the inspection process and the product quality.

Software Assurance personnel will ensure that:

  1. Ensure compliance with requirements defined in NPR 7150.2 and NASA-STD 8739.8.
  2. Ensure that preventive and safety measures are being implemented.
  3. Verify that requirements include error detection and recovery methods.
  4. Validate fault detection, identification, mitigation, and recovery requirements.

The following is an example of error taxonomy for code-related defects:

  1. Algorithm or method: An error in the sequence or set of steps used to solve a particular problem or computation, including mistakes in computations, incorrect implementation of algorithms, or calls to an inappropriate function for the algorithm being implemented.
  2. Assignment or initialization: A variable or data item that is assigned a value incorrectly or is not initialized properly or where the initialization scenario is mishandled (e.g., incorrect publish or subscribe, incorrect opening of the file, etc.)
  3. Checking: Software contains inadequate checking for potential error conditions or an inappropriate response is specified for error conditions.
  4. Data: Error in specifying or manipulating data items, incorrectly defined data structure, pointer or memory allocation errors, or incorrect type conversions.
  5. External interface: Errors in the user interface (including usability problems) or interfaces with other systems.
  6. Internal interface: Errors in the interfaces between system components, including mismatched calling sequences and incorrect opening, reading, writing, or closing of files and databases.
  7. Logic: Incorrect logical conditions on if, case, or loop blocks, including incorrect boundary conditions ("off by one" errors are an example) being applied, or incorrect expression (e.g., incorrect use of parentheses in a mathematical expression).
  8. Non-functional defects: Includes non-compliance with standards, failure to meet non-functional requirements such as portability and performance constraints, and lack of clarity of the design or code to the reader - both in the comments and the code itself.
  9. Timing or optimization: Errors that will cause timing (e.g., potential race conditions) or performance problems (e.g., unnecessarily slow implementation of an algorithm).
  10. Coding Standard Violation: When reviewing code, the coding standards need to be reviewed and verified that the code meets them.
  11. Other: Anything that does not fit any of the above categories that is logged during an inspection of a design artifact or source code.

Checklists for software peer reviews should contain items that:

    1. Address that the effort, schedule, and cost estimates for each activity (e.g., development, configuration management, maintenance, assurance, safety, security) are reasonable.
    2. Address the allocations of appropriate resources, including tools and personnel with the needed skills and knowledge.
    3. Check that all risks have been identified and documented along with their probability of occurrence, impact, and mitigation strategy.
    4. Assure sufficient management of the produced project data.
    5. Check that an appropriate and feasible plan exists for sustainment and retirement of the software, if applicable.
    6. Check that the plan fulfills the corresponding NPR 7150.2 recommended contents. 

Assure that all code peer reviews have verified that the code or code changes meet the software requirements.

See SWE-134 - Safety-Critical Software Design Requirements for additional guidance associated with cyclomatic complexity assessments.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

Related Links

  • No labels