bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C

SWE-018 - Software Activities Review

1. Requirements

3.3.2 The project manager shall regularly hold reviews of software activities, status, and results with the project stakeholders and track issues to resolution.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

This requirement applies to all classes and safety criticalities except:

  • Class E
  • Class H

Class

     A      

     B      

     C      

   CSC   

     D      

   DSC   

     E      

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Not Safety Critical; CSC & DSC = Safety Critical; E - H = Never Safety Critical.

2. Rationale

Regular reviews of software status and activities assure that all needed tasks and deliverables are managed and achieved. The technical reviews and assessments are used to monitor the progress of the software technical effort and provide software product status information. A key aspect of the technical assessment process is the conduct of life-cycle and technical reviews throughout the software life cycle. These reviews provide a periodic assessment of the program's or project's software technical and progress status and health at key points in the life cycle.

3. Guidance

Each program and project will perform the life-cycle reviews as required by their governing project management NPR, applicable Center practices, and the applicable requirements. Life-cycle reviews are event based and occur when the entrance criteria for the applicable review are satisfied. They should occur based on the maturity of the relevant technical baseline as opposed to calendar milestones (e.g., the quarterly progress review, the yearly summary).  The software and project management technical team needs to develop and document plans for life-cycle and technical reviews for use in the software project planning process. The life-cycle and technical review schedule should be document or recorded in a planning document or record.

Regular reviews are those held on a scheduled basis throughout the software development life cycle. These may be weekly, monthly, or quarterly; or as needed, depending on the size and scope of the software development activity. They include the software reviews conducted to status the project, and the major software technical reviews that occur at various phases of the project (see 7.8 - Maturity of Life-Cycle Products at Milestone Reviews).   7.8 - Maturity of Life-Cycle Products at Milestone Reviews provides a chart that summarizes the current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews. This chart serves as guidance only and NASA Center procedures should take precedence for projects at those Centers. The chart was constructed using the software engineering products from NPR 7150.2, the project life cycle reviews from NPR 7123.1 041, previous work from the NASA Software Working Group to map products to life cycle reviews, and additional information gathered from these NPRs, NPR 7120.5 082, and individual NASA Center procedures.

Regular software reviews cover details of work in software planning, requirements development, architecture, detailed design, coding and integration, testing plans, testing results, and overall readiness for flight. The individual project or development activity determines specific content of each review, with consideration of the current position of the activities within the software development life cycle. The software review content is based on the specific project needs. However, the major technical reviews that include software often must show evidence of satisfaction of entrance and exit (success) criteria. See 7.9 - Entrance and Exit Criteria for a listing of potential criteria to use in reviews. Review and status planning should take into consideration the results of the software classification level process (see SWE-020) and safety criticality determination process (see SWE-133) when making the choice of review frequency.

The evaluation of metrics developed from a software measures process (see SWE-091, SWE-092, SWE-093, and SWE-094) and the assessment of milestone status provide quantitative determinations of the work progress. Risk identification and mitigation, safety, problem identification and resolution are parts of the regular reviews. Risk identification (see SWE-086) and mitigation efforts are tracked in a controlled manner, whether in a database tool or in a software package written for risk management.

Issues that are identified during a regular review that can't be closed at the review are documented and tracked until they are officially dispositioned and/or closed. The issue can be tracked in a suitable tool, such as a risk management system, a configuration management system, or a problem reporting and corrective action (PRACA) system. The configuration management and control system selected for the project is written up in the Configuration Management Plan. The plan is used to record the methods and tools used for tracking the issues to closure (see SWE-079 and SCMP).

The progress between life-cycle phases is marked by key decision points (KDPs). At each KDP, software engineering and project management examines the maturity of the technical aspects of the project. For example, management examines whether the resources (staffing and funding) are sufficient for the planned technical effort, whether the technical maturity has evolved, what the technical and nontechnical internal issues and risks are, and whether the stakeholder expectations have changed.

The interpretation of the term ‘stakeholder’ for this requirement can be taken to include representatives from the following organizations:

  • Quality assurance.
  • Systems engineering.
  • Independent testing.
  • Operations.
  • Independent Verification and Validation.
  • Project and/or Engineering management.
  • Other organizations performing project activities.

However, other external stakeholders at the program or project level (e.g., principle investigators, the science community, technology community, public, education community, and/or a Mission Directorate sponsor) are not included on a regular basis at the internal reviews for the satisfaction of this requirement. In contrast to the relevant or internal stakeholders, external project stakeholders generally participate in just the major milestone reviews.  Stakeholders typically are those materially affected by the outcome of a decision or a deliverable. In the context of software work product development, a stakeholder may be inside of or outside of the organization doing the work. The key reason for holding regular reviews with project stakeholders is to keep them and their organizations informed since they are both participants in the software work product development and advocates for the activity.

A best practice related to the involvement of stakeholders is to determine and invite the relevant stakeholders, i.e., those who should be involved in the review because they are engaged in the development and/or who have a vested interest in the work products being produced.

Additional guidance related to holding a review of software activities, status, and results may be found in the following related requirements in this Handbook:

SWE-019

Software Life Cycle

SWE-024

Plan Tracking

SWE-037

Software Milestones

SWE-087

Software Peer Reviews and Inspections for Requirements, Test Plans, Design, and Code

SWE-088

Software Peer Reviews and Inspections - Checklist Criteria and Tracking

SWE-089

Software Peer Reviews and Inspections - Basic Measurements

4. Small Projects

This requirement applies to all projects depending on the determination of the software classification of the project (see SWE-020 ). A smaller project, however, may be able to get by with less frequent reviews, if the risk to the overall project or program by the software product is low. Periodic evaluations of the software classification and risk level may validate the use of less frequent reviews or suggest an increase in their frequency. 7.8 - Maturity of Life-Cycle Products at Milestone Reviews provides a chart that summarizes the current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews. This chart serves as guidance only and NASA Center procedures should take precedence for projects at those Centers.

5. Resources

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

No tools have been currently identified for this SWE. If you wish to suggest a tool, please leave a comment below.

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to or applicable to the software reviews:

  1. Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No.1122: The following entry in the lesson learned database, among other things, points to a concern for holding an adequate software review of validation activities to reduce risk on a particular part of the X-34 mission. "The X-34 technology demonstrator program faces safety risks related to the vehicle's separation from the L-1011 carrier aircraft and to the validation of flight software. Moreover, safety functions seem to be distributed among the numerous contractors, subcontractors, and NASA without a clear definition of roles and responsibilities." The recommendation is that "NASA should review and assure that adequate attention is focused on the potentially dangerous flight separation maneuver, the thorough and proper validation of flight software, and the pinpointing and integration of safety responsibilities in the X-34 program." 539.
  2. Informal Design Reviews Add Value to Formal Design Review Processes (1996), Lesson No.0582: "A JPL study of in-flight problems on Voyager I and II, Magellan, and Galileo (up to late 1993) revealed that 40% of the problems would likely have been identified by better technical penetration in reviews of the detailed designs performed well before launch. An additional 40% (for a total of 80%) might also have been found by similarly in-depth reviews." Also, "Since formal reviews emphasize verification of the project status and attainment of milestones, their chief benefit lies in the investigative work performed in preparation for the review. In contrast, informal reviews feature detailed value-added engineering analysis, problem solving, and peer review." 509.
  3. Project Management: Integrating and Managing Reviews, Lesson No.1281: This lesson learned discusses some caveats to running reviews. "The Project underwent several reviews throughout its life cycle. In general, these reviews helped the Project maintain its course and manage its resources effectively and efficiently. However, at times these review groups would provide the Project with recommendations that were inconsistent with the Project's requirements. For example, these recommendations included the recommendations of other review teams, and past recommendations of their own team. The result was that the Project had to expend resources trying to resolve these conflicts or continually redraft its objectives." The Recommendation is that "Projects should develop and maintain a matrix of all the reviews they undergo. This matrix should have information such as the review team roster, their scope, and a record of all their recommendations. Prior to each review this matrix should be given to the review team's chairperson so that the outcome of the review remains consistent and compatible with the Project's requirements. Senior Management should be represented at these reviews so that discrepancies between the review team and the Project can be resolved immediately." 547.
  • No labels