bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D

SWE-094 - Reporting of Measurement Analysis

1. Requirements

5.4.4 The project manager shall provide access to the software measurement data, measurement analyses and software development status as requested to the sponsoring Mission Directorate, the NASA Chief Engineer, Center and Headquarters SMA, and Center repositories.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes F and G are labeled with "X (not OTS)". This means that these requirements do not apply to off-the-shelf (OTS) software for these classes.”

Class

     A      

     B      

     C      

   CSC   

     D      

   DSC   

     E      

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

The intent of this requirement is to provide access to the software metric data during the project life cycle for those Agency- and Center-defined organizations responsible for assessing and utilizing the metric data. The software development project organizations can provide access to a number of organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information in a timely manner to allow usage of the information.

NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Center levels to satisfy organizational, project, program and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and for subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.

3. Guidance

Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center and Agency leaders for evaluating overall capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.

The data gained from center measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and in improving overall software engineering practices. In general, the project level and Center level measurement programs are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

Other requirements (SWE-090, SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being accessed by this SWE-094. The information is stored in the center repository (SWE-091) and recorded in a Software Metrics Report (see Metrics).

Software measurement data which includes software development status (see SWE-090) and measurement analyses (see SWE-093) should be accessible to sponsoring Mission Directorate, NASA Chief Engineer, Center and Headquarters SMA, and able to be captured in Center repositories.

It is the project’s responsibility to ensure that the measurement data, analysis results, and development status are communicated properly, on time, and to the appropriate people. The project collects data and reports analysis results periodically according to established collection, reporting, and storage procedures (see SWE-090 ).The project also makes the information available on a regular basis (as designed) and provides access at the request of sponsoring Mission Directorate, NASA Chief Engineer, Center and Headquarters SMA, and personnel managing Center repositories.

NASA-specific software measurement access information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

Additional guidance related to software measurements may be found in the following related requirements in this Handbook:

4. Small Projects

No additional guidance is available for small projects.  

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to access to measurement data:

  • Know How Your Software Measurement Data Will Be Used, Lesson No. 1772: "Prior to Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data to their parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation, and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate."

    The Recommendation states that "Prior to submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's." 567
  • Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556: "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

    "Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics: 577
    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software."



  • No labels