bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-096 - Directorate Measurement Objectives
Unknown macro: {div3}

1. Requirements

4.4.7 Each NASA Mission Directorate shall identify and document the specific measurement objectives, the chosen specific measures, the collection procedures, and storage and analysis procedures.

1.1 Notes">1.1 Notes

NPR 7150.2 does not include any notes for this requirement.

1.2 Applicability Across Classes

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

   

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

Each Mission Directorate develops its own software measurement system (see [SWE-095]) and tailors it to its needs and objectives. The system is based on an understanding of the development environment used for executing its programs and projects. Increased understanding of software progress leads to better oversight of projects. Mission Directorate specified analysis procedures allow the synthesis of results across Centers and across performing organizations, and within programs composed of one or more projects.

While much of the software measurement data and the checking and validation activities may be captured and performed at the project level, the information that is transmitted to the Mission Directorate is more readily used if it is provided in specified formats that allow the Mission Directorate to satisfy its review and oversight needs. Subsequent evaluation and interpretation of these software measurements enable the Mission Directorate to assess its current software status and engineering capabilities of providers for future work.

Unknown macro: {div3}

3. Guidance

Fortunately, there are many resources to help a Mission Directorate develop the software measurement system it needs (see [SWE-095]). The NASA "Software Measurement Guidebook" 1 is aimed at helping organizations begin or improve a measurement program. The Software Engineering Institute at Carnegie Mellon University has detailed specific practices for measurement and analysis within its CMMI-Dev 2, Version 1.3 model. The Software Technology Support Center (STSC), at Hill Air Force Base, has its "Software Metrics Capability Evaluation Guide" 3. Other resources are suggested in section 5 below.

A Mission Directorate organization will establish a software measurement program for many reasons. Those range from having good management information for guiding software development to carrying out research toward the development of some innovative advanced technique. However, NASA^1^ has shown that the three key reasons for software measurement are to

  1. Understand and model software engineering processes and products
  2. Aid in the management of projects
  3. Guide improvements in software engineering processes

The Mission Directorate plan will settle on and document the goals and objectives specifically chosen for its programs and projects. The collection of data measures are designed and dedicated to support the formation and analysis of metrics that describe the state and quality of the activities employed to execute the Mission Directorate's projects and programs. To emphasize this point, a quote from the CMMI-Dev document is cited here: "The measurement activities should support information needs at multiple levels including the business, organizational unit, and project to minimize re-work as the organization matures." 2 The Mission Directorate can define its plan to be applicable to all activities of software development, while at the same time being specific to the specific nature of each project. The Mission Directorate will typically derive its measurement objectives from management, technical, project, software product or software process improvement needs. These objectives can be expected to evolve as the goals and objectives of the Mission Directorate change.

The following information is a brief synopsis of the planning, collection, storage and analysis activities to be performed by the Mission Directorates.

The guidance information presented in [SWE-092], [SWE-093], [SWE-094], and [SWE-095] for Center and project software measurement activities is meant to reflect the chosen Mission Directorate goals and objectives.

The general approach and primary steps for the Mission Directorate and the subsequent project level software measurement are very similar, with only the major objectives and goals expected to be unique. The program and mission support offices at the Centers contribute to many Mission Directorate needs for leading indicators with assessments of their internal capabilities against the specified program and project goals and desired outcomes.

Proper execution of data collection necessitates an agreed to plan for the software measurement program. The plan is based on the evaluation and interpretation of specified software measures that have been captured, validated and analyzed according to the approved procedures in the software measurement plan. The activities begin with clear information that includes:

  • A description of all data to be provided
  • A precise definition of terms
  • A description of responsibilities for data provision and analyses
  • A description of time frames and responsibilities for reception of the data and analyses being provided

The data collection efforts by the Centers, projects, and software development teams can produce more accurate results if the time to collect the data is minimized. If the data collectors and analysis providers see this as a non-value added task, data collection will become sporadic and analysis quality will suffer.

Activities within the data storage procedure include:

  • A description of the checking, validation and verification of the quality of the data sets collected - this includes checks for proper formats, logicalness, missing entries, repetitive entrees, typical value ranges (expect these to be oriented towards validation at the set level since individual data checking and validation should have been already performed at the project level).
  • A description of what data sets, or intermediate analyses will be made and kept, or made and discarded (assuming the analyses can be reconstructed if needed) - this includes a listing of requested analyses by Mission Directorate stakeholders, lists of continuing metrics, and descriptions of changes to the analyses to reflect advances in the software development life cycle.
  • The identification of a proper storage system, and site and management steps to access, use, and control the data and its appropriate data base management system - the use of a DBMS allows multiple projects and organizations to access the data in a format that supports their specific or organizational objectives.

Metrics (or indicators) are computed from measures using the Mission Directorate's analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects.

Note: Management metrics are measurements that help evaluate how well software development activities are performing across multiple Centers, development organizations, programs or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed.

Additional guidance related to software measurement determination, collection, analysis, and reporting may be found in the following related requirements in this handbook that are written from the project development point of view.

[SWE-090]

Measurement Objectives

[SWE-091]

Software Measurement Areas

[SWE-092]

Measurement Collection and Storage

[SWE-093]

Analysis of Measurement Data

[SWE-094]

Reporting of Measurement Analysis

[SWE-095]

Directorate Measurement System

[SWE-117]

Software Metrics Report

Unknown macro: {div3}

4. Small Projects

This SWE requirement indirectly applies to projects, in that a project's software data feeds into the Mission Directorate's measurement system. The amount of data needed by Mission Directorates to monitor software risks on small projects is likely to be considerably less than for larger projects.

Unknown macro: {div3}

5. Resources

  1. Software Measurement Guidebook, NASA-GB-001-94, 1995.
  2. CMMI for Development, Version 1.3", CMU/SEI-2010-TR-033, Software Engineering Institute, 2010
  3. "Software Metrics Capability Evaluation Guide", Software Technology Support Center (STSC), Hill Air Force Base, 1995
  4. Boeing Houston Site Metrics Manual, HOU-EGM-308, January 17, 2002
  5. Westfall, Linda. "12 Steps to Useful Software Metrics", the Westfall Team, 2005
  6. "Software Metrics", SEI Curriculum Module SEI-CM-12-1.1, CMU-Software Engineering Institute, 1988.
Unknown macro: {div3}

6. Lessons Learned

Consider the following lessons learned when capturing software measures in support of Center/organizational needs:

  1. Know How Your Software Measurement data will Be used, Lesson No. 1772: Prior to Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the IPAO that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data to their parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation, and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate.

    Prior to submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's.

    See the following for more information: http://www.nasa.gov/offices/oce/llis/1772.html
  • No labels