SWE-093 - Analysis of Measurement Data

1. Requirements

5.4.3 The project manager shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-093 - Last used in rev NPR 7150.2D

RevSWE Statement

4.4.4 The project shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.

Difference between A and B

No change


5.4.3 The project shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.

Difference between B and C

No change


5.4.3 The project manager shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures. 

Difference between C and DNo change

5.4.3 The project manager shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.

1.3 Applicability Across Classes















Key:    - Applicable | - Not Applicable

2. Rationale

NASA software measurement programs are now being designed 297 to provide the specific information necessary to manage software products, projects, and services. Center, organizational, project, and task goals (see SWE-090 - Management and Technical Measurements) are determined in advance and then measurements and metrics are selected (see SWE-091 - Establish and Maintain Measurement Repository) based on those goals. These software measurements are used to make effective management decisions as they relate to established goals. Documented procedures are used to calculate and analyze metrics that indicate overall effectiveness in meeting the goals.

Typically, the effectiveness of the project in producing a quality product is characterized by measurement levels associated with the previously chosen metric. The use of measurement functions and analysis procedures that are chosen in advance helps assure that Center/organizational goals are being addressed.

3. Guidance

SWE-093 requires the analysis of the collected software measurements with the documented project-specified and Center and organizational analysis procedures. Implicit in the requirement is the need to investigate, evaluate and select the appropriate analysis procedures and software metrics. The Software Development (SDP) or Management Plan (see 5.08 - SDP-SMP - Software Development - Management Plan) lists software metrics as part of the SDP content. This indicates the need to develop the software metrics for the project early in the software development life cycle. The evolution of the software development project and its requirements may necessitate a similar evolution in the required software measures and software metrics (see SWE-092 - Using Measurement Data). See also Topic 7.14 - Implementing Measurement Requirements and Analysis for Projects, 5.05 - Metrics - Software Metrics Report

3.1 Metric Types

Metrics can be classified as primitive metrics (or base metrics) and/or as derived metrics (or computed metrics). Primitive metrics are those that can be directly measured or observed, such as program size (in source lines of code (SLOC)), number of defects observed in unit testing, or total development time for the project. Computed metrics are those that cannot be directly measured but are computed in some manner from software measures or other metrics. Examples of computed metrics are those that are commonly used to measure productivity, such as SLOC produced per person-month, or the number of defects per thousand lines of code (KSLOC). Computed metrics are combinations of software measures or other metric values and are often more valuable in understanding or evaluating the software development process than simple metrics. 252

3.2 Calculating The Metric

It is important to understand that the analysis procedure defines how we are going to calculate the project's software metrics. As stated above, the primitive metrics are measured directly and their analysis procedure may consist of converting them to a simple plot, bar chart, or table. Examples of software measures or primitive metrics may include the number of lines of code reviewed during an inspection, or the hours spent preparing for an inspection meeting.

The derived metrics (usually more complex, but not always) are calculated using the analysis procedures (e.g., mathematical combinations, equations, or algorithms) of the base software measures or other derived measures. An example of a derived metric would be the peer inspection's preparation rate, modeled as the number of lines of code reviewed divided by the number of preparation and review hours.

Many analysis procedures produce metrics with an element of simplification. This is both the strength and weakness of using metrics. When we create a model to use as our analysis procedure, we need to be pragmatic. If we try to include all of the software measures that might affect the metric used to characterize the software product, the model can become so complicated that it is useless. Being pragmatic means not trying to create the most comprehensive analysis procedure. It means picking the aspects that are the most important. Remember that the analysis procedures can always be modified to include additional levels of detail in the future.

Ask yourself these questions:

    • Does the analysis procedure produce more information than we have now?
    • Is the resulting information of practical benefit?
    • Does it tell us what we want to know?
    • Does it help satisfy the goals and objectives of the software measurement program?

3.2.1 Example: Lines  Of Code Metric

The importance of the need for defining software measures and associated analysis procedures can be illustrated by considering the lines of code metric. "Lines of Code" is one of the most used and most often misused of all of the software metrics. (The problems, variations, and anomalies of using lines of code were well documented many years ago 236). There is no industry-accepted standard for counting lines of code. Therefore, if you are going to use a metric based on lines of code, a specific measurement method must be defined. A description of this metric is included in all reports and analyses so that stakeholders (customers and managers) can understand the definition of the metric. Without this, invalid comparisons with other data are almost inevitable. 137

3.3 Analysis Procedure

There are two basic approaches for selecting an analysis procedure to use for producing the software metric(s): (1) Use an existing model, or, (2) create a new one. In many cases, there is no need to "re-invent the wheel."

Use An Existing Model

Many software analysis procedures exist that other organizations have used successfully. These are documented in the current literature and prior NASA software development projects. With a little research, you can identify many candidate analysis procedures that require little or no adaptation to match your own project needs and environments. For example, review the material in the NASA Software Measurement Guidebook 329 to see previously used software metrics. The analysis procedures to develop these metrics are typically straightforward.

Create A Model

The second method is to create your model. The best advice here is to talk to the people who are responsible for the software product or the software development processes and procedures. They are the experts. They know what factors are important. With their assistance, the key software measures, analysis procedures, and resulting software metrics you will need to support and meet the project's specific measurement objectives will become apparent (see SWE-090 - Management and Technical Measurements). If you create a new analysis procedure for calculating the project's software metrics, you need to ensure the analysis procedure is intelligible to your customers and management chain. You must also prove it is a valid analysis procedure for what you are trying to measure. Sometimes this validation can occur only through the application of statistical techniques 137 or by application to previous software development projects in your local environment.

Good metrics facilitate the development of models that are capable of predicting process or product parameters, not just describing them. 

3.4 Metrics Characteristics

As you analyze the collected software measurement data, keep in mind that ideal metrics are:

  • Traceable to an organizational or project objective.
  • Simple, precisely definable.
  • Objective.
  • Easily obtainable (i.e., at a reasonable cost).
  • Valid (the metric effectively measures what it is intended to measure).
  • Robust (the metric is relatively insensitive to insignificant changes in the process or product).

3.5 Metric Attributes

Good metrics may have different types of attributes:

  • Control type: Used to monitor the software processes, products, and services; and identify areas where corrective or management action is required.
  • Evaluate type: Used to examine and analyze the measurement information as part of the decision-making processes.
  • Understand and predict type: Used to predict the future status of the software development activity; adds to the level of confidence in the quality of the software product.

3.6 Metric Analysis

When analyzing the collected software measures, ask the following:

  • Are the software measurement sets complete?
  • Is the data objective or subjective?
  • What is the integrity and accuracy of the data?
  • How stable is the production process being measured?
  • What is the variation in the data set?

See also SWE-094 - Reporting of Measurement Analysis

3.7 Metrics For Decision Making

Software metrics can provide the information needed by engineers for technical decisions as well as information required by management 355. According to the International Standards Organization (ISO) / International Electrotechnical Commission (IEC) 15939 Software Engineering – Software Measurement Process standard, decision criteria are the thresholds, targets, or patterns used to determine the need for action or further investigation. 378

Center and organization analysis procedures may include reporting and distribution functions. This includes defining the report format (tables, trend lines, bar graphs), data extraction and reporting cycle (dates, triggers for collection, exception basis), reporting mechanisms (hard copy, online Database Management System (DBMS)), distribution (email blasts, management chain), and availability. Software measurement data collection cycles may or may not be the same as the data reporting cycles. Alternately, the data collection and storage procedures may capture the descriptions of these reporting and distribution functions (see SWE-092 - Using Measurement Data).

3.8 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.9 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only.  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

4. Small Projects

While SWE-091 - Establish and Maintain Measurement Repository may offer limits to the type and interval of measures to be recorded, the project still needs to collect and analyze the selected software measurement data to develop the key software metrics. Using previously defined analysis procedures can help a project reduce the time and effort needed to develop procedures. Also, certain development environments, such as JIRA (see section 5.1, Tools), and associated plug-ins, or configuration management systems, can help automate the collection and distribution of information associated with the analysis of development metrics.

5. Resources

5.1 References

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following related to capturing software measures in support of Center/organizational needs:

  • Know-How Your Software Measurement Data will be used, Lesson No. 1772 567: Before Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The cost office inputs this data to its parametric cost estimating model. The project provided qualitative parameters that were subject to misinterpretation and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate of approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach an agreement on the correct estimate.

Before submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-093 - Analysis of Measurement Data
5.4.3 The project manager shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm software measurement data analysis conforms to documented analysis procedures.

2. Analyze software assurance measurement data.

7.2 Software Assurance Products

  • SA metrics analysis results relating to software meeting or exceeding requirements, including any risks or issues.

    Objective Evidence

    • Software measurement or metric data
    • Trends and analysis results on the metric set being provided
    • Status presentation showing metrics and treading data
    • Software assurance audit reports on software metric processes

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • Measures relating to status and performance as identified in other requirements. (Schedule deviations, closure of corrective actions, product and process audit results, peer review results, etc.)

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

Task 1

Software assurance reviews the software development plan/software management plan or the measurement plan to confirm that the project has chosen documented analysis procedures from an Agency, Center, or project library or has developed project-specific analysis procedures for the measures they have chosen to collect.  Confirm that the software measurement analysis the project has done on its collected measures has followed the documented project analysis procedures. When the project measures exceed a documented threshold, verify that the project has examined the potential root causes of the variation and has chosen a corrective action to prevent further problems.

Task 2

Software assurance will take the software assurance measurement data and analyze it using the software assurance measurement analysis procedures documented in the software assurance plan. Measurement trends, potential root causes of problem areas, and potential corrective actions should receive special attention. An in-depth analysis of the data and trends should be done to understand the causes of any undesirable trends or indicators. The understanding of these causes is key to determining how to make corrections and improve software assurance performance. For example, if the charts for software assurance activities performed versus software assurance activities planned show that many planned activities have not been performed on schedule, there could be many reasons why (Not enough staff to perform all planned activities, the project is behind schedule and planned activities couldn't be completed, software assurance was focused on unplanned work, etc.) Based on the analysis, corrective actions can be planned to improve assurance work.  More information on analyzing the measurement data can be found in the guidance for this software requirement.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

  • No labels