bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)


SWE-091 - Measurement Selection

1. Requirements

4.4.2 The project shall select and record the selection of specific measures in the following areas:

a. Software progress tracking.
b. Software functionality.
c. Software quality.
d. Software requirements volatility.
e. Software characteristics

1.1 Notes

The requirement for a Software Metrics Report is defined in Chapter 5 [of the NPR 7150.2, NASA Software Engineering Requirements, Section 5.3.1 (also see SWE-117.)]

1.2 Applicability Across Classes

  • Classes C-E and Safety Critical are labeled "P (Center) +SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.
  • Class C and Not Safety Critical and Class G are labeled with "P (Center)." This means that a Center-defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.
  • Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf (OTS) software for these classes.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

    X

    P(C)

    X

   

    X

   

    X

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

Numerous years of experience on many NASA projects show the three key reasons for software measurement activities 329:

  1. To understand and model software engineering processes and products.
  2. To aid in assessing the status of software projects.
  3. To guide improvements in software engineering processes.

Measures are established with these reasons in mind and are tailored to achieve specific project and/or Directorate goals. In general, measures are designed to achieve time-based specific goals, which are usually derived from the following general statements:

  • To provide realistic data for progress tracking.
  • To assess the software's functionality when compared to the requirements and the user's needs.
  • To provide indicators of software quality which provides confidence in the final product.
  • To assess the volatility of the requirements throughout the life cycle.
  • To provide indicators of the software's characteristics and performance when compared to the requirements and the user's needs.
  • To improve future planning and cost estimation.
  • To provide baseline information for future process improvement activities.

The measurement areas chosen to provide specific measures are closely tied to the NASA measurement objectives listed above and in NPR 7150.2. There are specific example measurements listed in the notes in SWE-117 that were chosen based on information needs identified and corresponding questions during several NASA software measurement workshops.

3. Guidance

Many resources exist to help a Center develop a software measurement program.  The NASA "Software Measurement Guidebook" 329 is aimed at helping organizations begin or improve a measurement program. The Software Engineering Institute 327 at Carnegie Mellon University has detailed specific practices for measurement and analysis within its CMMI-Dev 157, Version 1.3 model. NASA resources that will help with the selection of measures for your project are: the "Software Metrics Selection Presentation" 316 and the "Project-Type/Goal/Metric Matrix" 089. The Software Technology Support Center (STSC), at Hill Air Force Base, has its "Software Metrics Capability Evaluation Guide" 336. Other resources are suggested in the Resources section of this SWE.

Each organization or Mission Directorate develops its own measurement program that is tailored to its needs and objectives, and is based on an understanding of its unique development environment (see SWE-095 and SWE-096). Once a manager has the ability to track actual project measures against planning estimates, any observed differences are used to evaluate the status of the project and to support decisions to take corrective actions. The SEPG (Software Engineering Process Group)can also use this data to improve the software development processes. The manager may also consider comparing actual measures to established norms or benchmarks either from the Center measurement program or from industry for other possible insights.

When choosing project measures, check to see if your Center has a pre-defined set of measurements that meets the project's objectives. If so, then the specification of measures for the project begins there. Review the measures specified and initially choose those required by your Center. Make sure they are all tied to project objectives or are measures that are required to meet your organization's objectives.

To determine if any additional measures are required or if your Center does not have a pre-defined set of measures, think about the questions that need to be asked to satisfy project objectives. For example, if an objective is to complete on schedule, the following might need to be asked:

  • How long is the schedule?
  • How much of the schedule has been used and how much is left?
  • How much work has been done? How much work remains to be done?
  • How long will it take to do the remaining work?

From these questions, determine what needs to be measured to get the answers to key questions. One possible set of measures for the above set of questions is "earned value" (sum of budgeted cost for task and products that have actually been produced (completed or in progress) at a given time in the schedule). Similarly, think about the questions that need to be answered for each objective and see what measures will provide the answers. If several different measures will provide the answers, choose the measures that are already being collected or those that are easily obtained from tools.

The presentation, "Software Metrics Selection Presentation" 316, gives a method for choosing project measures and provides a number of examples of measurement charts, with information showing how the charts might be useful for the project. The "Project-Type/Goal/Metric Matrix" 089 is also a matrix developed following the series of NASA software workshops at Headquarters that might be helpful in choosing the project's measures. This matrix specifies the types of measures a project might want to collect to meet a particular goal, based on project characteristics, such as size.

The measurements need to be defined so project personnel collect data items consistently. The measurement definitions are documented in the project Software Management Plan (see SWE-102) or Software Metrics Report (see SWE-117) plan along with the measurement objectives. Items to be included as part of a project's measurement collection and storage procedure are:

  • A clear description of all data to be provided.
  • A clear and precise definition of terms.
  • Who is responsible for providing which data.
  • When and to whom the data are to be provided.

The data collection involvement by the software development team works better if the team's time to collect the data is minimized. If the software developers see this as a non-value added task, data collection will become sporadic affecting data quality and usefulness. Some suggestions for specifying measures:

  • Don't collect too many measures. Be sure the project is going to use the measures.
  • Think about how the project will use them. Visualize the way charts look to best communicate information.
  • Make sure measures apply to project objectives (or are being provided to meet sponsor or institutional objectives).
  • Consider whether suitable measures already exist or whether they can be collected easily. The use of tools that automatically collect needed measures help ensure consistent, accurate collection.

Tools can be used to track and report (e.g.,JIRA) measures and provide status reports of the all open and closed issues, including their position in the issue tracking life-cycle. This can be used as a measure of progress.

Static analysis tools (e.g.,Coverity, CodeSonar) can provide measures of software quality and identify software characteristics at the source code level.

Characterizations of measures like requirements volatility can be tracked with general purpose requirements development and management tools (e.g., DOORS), along with tracking verification progress. They can also provide reports on software functionality and software verification progress.

Links to the aforementioned tools are found in section "5.1: Tools" on the Resources tab of this SWE.

4. Small Projects

See small project information in SWE-090. A few key measures to monitor the project's status and meet sponsor and institutional objectives may be sufficient. Data collection timing may be limited in frequency. The use of tools that collect measures automatically help considerably. In some cases, an organization will provide some organization staff support to help with the measurement collection, storage and analysis for a group of small projects.

5. Resources

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

Problem Report Tool

Downloadable

GSFC

http://software.gsfc.nasa.gov/toolsDetail.cfm?selTool=2.5.2.3 ...

v1.0, Excel-based problem report management and metrics tool, GSFC

GSFC

Staffing Tool

Downloadable (for Excel 2007 only) SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

This tool is used to plan staffing resources and track actual and projected effort against the plan. This tool is also used to plan procurement costs and track actual expenditures against the plan. This is downloadable in Excel 2007 only. Search in SPAN for "GSFC_TL_20150126_Staffing_Tool".

GSFC

Risk Management Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Provides a means for projects to specify and monitor risks. It supports up to 30 risks. Information tracked includes the statement of the risk, originator, date identified, probability, impact, timeframe, assignee, visibility, source, and mitigation steps. This Tool generates detail and summary reports. Search in SPAN for "GSFC_TL_20120905_Risk_Mgmt_Tool"

GSFC

RequirementsLink

COTS

ENSER/Parametric Technology Corporation (PTC)

http://www.ptc.com/appserver/wcms/relnotes/note.jsp?icgdbkey=826imdbkey=119829 ...

Windchill RequirementsLink. Requirements capture and tracking tool. Windchill RequirementsLink - an integral option for Windchill PDMLink - lets you manage product requirements, including change control and associating requirements with specific product structures and design content. With bi-directional traceability between customer needs, market requirements and the underlying technical requirements, you can ensure that customer and market requirements are satisfied by designs, and properly verified during development.

SSC

Requirements Metrics Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

The requirements metrics spreadsheet is used to track both functionality (via the number of requirements representing the scope of the system) and requirements volatility (by tracking changes to requirements). It has three tabs for input data and calculations, and four tabs for graphs of said data. The inputs are project information that helps set up the spreadsheets, data allocating requirements to build and CSCI, and a timeline of requirements changes that tracks the evolving number of requirements and requirements changes by CSCI. Search in SPAN for "GSFC_TL_20070501_Req_Metrics_Tool"

GSFC

Measurement Planning Table Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Provides a template for both development and acquisition projects for specifying the measures that should be collected over the project life cycle. For each measurement area (e.g., Software Quality), the template provides suggestions for the measurement objectives, the measurements that should be collected, the collection frequency, and the analysis that should be performed. Search in SPAN for "GSFC_TL_20160909_Measurement_Planning_Table_Tool"

GSFC

JIRA

COTS

Atlassian

http://www.atlassian.com/software/jira ...

JIRA provides issue tracking and project tracking for software development teams to improve code quality and the speed of development. It combines a clean, fast interface for capturing and organizing issues with customizable workflows, OpenSocial dashboards, and a pluggable integration framework. You can start with Atlassian software for $10. JIRA is used for issue tracking and project management by over 14,500 organizations.

GRC, JPL, GSFC, ARC

DOORS®

COTS

IBM® Rational®

http://www-01.ibm.com/software/awdtools/doors/ ...

IBM® Rational® DOORS® family is a group of requirements management tools that allow you to capture, trace, analyze and manage changes across the development lifecycle.

ARC, DFRC, GRC, GSFC, IV&V, JPL, JSC, JSC, LaRC, MSFC,

Coverity® Prevent and Extend™

COTS

Synopsys

https://www.synopsys.com/software-integrity/security-testing/static-analysis-sast.html ...

Static code analysis

JPL, IV&V

CodeSonar®

COTS

Grammatech

http://www.grammatech.com/products/codesonar/overview.html ...

By analyzing both source code and binaries, CodeSonar enables teams to analyze complete applications, enabling you to take control of your software supply chain and eliminate the most costly and hard-to-find defects early in the SDLC.

IV&V, ARC (NanoSat), JPL

Action Item Tracking Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Excel spreadsheet that tracks action items and produces a summary report. Attributes tracked for each action item include ID, Action Item, Assigned To, Priority, Date Opened, Date Due, Date Closed, Days Opened, and Notes. Available in SPAN on page: GSFC_TL_20080905_Action_Item_Tracking

GSFC

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to Measurement Selection:

1. Selection and use of Software Metrics for Software Development Projects. Lesson Learned Number 3556:  "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

"Metrics or measurements provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that: "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:

  • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
  • "The number of errors found during software verification/validation.
  • "The number of errors found in delivered software (a.k.a., 'process escapes').
  • "Projected versus actual labor hours expended.
  • "Projected versus actual lines of code, and the number of function points in delivered software." 577

2. Flight Software Engineering Lessons. Lesson Learned Number 2218: "The engineering of flight software (FSW) for a typical NASA/Caltech Jet Propulsion Laboratory (JPL) spacecraft is a major consideration in establishing the total project cost and schedule because every mission requires a significant amount of new software to implement new spacecraft functionality."

The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the FSW development process:

"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures include metrics such as the percentage of code, requirements, and defined faults tested, and the percentage of tests passed in both simulation and test bed environments. These measures also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and test bed environments, and where they have successfully passed all stress tests." 572

3. Also, see Lessons Learned listed in SWE-090.