bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C

SWE-091 - Establish and Maintain Measurement Repository

1. Requirements

2.1.3.9 For Class A, B, C, and safety critical software projects, the Center Director shall establish and maintain a software measurement repository for software project measurements containing at a minimum:
a. Software development tracking data.
b. Software functionality achieved data.
c. Software quality data.
d. Software development effort and cost data.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

2. Rationale

Software measurement programs are established to meet measurement objectives and goals at multiple levels. The data gained from these measurement programs are used in the management of projects, for assuring safety and quality, and in improving overall software engineering practices. Software measurement repositories help manage this data and provide insight needed on projects.  Measurement repositories should be established for collecting, storing, analyzing and reporting measurement data based on the requirements of the projects and Center. The repository enables the Center to assess its current software status and the engineering capabilities of providers for future work. Once these software measurement systems are established, the software measures and analysis results that emanate from them enable Center-wide assessments of the abilities and skills in the workforce, and opportunities for improving the software development.

3. Guidance

A software organization is expected to collect metrics in accordance with a common procedure to facilitate uniformity in the data collected.  The software organization designates a storage location so that the project metric data can be viewed and used by the organization. An effective storage approach allows for long-term access to the metric information that can be used in trending assessments and analyses. The data gained from these repositories are used in the management of projects, for assuring safety and quality, and in improving overall software engineering practices.  Measurement repositories can be at a Center or organizational level within a Center.  Each Center should decide which approach works best for their Center.

In general, Center-level measurement systems are designed to meet the following high level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

The software measurement repository stores metrics history to be used to evaluate data on current and/or future projects. The availability of past metrics data can be the primary source of information for calibration, planning estimates, benchmarking, and process improvement activities.

With the high level goals in mind, Centers are to establish and maintain a specific measurement repository for their particular programs and projects to enable reporting on the minimum requirements categories:

  • Software development tracking data:

This data is tracked throughout the life cycle, and includes, but is not limited to, the planned and actual values for software resources, schedule, implementation status, and test status.  This information may be reported in the Software Metrics Report (see Metrics).

  • Software functionality achieved data:

This is data that monitors the functionality of software and includes, but is not limited to, the planned vs. actual number of requirements and function points.  It also includes data on utilization of computer resources. This information may be reported in the Software Metrics Report (see Metrics).

  • Software quality data:

This data is used to determine the quality of the software produced during each phase of the software lifecycle.  Software quality data includes figures and measurements regarding software problem reports/change requests, reviews of item discrepancies, peer reviews/software inspections, software audits, software risks and mitigations.  This information may be reported in the Software Metrics Report (see Metrics).

  • Software development effort and cost:

Effort and cost data is used to monitor the cost and progress of software development.  This type of data can include progress toward accomplishing planned activities, number of schedule delays, resource and tool expenses, costs associated with test and verification facilities, and more. This information is typically captured in some type of status reporting tool. 

Since this is a Center repository, measurements are collected across programs and projects.  Proper execution of data collection necessitates an agreed-to plan for the Center’s software measurement repository. This plan is based on the evaluation and interpretation of specified software measures that have been captured, validated and analyzed according to the approved procedures in the programs/projects’ software measurement plans. The collection plan begins with clear information that includes:

  • A description of all data to be provided.
  • A precise definition of terms.
  • A description of responsibilities for data provision and analyses.
  • A description of time frames and responsibilities for reception of the data and analyses being provided.

Each Center develops its own measurement program that is tailored to its needs and objectives, and is based on an understanding of its unique development environment.

Activities within the data collection procedure include:

  • A clear description of all data to be provided. This includes a description of each item and its format, a description of the physical or electronic form to be used, the location or address for the data to be sent (i.e., the measurement repository).
  • A clear and precise definition of terms. This includes a description of the project or organization specific criteria, definitions, and a description of how to perform each step in the collection process and storing data in measurement repository.

Activities within the data storage procedure, which covers placing data in the measurement repository, include 329:

  • A description of the checking, validation and verification (V&V) of the quality of the data sets collected. This includes checks for proper formats, logicalness, missing entries, repetitive entrees, typical value ranges (expect these to be oriented towards validation at the set level since individual data checking and validation will have been already performed at the project level).
  • A description of what data sets, or intermediate analyses will be made and kept, or made and discarded (assuming the analyses can be reconstructed if needed). This includes a listing of requested analyses by Center stakeholders, lists of continuing metrics, and descriptions of changes to the analyses to reflect advances in the software development life cycle.
  • The identification of a Center’s measurement repository, and site and management steps to access, use, and control the data and its appropriate data base management system (DBMS) in the repository. The use of a DBMS in repository allows multiple projects and organizations to access the data in a format that supports their specific or organizational objectives.

Metrics (or indicators) are computed from measures using the Center’s analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects.

Management metrics are measurements that help evaluate how well software development activities are performing across multiple Centers, development organizations, programs or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed.

NASA-specific establishing and maintaining software measurement repository information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook. 

Additional guidance related to software measurements may be found in the following related requirements in this Handbook: 

SWE-090

Management and Technical Measurements

SWE-092

Usage of Measurement Data

SWE-093

Analysis of Measurement Data

SWE-094

Reporting of Measurement Analysis

4. Small Projects

No additional guidance is available for small projects. 

5. Resources

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

Problem Report Tool

Downloadable

GSFC

http://software.gsfc.nasa.gov/toolsDetail.cfm?selTool=2.5.2.3 ...

v1.0, Excel-based problem report management and metrics tool, GSFC

GSFC

Staffing Tool

Downloadable (for Excel 2007 only) SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

This tool is used to plan staffing resources and track actual and projected effort against the plan. This tool is also used to plan procurement costs and track actual expenditures against the plan. This is downloadable in Excel 2007 only. Search in SPAN for "GSFC_TL_20150126_Staffing_Tool".

GSFC

Risk Management Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Provides a means for projects to specify and monitor risks. It supports up to 30 risks. Information tracked includes the statement of the risk, originator, date identified, probability, impact, timeframe, assignee, visibility, source, and mitigation steps. This Tool generates detail and summary reports. Search in SPAN for "GSFC_TL_20120905_Risk_Mgmt_Tool"

GSFC

RequirementsLink

COTS

ENSER/Parametric Technology Corporation (PTC)

http://www.ptc.com/appserver/wcms/relnotes/note.jsp?icgdbkey=826imdbkey=119829 ...

Windchill RequirementsLink. Requirements capture and tracking tool. Windchill RequirementsLink - an integral option for Windchill PDMLink - lets you manage product requirements, including change control and associating requirements with specific product structures and design content. With bi-directional traceability between customer needs, market requirements and the underlying technical requirements, you can ensure that customer and market requirements are satisfied by designs, and properly verified during development.

SSC

Requirements Metrics Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

The requirements metrics spreadsheet is used to track both functionality (via the number of requirements representing the scope of the system) and requirements volatility (by tracking changes to requirements). It has three tabs for input data and calculations, and four tabs for graphs of said data. The inputs are project information that helps set up the spreadsheets, data allocating requirements to build and CSCI, and a timeline of requirements changes that tracks the evolving number of requirements and requirements changes by CSCI. Search in SPAN for "GSFC_TL_20070501_Req_Metrics_Tool"

GSFC

Measurement Planning Table Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Provides a template for both development and acquisition projects for specifying the measures that should be collected over the project life cycle. For each measurement area (e.g., Software Quality), the template provides suggestions for the measurement objectives, the measurements that should be collected, the collection frequency, and the analysis that should be performed. Search in SPAN for "GSFC_TL_20160909_Measurement_Planning_Table_Tool"

GSFC

JIRA

COTS

Atlassian

http://www.atlassian.com/software/jira ...

JIRA provides issue tracking and project tracking for software development teams to improve code quality and the speed of development. It combines a clean, fast interface for capturing and organizing issues with customizable workflows, OpenSocial dashboards, and a pluggable integration framework. You can start with Atlassian software for $10. JIRA is used for issue tracking and project management by over 14,500 organizations.

GRC, JPL, GSFC, ARC

DOORS®

COTS

IBM® Rational®

http://www-01.ibm.com/software/awdtools/doors/ ...

IBM® Rational® DOORS® family is a group of requirements management tools that allow you to capture, trace, analyze and manage changes across the development lifecycle.

ARC, DFRC, GRC, GSFC, IV&V, JPL, JSC, JSC, LaRC, MSFC,

Coverity® Prevent and Extend™

COTS

Synopsys

https://www.synopsys.com/software-integrity/security-testing/static-analysis-sast.html ...

Static code analysis

JPL, IV&V

CodeSonar®

COTS

Grammatech

http://www.grammatech.com/products/codesonar/overview.html ...

By analyzing both source code and binaries, CodeSonar enables teams to analyze complete applications, enabling you to take control of your software supply chain and eliminate the most costly and hard-to-find defects early in the SDLC.

IV&V, ARC (NanoSat), JPL

Action Item Tracking Tool

SPAN - Accessible to NASA users via SPAN tab in this Handbook. By Request - Non-NASA users, contact User for a copy of this tool.

GSFC

...

Excel spreadsheet that tracks action items and produces a summary report. Attributes tracked for each action item include ID, Action Item, Assigned To, Priority, Date Opened, Date Due, Date Closed, Days Opened, and Notes. Available in SPAN on page: GSFC_TL_20080905_Action_Item_Tracking

GSFC

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to Measurement Selection:

  • Selection and use of Software Metrics for Software Development Projects. Lesson Learned Number 3556:  "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

"Metrics or measurements provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that: "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:

    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software." 577

  • Flight Software Engineering Lessons. Lesson Learned Number 2218: "The engineering of flight software (FSW) for a typical NASA/Caltech Jet Propulsion Laboratory (JPL) spacecraft is a major consideration in establishing the total project cost and schedule because every mission requires a significant amount of new software to implement new spacecraft functionality."

The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the FSW development process:

"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures include metrics such as the percentage of code, requirements, and defined faults tested, and the percentage of tests passed in both simulation and test bed environments. These measures also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and test bed environments, and where they have successfully passed all stress tests." 572


  • No labels