bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin

...

idtabs-1

1. Requirements

5.3.1 The Software Metrics Report shall contain as a minimum the following information tracked on a CSCI (Computer Software Configuration Item) basis: [SWE-117]

     a. Software progress tracking measures.
     b. Software functionality measures.
     c. Software quality measures.
     d. Software requirement volatility.
     e. Software characteristics.

1.1 Notes

An example set of software progress tracking measures that meet 5.3.1.a include, but are not limited to:

     a. Software resources, such as budget and effort (planned vs. actual).
     b. Software development schedule tasks (e.g., milestones) (planned vs. actual).
     c. Implementation status information (e.g., number of computer software units in design phase, coded, unit tested, and integrated into computer software
        configuration item vs. planned).
     d. Test status information (e.g., number of tests developed, executed, passed).
     e. Number of replans/baselines performed.

An example set of software functionality measures that meet 5.3.1.b include, but are not limited to:

     a. Number of requirements included in a completed build/release (planned vs. actual).
     b. Function points (planned vs. actual).

An example set of software quality measures that meet 5.3.1.c include, but are not limited to:

     a. Number of software Problem Reports/Change Requests (new, open, closed, severity).
     b. Review of item discrepancies (open, closed, and withdrawn).
     c. Number of software peer reviews/inspections (planned vs. actual).
     d. Software peer review/inspection information (e.g., effort, review rate, defect data).
     e. Number of software audits (planned vs. actual).
     f. Software audit findings information (e.g., number and classification of findings).
     g. Software risks and mitigations.
     h. Number of requirements verified or status of requirements validation.
     i. Results from static code analysis tools.

An example set of software requirement volatility measures that meet 5.3.1.d include, but are not limited to:

     a. Number of software requirements.
     b. Number of software requirements changes (additions, modifications, deletions) per month.
     c. Number of "to be determined" items.

An example set of software characteristics that meet 5.3.1.e include, but are not limited to:

     a. Project name.
     b. Language.
     c. Software domain (flight software, ground software, Web application).
     d. Number of source lines of code by categories (e.g., new, modified, reuse) (planned vs. actual).
     e. Computer resource utilization in percentage of capacity.

Other information may be provided at the supplier's discretion to assist in evaluating the cost, technical, and schedule performance; e.g., innovative processes and cost reduction initiatives.

1.2 Applicability Across Classes

Classes C-E and Safety Critical are labeled "P (Center) + SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process that meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class C and Not Safety Critical is labeled with "P (Center)." This means that a Center-defined process that meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Classes F and G are labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf software for these classes.

...

f*
g*
h0
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnscp
dnsc0
dsc*
ensc0
Div
idtabs-2

2. Rationale

The Software Metrics Report (SMR) provides data to the project for the assessment of software cost, technical, and schedule progress. The reports provide a project or software lead:

  • Tracking measures to indicate progress achieved to date and relates them to cost.
  • Functionality measures to indicate the capabilities achieved to date.
  • Quality measures to indicate the degree to which checks and inspections have found and removed problems and defects in the software.
  • Requirements volatility measures to indicate current project/product stability and the potential for future iterations and changes in the product.
  • Software characteristics to help to uniquely identify the project and its work products, along with its major features.

Measurement is a key process area for successful management and is applied to all engineering disciplines. Measurement helps to define and implement more realistic plans, as well as to monitor progress against those plans. Measurement data provides objective information that helps project management to perform the following:

  • More accurately plan a project or program that is similar to one that has been completed.
  • Identify and correct problems early in the life cycle (more proactive than reactive).
  • Assess impact of problems that relate to project or program objectives.
  • Make proper decisions that best meet program objectives.
  • Defend and justify decisions.

These metrics serve as the major foundation for efforts to manage, assess, correct, report, and complete the software development activities. Since a computer software configuration item (CSCI) is a group of software that is treated as a single entity by a configuration management system, it is the lowest level of a product that can be effectively tracked. The SMR typically uses this CSCI information to aggregate management metrics for current project statusing and future project planning.

...

idtabs-3

3. Guidance

The SMR captures all the information that results from exercising and completing the requirements for SWE-090, SWE-091, SWE-092, SWE-093, and SWE-094. The SMR serves as a single repository for collecting the information developed from these activities and saving and presenting them to the appropriate stakeholders and project personnel. They result from the chosen measurement objectives (see SWE-090) and provide a view into the types of actions or decisions that may be made based on the results of the analysis and help prioritize the areas where measures need to be collected. SWE-091 calls for the project to develop and record measures in software progress tracking, software functionality, software quality, software requirements volatility, and software characteristics. SWE-092 indicates that data collection and storage procedures are specified and that the data itself needs to be then collected and stored. According to SWE-093,the collected software measures are to be analyzed with project- and Center-approved methods. Finally, SWE-094 calls for the results to be periodically reported and for access to the measurement information to be made available. All of this information may feed into Mission Directorate measurement and metrics programs (see SWE-095 and SWE-096).

Software Tracking Measures

Typically, the most common reason for implementing a measurement program is to track progress, one of the hardest things to do effectively. Consider the following four attributes for selecting effective tracking measures:

  • Objectivity: The measurement needs to be based on criteria that are observable and verifiable.
  • Near Real Time: The measurement reflects what is happening now in the project.
  • Multiple Levels: The measure needs to have both drill-down capability and be able to be rolled up to summary levels.
  • Prediction: The measure must support projections about future progress.

The requirement gives an example set of measures for tracking the project:

...

Software Functionality Measures

...

Software Quality Measures

...

Software Requirement Volatility

...

Software Characteristics

...

Formats

The SMR can be provided in an electronic format or via access to the software metric data repository. It is preferred that the software metric data be provided in an electronic format or method. This document does not have to be a formal hard copy Data Requirement Document if electronic access of the information is provided. It is also acceptable for an organization to provide this information as a part of a monthly software review process. The final set of metrics used by a project needs to be determined by the project and organizational needs and the software development life cycle phase. Which software metrics are reported at what point in the development or maintenance life cycle needs to be addressed in the software planning documents or organizational planning documents.

Div
idtabs-4

4. Small Projects

This requirement lists a minimum requirement for providing five areas of information on a CSCI (Computer Software Configuration Item) basis. Smaller projects may consider providing less information than listed in the example sets for each information item. The information included in the SMR needs to be sufficient to manage the project, manage risk, and maintain safety throughout the project's life cycle.

...

idtabs-5

5. Resources

...

toolstable

Div
idtabs-6

6. Lessons Learned

No lessons learned have currently been identified for this requirement.