bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

Error formatting macro: alias: java.lang.NullPointerException
SWE-094 - Reporting of Measurement Analysis
Unknown macro: {div3}

1. Requirements

4.4.5 The project shall report measurement analysis results periodically and allow access to measurement information by Center-defined organizational measurement programs.

1.1 Notes

NPR 7150.2A does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C-E and Safety Critical are labeled "P (Center) +SO". This means that this requirement applies to the safety-critical aspects of the software and that an approved Center defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class C and Not Safety Critical and Class G are labeled with "P (Center)". This means that a Center defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class F is labeled with "X (not OTS)". This means that this requirement does not apply to off-the-shelf software for these classes.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

    X

    P(C)

    X

   

    X

   

    X

    P(C)

   

Key:    A_SC = Class A Software, Safety Critical | A_NSC = Class A Software, Not Safety Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

Unknown macro: {div3}

2. Rationale

The intent of this requirement is that the software development project organizations provide or allow access to the software metric data during the project lifecycle by those center defined organizations responsible for assessing and utilizing the metric data. Access can be provided to a number of organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and or subcontractors provide NASA with access to the software metric information in a timely manner to allow usage of the information.

NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and the Mission Directorate and Mission Support levels to satisfy organizational, project, program and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and for subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.

The data gained from these measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and in improving overall software engineering practices. In general, the project level and Mission Directorate/Mission Support Office level measurement programs are designed to meet the following high level goals:

  • To improve future planning and cost estimation
  • To provide realistic data for progress tracking
  • To provide indicators of software quality
  • To provide baseline information for future process improvement activities

Information generated to satisfy other requirements in the NPR 7150.2 (see [SWE-090],[SWE-091], [SWE-092], and [SWE-093]) provide the software measurement data and the analysis methods that are used to produce the results being reported by this SWE-094.

Unknown macro: {div3}

3. Guidance

Metrics (or indicators) are computed from measures using approved and documented analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center leaders for evaluating overall Center capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.

To ensure the measurement analysis results are communicated properly, on time, and to the appropriate people, the project develops reporting procedures for the specified analysis results (see [SWE-091] and [SWE-093]). It makes these analysis results available on a regular basis (as designed) to the appropriate distribution systems.

Things to consider when developing these reporting methods in order to make them available to others are as follows:

  • Stakeholders: Who should receive which reports? What level of reporting depth is needed for particular stakeholders? Software developers and software testers (the collectors of the software measures) may only need a brief summary of the results, or maybe just a notification that results are complete and posted online where they may see them? Task and project managers will be interested in the status and trending of the project's metrics (these may be tailored to the level and responsibilities of each manager). Center leaders may need only normalized values that assist in evaluations of Center competence levels overall (which in turns provides direction for future training and process improvement activities).
  • Management chain: Does one detailed report go to every level? Should analysis results be tailored (abbreviated, synopsized) at progressively higher levels in the chain? Are there multiple chains (engineering, projects, safety and mission assurance)?
  • Timing and periodicity: Are all results issued at the same frequency? Should there be weekly, monthly, or running averages reported? Are some results issued only upon request? Are results reported as deltas or cumulative?
  • Formats for reports: Are spreadsheet-type tools used (bar graph, pie chart, trending lines)? Are statistical analyses performed? Are hardcopy, power point, or email/online tools used to report information?
  • Appropriate level of detail: Are only summary results presented? Are there criteria in place for deciding when to go to a greater level of reporting (trend line deterioration, major process change, mishap investigation)? Who approves data format and release levels?
  • Specialized reports: Are there capabilities to run specialized reports for individual stakeholders (safety related analysis, interface related defects)? Can reports be run outside of the normal 'Timing and periodicity' cycle?
  • Correlation with project and organizational goals: Are analysis results directly traceable or relatable to specific project and Center software measurement goals? Who performs the summaries and synopses of the traceability and relatability? Are periodic reviews scheduled and held to assess the applicability of the analysis results to the software improvement objectives and goals?
  • Interfaces with organizational and Center-level data repositories: Are analysis results provided regularly to organizational and Center level database systems? Is access open, or by permission (password) only? Is it project specific, or will only normalized data made available? Where are the interfaces designed, maintained, and controlled?

The project reports analysis results periodically according to established collection and storage procedures (see [SWE-92]) and the reporting procedures developed according to this SWE. These reporting procedures are contained in the Software Development or Management Plan (see [SWE-102]).

Unknown macro: {div3}

4. Small Projects

add

Unknown macro: {div3}

5. Resources

  1. add
  2. add
Unknown macro: {div3}

6. Lessons Learned

add

  • No labels