bannerc
SWE-094 - Reporting of Measurement Analysis

1. Requirements

5.4.4 The project manager shall provide access to the software measurement data, measurement analyses, and software development status as requested to the sponsoring Mission Directorate, the NASA Chief Engineer, the Center Technical Authorities, and Headquarters SMA. 

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

Click here to view the history of this requirement: SWE-094 History

1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Sometimes Safety Critical; E - F = Never Safety Critical.

2. Rationale

The intent of this requirement is to provide access to the software metric data during the project life cycle for those Agency and Center-defined organizations responsible for assessing and utilizing the metric data. The software development project organizations can provide access to a number of organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information in a timely manner to allow usage of the information.

NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Center levels to satisfy organizational, project, program and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and for subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.

3. Guidance

Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts for future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center and Agency leaders for evaluating overall capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.

The data gained from center measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and in improving overall software engineering practices. In general, the project level and Center level measurement programs are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

Other requirements (SWE-090, SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being accessed by this SWE-094. The information is stored in the center repository (SWE-091) and recorded in a Software Metrics Report (see Metrics).

Software measurement data which includes software development status (see SWE-090) and measurement analyses (see SWE-093) should be accessible to sponsoring Mission Directorate, NASA Chief Engineer, Center, and Headquarters SMA, and able to be captured in Center repositories.

It is the project’s responsibility to ensure that the measurement data, analysis results, and development status are communicated properly, on time, and to the appropriate people. The project collects data and reports analysis results periodically according to the established collection, reporting, and storage procedures (see SWE-090 ). The project also makes the information available on a regular basis (as designed) and provides access at the request of sponsoring Mission Directorate, NASA Chief Engineer, Center and Headquarters SMA, and personnel managing Center repositories.

NASA-specific software measurement access information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

Additional guidance related to software measurements may be found in the following related requirements in this Handbook:

4. Small Projects

No additional guidance is available for small projects.  

5. Resources

5.1 References

5.2 Tools

Unable to render {include} The included page could not be found.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to access to measurement data:

  • Know-How Your Software Measurement Data Will Be Used, Lesson No. 1772 567: "Prior to Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data to its parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate of approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate."

    The Recommendation states that "Prior to submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's."
  • Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556 577: "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

    "Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will be used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:
    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-094 - Reporting of Measurement Analysis
5.4.4 The project manager shall provide access to the software measurement data, measurement analyses, and software development status as requested to the sponsoring Mission Directorate, the NASA Chief Engineer, the Center Technical Authorities, and Headquarters SMA. 
 

7.1 Tasking for Software Assurance

  1. Confirm access to software measurement data, analysis, and status as requested. to the following entities:
    • Sponsoring Mission Directorate
    • NASA Chief Engineer
    • Center Technical Authorities
    • Headquarters SMA

7.2 Software Assurance Products

  • None at this time.


Objective Evidence

  • Evidence of confirmation that listed individuals have access to necessary metrics products and status.
 Definition of objective evidence

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing Short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.


7.3 Metrics

  • None identified at this time

7.4 Guidance

The software assurance task in this requirement is to confirm that the four groups/individuals listed below have access to the project’s software measurement data, measurement analysis, and software development status, as requested:

  • Sponsoring Mission Directorate
  • NASA Chief Engineer
  • Center Technical Authorities
  • Headquarters SMA

In some cases, the project may include some of this information in their status reports, such as the software development status and the measurement analysis describing the major take-aways from the measurements collected. If this information is in the status reports, see if there is evidence that it was distributed to or communicated to those above. Measurements and Metrics should be identified in the project-specific management or development plans.

It is likely that most of the above group may not see this information regularly, particularly in the case of the software measurement data, so it will be necessary to inquire if the information was available when it was requested. (Or it might be easier to ask if any requests for software measurement data, measurement analysis, and software development status were issued where NO results or INCOMPLETE results were received. In other words, did they get what they asked for?)

See topic 8.3 - Organizational Goals of Software Assurance Metrics  - For a table of software assurance metrics with their associated goals and questions.



  • No labels