bannerc

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
Tabsetup
01. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
67. Software Assurance
Div
idtabs-1

1. Requirements

Excerpt

5.4.4 The project manager shall provide access to the software measurement data, measurement analyses, and software development status as requested to the sponsoring Mission Directorate, the NASA Chief Engineer, the Center Technical Authorities, and Headquarters SMA. 

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

Expand
titleClick here to view the history of this requirement: SWE-094 History

Include Page
SITE:SWE-094 History
SITE:SWE-094 History

1.3 Applicability Across Classes

Applicable c
a1
b1
csc1
c1
d0
dsc1
e0
f0
g0
h0

Div
idtabs-2

2. Rationale

This requirement intends to provide access to the software metric data during the project life cycle for those Agency and Center-defined organizations responsible for assessing and utilizing the metric data. The software development project organizations can provide access to several organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information on time to allow usage of the information.

NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Center levels to satisfy organizational, project, program, and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.

Div
idtabs-3

3. Guidance

Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts for future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center and Agency leaders for evaluating overall capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.

The data gained from center measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and improving overall software engineering practices. In general, the project level and Center level measurement programs are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

Other requirements (SWE-090, SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being accessed by this SWE-094. The information is stored in the center repository (SWE-091) and recorded in a Software Metrics Report (see 5.05 - Metrics - Software Metrics Report).

Software measurement data which includes software development status (see SWE-090) and measurement analyses (see SWE-093) should be accessible to sponsoring Mission Directorate, NASA Chief Engineer, Center, and Headquarters SMA, and able to be captured in Center repositories.

It is the project’s responsibility to ensure that the measurement data, analysis results, and development status are communicated properly, on time, and to the appropriate people. The project collects data and reports analysis results periodically according to the established collection, reporting, and storage procedures (see SWE-090 ). The project also makes the information available regularly (as designed) and provides access at the request of sponsoring Mission Directorate, NASA Chief Engineer, Center and Headquarters SMA, and personnel managing Center repositories.

NASA-specific software measurement access information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.

Additional guidance related to software measurements may be found in the following related requirements in this Handbook:

Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects.  

Div
idtabs-5

5. Resources

5.1 References

refstable
Show If
groupconfluence-users
Panel
titleColorred
titleVisible to editors only

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted


SWEREFs called out in the text: 567, 577

SWEREFs NOT called out in text but listed as germane: 157, 329, 336, 355

5.2 Tools

Include Page
Tools Table Statement
Tools Table Statement

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to access to measurement data:

  • Know-How Your Software Measurement Data Will Be Used, Lesson No. 1772
    Swerefn
    refnum567
    : "Before Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data into its parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate of approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate."

    The Recommendation states that "Before submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's."
  • Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556
    Swerefn
    refnum577
    : "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

    "Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will be used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:
    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

Div
idtabs-7

7. Software Assurance

Excerpt Include
SWE-094 - Reporting of Measurement Analysis
SWE-094 - Reporting of Measurement Analysis
 

7.1 Tasking for Software Assurance

  1. Confirm access to software measurement data, analysis, and status as requested. to the following entities:
    • Sponsoring Mission Directorate
    • NASA Chief Engineer
    • Center Technical Authorities
    • Headquarters SMA

7.2 Software Assurance Products

  • None at this time.


Note
titleObjective Evidence
  • Status presentation showing metrics and treading data
Expand
titleDefinition of objective evidence

Include Page
SITE:Definition of Objective Evidence
SITE:Definition of Objective Evidence

7.3 Metrics

  • None identified at this time

7.4 Guidance

The software assurance task in this requirement is to confirm that the four groups/individuals listed below have access to the project’s software measurement data, measurement analysis, and software development status, as requested:

  • Sponsoring Mission Directorate
  • NASA Chief Engineer
  • Center Technical Authorities
  • Headquarters SMA

In some cases, the project may include some of this information in their status reports, such as the software development status and the measurement analysis describing the major take-aways from the measurements collected. If this information is in the status reports, see if there is evidence that it was distributed to or communicated to those above. Measurements and Metrics should be identified in the project-specific management or development plans.

Likely, most of the above group may not see this information regularly, particularly in the case of the software measurement data, so it will be necessary to inquire if the information was available when it was requested. (Or it might be easier to ask if any requests for software measurement data, measurement analysis, and software development status were issued where NO results or INCOMPLETE results were received. In other words, did they get what they asked for?)

See topic 8.03 - Organizational Goals of Software Assurance Metrics  - For a table of software assurance metrics with their associated goals and questions.