bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C

SWE-092 - Using Measurement Data

1. Requirements

2.1.3.10 For Class A, B, C, and safety critical software projects, the Center Director shall utilize software measurement data for monitoring software engineering capability, improving software quality, and tracking the status of software engineering improvement activities.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

2. Rationale

What gets measured, gets managed. Software measurement programs are established to meet objectives at multiple levels, and structured to satisfy particular organization, project, program and Mission Directorate needs. The data gained from these measurement programs assist in managing projects, assuring quality, and improving overall software engineering practices.  

3. Guidance


Each organization is expected to develop its own measurement program (see SWE-090) that is tailored to its needs and objectives, and is based on an understanding of its unique development environment.

Some of the important features and advantages of measurements/metrics are:

  • Motivation – Involving employees in the whole process of goal setting and increasing employee empowerment. This increases employee job satisfaction and commitment.
  • Better communication and coordination – Frequent reviews and interactions between superiors and subordinates help to maintain harmonious relationships within the organization and also to solve many problems.
  • Clarity of goals:

–      Subordinates tend to have a higher commitment to objectives they set for themselves than those imposed on them by another person.
–      Managers can ensure that objectives of subordinates are linked to the organization's objectives.
–      Everyone will have a common goal for the organization.

Software measurements are collected:

  • For the benefit of the current project:

–      Objective measurement data is used to plan, track, and correct project.

  • For the benefit of future projects across the Center:

–      Help create a basis for planning future projects.
–      Help understand what baseline performance is for similar projects.
–      Provide organizational information to help improve software activities.


Software measurement data is collected and maintained:

  • To force advanced, detailed planning.
  • To help make development and management planning decisions consistent with the project scope and requirements.
  • To provide objectivity in assessing progress which is often difficult during the heat of the battle.
  • To provide status relative to approved scope and requirements to support management control.
  • To allow corrective action in time to prevent the “crisis” or to minimize the impact of the crisis.
  • To improve the ability to estimate completion costs and schedule variances by analysis of data and trends.


A good measurement plan describes what data is to be collected, how it is to be collected, and how that measurement data is to be used.  Components of a good measurement plan include:

  1. Measurement objectives such as:

–      Organizational objectives: Improve cost estimation; improve quality of delivered software.
–      Project objectives: Deliver software on schedule and within cost, determine test completion, meet operational performance goals, deliver quality software, identify project problems early.

2. The measures that will meet the objectives (and don’t forget measures for the process areas)

–      Your project measures should include any measures that the organization needs.
–      Choose project measures that help you manage your project.
–      Consider the tools you are using and take advantage of metrics they may provide.

3. Descriptions of how the measures will be collected and stored

–      Who does the collection? How do they get collected?
–      Do we need some tools?
–      Do we have a repository for the measures?

4. The analysis methods for each of the measures

–      Do we have to do some additional computations on our measures to compare them?
–      Do we need to look at several measures together to get a full understanding?
–      What sorts of charts will show the answers to our questions?
–      What are the expected values? How do we know if the measurement results indicate something bad or good?

5. Communication of the measurement results

–      What results should the team be aware of?
–      What are the key measurement results that need to be reported to management?

6. Commitment to the measurement plan from your team and your management


The table 497 below provides an example of mapping organizational goals/objectives to metrics:


The specific measurement systems for particular programs and projects enable reporting on the minimum requirements categories listed in SWE-091, and repeated here:

  • Software development tracking data.
  • Software functionality achieved data.
  • Software quality data.
  • Software development effort and cost data.


At the organizational level, projects “typically examine high-level strategic goals like being the low cost provider, maintaining a high level of customer satisfaction, or meeting projected resource allocations. At the project level, [projects] typically look at goals that emphasize project management and control issues or project level requirements and objectives. These goals typically reflect the project success factors like on time delivery, finishing the project within budget or delivering software with the required level of quality or performance. At the specific task level, projects consider goals that emphasize task success factors." 355

Once these software measurement systems are established, the software measures and analysis results that emanate from them enable:

  • Assessments and monitoring of the abilities and skills in the workforce.
  • Identification of improvements in software quality.
  • Identification of opportunities for improving software development.
  • Status tracking for software engineering improvement activities.

Metrics (or indicators) are computed from measures using the approved analysis procedures (see SWE-093). They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breaches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are performing across multiple development organizations or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed.

Measurements useful for monitoring software engineering capability include:

  • Training metrics.
  • Training opportunity metrics, including cross-training opportunities.
  • Workforce experience level data.
  • Development life-cycle model metrics (e.g., number of personnel working Agile projects vs. traditional waterfall)

Measurements useful for improving software quality include:

  • Number of software Problem Reports/Change Requests (new, open, closed, severity).
  • Review of item discrepancies (open, closed, and withdrawn).
  • Number of software peer reviews/inspections (planned vs. actual).
  • Software peer review/inspection information (e.g., effort, review rate, defect data).
  • Number of software audits (planned vs. actual).
  • Software audit findings information (e.g., number and classification of findings).
  • Software risks and mitigations.
  • Number of requirements verified or status of requirements validation.
  • Results from static code analysis tools.

Measurements useful for tracking the status of software engineering improvement activities include:

  • CMMI assessment finding and results.
  • Productivity metric improvements.
  • Defect metric improvements.
  • Software cost data.
  • Goals or objectives of the software engineering improvement activities that have been completed or status of activities associated with the goals or objectives.
  • Workforce metrics.

Additional guidance related to measurement data may be found in the following related requirements in this Handbook:


SWE-090

Management and Technical Measurements

SWE-091

Establish and Maintain Measurement Repository

SWE-093

Analysis of Measurement Data

SWE-094Reporting of Measurement Analysis

4. Small Projects

No additional guidance is available for small projects.

5. Resources

5.1 Tools

Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.

Tool nameTypeOwner/SourceLinkDescriptionUser

JIRA

COTS

Atlassian

http://www.atlassian.com/software/jira ...

JIRA provides issue tracking and project tracking for software development teams to improve code quality and the speed of development. It combines a clean, fast interface for capturing and organizing issues with customizable workflows, OpenSocial dashboards, and a pluggable integration framework. You can start with Atlassian software for $10. JIRA is used for issue tracking and project management by over 14,500 organizations.

GRC, JPL, GSFC, ARC

6. Lessons Learned

Much of NASA's software development experience that was accomplished in the NASA/GSFC Software Engineering Laboratory (SEL) is captured in the reference cited below [from Lessons learned from 25 years of process improvement: The Rise and Fall of the NASA Software Engineering Laboratory]. The document describes numerous lessons learned that are applicable to the Agency's software development activities. From their early studies they were able to build models of the environment and develop profiles for their organization. One of the key lessons in the document is "Lesson 6: The accuracy of the measurement data will always be suspect, but you have to learn to live with it and understand its limitations." 430


  • No labels