2. Planning the Measurement Activities
When a software lead on a project begins to plan the project's software measurement task, he/she needs to determine what the project's information needs are and how to satisfy them. There are two commonly used methodologies to help a software lead determine these needs. They are the Goal/Question/Metric (GQM) methodology originally developed by Vic Basili of the Software Engineering Laboratory and the similar Goal/Question/Indicator/Metric (GQIM) methodology developed by the Software Engineering Institute.
The first step in both is to define the goals or objectives desired by the project for the measurement task. The goal specifies the purpose of the measurement, the object to be measured (usually a project product, process or resource), the issue to be measured and the viewpoint to be considered. An example of a typical project goal is to ensure that the software is delivered on schedule.
The next step is to refine the goal into questions that break down the issue into its major component. For the timely delivery goal, some questions might be: What is the planned delivery schedule? How much work needs to be completed before the delivery can be made? What is the planned rate of completing that work? Is the project completing the work as planned?
Using the questions, it is possible to determine what metrics can be collected to help answer the questions. Some potential measures might be the planned and the actual schedule as well as the current rate of work compared to the planned rate of work. Using this process to derive the measures is consistent with the GQM methodology.
In the case of the GQIM methodology, there is an extra step after refining the goal into questions in which consideration is given regarding the information to be displayed for easy analysis; in other words, consideration is given to what the charts look like (or what indicators would be helpful). In the above example, several charts might be useful to a software manager such as a Gantt chart with the planned versus actual schedule or an earned value chart showing the value of the work actually performed compared to the value of the work that was planned by this point in the schedule. (A higher level manager may only be interested in a top level indicator like a red/yellow/green flag to indicate the project schedule health.) As in the GQM methodology, once the questions and indicators have been determined, the metrics can also be determined.
Planning steps for the software lead are:
Step 1: Establish the measurement objectives for the project. (See SWE-090)
Step 2: Identify the measurement analyses that support these objectives (questions, indicators, as above)
Step 3: Specify the measures to be collected (See SWE-091)
Step 4: Specify the data collection and storage procedures (See SWE-092)
Step 5: Specify the analysis procedures (See SWE-093)
2.1 Step 1: Establish the measurement objectives for the project
In order to establish the measurement objectives or goals of the project, there are usually several things to consider. Typically, software projects are most interested in the following:
- Managing the project with objective data
- Providing data for planning future projects
- Meeting any specific requirements that may have been levied on them
There is often an overlap in the information needed for the items above. For example, Class A and B projects need to consider measurements that might be expected to help satisfy CMMI (Capability Maturity Model Integration) requirements and all classes of software need to comply with the requirements in NPR 7150.2. Measurement objectives that are established for the project are to be consistent with the objectives specified in the NPR and any specific measurement objectives specified by the project's organizational measurement groups.
In addition to the organizational objectives and the NASA requirements, other sources for objectives include management, technical, product, or process implementation needs such as:
- Achieve completion by the scheduled date
- Complete the project within the budget
- Devote adequate resources to each process area
Corresponding objectives might be:
- Measure project progress to ensure that it is adequate to achieve completion by scheduled date
- Track cost and effort to ensure project is within budget
- Measure the resources devoted to each process area
Additional sources for information needs and objectives include strategic plans, high level requirements, the project Software Management Plan and operational concepts.
Once the project has established the measurement objectives and information needs, they are documented in either the project's Software Management Plan (See SWE-102) or in a separate measurement plan.
2.2 Step 2: Identify the measurement analysis that supports these objectives
The software lead identifies the measurement analyses that will be done to support the objectives chosen. In order to help with this, think about the questions that need to be answered to provide the needed information. If there are multiple analyses that can be used, prioritize the candidates and select the key analysis that will provide the best information to satisfy the objective. There are several resources that could potentially assist with selecting the measurement analyses and also with the selection of the corresponding measures to be collected. Goddard Space Flight Center has a Measurement Planning Table that shows some common objectives with measurement analyses and corresponding measures. There is also a Project/Type Goal Matrix developed following the Headquarters Software Measurement Workshop that lists goals and corresponding measures based on the project's size. The selected analyses are documented in the Software Management Plan or measurement plan, maintaining the traceability to the objectives. Note: Analysis can be as simple as comparing the planned values to the actual values over time.
2.3 Step 3: Specify the measures to be collected
Continue to use the methodology and select the measures needed to support the analyses and the objectives already chosen. It is helpful to think about what types of indicators or charts are needed in each area. Possibilities include tables, trend charts, stoplight charts, bar chart comparing points in time, etc. The resources listed in Step 2 can assist with the selection of measures. According to SWE-091, measures are required to be selected in the following five areas:
- Software progress tracking
- Software functionality
- Software quality
- Software requirements volatility
- Software characteristics
The first four areas will provide information that can be used in the management of the project, to address objectives such as completion of the project on-time and within budget, delivery of the planned functionality, and delivery of high quality software. The software characteristics measures are intended to provide enough background information on the project to allow roll-ups of measures across similar projects to provide information for organizational models. Measures that are selected for the project are defined by name, type and unit and documented in the measurement plan or Software Management Plan.
Some examples of potential measures and possible chart representations are shown below for each of the four categories.
Progress tracking measure examples:
Figure 1: Scheduled Activities Planned Versus Scheduled Activities Completed shows whether the activities planned on the project are being completed on schedule.
Figure 2: Planned Progress Points versus Actual Progress Points. For this chart, a certain number of points are assigned to each activity on the project (or a portion of the project). Then points are scheduled according to the schedule dates when the activities need to be completed. As work progresses, points are earned toward the completion of the activities. Often a point represents a day or hour of estimated work. This method provides a more objective way of determining whether activities within the schedule are progressing satisfactorily. For more information on Point Counting see the instructions for the Point Counting Tool listed in the Tools Table in the of this topic.
Figures 3 and 4: Earned Value – Schedule and Cost Performance. Earned value is a methodology used to measure the progress of a project taking into account the work complete, the time taken and the costs incurred to complete that work. Using this methodology, it is possible to predict the final cost and schedule, based on the current performance of the project. For more information on earned value, see NASA's Earned Value Management (EVM) website.
Figure 5: Planned Staffing versus Actual Staffing shows whether the project has the planned amount of staff working on the project.
Figure 6: Planned Cost versus Actual Cost shows whether the costs incurred by the project are the costs expected at that point in the project.
Figure 7: Units Planned for Design versus Units Completed Design shows the number of units where design has been completed compared to the total number of units that need to be designed.
Figure 8: Tests Planned versus Tests Successfully Completed shows testing progress against plan.
This sample of progress tracking charts shows that there are many ways to examine progress and many factors that the project might want to consider to determine whether satisfactory progress is being made. Although only one progress tracking set of measures is required by SWE 091, more than one set of progress tracking measures are often needed to give the project a clear picture of whether the project is progressing satisfactorily or not. For example, if a project chose only to track actual costs against planned costs, and the charts showed that the costs were well below the plan, the assumption might be that the project was doing very well. However, a chart showing the progress of the activities completed might show that progress is way behind schedule. One possible reason might be that the project does not have the all the staff necessary to complete the work as scheduled.
Often, a particular type of progress tracking chart is more applicable in a particular phase of the project such as the charts shown in Figures 7 and 8 for progress during the design and testing phases, respectively.
Figure 9: Number of Units Planned for Each Build shows the incremental count of units planned for each build as the builds progress. This is an example of a functionality chart.
For this chart, you can see that the largest number of units was planned for the first build in the initial the build plan. In the second plan, the number of units to be delivered in build 1 has decreased and the numbers to be delivered in later builds has increased. In the third plan, even more of the units have shifted into the third build. This trend is generally a clear indication of difficulties in the project in terms of being able to deliver the planned functionality on time.
Figure 10: Requirements Changes per Month and Total Requirements is an example of a requirements volatility chart.
This chart shows a baseline of the number of expected requirements, the actual number of requirements over time and the numbers of requirements changes by month. Ideally, the number of requirements changes would decrease to zero as the project starts into the design and implementation phases. Large changes or large numbers of changes in requirements in the later phases are likely to cause either additional work or rework for the project and may affect staffing requirements or delivery schedules.
Figure 11: Number of Open, Closed and New Defects Over Time is an example of a quality chart.
This chart shows a fairly large number of defects still open, and the number being closed is not increasing very rapidly. There are still a number of new defects, but the numbers seem to be decreasing. The software is not ready for delivery since it still has a large number of open severity 1 defects and new ones are still being discovered. Note that a small number of defects may not indicate good quality software---it may indicate that inadequate testing has been done. If the organization has a baseline of typical numbers of defects/ KSLOC (Thousand Software Lines of Code) for software similar the project, it is helpful to compare the project's defect status with the organizational baseline.
2.4 Step 4: Specify the data collection and storage procedures
During this step, the project must decide how they are going to handle the logistics of collecting the measurement data they have chosen. The items to be decided and documented for each measurement collected are the following: Who is going to be responsible for collecting the data? How often will it be collected? Where will the responsible person find the data? Is there any data manipulation to be done (i.e., change in units, extraction from one tool and entry into another to produce charts, etc.?) Where will the data be stored? (Consider both the raw data and the resultant charts with analysis). Once the details of collection and storage have been determined, they are documented in a collection and storage procedure that describes these specific details for the project. An example template for a storage and collection procedure can be found in the NASA PAL (Process Asset Library). Measurement collection is always easier if there are tools that provide the measures as the work is being performed.
2.5 Step 5: Specify the analysis procedures
The final step in planning for measurement is to develop the analysis and reporting procedures. Essentially, this step provides the objective data that can be used to evaluate the information provided by the measurement data collected. This type of information can help the project manager determine whether the data shown on the measurement charts is indicating a problem or a good trend.
For many of the types of measurement charts, the analysis will be based on whether the data is within certain expected or planned boundaries as specified in the analysis procedures. For example, to analyze a chart showing cost over time compared with the planned cost, the analysis might be to compare the planned and actual costs and investigate further if the actual cost deviates from the plan by more than 10%. Often to complete the analysis, other measurement charts and information need to be considered. In the case of cost, the next step might be to look at the other information affecting cost. (Is the staffing higher than planned? Have the requirements increased? Did some piece of equipment cost more than planned?)
Another type of boundary that is used for analysis is an organization "norm" for that type of data. For example, an organization might have a normal or expected set of ranges for the numbers of defects found during each test phase. If a project has either considerably more or less defects, during that phase, further investigation is warranted.
Analysis procedures typically specify the thresholds on boundaries used to trigger further investigation. The analysis procedures also specify other charts that might provide a more complete picture of the information being provided by the measurements.
The reporting procedures specify the types of measurement charts that will be reported to each level of management. Typically, the technical software managers need more detailed levels of measurement data and the higher level managers are more interested in seeing red/yellow/green indicators that only provide an indication of potential problems. When red/yellow/green indicators are used, more detailed data can be made available to support these indicators. An example of an analysis and reporting procedure can be found on the NASA PAL.