Book A.

Book B.
7150 Requirements Guidance

Book C.

References, & Terms

(NASA Only)

SWE-024 - Plan Tracking

1. Requirements

2.2.13 The project shall ensure that actual results and performance of software activities are tracked against the software plans.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Class D and Not Safety Critical and Class G are labeled with "P (Center)."  This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement.





























Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

The software lead has the responsibility for periodically evaluating the cost, schedule, risk, technical performance, and content of the software work product development activity. The evaluation focuses on determining if the activity is meeting its commitments contained in the Software Development/Management Plan (SWE-102). The satisfaction of commitments in this plan, as well as subordinate software plans, helps assure that the safety, technical integrity, performance, and mission success criteria for the project will be met.

3. Guidance

The planning and requirements documentation developed during early phases of the project (see SWE-013, SWE-016, SWE-102, and SWE-109) guides the development of software work products. The project management team and the software development lead work together to construct a work plan that is logical and achievable in the allotted time and budget. During early phases key performance factors, schedules and milestones are composed. As scheduled work is performed it is important for the results to be reviewed to assure conformance with these plans and to assess if the expected performance has been achieved. The Software Engineering Institute's (SEI) capability maturity model (CMMI-DEV, Ver 1.3) 157 considers the evaluation of these work activities to be part of its Project Monitoring and Control process. "A project's documented plan is the basis for monitoring activities, communicating status, and taking corrective action.  Progress is primarily determined by comparing actual work product and task attributes, effort, cost, and schedule to the plan prescribed milestones or control levels within the project schedule or work breakdown structure (WBS)." 157

Per the Lesson Learned associated with this SWE, it's important to ensure that software plans that go across contract boundaries (as well as memorandums of understanding and other agreements) are adequately tracked by the project.

The project teams can use a number of tools to develop the insight into the progress of the work. For tracking the progress of activities against plans the use of the following tools and techniques could be helpful:

  • Charts - comparisons of planned vs. achieved values.
  • Documents - statusing the document tree.
  • Schedules - baselined, updates, variances.
  • Reports - monthly technical, schedule and cost narratives; performance measures.
  • Project integration meetings and telecons - cross discipline evaluations.
  • Test observations - unit test and integration test activities.
  • Team meetings - issue (current and forecasted) and problem reporting; resolution options and tracking completion status.

Results and analysis of these tracking activities can serve as the basis for reviews by stakeholders and advocates (see SWE-018).

In addition to the software lead, software assurance personnel have a responsibility for this requirement. "Specifically, reviews, audits, and evaluations may be performed to ensure adherence to and effectiveness of approved plans and procedures. Assure that problem reports, discrepancies from reviews, and test anomalies are documented, addressed, analyzed, and tracked to resolution. Assure that software products (e.g., software requirements, preliminary design, detailed design, use cases, code, models, simulators, test data, inspection results, flow diagrams) are reviewed and software quality metrics (e.g., defect metrics) are collected, analyzed, trended, and documented." 278

The software development team uses approved engineering processes to achieve the specified results and performance. Reviews, audits and tracking of the actual use of the specified processes by the software development team is a function of software assurance (see SWE-022).

Often the evaluation of actual results versus expected performance reveals issues, discrepancies, or deviations that need to be corrected. Typically these findings require further evaluations, replanning, and additional time in the schedule to correct. The software development lead must track these issues to close to ensure the intent of this requirement (see SWE-025).

4. Small Projects

This requirement applies to all projects regardless of size. It's not unusual for smaller and less critical projects to utilize engineering personnel to fulfill some or all of the assurance duties (rather than personnel from the Center's Safety and Mission Assurance Organization).

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

Acquisition and Oversight of Contracted Software Development (1999), Lesson No. 0921: "The loss of Mars Climate Orbiter (MCO) was attributed to, among other causes, the lack of a controlled and effective process for acquisition of contractor-developed, mission critical software. NASA Centers should develop and implement acquisition plans for contractor-developed software and this should be described in each Project Implementation Plan. These plans must provide for Software Requirements, Software Management Planning, and Acceptance Testing and assure NASA Center verification of the adequacy of the software design approach and overall contractor implementation throughout the software life cycle." 528.