UNDER CONSTRUCTION
Each of the sub-activities are performed by the Agency or Centers, not by the projects. Projects benefit from the outputs from these sub-activities. NPR 71150.2 is largely based on the industry-wide best practices documented in the CMMI. The CMMI Appraisal is a tool for assessing how closely is following the best practices of the CMMI. Some projects are required to use the CMMI Appraisal process administered by an independent CMMI Lead Appraiser to show how closely they follow the CMMI. Other projects are appraised by OCE or OSMA to check on how closely they are following NPR 7150.2. The SWEs listed here describe the benchmarking, appraisal, QAAR audits, and other assessments tools. All NASA software development needs to follow some level of defined software processes. Project Managers determines: Resources available to the Project Manager include the Process Asset Libraries from the Centers where the Center level processes are recorded for reuse. Software measurement programs are established to meet measurement objectives and goals at multiple levels. The data gained from these measurement programs are used in the management of projects, to assuring safety and quality, and in improving overall software engineering practices. Software measurement repositories help manage this data and provide the insight needed on projects. Measurement repositories should be established for collecting, storing, analyzing, and reporting measurement data based on the requirements of the projects and Center. The repository enables the Center to assess its current software status and the engineering capabilities of providers for future work. Once these software measurement systems are established, the software measures and analysis results that emanate from them enable Center-wide assessments of the abilities and skills in the workforce, and opportunities for improving software development. What gets measured, gets managed. Software measurement programs are established to meet objectives at multiple levels and structured to satisfy particular organization, project, program, and Mission Directorate needs. The data gained from these measurement programs assist in managing projects, assuring quality, and improving overall software engineering practices. Establishing, maintaining, and using a cost repository enables projects and programs to develop credible software cost estimates. NASA software measurement programs are designed to provide specific information necessary to manage software products, projects, and services. For these programs to be current and accurate, certain center measurements (such as size and effort estimates, milestones, and characteristics) are provided to the Center repository at the end of the major project milestones. Defined software measurements are used to make effective management decisions by Center Management. Historical measurement data can also be used to improve aspects of future projects such as cost estimation. a. Software development tracking data. a. Planned and actual effort and cost. NASA software development activities in support of projects often require a balanced blend of software engineering development expertise and knowledge. If the software is contracted out, the development activities also require knowledge of NASA's acquisition practices and regulations. The Office of the Chief Engineer(OCE) and the Centers have committed to support these objectives by providing sufficient funding in support of the training. In some instances, funding for training may be provided by multiple organizations if the training is beneficial to the communities they represent. NASA software assurance and software safety activities in support of projects often require a balanced blend of software engineering development expertise, software assurance expertise, software safety expertise, and knowledge. If the software is contracted out, the development activities also require knowledge of NASA's acquisition practices and regulations. The Office of Safety and Mission Assurance and the Centers have committed to support these objectives by providing sufficient funding in support of the training. In some instances, funding for training may be provided by multiple organizations if the training is beneficial to the communities they represent.
See edit history of this section
Post feedback on this section
13.01. Activity Overview and History
Institutional Requirements are grouped into several sub-activities. History of Improvement Efforts
13.01.1 Related SWEs
13.01.2 Related Work Products
13.01.3 Related Topics
13.01.4 Related SPAN Links
13.02. Appraisals, Assessments, and Benchmarking
Frequency Of This Activity
13.02.1 Related SWEs
13.02.2 Related Work Products
13.02.2.1 Related Process Asset Templates
13.02.3 Related Topics
13.02.4 Related SPAN Links
13.03. Processes
Frequency Of This Activity
13.03.1 Related SWEs
13.03.2 Related Work Products
13.03.2.1 Related Process Asset Templates
13.03.3 Related Topics
13.03.4 Related SPAN Links
13.04. Measurement and Metrics
Frequency Of This Activity
13.04.1 Related SWEs
b. Software functionality achieved data.
c. Software quality data.
d. Software development effort and cost data.
b. Planned and actual schedule dates for major milestones.
c. Both planned and actual values for key cost parameters that typically include software size, requirements count, defects counts for maintenance or sustaining engineering projects, and cost model inputs.
d. Project descriptors or metadata that typically includes software class, software domain/type, and requirements volatility.13.04.2 Related Work Products
13.04.2.1 Related Process Asset Templates
13.04.3 Related Topics
13.04.4 Related SPAN Links
13.05. Training
Frequency Of This Activity
13.05.1 Related SWEs
13.05.2 Related Work Products
13.05.2.1 Related Process Asset Templates
13.05.3 Related Topics
13.05.4 Related SPAN Links