Topic 7.1 addresses the history of the NASA software improvement efforts and provides a brief overview of the recent activities for software process improvement to provide a background and impetus for the development of this electronic Handbook.
NASA's software development activities began with the earliest projects headed to space (Gemini, 1962
). Usually, the software activities were project specific and were created with the particular project in mind. Often, the spacecraft design dictated the size and shape of the computer. New software development was required because software adaptation and reuse were essentially non-existent, because of unique platforms, individual programming styles, and the relative non-existence of other software.
The early occurrence and recognition of software issues (software faults caused computer restarts during the Apollo 11 lunar landing in 1969
), as well as the increasing costs of software development, encouraged NASA to address the software engineering approaches used in the Agency. In 1976, the first NASA Software Engineering Workshop was held to address these issues. In the late 1980s, NASA followed up these efforts with the kickoff of a SW Management and Assurance Program (SMAP). The relationship between these early software improvement efforts and several major NASA projects of this time period is shown in figure 1, NASA Software Engineering Timeline, 1957-1997.
Figure 1. NASA Software Engineering Timeline, 1957-1997.
From these initial activities came the impetus for the NASA Software Engineering Initiative (NSEI) at the beginning of the new century. NSEI came into existence as one of three basic components of NASA's Engineering Excellence Initiative (EEI). (Systems Engineering and Project Engineering are the other two main components.) Results and related activities stemming from the NSEI are shown in figure 2, NASA Software Engineering Initiative Timeline, 2002-2010.
Figure 2. NASA Software Engineering Initiative Timeline, 2002-2010
In 2004, the NASA Software Assurance Research Program (SARP) Software Research Infusion Initiative began.
This initiative encourages the uptake of new research results within real NASA missions. A key success feature is realized when a NASA Center adds the research product resulting from an initiative to its list of recommended practices. Table 1, NASA Research Infusion Initiative Completed Projects, shows a list of early initiative results.
Customer Sites and Target Application
Fraunhofer Maryland, SARP
Software Manual Inspection Technique
GSFC (Spacecraft FSW) USA (ISS power analyzer)
Defects found in legacy code and that escaped previous inspections; Adopted
Software Cost Reduction (SRC)
Naval Research Laboratory
Requirements Analysis Tools
ARC (ISS Payload)
Personnel trained, Reqts validated
Safeware Engineering Corp. & MIT
Requirements Capture and Analysis
JPL (Capture of Mission Design Rationale)
Personnel trained, MIT student hired
Process Improvement Methodology
JPL (Ground SW)
SOA and project personnel trained
Reverse Engineering/Defect Detection
JSC (ISS, Shuttle), IVVF (Spacecraft FSW)
Found defects that escaped previous inspections
C Global Surveyor (CGS)
ARC – Intelligent Systems Program
Software defect detection tool
ARC (ISS science payload) MSFC (ISS payload)
Found defects. Good feedback to provider
Software defect detection tool
MSFC (ISS, Shuttle FSW)
Found defects that escaped testing; Will be adopted.
Table 1. NASA Research Infusion Initiative Completed Projects
Also in 2004, the Office of the Chief Engineer (OCE) conducted the first annual software inventory and issued the initial version of NPR 7150.2. The Office of Safety and Mission Assurance (OSMA) completed the updates to NASA-STD-8719.13, Software Safety Standard,
and NASA-STD-8739.8, Software Assurance Standard.
Figure 2 highlights software training with callouts for the design of the software curriculum
DACUM (Developing a Curriculum) and the first offering of the software engineering management class (SWE 301)
, which occurred in the fall of 2008. Center organizations began achieving Configuration Maturity Model (CMM) and later Configuration Maturity Model Integration (CMMI) maturity level ratings at the turn of the century. Table 2, Completed SW Engineering Appraisals (FY07-FY10), lists the current status of Center CMMI ratings as of the start of FY11, as determined by
a SCAMPI (Standard CMMI Appraisal Method for Process Improvement) Class "A" evaluation.
Rating (SCAMPI A by Certified Appraiser
Software Classes Assessed
PP (CL3), CM (CL1)
Data Center Support
Class A, B, and C
Dev & Maintenance
Class A, B, and C
ML2 + RSKM(2)
Dev & Maintenance
Class A, B, and C
ML2 + CL3
Class B and C
PP(CL3), REQM(CL3), CM(CL3) MA(CL3)
Class B and C
Class A, B, C, and D
Class A, B, and C
ML2 + CL3
ARC-ISD (Codes TI & QS
Class C and D
Development & Maintenance
Class B and C
Table 2. Completed SW Engineering Appraisals from FY07 - FY10
Currently, the "CMMI for Development, Version 1.3"
is the approved version for process improvement and rating activities.
During the development of the NPR 7150.2, there was an intentional effort to minimize the size of the document by keeping it focused on the requirements. However, the 7150 team had developed a large amount of additional guidance material for the NPR, which they decided could be more effectively utilized in a NASA Handbook rather than in the NPR itself. Therefore, after the OCE released NPR 7150.2A,
the effort to develop this electronic Handbook in wiki format was initiated.
NPR 7150.2 is written in "shall" statement format with supplemental information in the form of accompanying notes and appendices. While these were included to help explain the meaning of each requirement, it was recognized that additional detail on the scope and the applicability of each SWE would increase the speed of cultural change and adoption of these software requirements. This Handbook, NASA-HDBK-2203,was established in wiki format to help achieve the above goals.
Numerous other components and related tasks have been developed and executed under OCE sponsorship. Some of these include the development of a software engineering curriculum (see the discussion on the DACUM below), the development and tracking of the Top Ten Software Issues list, the inventory of the software development activities (see the Software Inventory Management System (SIMS) tool discussion below), and the implementation of the OCE surveys. Others include:
NASA Study on Flight Software Complexity.
NASA Assurance Process for Complex Electronics.
Development of the Software Community of Practice.
Development of the Software Electronic Handbook.
Center Processes for NPR7150.2 implementation.
2. NASA Software Engineering Initiative
Software engineering is a core capability and a key enabling technology necessary for the support of NASA's Enterprises. Surveys and assessments identified and documented many software challenges within the Agency. Additionally, continuing exponential growth in the scope, complexity, and importance of software within NASA systems challenged the Agency's ability to manage it effectively. As a result, the NASA Headquarters OCE formed the NASA Software Engineering Initiative
in 2002. In coordination with Center software engineering improvement activities, the OCE defined a NASA-wide comprehensive approach for improving software engineering to quantifiable maturity levels commensurate with mission criticality to meet the software challenges of NASA.
The following key principles guide NASA's software improvement activities:
Software engineering is a core capability and a key enabling technology for NASA's missions and supporting infrastructure.
supports the implementation of NPD 7120.4, NASA Engineering and Program/Project Management Policy.
NPR 7150.2 provides a minimal set of requirements established by the Agency for software acquisition, development, maintenance, retirement, operations, and management.
NPR 7150.2 is intended to help NASA programs and projects accomplish their planned goals, e.g., mission success, safety, schedule, and budget, while satisfying their specified requirements.
2.1 NSEI Scope
This initiative covers software process improvement, as well as items related to software research: software safety, reliability, and quality; attraction and retention of software engineers, and improvement of NASA's software engineering knowledge and skills. It applies to both mission-critical and non-mission-critical software.
What is NSEI?
A NASA-wide comprehensive approach for improving software engineering processes and technology.
Why are we doingNSEI?
To meet the challenges facing NASA in software engineering (schedule, cost, project commitments, ensuring the use of best practices...).
Who is deployingNSEI?
OCE, in collaboration with each Center.
NASA Software Working Group (NSWG).
Center Management Steering Groups (MSGs) and Software Engineering Process Groups (SEPGs).
What are the elements of the OCE's approach?
Component plans from each Center.
The use of the Software Engineering Institute's Capability Maturity Model Integration as a benchmark for assessments.
Increasing the availability of software engineering tools.
The integration of sound software engineering principles and standards.
The enhancement of software engineers' knowledge and skills through training based on an agreed-upon curriculum.
Development of a Software Engineering Electronic Handbook.
Benchmarking of industry, government agencies, and academic institutions to learn about and share best practices.
Consolidation of Agency processes and practices.
Benchmarking and consolidation of best practices associated with cost estimation.
Small project implementation of requirements.
2.2 NSEI Elements
The NASA Software Engineering Initiative Implementation Plan (NSEIIP)
requires NASA to:
Implement a continuous software process and product improvement program across NASA and its contract community.
Improve safety, reliability, and quality of software through the establishment and integration of sound software engineering principles and standards.
Improve NASA's software engineering practices through research.
Improve software engineers' knowledge and skills and attract and retain software engineers.
The baseline NSEIIP was approved in 2002 (see SWE-002). As part of the NSEIIP, each Center developed its own improvement plan for advancing the in-house software engineering capability (see SWE-003).
Center plans: The OCE allowed the individual Centers some latitude in implementing the NSEI to take into account the mission of their Center's activities. While the implementation varies across research and flight Centers, certain basic features can be found to be common to all Centers.
To address the key NSEI components, each Center uses one or more of the following to meet its objectives for the initiative:
Center SEPG -This group is the primary organization responsible for developing software development process improvements and in creating, measuring, and interpreting software metrics.
MSGs -These groups provide the basic guidance for software engineering practices and applications at the Centers.
Software Policy – The key governing document
provides NASA policy for software.
Training, CMMI appraisals, and career development.
Software technology infusion.
Engineering, assurance, and safety collaboration.
Process assessment: NASA chose to appraise its software engineering improvement activities against a common industry process framework. Initially, it selected the Software Engineering Institute's (Carnegie Mellon University) process model, the CMM. This was succeeded by the CMMI for Development, Version 1.3.
Today (see table 2 above) NASA Center software development engineering process development activities are appraised (see SWE-032) for Maturity Level 2 and Level 3 capabilities to be properly credentialed to develop Class A and Class B software systems (see SWE-020). The OCE surveys assess the results of the Center CMMI assessments and evaluate related improvements to Center processes.
Tools: A key part of the NSEIIP to increase software developer skills and knowledge is the development of a set of tools useable in software development. Center and Agency repositories catalogue many of the tools used around the Agency. In addition, several tools have been developed to assist in achieving the above implementation elements:
(NEN), a repository for resources, tools, useful documentation.
The PAL provides for viewing of the Agency-level process assets. This library may contain information in many forms, including but not limited to processes, templates, web links, design principles, books, periodicals, presentations, tools, examples of documents, and conference descriptions.
NASA utilizes the SIMS tool to develop and maintain an inventory of software (see SWE-006) across the Agency for the purpose of facilitating strategic decisions with actual data. The analysis of the inventory results also directly supports NASA's Chief OSMA Officer and the IV&V Board of Advisors (IBA) in the selection of projects to receive IV&V services. (See SWE-131.)
Each SWE entry in this Handbook may include references to tools that have specific or related applicability to the SWE statement. The Resources section within each SWE description includes information as shown in the example panel below. The Handbook format allows users to suggest additional tools, as indicated in the second paragraph in the example panel. (Editor's note: the Tools sub-section in the Resources section normally is numbered as '5.1', which is shown in the panel.)
Metrics: Some basic measures are developed to assess the level and degree of improvement in software process activities across the Agency. Several activities and sources of measurement to form these metrics are used. The OCE conducts its assessments of Center progress by synthesizing and analyzing the information it gains through the conduct of Center surveys. (See the OCE Software Survey Instructions and Templates
for details on the content, frequency, and content required for these events.) The OCE also authorizes appraisals against selected requirements of NPR 7150.2 (see SWE-129). In addition, the results of the CMMI Maturity Level 2 and Maturity Level 3 appraisals are collected and reported in a non-attributed manner to indicate, from both a Center and an Agency perspective, the process areas that are well-developed and those that need further training and development assistance.
The PAL also includes support materials for projects to set up software measurement collection, as well as templates to analyze metrics to determine accurately the status of software development.
Process integration: As the
NASA Software Engineering Initiative Implementation (NSEII) achieves results, it is important to collect and document these software process improvements. The integration of these achievements allows for the distribution of the best-in-class processes around the Agency. This Agency-wide distribution encourages the development of uniform process efforts, which, in turn, helps the efficiency of software development activities across NASA's programs and projects. One means of achieving this integration and dissemination is by using annual workshops. NASA-sponsored events such as the NASA Project Management
from the OCE and the NASA Workshop on Validation and Verification
from the NASA IV&V Facility serve this purpose. The NASA PM Challenge is an annual seminar designed to examine current program/project management trends, as well as to provide a forum for knowledge sharing and the exchange of lessons learned. The IV&V Workshop offers an understanding of the challenges that IV&V organizations face in assuring that systems software operates safely and reliably.
The use of the Agency PAL (see SWE-098) is another mechanism being employed for the distribution of the best-in-class processes around the Agency. This repository allows the efficient collection and storage of useful and best-in-class documents that are easily available through the NEN. The Headquarters, Center, and Facility representatives to the NSWG currently manage population of the PAL.
Training: One additional step in
the NSEII approach is the development of better informed software engineers through improved and readily accessible training opportunities. The NSEII recognizes that both the software discipline and other discipline abilities are necessary for the development of quality software (see SWE-017). Headquarters and Center training organizations provide the appropriate and prioritized training events (see SWE-101and SWE-107). The OCE and the NSWG have sponsored the development of a NASA Software
Engineering DACUM to assure an Agency perspective for its software training needs. OSMA and the NASA Safety Center have developed the SMA Technical Excellence Program (STEP)
program, which includes extensive complementary training in software assurance. The DACUM
on the NEN captures a curriculum has been developed for software engineering. The curriculum is contained in the Software Engineering Technical Excellence Training (SWEET) program. The program will provide courses that a software engineer can take to both learn and enhance knowledge in software engineering throughout a career. The NASA Academy of Program/Project & Engineering Leadership (APPEL) Training Master Schedule and Registration
web site lists many of the more general training opportunities. The System for Administration, Training, and Educational Resources (SATERN) on-line training system provides employee, group, and individual learning opportunities for Center training initiatives.
Ensuring the quality, safety, and reliability of NASA software is of paramount importance in achieving mission success for the Agency's programs and projects. The NASA Software Process Improvement Initiative brings together an integrated spectrum of software engineering professionals, researchers, trained practitioners, improved processes, ratings and appraisal systems, accredited tools, and numerous engineering productivity tools to promote software improvement and overall excellence.
5. Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
Independent Verification and Validation of Embedded Software, Lesson Number 0723: Besides the recognition of the need for extensive project retesting (see the Introduction section for computer restarts
), this lesson learned indicates the need for independent verification of performance results because of recognition of earlier problems, e.g., failure to perform IV&V for software projects could result in software system weaknesses, performance of unintentional functions, and failure of the system and the mission. Anything less than a methodical, systematic rigorous treatment of IV&V could cause loss of mission, life, and valuable resources.