bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 75 Next »

Unknown macro: {span}

4

7.1 - History and Overview of the Software Process Improvement (SPI) Effort
Unknown macro: {div3}

1. Purpose

Topic 7.1 addresses the history of the NASA software improvement efforts and provides a brief overview of the recent activities for software process improvement to provide a background and impetus for the development of this electronic Handbook.

1.1 Introduction

NASA's software development activities began with the earliest projects headed to space (Gemini, 1962 408). Usually, the software activities were project specific and were created with the particular project in mind. Often, the spacecraft design dictated the size and shape of the computer. New software development was required because software adaptation and reuse were essentially non-existent, because of unique platforms, individual programming styles, and the relative non-existence of other software.

The early occurrence and recognition of software issues (software faults caused computer restarts during the Apollo 11 lunar landing in 1969 409), as well as the increasing costs of software development, encouraged NASA to address the software engineering approaches used in the Agency. In 1976, the first NASA Software Engineering Workshop was held to address these issues. In the late 1980s, NASA followed up these efforts with the kickoff of a SW Management and Assurance Program (SMAP). The relationship between these early software improvement efforts and several major NASA projects of this time period is shown in figure 1, NASA Software Engineering Timeline, 1957-1997.

                                                                                    Figure 1. NASA Software Engineering Timeline, 1957-1997.

From these initial activities came the impetus for the NASA Software Engineering Initiative (NSEI) at the beginning of the new century. NSEI came into existence as one of three basic components of NASA's Engineering Excellence Initiative (EEI). (Systems Engineering and Project Engineering are the other two main components.) Results and related activities stemming from the NSEI are shown in figure 2, NASA Software Engineering Initiative Timeline, 2002-2010.

                                                                                      Figure 2. NASA Software Engineering Initiative Timeline, 2002-2010

In 2004, the NASA Software Assurance Research Program (SARP) Software Research Infusion Initiative began. 204This initiative encourages the uptake of new research results within real NASA missions. A key success feature is realized when a NASA Center adds the research product resulting from an initiative to its list of recommended practices. Table 1, NASA Research Infusion Initiative Completed Projects, shows a list of early initiative results.

Technology

Technology Provider

Technology Description

Customer Sites and Target Application

Outcome/Benefits

Perspective-based Inspections

Fraunhofer Maryland, SARP

Software Manual Inspection Technique

GSFC (Spacecraft FSW)
USA (ISS power analyzer)

Defects found in legacy code and that escaped previous inspections;
Adopted

Software Cost Reduction (SRC)

Naval Research Laboratory

Requirements Analysis Tools

ARC (ISS Payload)

Personnel trained,  Reqts validated

SpecTRM

Safeware Engineering Corp. & MIT

Requirements Capture and Analysis

JPL (Capture of Mission Design Rationale)

Personnel trained,  MIT student hired

Orthogonal-Defect Classification

JPL, SARP

Process Improvement Methodology

JPL (Ground SW)

SOA and project personnel trained

CodeSurfer/CodeSonar

Grammatech, Inc.

Reverse Engineering/Defect Detection

JSC (ISS, Shuttle),
IVVF (Spacecraft FSW)

Found defects that escaped previous inspections

C Global Surveyor (CGS)

ARC – Intelligent Systems Program

Software defect detection tool

ARC (ISS science payload)
MSFC (ISS payload)

Found defects.  Good feedback to provider

Coverity SWAT/Prevent

Coverity, Inc.

Software defect detection tool

MSFC (ISS, Shuttle FSW)

Found defects that escaped testing; Will be adopted.

                                                                                          Table 1.  NASA Research Infusion Initiative Completed Projects

Also in 2004, the Office of the Chief Engineer (OCE) conducted the first annual software inventory and issued the initial version of NPR 7150.2. The Office of Safety and Mission Assurance (OSMA) completed the updates to NASA-STD-8719.13, Software Safety Standard, 271and NASA-STD-8739.8, Software Assurance Standard. 278)

Figure 2 highlights software training with callouts for the design of the software curriculum

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

DACUM

]]></ac:plain-text-body>
</ac:macro>

and the first offering of the software engineering management class (SWE 301) 389, which occurred in the fall of 2008. Center organizations began achieving Configuration Maturity Model (CMM) and later Configuration Maturity Model Integration (CMMI) maturity level ratings at the turn of the century. Table 2, Completed SW Engineering Appraisals (FY07-FY10), lists the current status of Center CMMI ratings as of the start of FY11, as determined by a

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

SCAMPI

]]></ac:plain-text-body>
</ac:macro>

A evaluation.

Center/Organization

Rating (SCAMPI A by Certified Appraiser

Date

# Projects

Type

Organizational Size

Software Classes Assessed

LaRC-ASDC

PP (CL3), CM (CL1)

Nov-06

1

Data Center Support

85

Class C

MSFC

ML3

Apr-07

3

Development

63

Class A, B, and C

JPL

ML3

Sep-07

7

Dev & Maintenance

1000

Class A, B, and C

GSFC

ML2 + RSKM(2)

May-08

4

Dev & Maintenance

600

Class A, B, and C

LaRC+FSSB

ML2 + CL3

Oct-08

3

Services

5

Class B and C

LaRC+SDAB

PP(CL3), REQM(CL3), CM(CL3) MA(CL3)

Mar-09

4

Development

21

Class B and C

JSC

ML2

Apr-09

4

Development

90

Class A, B, C, and D

KSC

ML2

Sep-09

1

Development

225

Class A, B, and C

MSFC-SIL

ML2 + CL3

May-2010

1

Development

50

Class C

ARC-ISD
(Codes TI & QS

ML2

May-2010

6

Development

50

Class C

GRC-Flt SW

ML2

Aug-2010

2

Development

22

Class C and D

MSFC-Flt SW

ML3

Aug-2010

1

Development

75

Class A

JLP-Mission SW

ML3

Sept-2010

9

Development & Maintenance

950

Class B and C

                                                                                  Table 2.  Completed SW Engineering Appraisals from FY07 - FY10

Currently, the "CMMI for Development, Version 1.3" 157is the approved version for process improvement and rating activities.

During the development of the NPR 7150.2, there was an intentional effort to minimize the size of the document by keeping it focused on the requirements. However, the 7150 team had developed a large amount of additional guidance material for the NPR, which they decided could be more effectively utilized in a NASA Handbook rather than in the NPR itself. Therefore, after the OCE released NPR 7150.2A, 039the effort to develop this electronic Handbook in wiki format was initiated.

NPR 7150.2 is written in "shall" statement format with supplemental information in the form of accompanying notes and appendices. While these were included to help explain the meaning of each requirement, it was recognized that additional detail on the scope and the applicability of each SWE would increase the speed of cultural change and adoption of these software requirements. This Handbook, NASA-HDBK-2203,was established in wiki format to help achieve the above goals.

Numerous other components and related tasks have been developed and executed under OCE sponsorship. Some of these include the development of a software engineering curriculum (see the discussion on the DACUM below), the development and tracking of the Top Ten Software Issues list, the inventory of the software development activities (see the Software Inventory Management System (SIMS) tool discussion below), and the implementation of the OCE surveys. Others include:

  • NASA Study on Flight Software Complexity. 040
  • NASA Assurance Process for Complex Electronics. 255
  • Development of the Software Community of Practice.
  • Development of the Software Electronic Handbook.
  • Center Processes for NPR7150.2 implementation.
Unknown macro: {div3}

2. NASA Software Engineering Initiative

Software engineering is a core capability and a key enabling technology necessary for the support of NASA's Enterprises. Surveys and assessments identified and documented many software challenges within the Agency. Additionally, continuing exponential growth in the scope, complexity, and importance of software within NASA systems challenged the Agency's ability to manage it effectively. As a result, the NASA Headquarters OCE formed the NASA Software Engineering Initiative 205in 2002. In coordination with Center software engineering improvement activities, the OCE defined a NASA-wide comprehensive approach for improving software engineering to quantifiable maturity levels commensurate with mission criticality to meet the software challenges of NASA.

The following key principles guide NASA's software improvement activities:

  • Software engineering is a core capability and a key enabling technology for NASA's missions and supporting infrastructure.
  • NPR 7150.2 039supports the implementation of NPD 7120.4, NASA Engineering and Program/Project Management Policy. 057
  • NPR 7150.2 provides a minimal set of requirements established by the Agency for software acquisition, development, maintenance, retirement, operations, and management.
  • NPR 7150.2 is intended to help NASA programs and projects accomplish their planned goals, e.g., mission success, safety, schedule, and budget, while satisfying their specified requirements.

2.1 NSEI Scope

This initiative covers software process improvement, as well as items related to software research: software safety, reliability, and quality; attraction and retention of software engineers, and improvement of NASA's software engineering knowledge and skills. It applies to both mission-critical and non-mission-critical software.

  • What is NSEI?
    • A NASA-wide comprehensive approach for improving software engineering processes and technology.
  • Why are we doing NSEI?
    • To meet the challenges facing NASA in software engineering (schedule, cost, project commitments, ensuring the use of best practices...).
  • Who is deploying NSEI?
    • OCE, in collaboration with each Center.
    • NASA Software Working Group (NSWG).
    • Center Management Steering Groups (MSGs) and Software Engineering Process Groups (SEPGs).
  • What are the elements of the OCE's approach?
    • Component plans from each Center.
    • The use of the Software Engineering Institute's Capability Maturity Model Integration as a benchmark for assessments.
    • Increasing the availability of software engineering tools.
    • Software metrics.
    • The integration of sound software engineering principles and standards.
    • The enhancement of software engineers' knowledge and skills through training based on an agreed-upon curriculum.
    • Development of a Software Engineering Electronic Handbook.
    • Benchmarking of industry, government agencies, and academic institutions to learn about and share best practices.
    • Consolidation of Agency processes and practices.
    • Benchmarking and consolidation of best practices associated with cost estimation.
    • Small project implementation of requirements.

2.2 NSEI Elements

The NASA Software Engineering Initiative Implementation Plan (NSEIIP) 038requires NASA to:

  • Implement a continuous software process and product improvement program across NASA and its contract community.
  • Improve safety, reliability, and quality of software through the establishment and integration of sound software engineering principles and standards.
  • Improve NASA's software engineering practices through research.
  • Improve software engineers' knowledge and skills and attract and retain software engineers.

The baseline NSEIIP was approved in 2002 (see [SWE-002]). As part of the NSEIIP, each Center developed its own improvement plan for advancing the in-house software engineering capability (see [SWE-003]).

Center plans: The OCE allowed the individual Centers some latitude in implementing the NSEI to take into account the mission of their Center's activities. While the implementation varies across research and flight Centers, certain basic features can be found to be common to all Centers.

To address the key NSEI components, each Center uses one or more of the following to meet its objectives for the initiative:

  • Center SEPG -This group is the primary organization responsible for developing software development process improvements and in creating, measuring, and interpreting software metrics.
  • MSGs -These groups provide the basic guidance for software engineering practices and applications at the Centers.
  • Software Policy – The key governing document 257provides NASA policy for software.
  • Training, CMMI appraisals, and career development.
  • Software technology infusion.
  • Engineering, assurance, and safety collaboration.

Process assessment: NASA chose to appraise its software engineering improvement activities against a common industry process framework. Initially, it selected the Software Engineering Institute's (Carnegie Mellon University) process model, the CMM. This was succeeded by the CMMI for Development, Version 1.3. 157Today (see table 2 above) NASA Center software development engineering process development activities are appraised (see [SWE-032]) for Maturity Level 2 and Level 3 capabilities to be properly credentialed to develop Class A and Class B software systems (see [SWE-020]). The OCE surveys assess the results of the Center CMMI assessments and evaluate related improvements to Center processes.

Tools: A key part of the NSEIIP to increase software developer skills and knowledge is the development of a set of tools useable in software development. Center and Agency repositories catalogue many of the tools used around the Agency. In addition, several tools have been developed to assist in achieving the above implementation elements:

  • NASA Software Process Asset Library (PAL) 066 (see [SWE-098])
  • NASA Software Inventory Management System (SIMS) 330web-based tool (sponsored by the OCE) (see [SWE-006]).
  • Classification Tool and Safety-Critical Assessment Tool (see [Topic 7.2 - Classification Tool and Safety-Critical Assessment Tool]).
  • NASA Engineering Network 258(NEN), a repository for resources, tools, useful documentation.

The PAL provides for viewing of the Agency-level process assets. This library may contain information in many forms, including but not limited to processes, templates, web links, design principles, books, periodicals, presentations, tools, examples of documents, and conference descriptions.

NASA utilizes the SIMS tool to develop and maintain an inventory of software (see [SWE-006] ) across the Agency for the purpose of facilitating strategic decisions with actual data. The analysis of the inventory results also directly supports NASA's Chief OSMA Officer and the IV&V Board of Advisors (IBA) in the selection of projects to receive IV&V services. (See [SWE-131.])

Each SWE entry in this Handbook may include references to tools that have specific or related applicability to the SWE statement. The Resources section within each SWE description includes information as shown in the example panel below. The Handbook format allows users to suggest additional tools, as indicated in the second paragraph in the example panel. (Editor's note: the Tools sub-section in the Resources section normally is numbered as '5.1', which is shown in the panel.)

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Metrics: Some basic measures are developed to assess the level and degree of improvement in software process activities across the Agency. Several activities and sources of measurement to form these metrics are used. The OCE conducts its assessments of Center progress by synthesizing and analyzing the information it gains through the conduct of Center surveys. (See the OCE Software Survey Instructions and Templates 290for details on the content, frequency, and content required for these events.) The OCE also authorizes appraisals against selected requirements of NPR 7150.2 (see [SWE-129] ). In addition, the results of the CMMI Maturity Level 2 and Maturity Level 3 appraisals are collected and reported in a non-attributed manner to indicate, from both a Center and an Agency perspective, the process areas that are well-developed and those that need further training and development assistance.

The PAL also includes support materials for projects to set up software measurement collection, as well as templates to analyze metrics to determine accurately the status of software development.

Process integration: As the

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

NSEII

]]></ac:plain-text-body>
</ac:macro>

achieves results, it is important to collect and document these software process improvements. The integration of these achievements allows for the distribution of the best-in-class processes around the Agency. This Agency-wide distribution encourages the development of uniform process efforts, which, in turn, helps the efficiency of software development activities across NASA's programs and projects. One means of achieving this integration and dissemination is by using annual workshops. NASA-sponsored events such as the NASA Project Management 410from the OCE and the NASA Workshop on Validation and Verification 411from the NASA IV&V Facility serve this purpose. The NASA PM Challenge is an annual seminar designed to examine current program/project management trends, as well as to provide a forum for knowledge sharing and the exchange of lessons learned. The IV&V Workshop offers an understanding of the challenges that IV&V organizations face in assuring that systems software operates safely and reliably.

The use of the Agency PAL (see [SWE-098]) is another mechanism being employed for the distribution of the best-in-class processes around the Agency. This repository allows the efficient collection and storage of useful and best-in-class documents that are easily available through the NEN. The Headquarters, Center, and Facility representatives to the NSWG currently manage population of the PAL.

Training: One additional step in the

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

NSEII

]]></ac:plain-text-body>
</ac:macro>

approach is the development of better informed software engineers through improved and readily accessible training opportunities. The NSEII recognizes that both the software discipline and other discipline abilities are necessary for the development of quality software (see [SWE-017]). Headquarters and Center training organizations provide the appropriate and prioritized training events (see [SWE-101] and [SWE-107]). The OCE and the NSWG have sponsored the development of a NASA Software Engineering

<ac:macro ac:name="unmigrated-wiki-markup">
<ac:plain-text-body><![CDATA[

DACUM

]]></ac:plain-text-body>
</ac:macro>

  to assure an Agency perspective for its software training needs. OSMA and the NASA Safety Center have developed the SMA Technical Excellence Program (STEP) 294program, which includes extensive complementary training in software assurance. The DACUM 289on the NEN captures a curriculum has been developed for software engineering. The curriculum is contained in the Software Engineering Technical Excellence Training (SWEET) program. The program will provide courses that a software engineer can take to both learn and enhance knowledge in software engineering throughout a career. The NASA Academy of Program/Project & Engineering Leadership (APPEL) Training Master Schedule and Registration 265web site lists many of the more general training opportunities. The System for Administration, Training, and Educational Resources (SATERN) on-line training system provides employee, group, and individual learning opportunities for Center training initiatives.

Unknown macro: {div3}

3. Conclusion

Ensuring the quality, safety, and reliability of NASA software is of paramount importance in achieving mission success for the Agency's programs and projects. The NASA Software Process Improvement Initiative brings together an integrated spectrum of software engineering professionals, researchers, trained practitioners, improved processes, ratings and appraisal systems, accredited tools, and numerous engineering productivity tools to promote software improvement and overall excellence.

Unknown macro: {div3}

4. Resources

4.1 Tools

Tools to aid in compliance with this Topic, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

Unknown macro: {div3}

5. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following: 

Independent Verification and Validation of Embedded Software, Lesson Number 0723: Besides the recognition of the need for extensive project retesting (see the Introduction section for computer restarts 409), this lesson learned indicates the need for independent verification of performance results because of recognition of earlier problems, e.g., failure to perform IV&V for software projects could result in software system weaknesses, performance of unintentional functions, and failure of the system and the mission. Anything less than a methodical, systematic rigorous treatment of IV&V could cause loss of mission, life, and valuable resources. 518

  • No labels