Invalid license: Your evaluation license of Refined expired.
bannerb

This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D

Renew your license to continue

Your evaluation license has expired. Contact your administrator to renew your Scaffolding Forms & Templates license.

7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009
This page contains macros or features from a plugin which requires a valid license.

You will need to contact your administrator.

1. Purpose

This topic discusses the relationship between the requirements and associated processes for NPR 7150.2, NASA Software Engineering Requirements 309, and the content of NASA-STD-7009 460, Standard for Models and Simulation 272. Because NASA-STD-7009 is generally applicable to all types of models and simulations (which are most commonly embodied in software), it is important to understand its relevance to NPR 7150.2 and the precedence to be maintained between the two documents. Software developers can use this topic to understand the relevance and applicability of NASA-STD-7009 when developing software under NPR 7150.2.

1.1 Introduction

1.1.1 Relevant Guidance in NPR 7150.2

As discussed elsewhere, the NPR 7150.2 requirements applicability to the software being developed is determined by the contents of its Appendix C, Requirements Mapping Matrix. Requirement SWE-139 implements the contents of Appendix C. The NASA-STD-7009 in turn describes how models and simulations are to be developed and verified within the bounds of the governing SWE requirements of the NPR 7150.2.

NPR 7150.2 references the content of NASA-STD-7009 via requirement SWE-070 in the presentation of the requirement in its section 4.5.7 and the accompanying Note:

"4.5.7 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment." (See SWE-070)

Note: Information regarding specific verification and validation techniques and the analysis of models and simulations can be found in the NASA standard NASA-STD-7009 460.

1.1.2 Applicability and Scope of NASA-STD-7009

The applicability and scope are stated in section 1.2 of NASA-STD-7009, where the acronym "M&S" denotes "models and simulations":

NASA-STD-7009 “applies to M&S used by NASA and its contractors for critical decisions in design, development, manufacturing, ground operations, and flight operations. (Guidance for determining which particular M&S are in scope is provided in [NASA-STD-7009] section 4.1 and Appendix A.) [NASA-STD-7009] also applies to use of legacy as well as commercial-off-the-shelf (COTS), government-off-the-shelf (GOTS), and modified-off-the-shelf (MOTS) M&S to support critical decisions. Generally, for such M&S, particular attention may need to be paid to defining the limits of operation and to verification, validation, and uncertainty quantification. Programs and projects are encouraged to apply this standard to M&S, if the M&S results may impact future critical decisions.

[NASA-STD-7009] does not apply to M&S that are embedded in control software, emulation software, and stimulation environments. However, Center implementation plans for NPR 7150.2, NASA Software Engineering Requirements ...cover embedded M&S, and address such M&S-specific issues as numerical accuracy, uncertainty analysis, sensitivity analysis, M&S verification, and M&S validation.” 272

2. Implications for Models and Simulations

2.1 Implications for Embedded Models and Simulations

As noted in the previous section, embedded models and simulations are not in the scope of NASA-STD-7009 272. However, the note in NPR 7150.2 039 implies that Requirements 4.4.1-4.4.6 found in NASA-STD-7009, nevertheless be addressed for such models and simulations. Each of these requirements begins with the phrase "Shall document." None of them requires that any specific activity be performed other than the relevant documentation. As the Standard states in section 4.1: "Whatever was done is to be documented, and if nothing was done a clear statement to that effect is to be documented."

2.2 Implications for Other Models and Simulations

For all other models and simulations that are deemed by the M&S Risk Assessment to be in scope of NASA-STD-7009, there is the need to ensure that the requirements of both documents are satisfied. From the perspective of NASA-STD-7009, some of the requirements in NPR-7150.2 are not related to M&S, some are supplemental to requirements in NASA-STD-7009, and others are subsets of requirements in NASA-STD-7009.

Table 1 indicates the specific relationship of each requirement in NPR 7150.2 to NASA-STD-7009. The key to the remarks in the fourth column is

  • Not Related: The NPR requirement is not germane to M&S per se. It may, for example, be a requirement on the NASA Centers (cf. SWE-003: "Center Directors, or designees, shall maintain, staff, and implement a plan to continually advance the Center's in-house software engineering capability...").
  • Supplemental: The NPR requirement is relevant to software for M&S, but it is over and above any requirement in NASA-STD-7009. For example, SWE-016 ("The project manager shall document and maintain a software schedule...") has no counterpart in NASA-STD-7009, which requires in Requirement 4.1.4 just that there be a plan (albeit for the M&S as a whole and not just the software), but has no requirement that this plan include a schedule.
  • Subset of Requirement X: The NPR requirement is only a part of a requirement in NASA-STD-7009: Requirement 4.1.4 requires that there be a plan for the M&S that includes such software aspects as verification and configuration management, but has additional requirements on M&S-specific aspects that have no counterpart in the NPR.

3. Table 1. NPR 7150.2B vs. NASA-STD-7009

Section of NPR 7150.2B 039

Requirement Descriptor

SWE #

NASA-STD-7009 460 Relationship

Preface




Roles and Responsibilities

SW engineering initiative

2

Not Related


Center improvement plans

3

Not Related


OCE (Office of the Chief Engineer) Benchmark

4

Supplemental


Software processes

5

Not Related


Agency software inventory

6

Supplemental

SW Life Cycle Planning

Software plans

13

Subset of Rqmt. 4.1.4

SW Cost Estimation

Cost estimation

15

Supplemental

SW Schedules

Software schedule

16

Supplemental

SW Project Specific Training

Project and software training

17

Supplemental to Rqmt. 4.6.2(a)

SW Schedules

Software activities reviews

18

Supplemental


Software life cycle

19

Supplemental

SW Classification and Planning Assessments

Software classification

20

Supplemental


Transition to a higher class

21

Supplemental

SW Assurance and SW IV&V

Software assurance

22

Supplemental

Safety Critical SW

Software safety

23

Supplemental

SW Life-Cycle Planning

Plan tracking

24

Supplemental

Use of Commercial, Government, Legacy, Heritage, and Modified OTS (Off the Shelf) SW

Use of commercial, government, legacy software

27

Supplemental

SW Verification & Validation

Verification planning

28

Subset of Rqmt. 4.1.3(e)


Validation planning

29

Subset of Rqmt. 4.1.3(e)


Verification results

30

Subset of Rqmts. 4.4.1 – 4.4.3


Validation results

31

Subset of Rqmts. 4.4.4 – 4.4.6

SW Development Processes

CMMI (Capability Maturity Model Integration) levels for class A, B and C software 

32

Supplemental

SW Acquisition

Acquisition vs. development assessment

33

Supplemental


Acceptance criteria

34

Subset of Rqmt. 4.1.3(a)


Supplier selection

35

Subset of Rqmt. 4.1.4


Software process determination

36

Supplemental


Software milestones

37

Supplemental


Acquisition planning

38

Supplemental


Software supplier insight

39

Supplemental


Access to software products

40

Supplemental

Open Source

Open source software notification

41

Supplemental

SW Acquisition

Source code electronic access

42

Supplemental

SW Monitoring

Track change request

43

Supplemental


Project participation in audits

45

Supplemental


Supplier software schedule

46

Supplemental


Traceability data

47

Supplemental


Software requirements

50

Supplemental


Software requirements analysis

51

Supplemental


Bi-directional traceability between higher level requirements & SW requirements

52

Supplemental


Manage requirements change

53

Supplemental


Corrective action for inconsistencies

54

Supplemental


Requirements validation

55

Supplemental

Software Design

Document design

56

Supplemental

Software Architecture

Software architecture

57

Supplemental

Software Design

Detailed design

58

Supplemental


Bidirectional traceability between SW requirements and SW design

59

Supplemental

Software Implementation

Coding software

60

Supplemental


Coding standards

61

Supplemental


Unit test

62

Supplemental


Release version description

63

Subset of Rqmt. 4.2.9


Bidirectional traceability between SW design and SW code

64

Supplemental

Software Testing

Test Plan, procedures, reports

65

Supplemental


Perform testing

66

Supplemental


Verify implementation

67

Supplemental


Evaluate test results

68

Subset of Rqmts. 4.4.1 – 4.4.6


Document defects & track

69

Supplemental


Models, simulations, tools

70

Supplemental


Update plans & procedures

71

Supplemental


Bidirectional traceability between SW test procedures and SW requirements

72

Supplemental


Platform or hi-fidelity simulations

73

Related to M&S in Scope

Software Operations, Maintenance, and Retirement

Plan operations, maintenance & retirement

75

Subset of Rqmt. 4.1.4


Deliver software products

77

Supplemental

Software Configuration Management

Develop configuration management plan

79

Supplemental


Track & evaluate changes

80

Supplemental


Identify software configuration management items

81

Supplemental


Authorizing changes

82

Supplemental


Status accounting

83

Supplemental


Configuration audits

84

Supplemental


Release management

85

Supplemental

Risk Management

Continuous risk management

86

Supplemental

Peer Reviews and Inspections

SW peer reviews and for requirements, test plans, design, and code

87

Supplemental


SW peer reviews and inspections - checklist criteria & tracking

88

Supplemental


SW peer reviews and inspections - basic measurements

89

Supplemental

SW Measurement

Management and technical measurements

90

Supplemental

Roles and Responsibilities

Establish and maintain measurement repository

91

Supplemental


Usage of measurement data

92

Supplemental

SW Measurement

Analysis of measurement data

93

Supplemental


Access to Measurement Data

94

Supplemental

Roles and Responsibilities

Reporting engineering discipline status

95

Supplemental


Agency process asset library

98

Supplemental


Software training funding

100

Supplemental


Center SW training plans

101

Not Related

Principles Related to Tailoring Requirements

Document alternate requirements

121

Supplemental

Roles and Responsibilities

Technical Authority appointment

122

Not Related

Principles Related to Tailoring of Requirements

Requirements compliance matrix

125

Supplemental

Roles and Responsibilities

Waiver and deviation considerations

126

Supplemental


OCE NPR appraisals

129

Supplemental

SW Assurance and SW IV&V (Independent Verification and Validation)

IV&V Project Execution Plan

131

Supplemental

SW Classification and Planning Assessments

Independent SW classification assessment

132

Supplemental


Software safety determination

133

Supplemental

Safety-Critical SW

Safety-critical software requirements

134

Supplemental

Software Implementation

Static analysis

135

Supplemental


Software tool accreditation

136

Supplemental

Principles Related to Tailoring Requirements

Shall statements

139

Supplemental

Roles and Responsibilities

Comply with requirements

140

Supplemental

SW Assurance and SW IV&V

Software IV&V

141

Not Related

Roles and Responsibilities

SW cost repositories

142

Supplemental

Software Architecture

Perform software architecture review

143

Supplemental

Roles and Responsibilities

SW engineering process assets

144

Supplemental


Indicate approval

145

Supplemental

Automatic Generation of SW Source Code

Auto-generated source code

146

Supplemental

Software Reuse

Reusability requirements

147

Supplemental


Contribute to software catalog

148

Supplemental

Open Source

Open source conditions

149

Supplemental

Principles Related to Tailoring Requirements

Review changes to tailored requirements

150

Supplemental

SW Cost Estimation

Cost estimate conditions

151

Supplemental

Roles and Responsibilities

Review compliance matrices

152

Supplemental


Define document contents

153

Supplemental

SW Security

Identify security risks

154

Supplemental


Implement risk mitigations

155

Supplemental


Evaluate systems for security risks

156

Supplemental


Protect against unauthorized access

157

Supplemental


Evaluate SW for security vulnerabilities

158

Supplemental


V&V risk mitigations

159

Supplemental

SW Classification and Planning Assessments

Safety critical classification

160

Supplemental

4. Rationale for STD-7009

The NASA Standard for Models and Simulations (NASA-STD-7009) 272 had its genesis in the Space Shuttle Columbia Accident Investigation (2003). Generally, its purpose is to improve the "development, documentation, and operation of models and simulations" (Diaz Report) 144 and per a September 2006 memo from the Office of the Chief Engineer "include a standard method to assess the credibility of the models and simulations" . After an approximately three-year development period, the NASA Standard for Models and Simulations, NASA-STD-7009 was approved by NASA's Engineering Review Board on July 11, 2008 for voluntary use.

NASA-STD-7009 holds a unique place in the world of modeling and simulation in that it is, by direction, generally applicable to all types of models and simulations (M&S) and in all phases of development, though it is primarily focused on the results of an M&S-based analysis. All standards and recommended practices for M&S to date have either been focused on a single type of M&S (e.g., structures, fluids, electrical controls, etc.) or a particular phase of M&S development (e.g., verification, validation, etc.). NASA management is confronted with numerous types of analyses that may be involved in making critical decisions. Depending on the situation at hand, a common framework for understanding the results and assessing the credibility of that analysis may seem intuitive. However, this is complicated by the vast differences in engineering systems, and, thus, the adoption of a standard like this has been slow.

After formal approval in July 2008, the NASA-STD-7009 was largely left to the individual program, project, or M&S practitioner to adopt as they wished. While already existing programs and projects were not required to adopt it, new programs and projects were to adopt it, depending on their needs, desires, and criticality of the M&S-based analysis at hand.

5. Guidance for NASA-STD-7009

Guidance for use and application of NASA-STD-7009 272 can be found in the NASA-HDBK-7009 Handbook 460 located at the following link: https://standards.nasa.gov. This is a comprehensive instruction set on the use and application of NASA-STD-7009 as it relates to Verification and Validation of Models and Simulations. Pay particular attention to the use and application of the Credibility Assessment Scale (CAS) 414.

6. Resources

Renew your license to continue

Your evaluation license has expired. Contact your administrator to renew your Reporting for Confluence license.

6.1 Tools

Tools to aid in compliance with this Topic, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

7. Lessons Learned

The requirements for a NASA Standard have matured from the Columbia Accident Investigation Board (CAIB) Report.  The CAIB Report found problems pertaining to "ineffective and inconsistent application of M&S tools, along with cases of misuse."  It called on NASA to "develop, validate, and maintain physics-based computer models to evaluate Thermal Protection System damage from debris impacts. These tools should provide realistic and timely estimates of any impact damage from possible debris from any source that may ultimately impact the Orbiter." NASA was to establish impact damage thresholds that trigger responsive corrective action, such as on-orbit inspection and repair, when indicated.

Lessons Learned and their applicability to the need to perform verification and validation are well documented in the above identified resources.  These should be reviewed and retained by any program or project utilizing Modeling and Simulation products throughout the Software development life cycle.

A documented lesson from the NASA Lessons Learned database notes the following:

Performance Decrease due to Propulsion Thruster Plume Impingement on the Voyager Spacecraft, Lesson Number: 0377:

"A 21% shortfall in Voyager's velocity change was suspected to be due to exhaust plume impingement. Due to the complexity of spacecraft/thruster configurations, additional care must be taken in the development and utilization of spacecraft and plume models. Analysis should be conducted on early and final designs." 582.

  • No labels