This topic discusses the relationship between the requirements and associated processes for NPR 7150.2, NASA Software Engineering Requirements, 039 and the content of NASA-STD-7009, Standard for Models and Simulation 272. Because NASA-STD-7009 is generally applicable to all types of models and simulations (which are most commonly embodied in software), it is important to understand its relevance to NPR 7150.2 and the precedence to be maintained between the two documents. Software developers can use this topic to understand the relevance and applicability of NASA-STD-7009 when developing software under NPR 7150.2. As discussed elsewhere, the NPR 7150.2 requirements applicability to the software being developed is determined by the contents of its Appendix D, Requirements Mapping Matrix. Requirements SWE-139 and SWE-140 implement the contents of Appendix D. The NASA-STD-7009 in turn describes how models and simulations are to be developed and verified within the bounds of the governing SWE requirements of the NPR 7150.2. NPR 7150.2 invokes the content of NASA-STD-7009 via requirement SWE-070 in the presentation of the requirement in its section 3.4.6 and the accompanying Note: "3.4.6 The project shall verify, validate, and accredit software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment." (See SWE-070) Note: Center processes address issues such as numerical accuracy, uncertainty analysis, and sensitivity analysis, as well as verification and validation for software implementations of models and simulations. Information regarding specific verification and validation techniques and the analysis of models and simulations can be found in the NASA standard NASA-STD-7009." The applicability and scope are stated in section 1.2 of NASA-STD-7009, where the acronym "M&S" denotes "models and simulations": NASA-STD-7009 applies to M&S used by NASA and its contractors for critical decisions in design, development, manufacturing, ground operations, and flight operations. (Guidance for determining which particular M&S are in scope is provided in NASA-STD-7009, section 4.1 and Appendix A. NASA-STD-7009 also applies to use of legacy as well as commercial-off-the-shelf (COTS), government-off-the-shelf (GOTS), and modified-off-the-shelf (MOTS) M&S to support critical decisions. Generally, for such M&S, particular attention may need to be paid to defining the limits of operation and to verification, validation, and uncertainty quantification. Programs and projects are encouraged to apply this standard to M&S, if the M&S results may impact future critical decisions. NASA-STD-7009 does not apply to M&S that are embedded in control software, emulation software, and stimulation environments. However, Center implementation plans for NPR 7150.2, NASA Software Engineering Requirements, cover embedded M&S, and address such M&S-specific issues as numerical accuracy, uncertainty analysis, sensitivity analysis, M&S verification, and M&S validation. As noted in the previous section, embedded models and simulations are not in the scope of NASA-STD-7009. 272 However, the note in NPR 7150.2A 039 implies that Requirement 4.2.6 and Requirements 4.4.1-4.4.9 found in NASA-STD-7009, nevertheless be addressed for such models and simulations. Each of these requirements begins with the phrase "Shall document." None of them requires that any specific activity be performed other than the relevant documentation. As the Standard states in section 4.1: "Whatever was done is to be documented, and if nothing was done a clear statement to that effect is to be documented." For all other models and simulations that are deemed by the M&S Risk Assessment to be in scope of NASA-STD-7009, there is the need to ensure that the requirements of both documents are satisfied. From the perspective of NASA-STD-7009, some of the requirements in NPR-7150.2A are not related to M&S, some are supplemental to requirements in NASA-STD-7009, and others are subsets of requirements in NASA-STD-7009.
Table 1 indicates the specific relationship of each requirement in NPR 7150.2A to NASA-STD-7009. The key to the remarks in the fourth column is Section of NPR 7150.2A 039 Requirement Descriptor SWE # NASA-STD-7009 272 Relationship Preface Effective date 1 Scope statement implicitly includes M&S implemented in software Organizational Capability Agency software initiative 2 Not Related Center plan 3 Not Related Benchmark 4 Not Related Software processes 5 The Note in 3.4.6 of NPR 7150.2A means that the topics in the Note are to be addressed in Center processes List of agency's programs & projects containing software 6 Not Related SW Life Cycle Planning Software plans 13 Subset of Rqmt. 4.1.4 Execute planning 14 Supplemental Cost estimation 15 Supplemental Schedule 16 Supplemental Training 17 Supplemental to Rqmt. 4.6.2(a) Reviews 18 Supplemental Software development life cycle or model 19 Supplemental Software classification 20 Supplemental Software classification changes 21 Supplemental Software assurance 22 Supplemental Software safety 23 Supplemental Plan tracking 24 Supplemental Corrective action 25 Supplemental Changes 26 Supplemental Off The Shelf (OTS) SW COTS, GOTS, MOTS, etc. 27 Supplemental Verification & Validation Verification planning 28 Subset of Rqmt. 4.1.3(e) Validation planning 29 Subset of Rqmt. 4.1.3(e) Verification results 30 Subset of Rqmts. 4.4.1 – 4.4.3 Validation results 31 Subset of Rqmts. 4.4.4 – 4.4.6 Project Formulation CMMI levels for class A, B and C software 32 Supplemental Acquisition Assessment 33 Supplemental Acceptance criteria 34 Subset of Rqmt. 4.1.3(a) Supplier selection 35 Subset of Rqmt. 4.1.4 Software processes 36 Supplemental Milestone 37 Supplemental Acquisition planning 38 Supplemental Government Insight Insight into software activities 39 Supplemental Access to software products 40 Supplemental Open source notification 41 Supplemental Electronic access to source code 42 Supplemental Supplier Monitoring Track change request 43 Supplemental Software measurement data 44 Supplemental Joint audits 45 Supplemental Software schedule 46 Supplemental Traceability data 47 Supplemental Solicitation 48 Supplemental Software Requirements Development Document 49 Subset of Rqmt. 4.1.3 Software requirements 50 Supplemental Flow-down & derived requirements 51 Supplemental Bi-directional traceability 52 Supplemental Software Requirements Management Manage requirements change 53 Supplemental Corrective action 54 Supplemental Requirements validation 55 Supplemental Software Design Document design 56 Supplemental Software architecture 57 Supplemental Detailed design 58 Supplemental Bidirectional traceability 59 Supplemental Software Implementation Design into code 60 Supplemental Coding standards 61 Supplemental Unit test 62 Supplemental Version description 63 Subset of Rqmt. 4.2.9 Bidirectional traceability 64 Supplemental Software Testing Plan, procedures, reports 65 Supplemental Perform testing 66 Supplemental Verify implementation 67 Supplemental Evaluate test results 68 Subset of Rqmts. 4.4.1 – 4.4.6 Document defects & track 69 Supplemental Models, simulations, tools 70 Supplemental Update plans & procedures 71 Supplemental Bidirectional traceability 72 Supplemental Platform or hi-fidelity simulations 73 Not Related to M&S in Scope Software Operations, Maintenance, and Retirement Document maintenance plans 74 Subset of Rqmt. 4.1.4 Plan operations, maintenance & retirement 75 Subset of Rqmt. 4.1.4 Implement plans 76 Supplemental Deliver software products 77 Supplemental As-built documentation 78 Supplemental Software Configuration Management Develop configuration management plan 79 Supplemental Track & evaluate changes 80 Supplemental Software Configuration Management Identify software configuration items 81 Supplemental Authorizing changes 82 Supplemental Maintain records 83 Supplemental Configuration audits 84 Supplemental Implement procedures 85 Supplemental Risk Management Continuous risk management 86 Supplemental Peer Reviews/ Inspections Requirements, test plans, design & code 87 Supplemental Checklist, criteria & tracking 88 Supplemental Basic measurements 89 Supplemental Software Measurement Objectives 90 Supplemental Software measurement areas 91 Supplemental Collection & storage 92 Supplemental Analyze data 93 Supplemental Report analysis 94 Supplemental Software Measurement Software measurement system 95 Not Related Objectives & procedures 96 Not Related Best Practices Agency process asset library 98 Not Related Identify applicable practices 99 Not Related Training Software engineering training 100 Not Related Software training plan 101 Not Related Software Documentation Requirements Software development/management plan 102 Supplemental Software configuration management plan 103 Supplemental Software test plan 104 Supplemental Software maintenance plan 105 Supplemental Software assurance plan 106 Supplemental Center software training plan 107 Not Related Center software engineering improvement 108 Not Related Software requirements specification 109 Supplemental Software data dictionary 110 Not Related to M&S in Scope Software design description 111 Supplemental Interface design description 112 Not Related to M&S in Scope Software change request/ problem report 113 Supplemental Software test procedures 114 Supplemental Software users manual 115 Supplemental Software version description 116 Supplemental Software metrics report 117 Supplemental Software test report 118 Supplemental Software inspection/ peer review/ inspections 119 Supplemental Tailoring of Requirements Submit generic waiver request 120 Supplemental Document approved alternate requirements 121 Supplemental Designation of Engineering Technical Authority Center level Engineering Technical Authority approval 122 Not Related Compliance Direction for Technical Authority 124 Not Related Compliance matrix 125 Supplemental Considerations for waivers 126 Not Related Review of "P(Center)" 127 Not Related Compliance Compliance records 128 Not Related Check compliance 129 Not Related Software Life Cycle Planning Software safety plan 130 Supplemental IV&V Plan 131 Supplemental Classification assessment 132 Not Related Software safety determination 133 Supplemental Safety-critical software requirements 134 Supplemental Software Implementation Static analysis 135 Supplemental Validation of software development tools 136 Supplemental Software Peer Reviews/ Inspections Peer Review/ Inspections of Software plans 137 Supplemental Software Documentation Requirements Software safety plan contents 138 Supplemental Compliance "Shall: statements in this NPR 139 Supplemental "P (Center)" 140 Supplemental The NASA Standard for Models and Simulations (NASA-STD-7009) 272 had its genesis in the Space Shuttle Columbia Accident Investigation (2003). Generally, its purpose is to improve the "development, documentation, and operation of models and simulations" (Diaz Report) 144 and "include a standard method to assess the credibility of the models and simulations" (Office of the Chief Engineer (OCE) Memo 376) . After an approximately three-year development period, the NASA Standard for Models and Simulations, NASA-STD-7009 was approved by NASA's Engineering Review Board on July 11, 2008 for voluntary use. NASA-STD-7009 holds a unique place in the world of modeling and simulation in that it is, by direction, generally applicable to all types of models and simulations (M&S) and in all phases of development, though it is primarily focused on the results of an M&S-based analysis. All standards and recommended practices for M&S to date have either been focused on a single type of M&S (e.g., structures, fluids, electrical controls, etc.) or a particular phase of M&S development (e.g., verification, validation, etc.). NASA management is confronted with numerous types of analyses that may be involved in making critical decisions. Depending on the situation at hand, a common framework for understanding the results and assessing the credibility of that analysis may seem intuitive. However, this is complicated by the vast differences in engineering systems, and, thus, the adoption of a standard like this has been slow. After formal approval in July 2008, the NASA-STD-7009 was largely left to the individual program, project, or M&S practitioner to adopt as they wished. While already existing programs and projects were not required to adopt it, new programs and projects were to adopt it, depending on their needs, desires, and criticality of the M&S-based analysis at hand. Guidance for use and application of NASA-STD-7009 272 can be found in the NASA-STD-7009 Guidebook located at the following link: https://standards.nasa.gov. This is a comprehensive instruction set on the use and application of NASA-STD-7009 as it relates to Verification and Validation of Models and Simulations. Pay particular attention to the use and application of the Credibility Assessment Scale (CAS) 414. Tools to aid in compliance with this Topic, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. The requirements for a NASA Standard have matured from the Columbia Accident Investigation Board (CAIB) Report. The CAIB Report found problems pertaining to "ineffective and inconsistent application of M&S tools, along with cases of misuse." It called on NASA to "develop, validate, and maintain physics-based computer models to evaluate Thermal Protection System damage from debris impacts. These tools should provide realistic and timely estimates of any impact damage from possible debris from any source that may ultimately impact the Orbiter." NASA was to establish impact damage thresholds that trigger responsive corrective action, such as on-orbit inspection and repair, when indicated. Lessons Learned and their applicability to the need to perform verification and validation are well documented in the above identified resources. These should be reviewed and retained by any program or project utilizing Modeling and Simulation products throughout the Software development life cycle. A documented lesson from the NASA Lessons Learned database notes the following: Performance Decrease due to Propulsion Thruster Plume Impingement on the Voyager Spacecraft, Lesson Number: 0377: "A 21% shortfall in Voyager's velocity change was suspected to be due to exhaust plume impingement. Due to the complexity of spacecraft/thruster configurations, additional care must be taken in the development and utilization of spacecraft and plume models. Analysis should be conducted on early and final designs." 582
See edit history of this section
Post feedback on this section
1. Purpose
1.1 Introduction
1.1.1 Relevant Guidance in NPR 7150.2
1.1.2 Applicability and Scope of NASA-STD-7009
2. Implications for Models and Simulations
2.1 Implications for Embedded Models and Simulations
2.2 Implications for Other Models and Simulations
3. Table 1. NPR 7150.2A vs. NASA-STD-7009
4. Rationale for STD-7009
5. Guidance for NASA-STD-7009
6. Resources
6.1 Tools
7. Lessons Learned
7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009
Web Resources
View this section on the websiteUnknown macro: {page-info}