- 1. Purpose
- 2. Implications for Models and Simulations
- 3. Table 1. NPR 7150.2A vs. STD-7009
- 4. Rationale for STD-7009
- 5. Guidance for STD-7009
- 6. Resources
- 7. Lessons Learned
1. Purpose
This topic discusses the relationship between the requirements and associated processes for NPR 7150.2, NASA Software Engineering Requirements, 039 and the content of NASA-STD-7009, Standard for Models and Simulation 272. Because NASA-STD-7009 is generally applicable to all types of models and simulations (which are most commonly embodied in software), it is important to understand its relevance to NPR 7150.2 and the precedence to be maintained between the two documents. Software developers can use this topic to understand the relevance and applicability of NASA-STD-7009 when developing software under NPR 7150.2.
1.1 Introduction
1.1.1 Relevant Guidance in NPR 7150.2
As discussed elsewhere, the NPR 7150.2 requirements applicability to the software being developed is determined by the contents of its Appendix D, Requirements Mapping Matrix. Requirements SWE-139 and SWE-140 implement the contents of Appendix D. The NASA-STD-7009 in turn describes how models and simulations are to be developed and verified within the bounds of the governing SWE requirements of the NPR 7150.2.
NPR 7150.2 invokes the content of NASA-STD-7009 via requirement SWE-070 in the presentation of the requirement in its section 3.4.6 and the accompanying Note:
"3.4.6 The project shall verify, validate, and accredit software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment." (See SWE-070)
Note: Center processes address issues such as numerical accuracy, uncertainty analysis, and sensitivity analysis, as well as verification and validation for software implementations of models and simulations. Information regarding specific verification and validation techniques and the analysis of models and simulations can be found in the NASA standard NASA-STD-7009."
1.1.2 Applicability and Scope of NASA-STD-7009
The applicability and scope are stated in section 1.2 of NASA-STD-7009, where the acronym "M&S" denotes "models and simulations":
NASA-STD-7009 applies to M&S used by NASA and its contractors for critical decisions in design, development, manufacturing, ground operations, and flight operations. (Guidance for determining which particular M&S are in scope is provided in NASA-STD-7009, section 4.1 and Appendix A. NASA-STD-7009 also applies to use of legacy as well as commercial-off-the-shelf (COTS), government-off-the-shelf (GOTS), and modified-off-the-shelf (MOTS) M&S to support critical decisions. Generally, for such M&S, particular attention may need to be paid to defining the limits of operation and to verification, validation, and uncertainty quantification. Programs and projects are encouraged to apply this standard to M&S, if the M&S results may impact future critical decisions.
NASA-STD-7009 does not apply to M&S that are embedded in control software, emulation software, and stimulation environments. However, Center implementation plans for NPR 7150.2, NASA Software Engineering Requirements, cover embedded M&S, and address such M&S-specific issues as numerical accuracy, uncertainty analysis, sensitivity analysis, M&S verification, and M&S validation.
2. Implications for Models and Simulations
2.1 Implications for Embedded Models and Simulations
As noted in the previous section, embedded models and simulations are not in the scope of NASA-STD-7009. 272 However, the note in NPR 7150.2A 039 implies that Requirement 4.2.6 and Requirements 4.4.1-4.4.9 found in NASA-STD-7009, nevertheless be addressed for such models and simulations. Each of these requirements begins with the phrase "Shall document." None of them requires that any specific activity be performed other than the relevant documentation. As the Standard states in section 4.1: "Whatever was done is to be documented, and if nothing was done a clear statement to that effect is to be documented."
2.2 Implications for Other Models and Simulations
For all other models and simulations that are deemed by the M&S Risk Assessment to be in scope of NASA-STD-7009, there is the need to ensure that the requirements of both documents are satisfied. From the perspective of NASA-STD-7009, some of the requirements in NPR-7150.2A are not related to M&S, some are supplemental to requirements in NASA-STD-7009, and others are subsets of requirements in NASA-STD-7009.
Table 1 indicates the specific relationship of each requirement in NPR 7150.2A to NASA-STD-7009. The key to the remarks in the fourth column is
- Not Related: The NPR requirement is not germane to M&S per se. It may, for example, be a requirement on the NASA Centers (cf. SWE-003: "Each Center shall establish, document, execute, and maintain software processes").
- Supplemental: The NPR requirement is relevant to software for M&S, but it is over and above any requirement in NASA-STD-7009. For example, SWE-014 ("The project shall implement, maintain, and execute the software plan(s)") has no counterpart in NASA-STD-7009, which requires in Requirement 4.1.4 just that there be a plan (albeit for the M&S as a whole and not just the software), but has no requirement that this plan be implemented, maintained or executed.
- Subset of Requirement X: The NPR requirement is only a part of a requirement in NASA-STD-7009: Requirement 4.1.4 requires that there be a plan for the M&S that includes such software aspects as verification and configuration management, but has additional requirements on M&S-specific aspects that have no counterpart in the NPR.
3. Table 1. NPR 7150.2A vs. NASA-STD-7009
Section of NPR 7150.2A 039 | Requirement Descriptor | SWE # | NASA-STD-7009 272 Relationship |
Preface | Effective date | 1 | Scope statement implicitly includes M&S implemented in software |
Organizational Capability | Agency software initiative | 2 | Not Related |
| Center plan | 3 | Not Related |
| Benchmark | 4 | Not Related |
| Software processes | 5 | The Note in 3.4.6 of NPR 7150.2A means that the topics in the Note are to be addressed in Center processes |
| List of agency's programs & projects containing software | 6 | Not Related |
SW Life Cycle Planning | Software plans | 13 | Subset of Rqmt. 4.1.4 |
| Execute planning | 14 | Supplemental |
| Cost estimation | 15 | Supplemental |
| Schedule | 16 | Supplemental |
| Training | 17 | Supplemental to Rqmt. 4.6.2(a) |
| Reviews | 18 | Supplemental |
| Software development life cycle or model | 19 | Supplemental |
| Software classification | 20 | Supplemental |
| Software classification changes | 21 | Supplemental |
| Software assurance | 22 | Supplemental |
| Software safety | 23 | Supplemental |
| Plan tracking | 24 | Supplemental |
| Corrective action | 25 | Supplemental |
| Changes | 26 | Supplemental |
Off The Shelf (OTS) SW | COTS, GOTS, MOTS, etc. | 27 | Supplemental |
Verification & Validation | Verification planning | 28 | Subset of Rqmt. 4.1.3(e) |
| Validation planning | 29 | Subset of Rqmt. 4.1.3(e) |
| Verification results | 30 | Subset of Rqmts. 4.4.1 – 4.4.3 |
| Validation results | 31 | Subset of Rqmts. 4.4.4 – 4.4.6 |
Project Formulation | CMMI levels for class A, B and C software | 32 | Supplemental |
| Acquisition Assessment | 33 | Supplemental |
| Acceptance criteria | 34 | Subset of Rqmt. 4.1.3(a) |
| Supplier selection | 35 | Subset of Rqmt. 4.1.4 |
| Software processes | 36 | Supplemental |
| Milestone | 37 | Supplemental |
| Acquisition planning | 38 | Supplemental |
Government Insight | Insight into software activities | 39 | Supplemental |
| Access to software products | 40 | Supplemental |
| Open source notification | 41 | Supplemental |
| Electronic access to source code | 42 | Supplemental |
Supplier Monitoring | Track change request | 43 | Supplemental |
| Software measurement data | 44 | Supplemental |
| Joint audits | 45 | Supplemental |
| Software schedule | 46 | Supplemental |
| Traceability data | 47 | Supplemental |
| Solicitation | 48 | Supplemental |
Software Requirements Development | Document | 49 | Subset of Rqmt. 4.1.3 |
| Software requirements | 50 | Supplemental |
| Flow-down & derived requirements | 51 | Supplemental |
| Bi-directional traceability | 52 | Supplemental |
Software Requirements Management | Manage requirements change | 53 | Supplemental |
| Corrective action | 54 | Supplemental |
| Requirements validation | 55 | Supplemental |
Software Design | Document design | 56 | Supplemental |
| Software architecture | 57 | Supplemental |
| Detailed design | 58 | Supplemental |
| Bidirectional traceability | 59 | Supplemental |
Software Implementation | Design into code | 60 | Supplemental |
| Coding standards | 61 | Supplemental |
| Unit test | 62 | Supplemental |
| Version description | 63 | Subset of Rqmt. 4.2.9 |
| Bidirectional traceability | 64 | Supplemental |
Software Testing | Plan, procedures, reports | 65 | Supplemental |
| Perform testing | 66 | Supplemental |
| Verify implementation | 67 | Supplemental |
| Evaluate test results | 68 | Subset of Rqmts. 4.4.1 – 4.4.6 |
| Document defects & track | 69 | Supplemental |
| Models, simulations, tools | 70 | Supplemental |
| Update plans & procedures | 71 | Supplemental |
| Bidirectional traceability | 72 | Supplemental |
| Platform or hi-fidelity simulations | 73 | Not Related to M&S in Scope |
Software Operations, Maintenance, and Retirement | Document maintenance plans | 74 | Subset of Rqmt. 4.1.4 |
| Plan operations, maintenance & retirement | 75 | Subset of Rqmt. 4.1.4 |
| Implement plans | 76 | Supplemental |
| Deliver software products | 77 | Supplemental |
| As-built documentation | 78 | Supplemental |
Software Configuration Management | Develop configuration management plan | 79 | Supplemental |
| Track & evaluate changes | 80 | Supplemental |
Software Configuration Management | Identify software configuration items | 81 | Supplemental |
| Authorizing changes | 82 | Supplemental |
| Maintain records | 83 | Supplemental |
| Configuration audits | 84 | Supplemental |
| Implement procedures | 85 | Supplemental |
Risk Management | Continuous risk management | 86 | Supplemental |
Peer Reviews/ Inspections | Requirements, test plans, design & code | 87 | Supplemental |
| Checklist, criteria & tracking | 88 | Supplemental |
| Basic measurements | 89 | Supplemental |
Software Measurement | Objectives | 90 | Supplemental |
| Software measurement areas | 91 | Supplemental |
| Collection & storage | 92 | Supplemental |
| Analyze data | 93 | Supplemental |
| Report analysis | 94 | Supplemental |
Software Measurement | Software measurement system | 95 | Not Related |
| Objectives & procedures | 96 | Not Related |
Best Practices | Agency process asset library | 98 | Not Related |
| Identify applicable practices | 99 | Not Related |
Training | Software engineering training | 100 | Not Related |
| Software training plan | 101 | Not Related |
Software Documentation Requirements | Software development/management plan | 102 | Supplemental |
| Software configuration management plan | 103 | Supplemental |
| Software test plan | 104 | Supplemental |
| Software maintenance plan | 105 | Supplemental |
| Software assurance plan | 106 | Supplemental |
| Center software training plan | 107 | Not Related |
| Center software engineering improvement | 108 | Not Related |
| Software requirements specification | 109 | Supplemental |
| Software data dictionary | 110 | Not Related to M&S in Scope |
| Software design description | 111 | Supplemental |
| Interface design description | 112 | Not Related to M&S in Scope |
| Software change request/ problem report | 113 | Supplemental |
| Software test procedures | 114 | Supplemental |
| Software users manual | 115 | Supplemental |
| Software version description | 116 | Supplemental |
| Software metrics report | 117 | Supplemental |
| Software test report | 118 | Supplemental |
| Software inspection/ peer review/ inspections | 119 | Supplemental |
Tailoring of Requirements | Submit generic waiver request | 120 | Supplemental |
| Document approved alternate requirements | 121 | Supplemental |
Designation of Engineering Technical Authority | Center level Engineering Technical Authority approval | 122 | Not Related |
Compliance | Direction for Technical Authority | 124 | Not Related |
| Compliance matrix | 125 | Supplemental |
| Considerations for waivers | 126 | Not Related |
| Review of "P(Center)" | 127 | Not Related |
Compliance | Compliance records | 128 | Not Related |
| Check compliance | 129 | Not Related |
Software Life Cycle Planning | Software safety plan | 130 | Supplemental |
| IV&V Plan | 131 | Supplemental |
| Classification assessment | 132 | Not Related |
| Software safety determination | 133 | Supplemental |
| Safety-critical software requirements | 134 | Supplemental |
Software Implementation | Static analysis | 135 | Supplemental |
| Validation of software development tools | 136 | Supplemental |
Software Peer Reviews/ Inspections | Peer Review/ Inspections of Software plans | 137 | Supplemental |
Software Documentation Requirements | Software safety plan contents | 138 | Supplemental |
Compliance | "Shall: statements in this NPR | 139 | Supplemental |
| "P (Center)" | 140 | Supplemental |
4. Rationale for STD-7009
The NASA Standard for Models and Simulations (NASA-STD-7009) 272 had its genesis in the Space Shuttle Columbia Accident Investigation (2003). Generally, its purpose is to improve the "development, documentation, and operation of models and simulations" (Diaz Report) 144 and "include a standard method to assess the credibility of the models and simulations" (Office of the Chief Engineer (OCE) Memo 376) . After an approximately three-year development period, the NASA Standard for Models and Simulations, NASA-STD-7009 was approved by NASA's Engineering Review Board on July 11, 2008 for voluntary use.
NASA-STD-7009 holds a unique place in the world of modeling and simulation in that it is, by direction, generally applicable to all types of models and simulations (M&S) and in all phases of development, though it is primarily focused on the results of an M&S-based analysis. All standards and recommended practices for M&S to date have either been focused on a single type of M&S (e.g., structures, fluids, electrical controls, etc.) or a particular phase of M&S development (e.g., verification, validation, etc.). NASA management is confronted with numerous types of analyses that may be involved in making critical decisions. Depending on the situation at hand, a common framework for understanding the results and assessing the credibility of that analysis may seem intuitive. However, this is complicated by the vast differences in engineering systems, and, thus, the adoption of a standard like this has been slow.
After formal approval in July 2008, the NASA-STD-7009 was largely left to the individual program, project, or M&S practitioner to adopt as they wished. While already existing programs and projects were not required to adopt it, new programs and projects were to adopt it, depending on their needs, desires, and criticality of the M&S-based analysis at hand.
5. Guidance for NASA-STD-7009
Guidance for use and application of NASA-STD-7009272 can be found in the NASA-STD-7009 Guidebook located at the following link: https://standards.nasa.gov. This is a comprehensive instruction set on the use and application of NASA-STD-7009 as it relates to Verification and Validation of Models and Simulations. Pay particular attention to the use and application of the Credibility Assessment Scale (CAS) 414.
6. Resources
- (SWEREF-005) AIAA-2008-2156 Blattnig, S.R.; Green, L.L.; Luckring, J.M.; Morrison, J.H.; Tripathi, R.K.; Zang, T.A. (2008). NASA users can access AIAA documents via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of AIAA documents.
- (SWEREF-009) NPR 8000.4C, NASA Office of Safety and Mission Assurance, 2022. Effective Date: April 19, 2022 Expiration Date: April 19, 2027
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-083) NPR 7150.2D, Effective Date: March 08, 2022, Expiration Date: March 08, 2027 https://nodis3.gsfc.nasa.gov/displayDir.cfm?t=NPR&c=7150&s=2D Contains link to full text copy in PDF format. Search for "SWEREF-083" for links to old NPR7150.2 copies.
- (SWEREF-124) ASME (2006). New York, NY Available to members
- (SWEREF-126) Balci, O. (2004). Proceedings of the 2004 Winter Simulation Conference. R.G. Ingalls; M.D. Rossetti; J.S. Smith; B.A. Peters, eds. Dec. 5-8. Piscataway, NJ: IEEE. pp. 122-129.
- (SWEREF-128) Banks, J., ed. (1998). New York: John Wiley & Sons, Sep 14, 1998,
- (SWEREF-144) Diaz, Al,NASA Goddard Space Flight Center (Jan, 2004). CAIB Columbia Accident Investigation Board Report. (August 2003). Vol. 1. PB2005-100968
- (SWEREF-155) Clemen, R.T. (1996). Second Edition. Pacific Grove, CA: Brooks/Cole. download available from link.
- (SWEREF-164) Cooke, R.M. (1991). New York: Oxford University Press.
- (SWEREF-187) Fidelity ISG Glossary Simulation Interoperability Standards Organization (SISO). (Dec. 1998). Vol. 3.0. Site still exists at https://www.sisostds.org/. Unable to locate the Fidelity ISG Glossary.
- (SWEREF-198) AIAA G-077. Reston, VA: AIAA. 1998. This document is avaible from AIAA; NASA has access to the AIAA website via the NASA START (AGCY NTSS) system (https://standards.nasa.gov ).
- (SWEREF-200) Hale, J.P.; Hartway, B.L.; Thomas, D.A. (2007). The 5th Joint Army-Navy-NASA-Air Force (JANNAF) Modeling and Simulation Subcommittee Meeting, May, CDJSC 49. Columbia, MD: Johns Hopkins University.
- (SWEREF-202) Harmon, S.Y.; Youngblood, S.M. (2005). Vol. 2, No. 4, pp. 179-190. Conference: Spring 2003 SISO Simulation Interoperability Workshop, At Kissimmee, FL
- (SWEREF-222) IEEE STD 610.12-1990, 1990. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-240) Lin, J.; West, J.S.; Williams, R.W.; Tucker, P.K. (2005). AIAA-2005-4524 .
- (SWEREF-248) NASA-HDBK-7009 Revision: A, Document Date: 2019-05-08, Next 5-Year Review Date: 2024-05-08
- (SWEREF-251) Mehta, U.B. (2007). The 5th Joint Army-Navy-NASA-Air Force (JANNAF) Modeling and Simulation Subcommittee Meeting, CDJSC 49, May, CPIAC. Columbia, MD: Johns Hopkins University.
- (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
- (SWEREF-272) NASA-STD-7009A w/ CHANGE 1: ADMINISTRATIVE/ EDITORIAL CHANGES 2016-12-07
- (SWEREF-286) NPR 8715.3D, Effective Date: December 16, 2021, Expiration Date: December 21, 2026
- (SWEREF-288) Oberkampf, W.L.; Pilch, M.; Trucano, T.G. (October 2007), SAND2007-5948. Sandia National Laboratories: Albuquerque, New Mexico 87185 and Livermore, California 94550.
- (SWEREF-311) Defense Modeling and Simulation Office. http://vva.dmso.mil/.
- (SWEREF-312) Saltelli, A.; Chan, K.; Scott, E.M., eds. (2000). Chichester, England: John Wiley & Sons. Available via http://www.wiley.com/WileyCDA/WileyTitle/productCd-0471998923.html.
- (SWEREF-414) Babula, M. et al. (2008). Published Online:15 Jun 2012 https://doi.org/10.2514/6.2009-1011
- (SWEREF-460) NPR 7150 traced to NASA-STD-8739 8 and NASA-STD-8719 13B_20100924 This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-582) Public Lessons Learned Entry: 377.
6.1 Tools
Tools to aid in compliance with this Topic, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
7. Lessons Learned
The requirements for a NASA Standard have matured from the Columbia Accident Investigation Board (CAIB) Report. The CAIB Report found problems pertaining to "ineffective and inconsistent application of M&S tools, along with cases of misuse." It called on NASA to "develop, validate, and maintain physics-based computer models to evaluate Thermal Protection System damage from debris impacts. These tools should provide realistic and timely estimates of any impact damage from possible debris from any source that may ultimately impact the Orbiter." NASA was to establish impact damage thresholds that trigger responsive corrective action, such as on-orbit inspection and repair, when indicated.
Lessons Learned and their applicability to the need to perform verification and validation are well documented in the above identified resources. These should be reviewed and retained by any program or project utilizing Modeling and Simulation products throughout the Software development life cycle.
A documented lesson from the NASA Lessons Learned database notes the following:
Performance Decrease due to Propulsion Thruster Plume Impingement on the Voyager Spacecraft, Lesson Number: 0377:
"A 21% shortfall in Voyager's velocity change was suspected to be due to exhaust plume impingement. Due to the complexity of spacecraft/thruster configurations, additional care must be taken in the development and utilization of spacecraft and plume models. Analysis should be conducted on early and final designs." 582



