bannerc

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
Tabsetup
01. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
67. Software Assurance
Div
idtabs-1

1. Requirements

Excerpt

4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

1.1 Notes

Information regarding specific V&V techniques and the analysis of models and simulations can be found in NASA-STD-7009, Standard for Models and Simulations, NASA-HDBK-7009, Handbook for Models and Simulations, or discipline-specific recommended practice guides.

1.2 History

Expand
titleClick here to view the history of this requirement: SWE-070 History

Include Page
SITE:SWE-070 History
SITE:SWE-070 History

1.3 Applicability Across Classes

Applicable c
a1
b1
csc1
c1
d0
dsc1
e0
f0
g0
h0

Div
idtabs-2

2. Rationale

Performing verification and validation (V&V) to accredit software models, simulations, and analysis tools is important to ensure the credibility of the results produced by those tools. Critical decisions may be made, at least in part, based upon the results produced by models, simulations, and analysis tools. Reducing the risk associated with these decisions is one reason to use accredited tools that have been properly verified and validated.

Div
idtabs-3

3. Guidance

The processes of V&V are key activities for accrediting all types of models and simulations. To provide a basis to determine if the models, simulations, and analysis tools are acceptable for use for a specific purpose, there are three pieces of information to capture and address from the modeling and simulation (M&S) characterizations as part of this process:

Panel
  1. The question(s) to be answered and the particular aspects of the problem that the M&S will be used to help address.
  2. The decisions that will be made are based on the M&S results.
  3. The consequences result from erroneous M&S outputs.
    Swerefn
    refnum175

Information for the V&V of models, simulations, and tools used to develop software code can be found in NASA-STD-7009, Standard for Models and Simulations,

Swerefn
refnum272
  and NPR 7150.2. Center requirements and associated processes flowed down from these two documents address numerical accuracy, uncertainty analysis, sensitivity analysis, and V&V of models and simulations.

The use of NASA-STD-7009 in fulfilling the requirements of NPR 7150.2 is described in topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009. The Topic also provides a list of additional resources for V&V of models and simulations used in the software development life cycle.

Per section 4.4 of NASA-STD-7009, 

Swerefn
refnum272
 the following basic activities are to be performed for verification of models and simulations:

  • Document verification techniques and the conditions under which the verification was conducted.
  • Document any numerical error estimates for the results of the computational model.
  • Document the verification status.

Per section 4.4 of NASA-STD-7009, 

Swerefn
refnum272
  the following basic activities are to be performed for validation of models and simulations:

  • Document techniques used to validate the models and simulations for their intended use.
  • Document the conditions under which the validation was conducted.


  • Document any validation metrics and any validation data set used.
  • Document any studies conducted.
  • Document validation results.

To better understand the uncertainties affecting the results of the models and simulations, follow the required steps in section 4.4 of NASA-STD-7009 and the steps for assessing and reporting the credibility of model and simulation results found in sections 4.7 and 4.8 of NASA-STD-7009.

Swerefn
refnum272

Panel

Key points:

  1. Simulations/emulations must be kept in sync with hardware/software updates
  2. Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
  3. Projects should verify that their simulators/emulators have been tested and validated against fight or flight-like hardware before use. 
  4. Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.
Div
idtabs-4

4. Small Projects

Small projects may choose to lighten their Verification and Validation (V&V) and accreditation requirements through the use of software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects that used these tools in a similar manner and purpose as the small project. To determine the relevance and usefulness of this option, small projects need to be aware of differences between the projects and the prior project's V&V and accreditation activities as well as the versions of the models, simulations, and analysis tools on which the V&V and accreditation activities were performed.

Div
idtabs-5

5. Resources

5.1 References

refstable
Show If
groupconfluence-users
Panel
titleColorred
titleVisible to editors only

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted
SWEREF-623 10/31/2019

SWEREFs called out in the text: 175, 272, 623

SWEREFs NOT called out in text but listed as germane: none

5.2 Tools


Include Page
Tools Table Statement
Tools Table Statement

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

Topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009, and NASA-STD-7009

Swerefn
refnum272
, includes lessons learned for the verification and validation of models and simulations.

Div
idtabs-7

7. Software Assurance

Excerpt Include
SWE-070 - Models, Simulations, Tools
SWE-070 - Models, Simulations, Tools

7.1 Tasking for Software Assurance

  1. Confirm the software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment have been validated and accredited.

7.2 Software Assurance Products

  • None at this time.


    Note
    titleObjective Evidence
    • Validation and accreditation criteria for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
    • Software test results for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
    • Use of NASA-STD-7009 for accreditation of software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
    Expand
    titleDefinition of objective evidence

    Include Page
    SITE:Definition of Objective Evidence
    SITE:Definition of Objective Evidence

7.3 Metrics

  •  # of Non-Conformances identified in models, simulations, and tools over time (Open, Closed, Severity)

7.4 Guidance

Task 1:

For this requirement, software assurance needs to confirm that any software models, simulations, and analysis tools used for the qualification of flight software or flight equipment have been validated and accredited. To be accredited means the models, simulations, and analysis tools have been officially certified as acceptable for the specific use for which they are intended.

Performing V&V to approve SW models, simulations, analysis tools, and software tools is important to ensure the credibility of those tools’ results and how the tool directly affects the SW being developed. Critical decisions may be made, at least in part, based upon the results produced by these tools. Reducing the risk associated with these decisions is one reason to assure that the approved tools have been properly verified and validated. Information regarding specific V&V techniques and the analysis of models and simulations not covered can be found in the NASA Standard NASA-STD-7009

Swerefn
refnum272
, Standard for Models and Simulations.

The first step In confirming the validation and accreditation of the models, simulations, and analysis tools used for the qualification of flight software or flight is to obtain a list of those being used. Then check to see if the project has verified, validated, and approved them for their use in the development, analysis, testing, or maintenance of the flight software or flight equipment.

Examples of SW tools that directly affect the SW code development can include but are not limited to, the compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools used by a Project. Another example would be the Project’s use of a linked code library, which would be typically validated and accredited before use in the executable code.

There is information in the software guidance section of this requirement on validating and accrediting these tools. Another reference is “How to Perform Credible Verification, Validation, and Accreditation for Modeling and Simulations” by Dr. David Cook and Dr. James Skinner

Swerefn
refnum623
.

Task 2:

Software assurance also needs to confirm that flight software or flight equipment has been properly calibrated before use, if applicable. Generally, tools that need calibration have a calibration tag that states the last calibration date and the expiration of that calibration (date when recalibration is required). If tools are found that are out of calibration before use for a test, this should be brought to the attention of the test director or project manager and the test should be delayed until the tool is calibrated or replaced with a calibrated tool.

If the software tools (e.g., simulators, models, simulations, emulators, compiler libraries, built-in memory checkers, materials analysis, trajectory analysis) have an impact on the safety of the system, then determine if they are correctly applied and used, operated within the range of their limitations, the results are documented, etc.  Typically, once it is determined if a tool has any safety implications, an assessment is performed on the severity of impact if the tool provides one or more wrong outputs, and the likelihood of that occurring determines the level of effort or hazard mitigations to employ.

Software assurance may want to check on the following other considerations for software models, simulations, and analysis tools:

  • When models are used to design, develop, analyze, test, or maintain software systems, assure that:
    • The models are kept up to date and configuration managed, incorporating changes, as needed
    • The models incorporate the requirements, constraints, and design features
    • Inputs are correct and complete
    • The models are testable, tested, and accredited
    • When using test tools, simulations, models, and environments, check:
      • Use of up-to-date versions and licenses on purchased tools
      • Are models the level of fidelity and completeness required to determine the requirements and functionality required?
      • Have any concerns or risks been recorded and resolved?
      • Are tools, models, simulators are being operated within the parameters/limitations of their capabilities
  • Have any operations of the software system or supporting tools, models, or simulators outside known boundaries, parameters, limitations been documented, and the risks input to the risk management system
  • Are the results as expected or can they be explained? 
  • Is there a report on the limits, functioning, and results of all simulators, models, and tools provided along with an analysis of the level of certainty/trust from outcomes, including any concerns, risks, issues?
Panel

Key points:

  1. Simulations/emulations must be kept in sync with hardware/software updates
  2. Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
  3. Projects should verify that their simulators/emulators have been tested and validated against fight or flight-like hardware before use. 
  4. Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.