bannerd


SWE-073 - Platform or Hi-Fidelity Simulations

1. Requirements

4.5.8 The project manager shall validate the software system on the targeted platform or high-fidelity simulation.

1.1 Notes

Typically, a high-fidelity simulation has the exact processor, processor performance, timing, memory size, and interfaces as the target system.

1.2 History

SWE-073 - Last used in rev NPR 7150.2D

RevSWE Statement
A

3.4.9 The project shall ensure that the software system is validated on the targeted platform or high-fidelity simulation.

Difference between A and B

No change

B

4.5.10 The project manager shall validate the software system on the targeted platform or high-fidelity simulation.

Difference between B and C

No change

C

4.5.8 The project manager shall validate the software system on the targeted platform or high-fidelity simulation

Difference between C and DNo change
D

4.5.8 The project manager shall validate the software system on the targeted platform or high-fidelity simulation.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do.

Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment). The methods used to accomplish validation on the actual target platform or in a high fidelity simulator may include aspects that were applied to previous software work products (requirements, designs, prototypes, etc.). The use of these methods provides continuity of results through the assembling system. The use of the high-fidelity or targeted system allows the software developers to check systems-level interfaces, memory performance and constraints, event timing, and other characteristics that can only be evaluated properly in the real system or near-system environment (see SWE-055 - Requirements Validation). Validation activities include preparation, performance, analysis of results, and identification of corrective action. Validation at the systems level ensures that the correct product has been built. 001

See also SWE-065 - Test Plan, Procedures, Reports, SWE-068 - Evaluate Test Results.

3. Guidance

3.1 Validation Process

The basic validation process is shown below with the steps addressed by this requirement highlighted:



Validation activities are not be confused with verification activities as each has a specific goal. Validation is designed to confirm the right product is being produced while verification is conducted to confirm the product being produced meets the specified requirements correctly. 

Validation, as used in this requirement, addresses the following:

  • Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders.
  • Confirmation that implied or inherent requirements (e.g., the system must do X before Y) are correctly implemented.

See SWE-055 - Requirements Validation  for additional information on requirements validation during the concept, design, coding, and initial testing phases of the software development life cycle.

Once the software work products have been integrated into a software system, validation activities are concentrated on systems-level effects, interactions, interfaces, and the overall behavior of the system (i.e., whether the system is providing for and meeting the needs of the customer). This level of validation can be accomplished in either an actual operational environment with the use of the targeted platform or if this combination is not viable, on a high-fidelity simulator. A high-fidelity simulation typically has the exact processor, processor performance, timing, memory size, and interfaces as the flight unit. 

See also Topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009

3.2 Validation Approach

The following scenarios provide additional considerations for selection of the most appropriate validation approach at the systems level:

  • Operational environment demonstrations. 
    • Running the software in an actual operational environment.
    • Using this technique to confirm that implied, derived, and inherent requirements such as "the software will run" are properly fulfilled in the target environment.
    • Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the software product fulfills its intended purpose, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.
  • Behavior in a simulated environment.
    • Running the software in a simulated operational environment.
    • Using this technique when running the system in the actual environment is not possible or is impractical (costly).
    • Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the product fulfills its intended use, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.

      See Lessons Learned for other considerations related to simulated environment validation.

  •  Portability requirements may require the software to be run on a variety of platforms.
    • Validate portability by running appropriate software and system tests on all the required platforms.

Also, consider user-created operational scenarios, when appropriate. They can be a valuable tool in either simulated or operational environments.

3.3 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.4 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

The small project does not normally involve highly complex platforms, so it is generally easier and cheaper to validate software systems on the targeted platform. However, the environment for space systems will typically need to be simulated during validation for projects regardless of size. When using simulated platforms, small projects are advised to look for existing tools rather than creating their own.

Guidance: Platforms for simulating a space environment can vary considerably in complexity.  When possible, the team should consider reusing as many resources as feasible to simulate the space environment.  If the validation is to be performed on the target platform special attention should be made to ensure that the system is not damaged during validation. To simplify the testing, simulations can utilize predefined data streams. 

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

 The NASA Lessons Learned database contains the following lessons learned related to simulations:

  • Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No. 1122 539: A recent NASA technology program recognized the need to validate its flight (systems) level software in a series of simulated environments because of the concern over its inability to validate the software on the targeted platform (i.e., the X-34 separation from an L-1011 aircraft) ahead of the operational mission. The concern was heightened because of the seeming distributed nature of the mission's safety functions among the project's participants.
  • Testbed Limitations May Impact End-to-End Flight System Testing, Lesson No. 3716 578: "After 11 years of spaceflight, it was discovered that the dual string Stardust/NExT spacecraft was incapable of switching to the redundant flight system. Flight software changes made only 3 weeks before launch had inhibited side swapping, and the testbed that had verified the changes was not capable of simulating redundancy switching. When it is infeasible to test such changes using the flight system integrated with the launch system, assure that the system testbed is fully equipped for end-to-end simulation of the flight system."

    The Recommendation states: "When it is only feasible to test 'last-minute' command changes or flight software changes via simulation, instead of using the flight system that has been integrated with the launch vehicle, assure that the simulation testbed is capable of end-to-end verification of the impact on all flight software functions, including fault protection. Should the system testbed lack high fidelity features such as dual string simulation, the project should identify potential testing shortfalls and address how it will validate the test results ."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-073 - Platform or Hi-Fidelity Simulations
4.5.8 The project manager shall validate the software system on the targeted platform or high-fidelity simulation.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that the project validates the software components on the targeted platform or a high-fidelity simulation.

7.2 Software Assurance Products

  • None at this time.


    Objective Evidence

    • Software test procedures
    • Software test plan
    • Software test reports

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of software components (e.g., programs, modules, routines, functions, etc.) planned vs. # released in each build

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

For this requirement, confirm that the validation of the software system is being done on the target platform, or if that is not possible, on a high fidelity simulation. If the testing is done on a high fidelity simulation, identify any risks they see with using the high fidelity simulation instead of the intended platform. To identify potential risks in using the high fidelity simulation instead of the intended platform, think about the capabilities that the high fidelity simulator is not able to replicate exactly, including any interfaces that are not able to provide realistic inputs. There may be risks in any operational scenarios that cannot be tested fully because realistic simulator inputs are not available or because the simulator does not exactly replicate the capabilities of the flight systems.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:


  • No labels