bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


Wiki Markup
{alias:SWE-073}
Tabsetuptabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned1. The Requirement


{div3:id=} h1.
Wiki Markup
Div
idtabs-1

1.

Requirements

3.4.9

The

project

shall

ensure

that

the

software

system

is

validated

on

the

targeted

platform

or

high-fidelity

simulation.

h2. {color:#003366}{*}

1.1

Notes{*}{color}

Notes

Typically,

a

high-fidelity

simulation

has

the

exact

processor,

processor

performance,

timing,

memory

size,

and

interfaces

as

the

flight

unit.

h2.

1.2

Applicability

Across

Classes

Class

G

is

labeled

with

"P

(Center)."

This

means

that

an

approved

Center-defined

process

that

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement.

{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=1|cnsc=1|dsc=1|dnsc=0|esc=1|ensc=0|f=1|g=p|h=0} {div3}

Wiki Markup
{div3:id=tabs-2}

h1. 2. Rationale

Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do.

Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment). The methods used to accomplish validation on the actual target platform or in a high fidelity simulator may include aspects that were applied to previous software work products (requirements, designs, prototypes, etc.). The use of these methods provides continuity of results through the assembling system. The use of the high-fidelity or targeted system allows the software developers to check systems level interfaces, memory performance and constraints, event timing, and other characteristics that can only be evaluated properly in the real system or near-system environment (see [SWE-055|SWE-055]). Validation activities include preparation, performance, analysis of results, and identification of corrective action. Validation at the systems level ensures that the correct product has been built. {sweref:001}
{div3}
Wiki Markup
{div3:id=tabs-3}

h1. 3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:


!SWE-073 new graphic.jpg|border=0!



{panel}Validation activities are not be confused with verification activities as each has a specific goal. Validation is designed to confirm the right product is being produced while verification is conducted to confirm the product being produced meets the specified requirements correctly. {panel}

Validation, as used in this requirement, addresses the following:
* Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders.
* Confirmation that implied or inherent requirements (e.g., system must do X before Y) are correctly implemented.

See [SWE-055|SWE-055] for additional information on requirements validation during the concept, design, coding, and initial testing phases of the software development life cycle.

Once the software work products have been integrated into a software system, validation activities are concentrated on systems-level effects, interactions, interfaces, and the overall behavior of the system (i.e., whether the system is providing for and meeting the needs of the customer). This level of validation can be accomplished in either an actual operational environment with the use of the targeted platform, or if this combination is not viable, on a high-fidelity simulator. Recall from the note associated with this requirement that a high-fidelity simulation typically has the exact processor, processor performance, timing, memory size, and interfaces as the flight unit. 

The following scenarios provide additional considerations for selection of the most appropriate validation approach at the systems level:
* Operational environment demonstrations. 
** Running the software in an actual operational environment.
** Using this technique to confirm that implied, derived, and inherent requirements such as "the software will run" are properly fulfilled in the target environment.
** Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the software product fulfills its intended purpose, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.
* Behavior in a simulated environment.
** Running the software in a simulated operational environment.
** Using this technique when running the system in the actual environment is not possible or is impractical (costly).
** Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the product fulfills its intended use, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.
** {panel}See Lessons Learned for other considerations related to simulated environment validation. {panel}
*  Portability requirements may require the software to be run in a variety of platforms.
** Validate portability by running appropriate software and system tests on all the required platforms.

Also, consider user-created operational scenarios, when appropriate. They can be a valuable tool in either simulated or operational environments.
\\
Additional guidance related to platform or hi-fidelity simulations may be found in the following related requirements in this Handbook:
| [SWE-029|SWE-029] | Validation Planning |
| [SWE-031|SWE-031] | Validation Results |
\\
\\
{div3}
Wiki Markup
{div3:id=tabs-4}

h1. 4. Small Projects

The small project does not normally involve highly complex platforms, so it is generally easier and cheaper to validate software systems on the targeted platform. However, the environment for space systems will typically need to be simulated during validation for projects regardless of size. When using simulated platforms, small projects are advised to look for existing tools rather than creating their own.
{div3}
Wiki Markup
{div3:id=tabs-5}

h1. 5. Resources



{refstable}

{Toolstable}


{div3}
Wiki Markup
{div3:id=tabs-6} h1. 6. Lessons Learned  The NASA Lessons Learned database contains the following lessons learned related to simulations: * *Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No. 1122*: A recent NASA technology program recognized the need to validate its flight (systems) level software in a series of simulated environments because of the concern over its inability to validate the software on the targeted platform (i.e., the X-34 separation from an L-1011 aircraft) ahead of the operational mission. The concern was heightened because of the seeming distributed nature of the mission's safety functions among the project's participants{sweref:539}.   * *Testbed Limitations May Impact End-to-End Flight System Testing, Lesson No. 3716*: "After 11 year\[s\] of spaceflight, it was discovered that the dual string Stardust/NExT spacecraft was incapable of switching to the redundant flight system. Flight software changes made only 3 weeks before launch had inhibited side swapping, and the testbed that had verified the changes was not capable of simulating redundancy switching. When it is infeasible to test such changes using the flight system integrated with the launch system, assure that the system testbed is fully equipped for end-to-end simulation of the flight system."   The Recommendation states: "When it is only feasible to test 'last minute' command changes or flight software changes via simulation, instead of using the flight system that has been integrated with the launch vehicle, assure that the simulation testbed is capable of end-to-end verification of the impact on all flight software functions, including fault protection. Should the system testbed lack high fidelity features such as dual string simulation, the project should identify potential testing shortfalls and address how it will validate the test results{sweref:578}." {div3}


applicable
f1
gp
h0
ansc1
asc1
bnsc1
csc1
bsc1
esc1
cnsc1
dnsc0
dsc1
ensc0



Div
idtabs-2

2. Rationale

Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do.

Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment). The methods used to accomplish validation on the actual target platform or in a high fidelity simulator may include aspects that were applied to previous software work products (requirements, designs, prototypes, etc.). The use of these methods provides continuity of results through the assembling system. The use of the high-fidelity or targeted system allows the software developers to check systems level interfaces, memory performance and constraints, event timing, and other characteristics that can only be evaluated properly in the real system or near-system environment (see SWE-055). Validation activities include preparation, performance, analysis of results, and identification of corrective action. Validation at the systems level ensures that the correct product has been built.

sweref
001
001


Div
idtabs-3

3. Guidance

The basic validation process is shown below with the steps addressed by this requirement highlighted:

Image Added


Panel

Validation activities are not be confused with verification activities as each has a specific goal. Validation is designed to confirm the right product is being produced while verification is conducted to confirm the product being produced meets the specified requirements correctly.


Validation, as used in this requirement, addresses the following:

  • Confirmation of the correctness, completeness, clarity, and consistency of the requirements with stakeholders.
  • Confirmation that implied or inherent requirements (e.g., system must do X before Y) are correctly implemented.

See SWE-055 for additional information on requirements validation during the concept, design, coding, and initial testing phases of the software development life cycle.

Once the software work products have been integrated into a software system, validation activities are concentrated on systems-level effects, interactions, interfaces, and the overall behavior of the system (i.e., whether the system is providing for and meeting the needs of the customer). This level of validation can be accomplished in either an actual operational environment with the use of the targeted platform, or if this combination is not viable, on a high-fidelity simulator. Recall from the note associated with this requirement that a high-fidelity simulation typically has the exact processor, processor performance, timing, memory size, and interfaces as the flight unit. 

The following scenarios provide additional considerations for selection of the most appropriate validation approach at the systems level:

  • Operational environment demonstrations. 
    • Running the software in an actual operational environment.
    • Using this technique to confirm that implied, derived, and inherent requirements such as "the software will run" are properly fulfilled in the target environment.
    • Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the software product fulfills its intended purpose, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.
  • Behavior in a simulated environment.
    • Running the software in a simulated operational environment.
    • Using this technique when running the system in the actual environment is not possible or is impractical (costly).
    • Using this technique to view a system or subsystem as a collected implementation of the requirements and confirm that the product fulfills its intended use, not just individual requirements, but as a collected set of requirements, addressing needs, expected behavior, and functionality.

    • Panel

      See Lessons Learned for other considerations related to simulated environment validation.


  •  Portability requirements may require the software to be run in a variety of platforms.
    • Validate portability by running appropriate software and system tests on all the required platforms.

Also, consider user-created operational scenarios, when appropriate. They can be a valuable tool in either simulated or operational environments.
Additional guidance related to platform or hi-fidelity simulations may be found in the following related requirements in this Handbook:


SWE-029

Validation Planning

SWE-031

Validation Results




Div
idtabs-4

4. Small Projects

The small project does not normally involve highly complex platforms, so it is generally easier and cheaper to validate software systems on the targeted platform. However, the environment for space systems will typically need to be simulated during validation for projects regardless of size. When using simulated platforms, small projects are advised to look for existing tools rather than creating their own.


Div
idtabs-5

5. Resources


refstable

Toolstable


Div
idtabs-6

6. Lessons Learned

 The NASA Lessons Learned database contains the following lessons learned related to simulations:

  • Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No. 1122: A recent NASA technology program recognized the need to validate its flight (systems) level software in a series of simulated environments because of the concern over its inability to validate the software on the targeted platform (i.e., the X-34 separation from an L-1011 aircraft) ahead of the operational mission. The concern was heightened because of the seeming distributed nature of the mission's safety functions among the project's participants
    sweref
    539
    539
    .
     
  • Testbed Limitations May Impact End-to-End Flight System Testing, Lesson No. 3716: "After 11 year[s] of spaceflight, it was discovered that the dual string Stardust/NExT spacecraft was incapable of switching to the redundant flight system. Flight software changes made only 3 weeks before launch had inhibited side swapping, and the testbed that had verified the changes was not capable of simulating redundancy switching. When it is infeasible to test such changes using the flight system integrated with the launch system, assure that the system testbed is fully equipped for end-to-end simulation of the flight system."
     
    The Recommendation states: "When it is only feasible to test 'last minute' command changes or flight software changes via simulation, instead of using the flight system that has been integrated with the launch vehicle, assure that the simulation testbed is capable of end-to-end verification of the impact on all flight software functions, including fault protection. Should the system testbed lack high fidelity features such as dual string simulation, the project should identify potential testing shortfalls and address how it will validate the test results
    sweref
    578
    578
    ."