bannerd


SWE-193 - Acceptance Testing for Affected System and Software Behavior

1. Requirements

4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior.  

1.1 Notes

These acceptance tests should validate and verify the data, rules, and code for nominal and off-nominal scenarios.

1.2 History

SWE-193 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

N/A

B


Difference between B and C

NEW

Class A or B only

C

4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior. 

Difference between C and DNo change
D

4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior.  



1.3 Applicability Across Classes

 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Any uploaded or uplinked data, rules, and code can affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios.

3. Guidance

3.1 Acceptance Test

Acceptance Test (see SWE-034 - Acceptance Criteria) is a system-level test, usually performed during the final integration of formally tested software with the intended flight or operations hardware before Project level acceptance. Additional information may be found in NPR 7123.1  041 and NASA-SP-2007-6105.  273

Uploaded or uplinked data, rules, and code may affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios. See also Topic 8.01 - Off Nominal Testing.

See also SWE-066 - Perform Testing, SWE-068 - Evaluate Test Results

3.2 Acceptance Criteria

Acceptance activities for software development begin with planning and developing acceptance criteria during the Formulation phase of the project. These activities and acceptance criteria can be documented in the Software Development/Management Plan (see 5.08 - SDP-SMP - Software Development - Management Plan ) or a separate Software V&V Plan. Understanding the system grows and the requirements are better understood, the acceptance criteria can be reviewed to ensure they are up to date and consistent with the system being built. They typically conclude with the system acceptance review late in the Implementation phase of the project (see the entrance and exit criteria for the Systems Acceptance Review in topic 7.09 - Entrance and Exit Criteria). See also 8.01 - Off Nominal Testing.

3.3 Acceptance Testing

Acceptance Testing is the formal testing conducted to determine whether a software system satisfies its acceptance criteria, enabling the customer to determine whether or not to accept the system. Acceptance testing is designed to determine whether the software work product is fit for use. Acceptance testing of the software work product typically forms a major portion of the acceptance plan. Once developed, the team runs the acceptance testing, commonly called a test suite, against the supplied input data and conditions. The software testing personnel are typically but not always independent of the project team. Software assurance personnel observe the tests. The team compares the obtained test results with the expected results. If the results match or fall within a previously agreed-to band or tolerance, the test suite is said to pass, and the work product is acceptable. If not, the work product may either be rejected or accepted on conditions previously agreed to between the customer and the software development team.

3.4 Test Results

Verification results that are used in software acceptance reviews are typically documented in software verification reports. Test results for software work products subjected to acceptance testing must be documented in a test report or an acceptance data package. See 5.11 - STR - Software Test Report for related information.

3.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.6 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

 Perform acceptance testing for nominal scenarios on final hardware against input data and conditions that affect the behavior of the system. Validation tests may be used as the acceptance test. Test results need to be documented and evaluated against expected results to determine whether the software passes or is determined to be acceptable by the customer. 

5. Resources

5.1 References


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

Configurable Data Loads (CDL)

  • Definition: CDLs contains updateable parameters that are loaded into flight software and can control safety-critical functions.
  • Safety-critical data is a shared responsibility between Subsystem Responsible Engineers and Flight Software Team with oversight from Systems Engineering.
  • Maintain traceability between data loads and software verification test procedures to support timely verification of late-breaking changes.
  • Predefine verification/validation is needed for all CDLs.
  • Pre-declare CDL values that are expected/allowed to change with associated nominal verification activities.
    • Changes outside this list need Engineering Control Board approval and must have a verification plan for every change.

Bottom Line: Safety-critical data must be treated with the same rigor as safety-critical software

  • Configuration Management
  • Verification and Validation

7. Software Assurance

SWE-193 - Acceptance Testing for Affected System and Software Behavior
4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior.  

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that the project develops acceptance tests for loaded or uplinked data, rules, and code that affect software and software system behavior.

2. Confirm that the loaded or uplinked data, rules, scripts, or code that affect software and software system behavior are baselined in the software configuration system. 

3. Confirm that loaded or uplinked data, rules, and scripts are verified as correct prior to operations, particularly for safety-critical operations.

7.2 Software Assurance Products

  • None at this time.


    Objective Evidence

    • Software test reports
    • Software test procedures
    • Software configuration data

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • None identified at this time.

7.4 Guidance

When acceptance testing is being done, it is important to ensure that the entire software system will perform correctly in the intended environment for operations. Uploaded or uplinked data, rules, and code form an important part of the system that needs to contain correct data and be working properly for the system to work as intended. Since uploaded or uplinked data, rules, scripts, and code may affect the behavior of the software and/or system, special acceptance tests should be developed to validate and verify the uplinked or uploaded information, including the information for nominal and off-nominal scenarios. Software assurance needs to verify that tests have been developed and successfully run to test these items. See also 8.01 - Off Nominal Testing.

Since the loaded/uplinked data, rules, scripts, and code can be easily changed during operations, it is also important for software assurance to confirm that they are kept under configuration managed and only changed using the configuration management process.

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

  • No labels