bannerc

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
Tabsetup
01. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
67. Software Assurance
Div
idtabs-1

1. Requirements

Excerpt

4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior. 

1.1 Notes

These acceptance tests should validate and verify the data, rules, and code for nominal and off-nominal scenarios.

1.2 History

Expand
titleClick here to view the history of this requirement: SWE-193 History

Include Page
SITE:SWE-193 History
SITE:SWE-193 History

1.3 Applicability Across Classes

 

Applicable c
a1
b1
csc1
c0
d0
dsc1
e0
f1
g0
h0

Div
idtabs-2

2. Rationale

Any uploaded or uplinked data, rules, and code can affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios.

Div
idtabs-3

3. Guidance

Acceptance Test (see SWE-034) is a system-level test, usually performed during the final integration of formally tested software with the intended flight or operations hardware before Project level acceptance. Additional information may be found in NPR 7123.1 

Swerefn
refnum041
 and NASA-SP-2007-6105. 
Swerefn
refnum273

Uploaded or uplinked data, rules, and code may affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios.

Acceptance activities for software development begin with planning and developing acceptance criteria during the Formulation phase of the project. These activities and acceptance criteria can be documented in the Software Development/Management Plan (see 5.08 - SDP-SMP - Software Development - Management Plan ) or a separate Software V&V Plan. Understanding the system grows and the requirements are better understood, the acceptance criteria can be reviewed to ensure they are up to date and consistent with the system being built. They typically conclude with the system acceptance review late in the Implementation phase of the project (see the entrance and exit criteria for the Systems Acceptance Review in Topic 7.09 - Entrance and Exit Criteria)

Acceptance Testing is the formal testing conducted to determine whether a software system satisfies its acceptance criteria, enabling the customer to determine whether or not to accept the system. Acceptance testing is designed to determine whether the software work product is fit for use. Acceptance testing of the software work product typically forms a major portion of the acceptance plan. Once developed, the team runs the acceptance testing, commonly called a test suite, against the supplied input data and conditions. The software testing personnel are typically but not always independent of the project team. Software assurance personnel observe the tests. The team compares the obtained test results with the expected results. If the results match or fall within a previously agreed-to band or tolerance, the test suite is said to pass, and the work product is acceptable. If not, the work product may either be rejected or accepted on conditions previously agreed to between the customer and the software development team.

Verification results that are used in software acceptance reviews are typically documented in software verification reports. Test results for software work products subjected to acceptance testing must be documented in a test report or an acceptance data package. (See 5.11 - STR - Software Test Report and 5.03 - Inspect - Software Inspection, Peer Reviews, Inspections for related information.)

Additional guidance related to acceptance testing may be found in the following related requirements in this handbook: 

Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects.

Div
idtabs-5

5. Resources

5.1 References

refstable
Show If
groupconfluence-users
Panel
titleColorred
titleVisible to editors only

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted
added SWEREF-041
added SWEREF-273

SWEREFs NOT called out in text but listed as germane: none

SWEREFs called out in the text: 041, 273


5.2 Tools

Include Page
Tools Table Statement
Tools Table Statement

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

Configurable Data Loads (CDL)

  • Definition: CDLs contain updateable parameters that are loaded into flight software and can control safety-critical functions.
  • Safety-critical data is a shared responsibility between Subsystem Responsible Engineers and Flight Software Team with oversight from Systems Engineering.
  • Maintain traceability between data loads and software verification test procedures to support timely verification of late-breaking changes.
  • Predefine verification/validation is needed for all CDLs.
  • Pre-declare CDL values that are expected/allowed to change with associated nominal verification activities.
    • Changes outside this list need Engineering Control Board approval and must have a verification plan for every change.

Bottom Line: Safety-critical data must be treated with the same rigor as safety-critical software

  • Configuration Management
  • Verification and Validation
Div
idtabs-7

7. Software Assurance

Excerpt Include
SWE-193 - Acceptance Testing for Affected System and Software Behavior
SWE-193 - Acceptance Testing for Affected System and Software Behavior

7.1 Tasking for Software Assurance

  1. Confirm that the project develops acceptance tests for loaded/uplinked data, rules, and code that affect software and software system behavior.

  2. Confirm that the loaded/uplinked data, rules, scripts, or code that affects software and software system behavior is baselined on the software configuration system.

7.2 Software Assurance Products

  • None at this time.


    Note
    titleObjective Evidence
    • Software test reports
    • Software test procedures
    • Software configuration data
    Expand
    titleDefinition of objective evidence

    Include Page
    SITE:Definition of Objective Evidence
    SITE:Definition of Objective Evidence

7.3 Metrics

  • None identified at this time.

7.4 Guidance

When acceptance testing is being done, it is important to ensure that the entire software system will perform correctly in the intended environment for operations. Uploaded or uplinked data, rules, and code form an important part of the system that needs to contain correct data and be working properly for the system to work as intended. Since uploaded or uplinked data, rules, scripts, and code may affect the behavior of the software and/or system, special acceptance tests should be developed to validate and verify the uplinked or uploaded information, including the information for nominal and off-nominal scenarios. Software assurance needs to verify that tests have been developed and successfully run to test these items.

Since the loaded/uplinked data, rules, scripts, and code can be easily changed during operations, it is also important for software assurance to confirm that they are kept under configuration managed and only changed using the configuration management process.