bannerd


UNDER CONSTRUCTION

A.06 Software Testing

06.00. Software Testing Activity Overview

Software testing is required to ensure that the software meets the agreed requirements and design, works as expected, doesn’t contain serious bugs, and meets its intended use as per user expectations. IV&V may be used to expand test coverage and depth. 

This testing is performed as individual modules of code are combined into larger units of code for integration into a system. 

1.1 Sub-Activities in Testing

There are two sub activities in testing. 

  • A.06.01 - Software Testing 
  • A.06.02 - Independent Verification and Validation - IV&V 

06.01. Software Testing Activity Overview

The development of plans and procedures provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project. Test reports ensure that the results of verification activities are documented and stored in the configuration management system for use in acceptance reviews or readiness reviews.

Before software testing begins, all software items are placed under configuration control. This configuration control captures and identifies the versions of what is being tested.

Test results are the basis for confirming that the team has fulfilled the software requirements in the resulting software product. To make such decisions, test results must be reviewed and evaluated using a documented, repeatable process. The team can derive quality conclusions by capturing the actual test results, comparing them to expected results, analyzing those results against pre-established criteria, and documenting that analysis/evaluation process.

Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do. Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment).

To identify which lines of source code have been tested and which lines of your source code have not been tested and to provide data showing the completeness of the executed software tests.

The purpose of regression testing is to ensure that changes made to the software have not introduced new defects. One of the main reasons for regression testing is to determine whether a change in one part of the software affects other parts of the software. To ensure no new defects are injected into the previously integrated or tested software, the project manager should both plan and conduct regression testing to demonstrate the newly integrated software.

Verify by the test that any safety features related to system hazards, fault trees, or FMEA events are reliable and work as planned.  

Any uploaded or uplinked data, rules, and code can affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios.

Software testing is required to ensure that the software meets the agreed requirements and design, the application works as expected, the application doesn’t contain serious bugs and the software meets its intended use as per user expectations. Testing of embedded COTS, GOTS, MOTS, Open Source Software (OSS), or reused software component should be at the same level required to accept a custom-developed software component for its intended use.

Frequency Of This Activity

Testing is an activity that is constantly being performed as the software products are developed and become available in release packages. testing cycles are driven by:

  • if the requirements change, the test plans and procedures must also change to ensure that the test activity is accurate, complete, and consistent with the requirements
  • as new release candidates are built, testing is used to check that requirements are properly implemented
  • regression testing confirms that old problems have not crept back into code and that good configuration management practices are in force

06.01.1 Related SWEs

Testing

Cybersecurity

  • SWE-210 - Detection of Adversarial Actions 3.11.8 The project manager shall identify software requirements for the collection, reporting, and storage of data relating to the detection of adversarial actions.

06.01.2 Related Work Products

06.01.2.1 Related Process Asset Templates

06.01.3 Related Topics

Editors only

A.06.01 Software Testing

A.06.01 Software Testing

Software Testing

See also IV&V in A.02 Software Assurance and Software Safety

IV&V 

IV&V


Topics


SPAN Links

Analysis of SWEs and SM

A.06.01 Software Testing

  • Not all SWEs are pointing to the appropriate SWEs, topics, and document pages
  • Some topics could point to appropriate SWEs
SWE or Topic

Related SWEs 

Related SM

Related Activity

5.10 - STP - Software Test Plan
5.14 - Test - Software Test Procedures
5.11 - STR - Software Test Report
7.08 - Maturity of Life Cycle Products at Milestone Reviews
7.09 - Entrance and Exit Criteria

7.06 - Software Test Estimation and Testing Levels
7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009
7.19 - Software Risk Management Checklists
8.01 - Off Nominal Testing
8.02 - Software Reliability
8.04 - Additional Requirements Considerations for Use with Safety-Critical Software
8.08 - COTS Software Safety Considerations
8.13 - Test Witnessing
8.57 - Testing Analysis

A.06.02 - IV&V

Duplicated in A.02 - IV&V

SWE or TopicRelated SWEsRelated SMRelated Activities


8.53 - IV&V Project Execution Plan

Unable to render {include} The included page could not be found.

7.08 - Maturity of Life Cycle Products at Milestone Reviews
7.09 - Entrance and Exit Criteria

8.6 - IV&V Surveillance

PAT-018 - Test Plan Checklist

PAT-019 - Test Procedure Checklist

PAT-026 - Test Review Checklist For Test Leads

PAT-027 - Test Review Checklist For Review Teams

PAT-033 - TASKS NEEDING OBJECTIVE EVIDENCE

06.02. IV&V

Independent validation and verification (IV&V) is a part of Software Assurance playing a role in the NASA software risk mitigation strategy. OSMA is responsible for determining which projects have IV&V for NASA in conjunction with the responsible Mission Directorate. The rationale for independent validation and verification (IV&V) on a project is to reduce the risk of failures due to software and provide assurance that the software will operate as intended, not operate unexpectedly and respond appropriately to adverse conditions. Performing IV&V on projects yields greater confidence that the delivered software products are error-free and meet the customer’s needs.  IV&V across the project life cycle increases the likelihood of uncovering high-risk errors early in the life cycle.

IV&V is a technical discipline of software assurance that employs rigorous analysis and testing methodologies to identify objective evidence and conclusions to provide an independent assessment of critical products and processes throughout the software development life cycle. The evaluation of products and processes throughout the life cycle demonstrates whether the software is fit for nominal operations (required functionality, safety, dependability, etc.) and off-nominal conditions (response to faults, responses to hazardous conditions, etc.). The goal of the IV&V effort is to contribute assurance conclusions provided to the project and stakeholders based on evidence found in software development artifacts and risks associated with the intended behaviors of the software.

The rationale for independent validation and verification (IV&V) on a project is to reduce the risk of failures due to software.  Performing IV&V on projects yields greater confidence that the delivered software products are error-free and meet the customer’s needs.  IV&V across the project life cycle increases the likelihood of uncovering high-risk errors early in the life cycle.

The IV&V Project Execution Plans (IPEP) documents the activities, methods, level of rigor, environments, tailoring (if any) of the IV&V requirements, and criteria to be used in performing verification and validation of in-scope system/software behaviors (including responsible software components) determined by the planning and scoping effort.

IV&V artifacts and products required to perform the IV&V analysis on NASA projects are to be made available in electronic format in the original format. The electronic availability of the IV&V products and artifacts facilitates post-deliveries that might be necessary with software updates. Electronic access to IV&V artifacts and products reduces NASA's IV&V project costs and accommodates the longer-term needs when performing software maintenance.

 If the project manager does not address the issues and risks found by IV&V and track them to closure, these unaddressed risks and issues could cause the project to fail to meet its objectives (e.g. schedule, planned quality, functionality, etc.) Since IV&V personnel have generally worked across many projects, they are often likely to recognize risks and issues to the project that the project manager may not recognize.

Performing verification and validation (V&V) to accredit software models, simulations, and analysis tools is important to ensure the credibility of the results produced by those tools. Critical decisions may be made, at least in part, based on the results produced by models, simulations, and analysis tools. Reducing the risk associated with these decisions is one reason to use accredited tools that have been properly verified and validated.

See also A.02 Software Assurance and Software Safety

Frequency Of This Activity

06.02.1 Related SWEs

  • SWE-131 - Independent Verification and Validation Project Execution Plan - 3.6.3 If software IV&V is required for a project, the project manager, in consultation with NASA IV&V, shall ensure an IPEP is developed, approved, maintained, and executed in accordance with IV&V requirements in NASA-STD-8739.8.
  • SWE-141 - Software Independent Verification and Validation - 3.6.2 For projects reaching Key Decision Point A, the program manager shall ensure that software IV&V is performed on the following categories of projects: 

    a. Category 1 projects as defined in NPR 7120.5.
    b. Category 2 projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4, Risk Classification for NASA Payloads.
    c. Projects selected explicitly by the Mission Directorate Associate Administrator (MDAA) to have software IV&V.

  • SWE-178 - IV&V Artifacts - 3.6.4 If software IV&V is performed on a project, the project manager shall ensure that IV&V is provided access to development artifacts, products, source code, and data required to perform the IV&V analysis efficiently and effectively. 
  • SWE-179 - IV&V Submitted Issues and Risks - 3.6.5 If software IV&V is performed on a project, the project manager shall provide responses to IV&V submitted issues and risks and track these issues and risks to closure.
  • SWE-223 - Tailoring IV&V project selections - 2.1.2.7 The NASA Chief, SMA shall make the final decision on all proposed tailoring of SWE-141, the Independent Verification and Validation (IV&V) requirement. 

06.02.2 Related Work Products

06.02.2.1 Related Process Asset Templates

06.02.3 Related Topics

  • 8.06 - IV&V SurveillanceThis guidance will establish the rationale behind the creation of an IV&V Requirements and Surveillance activities. 

  • No labels