UNDER CONSTRUCTION



06.00. Software Testing Activity Overview
Software testing is required to ensure that the software meets the agreed requirements and design, works as expected, doesn’t contain serious bugs, and meets its intended use as per user expectations. IV&V may be used to expand test coverage and depth.This testing is performed as individual modules of code are combined into larger units of code for integration into a system.
1.1 Sub-Activities in Testing
There are two sub activities in testing.
- A.06.01 - Software Testing
- A.06.02 - Independent Verification and Validation - IV&V
06.01. Software Testing Activity Overview
The development of plans and procedures provides the opportunity for stakeholders to give input and assist with the documentation and tailoring of the planned testing activities to ensure the outcome will meet the expectations and goals of the project. Test reports ensure that the results of verification activities are documented and stored in the configuration management system for use in acceptance reviews or readiness reviews.
Before software testing begins, all software items are placed under configuration control. This configuration control captures and identifies the versions of what is being tested.
Test results are the basis for confirming that the team has fulfilled the software requirements in the resulting software product. To make such decisions, test results must be reviewed and evaluated using a documented, repeatable process. The team can derive quality conclusions by capturing the actual test results, comparing them to expected results, analyzing those results against pre-established criteria, and documenting that analysis/evaluation process.
Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do. Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment).
To identify which lines of source code have been tested and which lines of your source code have not been tested and to provide data showing the completeness of the executed software tests.
The purpose of regression testing is to ensure that changes made to the software have not introduced new defects. One of the main reasons for regression testing is to determine whether a change in one part of the software affects other parts of the software. To ensure no new defects are injected into the previously integrated or tested software, the project manager should both plan and conduct regression testing to demonstrate the newly integrated software.
Verify by the test that any safety features related to system hazards, fault trees, or FMEA events are reliable and work as planned.
Any uploaded or uplinked data, rules, and code can affect the behavior of the software and/or system. Special acceptance tests should be developed to validate and verify the uplinked or uploaded information for nominal and off-nominal scenarios.
Software testing is required to ensure that the software meets the agreed requirements and design, the application works as expected, the application doesn’t contain serious bugs and the software meets its intended use as per user expectations. Testing of embedded COTS, GOTS, MOTS, Open Source Software (OSS), or reused software component should be at the same level required to accept a custom-developed software component for its intended use.
Frequency Of This Activity
Testing is an activity that is constantly being performed as the software products are developed and become available in release packages. testing cycles are driven by:
- if the requirements change, the test plans and procedures must also change to ensure that the test activity is accurate, complete, and consistent with the requirements
- as new release candidates are built, testing is used to check that requirements are properly implemented
- regression testing confirms that old problems have not crept back into code and that good configuration management practices are in force
06.01.1 Related SWEs
Testing
- SWE-065 - Test Plan, Procedures, Reports - 4.5.2 The project manager shall establish and maintain:
a. Software test plan(s).
b. Software test procedure(s).
c. Software test(s), including any code specifically written to perform test procedures.
d. Software test report(s). - SWE-066 - Perform Testing - 4.5.3 The project manager shall test the software against its requirements.
- SWE-187 - Control of Software Items - 4.5.4 The project manager shall place software items under configuration management prior to testing.
- SWE-068 - Evaluate Test Results - 4.5.5 The project manager shall evaluate test results and record the evaluation.
- SWE-070 - Models, Simulations, Tools - 4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.
- SWE-071 - Update Test Plans and Procedures - 4.5.7 The project manager shall update the software test and verification plan(s) and procedure(s) to be consistent with software requirements.
- SWE-073 - Platform or Hi-Fidelity Simulations - 4.5.8 The project manager shall validate the software system on the targeted platform or high-fidelity simulation.
- SWE-189 - Code Coverage Measurements - 4.5.9 The project manager shall ensure that the code coverage measurements for the software are selected, implemented, tracked, recorded, and reported.
- SWE-190 - Verify Code Coverage - 4.5.10 The project manager shall verify code coverage is measured by analysis of the results of the execution of tests.
- SWE-191 - Software Regression Testing - 4.5.11 The project manager shall plan and conduct software regression testing to demonstrate that defects have not been introduced into previously integrated or tested software and have not produced a security vulnerability.
- SWE-192 - Software Hazardous Requirements - 4.5.12 The project manager shall verify through test the software requirements that trace to a hazardous event, cause, or mitigation technique.
- SWE-193 - Acceptance Testing for Affected System and Software Behavior - 4.5.13 The project manager shall develop acceptance tests for loaded or uplinked data, rules, and code that affects software and software system behavior.
- SWE-211 - Test Levels of Non-Custom Developed Software - 4.5.14 The project manager shall test embedded COTS, GOTS, MOTS, OSS, or reused software components to the same level required to accept a custom developed software component for its intended use.
Cybersecurity
- SWE-210 - Detection of Adversarial Actions - 3.11.8 The project manager shall identify software requirements for the collection, reporting, and storage of data relating to the detection of adversarial actions.
06.01.2 Related Work Products
- 5.01 - CR-PR - Software Change Request - Problem Report - Minimum recommended content for the Software Change Request - Problem Report.
- 5.10 - STP - Software Test Plan - Minimum recommended content for the Software Test Plan at a high level,
- 5.11 - STR - Software Test Report - Minimum recommended content for the Software Test Report.
- 5.14 - Test - Software Test Procedures - Minimum recommended content for the Software Test Procedures Plan.
- 7.08 - Maturity of Life Cycle Products at Milestone Reviews - This chart summarizes current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews.
- 7.09 - Entrance and Exit Criteria - This guidance provides the recommended life cycle review entrance and exit criteria for software projects and should be tailored for the project class.
- A.10 Software Peer Reviews and Inspections - Test Plans and Procedures are good candidates for a Peer Review
- 7.19 - Software Risk Management Checklists - tab 6 Software Test Phase checklist
06.01.2.1 Related Process Asset Templates
- PAT-018 - Test Plan Checklist
- PAT-019 - Test Procedure Checklist
- PAT-026 - Test Review Checklist For Test Leads
- PAT-027 - Test Review Checklist For Review Teams
- PAT-033 - TASKS NEEDING OBJECTIVE EVIDENCE
06.01.3 Related Topics
- 7.06 - Software Test Estimation and Testing Levels - Provide guiding principles and best practices pertaining to software test estimation and a description of the typical "levels" of testing performed for a software project.
- 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009 - Discusses the relationship of NPR7150.2 to NASA-STD-7009 (Models and Simulation)
- 8.01 - Off Nominal Testing - Guidance focusing on out of bounds parameters, failure scenarios, unexpected conditions, and capabilities that are typically considered as "must not work" functions.
- 8.02 - Software Reliability - The goal of SW reliability and maintainability is to assure that SW performs consistently as desired, when operating within specified conditions. This topic covers additional basic information on software reliability.
- 8.04 - Additional Requirements Considerations for Use with Safety-Critical Software - Requirements to be considered when you have safety-critical software on a program/project/facility.
- 8.08 - COTS Software Safety Considerations - A discussion on the use of COTS in safety-critical systems.
- 8.13 - Test Witnessing - Guidance for software assurance personnel performing test witnessing.
- 8.57 - Testing Analysis - Testing Analysis product content.
06.01.4 Related SPAN Links
06.02. IV&V
Independent validation and verification (IV&V) is a part of Software Assurance playing a role in the NASA software risk mitigation strategy. OSMA is responsible for determining which projects have IV&V for NASA in conjunction with the responsible Mission Directorate. The rationale for independent validation and verification (IV&V) on a project is to reduce the risk of failures due to software and provide assurance that the software will operate as intended, not operate unexpectedly and respond appropriately to adverse conditions. Performing IV&V on projects yields greater confidence that the delivered software products are error-free and meet the customer’s needs. IV&V across the project life cycle increases the likelihood of uncovering high-risk errors early in the life cycle.
IV&V is a technical discipline of software assurance that employs rigorous analysis and testing methodologies to identify objective evidence and conclusions to provide an independent assessment of critical products and processes throughout the software development life cycle. The evaluation of products and processes throughout the life cycle demonstrates whether the software is fit for nominal operations (required functionality, safety, dependability, etc.) and off-nominal conditions (response to faults, responses to hazardous conditions, etc.). The goal of the IV&V effort is to contribute assurance conclusions provided to the project and stakeholders based on evidence found in software development artifacts and risks associated with the intended behaviors of the software.
The rationale for independent validation and verification (IV&V) on a project is to reduce the risk of failures due to software. Performing IV&V on projects yields greater confidence that the delivered software products are error-free and meet the customer’s needs. IV&V across the project life cycle increases the likelihood of uncovering high-risk errors early in the life cycle.
The IV&V Project Execution Plans (IPEP) documents the activities, methods, level of rigor, environments, tailoring (if any) of the IV&V requirements, and criteria to be used in performing verification and validation of in-scope system/software behaviors (including responsible software components) determined by the planning and scoping effort.
IV&V artifacts and products required to perform the IV&V analysis on NASA projects are to be made available in electronic format in the original format. The electronic availability of the IV&V products and artifacts facilitates post-deliveries that might be necessary with software updates. Electronic access to IV&V artifacts and products reduces NASA's IV&V project costs and accommodates the longer-term needs when performing software maintenance.
If the project manager does not address the issues and risks found by IV&V and track them to closure, these unaddressed risks and issues could cause the project to fail to meet its objectives (e.g. schedule, planned quality, functionality, etc.) Since IV&V personnel have generally worked across many projects, they are often likely to recognize risks and issues to the project that the project manager may not recognize.
Performing verification and validation (V&V) to accredit software models, simulations, and analysis tools is important to ensure the credibility of the results produced by those tools. Critical decisions may be made, at least in part, based on the results produced by models, simulations, and analysis tools. Reducing the risk associated with these decisions is one reason to use accredited tools that have been properly verified and validated.
See also A.02 Software Assurance and Software Safety.
Frequency Of This Activity
06.02.1 Related SWEs
- SWE-131 - Independent Verification and Validation Project Execution Plan - 3.6.3 If software IV&V is required for a project, the project manager, in consultation with NASA IV&V, shall ensure an IPEP is developed, approved, maintained, and executed in accordance with IV&V requirements in NASA-STD-8739.8.
- SWE-141 - Software Independent Verification and Validation - 3.6.2 For projects reaching Key Decision Point A, the program manager shall ensure that software IV&V is performed on the following categories of projects:
a. Category 1 projects as defined in NPR 7120.5.
b. Category 2 projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4, Risk Classification for NASA Payloads.
c. Projects selected explicitly by the Mission Directorate Associate Administrator (MDAA) to have software IV&V. - SWE-178 - IV&V Artifacts - 3.6.4 If software IV&V is performed on a project, the project manager shall ensure that IV&V is provided access to development artifacts, products, source code, and data required to perform the IV&V analysis efficiently and effectively.
- SWE-179 - IV&V Submitted Issues and Risks - 3.6.5 If software IV&V is performed on a project, the project manager shall provide responses to IV&V submitted issues and risks and track these issues and risks to closure.
- SWE-223 - Tailoring IV&V project selections - 2.1.2.7 The NASA Chief, SMA shall make the final decision on all proposed tailoring of SWE-141, the Independent Verification and Validation (IV&V) requirement.
06.02.2 Related Work Products
- 8.53 - IV&V Project Execution Plan - IV&V Project Execution Plan product introduction
- 7.08 - Maturity of Life Cycle Products at Milestone Reviews - This chart summarizes current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews.
- 7.09 - Entrance and Exit Criteria - This guidance provides the recommended life cycle review entrance and exit criteria for software projects and should be tailored for the project class.
- A.10 Software Peer Reviews and Inspections - Plans are good candidates for a Peer Review
06.02.2.1 Related Process Asset Templates
06.02.3 Related Topics
- 8.06 - IV&V Surveillance - This guidance will establish the rationale behind the creation of an IV&V Requirements and Surveillance activities.
06.02.4 Related SPAN Links