NASA Technical Standard 8739.8B - Approved 2004-27-08 - SOFTWARE ASSURANCE AND SOFTWARE SAFETY STANDARD. Full text of this NPR as taken from NODIS. Assembled from component pieces beneath this page.
Tabsetup
0
1. NASA-STD-8739.8B
1
2. Applicable and Reference Documents
2
3. Acronyms and Definitions
3
4. Software Assurance and Software Safety Requirements
4
5. Appendix A Guidelines For The Hazard Development Involving Software
Div
id
tabs-1
1. NASA-STD-8739.8B Title Material
Include Page
NASA-STD-8739.8 - Title Material
NASA-STD-8739.8 - Title Material
1. SCOPE
1.1 Document Purpose
1.1.1 The purpose of the Software Assurance and Software Safety Standard is to define the requirements to implement a systematic approach to software assurance, software safety, and Independent Verification and Validation (IV&V) for software created, acquired, provided, used, or maintained by or for NASA. Various personnel in the program, project, engineering, facility, or Safety and Mission Assurance (SMA) organizations can perform the activities required to satisfy these requirements. The Software Assurance and Software Safety Standard provides a basis for personnel to perform software assurance, software safety, and IV&V activities consistently throughout the life of the software.
1.1.2 The Software Assurance and Software Safety Standard, in accordance with NPR 7150.2, NASA Software Engineering Requirements, supports the implementation of the software assurance, software safety, and IV&V sub-disciplines. The application and approach to meeting the Software Assurance and Software Safety Standard vary based on the system and software products and processes to which they are applied. The Software Assurance and Software Safety Standard stresses coordination between the software assurance sub-disciplines and system safety, system reliability, hardware quality, system security, and software engineering to maintain the system perspective and minimize duplication of effort.
1.1.3 The objectives of the Software Assurance and Software Safety Standard include the following:
a. Ensuring that the processes, procedures, and products used to produce and sustain the software conform to all specified requirements and standards that govern those processes, procedures, and products.
(1) A set of activities that assess adherence to, and the adequacy of the software processes used to develop and modify software products. (2) A set of activities that define and assess the adequacy of software processes to provide evidence that establishes confidence that the software processes are appropriate for and produce software products of suitable quality for their intended purposes.
b. Determining the degree of software quality obtained by the software products. c. Ensuring that the software systems are safe and that the software safety-critical requirements are followed. d. Ensuring that the software systems are secure. e. Employing rigorous analysis and testing methodologies to identify objective evidence and conclusions to provide an independent assessment of critical products and processes throughout the life cycle.
1.1.4 The Software Assurance and Software Safety Standard is compatible with all software life cycle models. The Software Assurance and Software Safety Standard does not impose a particular life cycle model on a software project.
1.1.5 In this standard, all mandatory actions (i.e., requirements) are denoted by statements containing the term “shall.” The terms “may” denote a discretionary privilege or permission; “can” denotes statements of possibility or capability; “should” denotes a good practice and is recommended; but not required, “will” denotes expected outcome; and “are/is” denotes descriptive material.
1.2 Applicability
1.2.1 This standard is approved for use by NASA Headquarters and NASA Centers, including Component Facilities and Technical and Service Support Centers. This NASA Technical Standard applies to the assurance of software created by or for NASA projects, programs, facilities, and activities and defines the requirements for those activities. This directive is applicable to the Jet Propulsion Laboratory, a Federally Funded Research and Development Center, only to the extent specified in the NASA/Caltech Prime Contract. This standard may also apply to other contractors, grant recipients, or parties to agreements to the extent specified or referenced in their contracts, grants, or agreements.
1.3 Documentation and Deliverables
1.3.1 The Software Assurance and Software Safety Standard is not intended to designate the format of program/project/facility documentation and deliverables. The software assurance and software safety data, information, and plans may be considered to be quality records with a retention period as specified in NRRS 1441.1. The format of the documentation is a program/project/facility decision. The software assurance and software safety organizations should keep records, reports, metrics, analyses, and trending results and should keep copies of their project plans for future reference and improvements. The software assurance and software safety plans (e.g., the Software Assurance Plan) can be standalone documents or incorporated within other documents (e.g., part of a Software Management Plan, a Software Development Plan or part of a Program or Project Safety and Mission Assurance (SMA) plan).
1.4 Request for Relief
1.4.1 Tailoring of this standard for application to a specific program or project is documented as part of program or project requirements and approved by the responsible Center Technical Authority (TA) in accordance with NPR 8715.3, NASA General Safety Program Requirements. Section 4.5 of this standard contains the principles related to tailoring this standard’s requirements.
Div
id
tabs-2
2. APPLICABLE AND REFERENCE DOCUMENTS
2.1 Applicable Documents
The applicable documents are accessible via the NASA Technical Standards System at https://standards.nasa.gov, or the NASA Online Directives Information System https://nodis3.gsfc.nasa.gov/main_lib.cfm or may be obtained directly from the Standards Developing Organizations.
NPR 1400.1 NASA Directives and Charters Procedural Requirements
NPR 7120.5 NASA Space Flight Program and Project Management Requirements
NPR 7120.10 Technical Standards for NASA Programs and Projects
The reference documents listed in this section are not incorporated by reference within this standard but may provide further clarification and guidance.
2.2.1 Government Documents
NPD 2810.1 NASA Information Security Policy
NPD 8720.1 NASA Reliability and Maintainability Program Policy
NPR 1441.1 NASA Records Management Program Requirements
NPR 2210.1 Release of NASA Software
NPR 2810.1 Security of Information and Information Systems
NPR 2830.1 NASA Enterprise Architecture Procedures
NPR 2841.1 Identity, Credential, and Access Management
NPR 7120.7 NASA Information Technology Program and Project Management Requirements
NPR 7120.8 NASA Research and Technology Program and Project Management Requirements
NPR 7120.10 Technical Standards for NASA Programs and Projects
NPR 7120.11 Health and Medical Technical Authority Implementation
NPR 7123.1 NASA Systems Engineering Processes and Requirements.
NPR 7123.1 NASA Systems Engineering Processes and Requirements
NASA-STD-1006 Space System Protection Standard
NASA-STD-2601 Minimum Cybersecurity Requirements for Computing Systems
NASA-STD-7009 Standard for Models and Simulations
NASA-STD-8729.1 NASA Reliability And Maintainability Standard For Spaceflight And Support Systems
NASA-HDBK-7009 NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA-HDBK-8709.22 Safety and Mission Assurance Acronyms, Abbreviations, and Definitions
NASA-HDBK-8739.23 NASA Complex Electronics Handbook for Assurance Professionals
NIST SP 800-37 Risk Management Framework
NIST SP 800-40 Guide to Enterprise Patch Management Planning: Preventive Maintenance for Technology
NIST SP 800-53 Security and Privacy Controls for Information Systems and Organizations
NIST SP 800-70 National Checklist Program for Information Technology products: Guidelines for Checklist Users and Developers
NIST SP 800-115 Technical Guide to Information Security Testing and Assessment
NFS 1813.301-79 Supporting Federal Policies, Regulations, and NASA Procedural Requirements
NFS 1852.237-72 Access to Sensitive Information
NFS 1852.237-73 Release of Sensitive Information
2.2.2 Non-Government Documents
CMMI-DEV, V2.0 CMMI® for Development, Version 2.0
IEEE 730 Institute of Electrical and Electronics Engineers (IEEE) Standard for Software Quality Assurance Processes
IEEE 828 IEEE Standard for Configuration Management in Systems and Software Engineering.
IEEE 982.1 IEEE Standard Measures of the Software Aspects of Dependability
IEEE 1012 IEEE Standard for System, Software, and Hardware Verification and Validation
IEEE 1028 IEEE Standard for Software Reviews and Audits
IEEE 1633 IEEE Recommended Practice on Software Reliability
IEEE 15026-1 Systems and software engineering--Systems and software assurance--Part 1: Concepts and vocabulary
IEEE 29119-4 Software and systems engineering -- Software testing -- Part 4: Test techniques
ISO 26514 Systems and software engineering–requirements for designers and developers of user documentation
ISO 24765 System and Software Engineering – Vocabulary
2.3 Order of Precedence
2.3.1 This standard establishes requirements to implement a systematic approach to Software Assurance, Software Safety, and IV&V for software created, acquired, provided, or maintained by or for NASA but does not supersede nor waive established Agency requirements found in other documentation.
2.3.2 Conflicts between the Software Assurance and Software Safety Standard and other requirements documents are resolved by the responsible SMA and engineering TA(s), per NPR1400.1, NASA Directives and Charters Procedural Requirements, and NPR 7120.10, Technical Standards for NASA Programs and Projects.
Div
id
tabs-3
3. ACRONYMS AND DEFINITIONS
3.1 Acronyms and Abbreviations
CMMI®? Capability Maturity Model Integration
COTS Commercial-Off-The-Shelf
GOTS Government-Off-The-Shelf
HDBK Handbook
IEEE Institute of Electrical and Electronics Engineers
IPEP IV&V Project Execution Plan
IV&V Independent Verification and Validation
MC/DC Modified Condition/Decision Coverage
MOTS Modified-Off-The-Shelf
NASA National Aeronautics and Space Administration
NIST National Institute of Standards and Technology
NPD NASA Policy Directive
NPR NASA Procedural Requirements
NRRS NASA Records Retention Schedule
OSMA NASA Headquarters Office, Safety and Mission Assurance
OSS Open Source Software
PLC Programmable Logic Controller
PROM Programmable Read-Only Memory
RTOS Real-Time Operating System
SMA Safety and Mission Assurance
SP Special Publication
SWE Software Engineering
TA Technical Authority
3.2 Definitions
Accredit. The official acceptance of a software development tool, model, or simulation, including associated data, to use for a specific purpose. Acquirer. The entity or individual who specifies the requirements and accepts the resulting software products. The Acquirer is usually NASA or an organization within the Agency but can also refer to the prime contractor-subcontractor relationship.
Analyze. Review results in-depth, look at relationships of activities, examine methodologies in detail, and follow methodologies such as Failure Mode and Effects Analysis, Fault Tree Analysis, trending, and metrics analysis. Examine processes, plans, products, and task lists for completeness, consistency, accuracy, reasonableness, and compliance with requirements. The analysis may include identifying missing, incomplete, or inaccurate products, relationships, deliverables, activities, required actions, etc.
Approve. When the responsible originating official, or designated decision authority, of a document, report, condition, etc., has agreed, via their signature, to the content and indicates the document is ready for release, baselining, distribution, etc. Usually, one “approver” and several stakeholders need to “concur” for official acceptance of a document, report, etc. For example, the project manager would approve the Software Development Plan, but SMA would concur on it.
Assess. Judge results against plans or work product requirements. Assess includes judging for practicality, timeliness, correctness, completeness, compliance, evaluation of rationale, etc., reviewing activities performed, and independently tracking corrective actions to closure. Assure. When software assurance personnel make certain that others have performed the specified software assurance, management, and engineering activities.
Audit. Formal review to assess compliance with hardware or software requirements, specifications, baselines, safety standards, procedures, instructions, codes, and contractual and licensing requirements. (Source NPR 8715.3)
Bi-directional Traceability. Association among two or more logical entities that are discernible in either direction (to and from an entity). (Source IEEE Definition)
Concur. A documented agreement that a proposed course of action is acceptable.
Condition. (1) measurable qualitative or quantitative attribute that is stipulated for a requirement and that indicates a circumstance or event under which a requirement applies (2) description of a contingency to be considered in the representation of a problem, or a reference to other procedures to be considered as part of the condition (3) true or false logical predicate (4) logical predicate involving one or more behavior model elements (5) Boolean expression containing no Boolean operators.
Configuration Item. (1)item or aggregation of hardware, software, or both that is designated for configuration management and treated as a single entity in the configuration management process (2)component of an infrastructure or an item which is or will be under control of configuration management (3) aggregation of work products that is designated for configuration management and treated as a single entity in the configuration management process (4) any system element or aggregation of system elements that satisfies an end use function and is designated by the acquirer for separate configuration control (5) item or aggregation of software that is designed to be managed as a single entity and its underlying components, such as documentation, data structures, scripts. (Source IEEE Definition)
Note
Configuration items can vary widely in complexity, size, and type, ranging from an entire system including all hardware, software, and documentation, to a single module or a minor hardware component. CIs have four common characteristics: defined functionality; replaceable as an entity; unique specification; formal control of form, fit, and function. See Also: hardware configuration item, computer software configuration item, configuration identification, and critical item.
Confirm. Check to see that activities specified in the software engineering requirements are adequately done and evidence of the activities exists as proof. Confirm includes ensuring activities are done completely and correctly and have expected content according to approved tailoring.
Critical. A condition that may cause severe injury or occupational illness, or major property damage to facilities, systems, or flight hardware. Deliverable. Product or item that has to be completed and delivered under the terms of an agreement or contract. Products may also be deliverables, e.g., software requirements specifications, and detailed design documents. Develop. To produce or create a product or document and mature or advance the product or document content.
Ensure. When software assurance or software safety personnel perform the specified software assurance and software safety activities themselves. Event. (1) occurrence of a particular set of circumstances (2) external or internal stimulus used for synchronization purposes (3) change detectable by the subject software (4) fact that an action has taken place (5) singular moment in time at which some perceptible phenomenological change (energy, matter, or information) occurs at the port of a unit.
Failure. Inability of a system, subsystem, component, or part to perform its required function within specified limits. (Source NPR 8715.3)
Hazard. A state or a set of conditions, internal or external to a system that has the potential to cause harm. (Source NPR 8715.3) Hazard Analysis. Identifying and evaluating existing and potential hazards and the recommended mitigation for the hazard sources found. Hazard Control. Means of reducing the risk of exposure to a hazard. (Source NPR 8715.3)
Hazardous Operation/Work Activity. Any operation or other work activity that, without the implementation of proper mitigations, has a high potential to result in loss of life, serious injury to personnel or public, or damage to property due to the material or equipment involved or the nature of the operation/activity itself.
Independent Verification and Validation. Verification and validation performed by an organization that is technically, managerially, and financially independent of the development organization. (Source IEEE Definition)
Inhibit. Design feature that prevents the operation of a function.
Insight. An element of Government surveillance that monitors contractor compliance using Government-identified metrics and contracted milestones. Insight is a continuum that can range from low intensity such as reviewing quarterly reports to high intensity such as performing surveys and reviews. (Source NPR 7123.1)
Maintain. To continue to have; to keep in existence, to stay up-to-date and correct.
Mission Critical. [1] Item or function that must retain its operational capability to assure no mission failure (i.e., for mission success). [2] An item or function, the failure of which may result in the inability to retain operational capability for mission continuation if corrective action is not successfully performed. (Source NASA-STD-8729.1)
Mission Success. Meeting all mission objectives and requirements for performance and safety. (Source NPR 8715.3)
Monitor. (1) software tool or hardware device that operates concurrently with a system or component and supervises, records, analyzes, or verifies the operation of the system or component; (2) collect project performance data with respect to a plan, process, produce performance measures, and report and disseminate performance information.
Participate. To be a part of the activity, audit, review, meeting, or assessment.
Perform. Software assurance does the action specified. Perform may include making comparisons of independent results with similar activities performed by engineering; performing audits; and reporting results to engineering.
Product. A result of a physical, analytical, or another process. The item delivered to the customer (e.g., hardware, software, test reports, data) and the processes (e.g., system engineering, design, test, logistics) that make the product possible. (Source NASA-HDBK-8709.22)
Program. A strategic investment by a Mission Directorate or Mission Support Office that has a defined architecture and technical approach, requirements, funding level, and management structure that initiates and directs one or more projects. A program implements a strategic direction that the Agency has identified as needed to accomplish Agency goals and objectives. (Source NPR 7120.5)
Program Manager. A generic term for the person who is formally assigned to be in charge of the program. A program manager could be designated as a program lead, program director, or some other term, as defined in the program's governing document. A program manager is responsible for the formulation and implementation of the program, per the governing document with the sponsoring MDAA.
Project. A specific investment having defined goals, objectives, requirements, life cycle cost, a beginning, and an end. A project yields new or revised products or services that directly address NASA’s strategic needs. They may be performed wholly in-house; by Government, industry, academia partnerships; or through contracts with private industry. (Source NPR 7150.2)
Project Manager. The entity or individual who accepts the resulting software products. Project managers are responsible and accountable for the safe conduct and successful outcome of their program or project in conformance with governing programmatic requirements. The project manager is usually NASA but can also refer to the prime contractor-subcontractor relationship as well. Provider. A Provider is a NASA or contractor organization that is tasked by an accountable organization (i.e., the Acquirer) to produce a product or service. (Source NASA-HDBK-8709.22)
Regression testing. (1) selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements (2) testing following modifications to a test item or its operational environment, to identify whether regression failures occur. (Source IEEE Definition)
Risk. The combination of (1) the probability (qualitative or quantitative) of experiencing an undesired event, (2) the consequences, impact, or severity that would occur if the undesired event were to occur, and (3) the uncertainties associated with the probability and consequences. (Source NPR 8715.3)
Note
A risk is an uncertain future event, or combination of events, that could threaten the achievement of performance objectives or requirements. A "problem," on the other hand, describes an issue that is certain or near certain to exist now, or an event that has been determined with certainty or near certainty to have occurred and is threatening the achievement of an objective or requirement. It is generally at the discretion of the decision authority to define at what level of certainty (i.e., likelihood) an event may be classified and addressed as a “problem” rather than as a “risk.” A risk may be conditional upon a problem, i.e., an existing issue may or may not develop into performance-objective consequences or the extent to which it may be at present uncertain.
Risk Posture. A characterization of risk based on conditions (e.g., criticality, complexity, environments, performance, cost, schedule) and a set of identified risks, taken as a whole which allows an understanding of the overall risk or provides a target risk range or level, which can then be used to support decisions being made.
Safe State. A system state in which hazards are inhibited, and all hazardous actuators are in a non-hazardous state. The system can have more than one Safe State.
Safety. Freedom from those conditions that can cause death, injury, occupational illness, damage to or loss of equipment or property, or damage to the environment. In a risk-informed context, safety is an overall mission and program condition that provides sufficient assurance that accidents will not result from the mission execution or program implementation, or, if they occur, their consequences will be mitigated. This assurance is established by means of the satisfaction of a combination of deterministic criteria and risk criteria. (Source NPR 8715.3)
Safety Analysis. Generic term for a family of analyses, which includes but is not limited to, preliminary hazard analysis, system (subsystem) hazard analysis, operating hazard analysis, software hazard analysis, sneak circuit, and others. Software safety analysis consists of a number of tools and techniques to identify safety risks and formulate effective controls. These techniques are used to help identify the hazards during the Hazard Analysis process, which in turn identifies the safety-critical software. The Safety Analysis techniques often used to support the Hazard Analysis are the Software Fault Tree Analysis and the Software Failure Modes and Effects Analysis. The Software Fault Tree Analysis and the Software Failure Modes and Effects Analysis are used to help identify hazards, hazard causes, and potential failure modes.
Safety-Critical. A term describing any condition, event, operation, process, equipment, or system that could cause or lead to severe injury, major damage, or mission failure if performed or built improperly or allowed to remain uncorrected. (Source NPR 8715.3)
Include Page
8739.8B Para 3.2 Def - Safety-Critical Software
8739.8B Para 3.2 Def - Safety-Critical Software
Software. defined as (1) computer programs, procedures, and associated documentation and data pertaining to the operation of a computer system (2) all or a part of the programs, procedures, rules, and associated documentation of an information processing (3) program or set of programs used to run a computer (4) all or part of the programs which process or support the processing of digital information (5) part of a product that is the computer program or the set of computer programs. This definition applies to software developed by NASA, software developed for NASA, software maintained by or for NASA, Commercial-Off-The-Shelf (COTS), Government-Off-The-Shelf (GOTS), Modified-Off-The-Shelf (MOTS), Open Source Software (OSS), reused software components, auto-generated code, embedded software, the software executed on processors embedded in programmable logic devices (see NASA-HDBK-4008), legacy, heritage, applications, freeware, shareware, trial or demonstration software, and OSS components. (Source NPR 7150.2)
Software Assurance. (1) a set of activities that assess adherence to, and the adequacy of the software processes used to develop and modify software products. Software assurance also determines the degree to which the desired results from software quality control are being obtained. (2) set of activities that define and assess the adequacy of software processes to provide evidence that establishes confidence that the software processes are appropriate for and produce software products of suitable quality for their intended purposes. (Source IEEE Definition)
Note
A key attribute of software assurance is the objectivity of the software assurance function with respect to the project.
Software Developer. A person, organization, or system that develops software based on program/project requirements.
Software Life Cycle. The period that begins when a software product is conceived and ends when the software is no longer available for use. The software life cycle typically includes a concept phase, requirements phase, design phase, implementation phase, test phase, installation and checkout phase, operation and maintenance phase, and sometimes, retirement phase.
Software Peer Review. An examination of a software product to detect and identify software anomalies, including errors and deviations from standards and specifications. (Source IEEE Definition)
Software Safety. The aspects of software engineering, system safety, and software assurance, that provide a systematic approach to identifying, analyzing, tracking, mitigating, and controlling hazards and hazardous functions of a system where software may contribute either to the hazard(s) or to its detection, mitigation or control, to ensure safe operation of the system.
Software Validation. (1)confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled (2) process of providing evidence that the system, software, or hardware and its associated products satisfy requirements allocated to it at the end of each life cycle activity, solve the right problem (e.g., correctly model physical laws, implement business rules, and use the proper system assumptions), and satisfy intended use and user needs (3) the assurance that a product, service, or system meets the needs of the customer and other identified stakeholders (4) process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements (5) confirmation in a timely manner, through automated techniques where possible, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled. (Source IEEE Definition)
Note: Validation in a system life cycle context is the set of activities ensuring and gaining confidence that a system is able to accomplish its intended use, goals, and objectives (meet stakeholder requirements) in the intended operational environment. The right system has been built or is operating to meet business objectives. Validation demonstrates that the system can be used by the users for their specific tasks. "Validated" is used to designate the corresponding status. Multiple Validation can be carried out if there are different intended uses.
Software Verification. Confirmation that products properly reflect the requirements specified for them. In other words, verification ensures that “you built it right.” (Source IEEE Definition)
Supplier. Any organization which provides a product or service to a customer. By this definition, suppliers may include vendors, subcontractors, contractors, flight programs/projects, and the NASA organization supplying science data to a principal investigator. The classical definition of a supplier is a subcontractor, at any tier, performing contract services or producing the contract articles for a contractor. (Source NASA-HDBK-8709.22) System Safety. Application of engineering and management principles, criteria, and techniques to optimize safety and reduce risks within the constraints of operational effectiveness, time, and cost.
Tailoring. The process used to adjust a prescribed requirement to accommodate the needs of a specific task or activity (e.g., program or project). Tailoring may result in changes, subtractions, or additions to a typical implementation of the requirement. (Source NPR 7150.2)
Track. To follow and note the course or progress of the product.
Div
id
tabs-4
4. SOFTWARE ASSURANCE AND SOFTWARE SAFETY REQUIREMENTS
Include Page
8739.8B Para 4.1
8739.8B Para 4.1
Include Page
8739.8B Para 4.2
8739.8B Para 4.2
Note
See Appendix A for guidelines associated with addressing software in hazard definitions. See Table 1, 3.7.1, SWE-205 for more details. Consideration for other independent means of protection (software, hardware, barriers, or administrative) should be a part of the system hazard definition process.
4.3 Software Assurance and Software Safety Requirements
4.3.1 The responsible project manager shall ensure the performance of the software assurance, software safety, and IV&V activities, the applicable requirements are defined in Table 1. In this document, the phrase “Software Assurance and Software Safety Tasks” means that the roles and responsibilities for completing these requirements may be delegated within the project consistent with the scope and scale of the project. The Center SMA Director designates SMA TA(s) for programs, facilities, and projects, providing direction, functional oversight, and assessment for all Agency software assurance, software safety, and IV&V activities.
Table 1. Software Assurance and Software Safety Requirements Mapping Matrix
4.4.1.1 IV&V is a technical discipline of software assurance that employs rigorous analysis and testing methodologies to identify objective evidence and conclusions to provide an independent assessment of critical products and processes throughout the software development life The evaluation of products and processes throughout the life cycle demonstrates whether the software is fit for nominal operations (required functionality, safety, dependability, etc.) and off-nominal conditions (response to faults, responses to hazardous conditions, etc.). The goal of the IV&V effort is to contribute assurance conclusions provided to the project and stakeholders based on evidence found in software development artifacts and risks associated with the intended behaviors of the software.
4.4.1.2 Three parameters define the independence of IV&V: technical independence, managerial independence, and financial independence.
a. Technical independence requires that the personnel performing the IV&V analysis are not involved in the development of the system or its elements. The IV&V team establishes an understanding of the problem and how the system addresses the problem. Through technical independence, the IV&V team’s different perspective allows it to detect subtle errors overlooked by personnel focused on developing the system.
b. Managerial independence requires that the personnel performing the IV&V analysis are not in the same organization as the development and program management team. Managerial independence also means that the IV&V team makes its own decisions about which segments of the system and its software to analyze and test, chooses the IV&V analysis methods to apply, and defines the IV&V schedule of activities. While independent from the development and program management organization, the IV&V team provides its findings in a timely manner to both of those organizations. The submission of findings to the program management organization should not include any restrictions (e.g., requiring the approval of the development organization) or any other adverse pressures from the development group.
c. Financial independence requires that the control of the IV&V budget be vested in a group independent of the software development organization. Financial independence does not necessarily mean that the IV&V team controls the budget but that the finances should be structured so that funding is available for the IV&V team to complete its analysis or test work. No adverse financial pressure or influence is applied.
4.4.1.3 The IV&V process starts early in the software development life cycle, providing feedback to the IV&V provider organization, allowing the IV&V team to modify products at optimal timeframes and in a timely fashion, thereby reducing overall project risk. The feedback also answers project stakeholders’ questions about system properties (correctness, robustness, safety, security, etc.) to make informed decisions with respect to the development and acceptance of the system and its software.
4.4.1.4 The IV&V provider performs two primary activities, often concurrently: verification and validation. Each of the activities provides a different perspective on the system/software.
a. Verification is the process of evaluating a system and its software to provide objective evidence as to whether or not a product conforms to the build-to requirements and design specifications. Verification holds from the requirements through the design and code and into testing. Verification demonstrates that the products of a given development phase satisfy the conditions imposed at the start of or during that phase.
b. Validation develops objective evidence that shows that the content of the engineering artifact is the right content for the developed system/software.
The content is accurate and correct if the objective evidence demonstrates that it satisfies the system requirements (e.g., user needs, stakeholder needs, etc.), fully describes the required capability/functionality needed, and solves the right problem.
4.4.1.5 The main goal of the IV&V effort is to identify and generate objective evidence that supports the correct operation of the system or refutes the correct operation of the system. The IV&V provider typically works with the development team to understand this objective evidence, which provides artifacts such as concept studies, operations concepts, and requirements that define the overall project. The IV&V provider uses these materials to develop an independent understanding of the project’s commitment to NASA, which forms the basis for validating lower-level technical artifacts.
4.4.1.6 Two principles help guide the development and use of objective evidence.
a. Performing IV&V throughout the entire development lifetime is the first principle; potential problems should be detected as early as possible in the development life Performing IV&V throughout the entire development lifetime provides the IV&V team with sufficient information to establish a basis for the analysis results and provides early objective evidence to the development and program management groups to help keep the development effort on track early in the life cycle.
b. The second principle is “appropriate assurance.” Given that it is not possible to provide IV&V on all aspects of a project’s software, the IV&V provider and project should balance risks against available resources to define an IV&V program for each project that provides IV&V so that the software will operate correctly, safely, reliably, and securely throughout its operational lifetime. The IPEP documents this tailored approach and summarizes the cost/benefit trade-offs made in the scoping process.
4.4.1.7 The IV&V requirements are analyzed and partitioned according to the type of artifact. The requirements do not imply or require the use of any specific life cycle model. It is also important to understand that IV&V applies to any life cycle development process. The IV&V requirements document the potential scope of analysis performed by the IV&V provider and the key responsibility of the software project to provide the information needed to perform that analysis. Additionally, the risk assessment is used to scope the IV&V analysis to help determine the prioritization of activities and the level of rigor associated with performing those activities. The scoping exercise results are captured in the IV&V Project Execution Plan, as documented below.
Include Page
8739.8B 4.4.2 IV&V Requirements
8739.8B 4.4.2 IV&V Requirements
Include Page
8739.8B 4.5 Principles Related to Tailoring the Standard Requirements
8739.8B 4.5 Principles Related to Tailoring the Standard Requirements