1. Introduction
This checklist provides comprehensive data and evidence required to certify software for human-rated missions.It ensures compliance with applicable safety standards, regulatory requirements (NASA NPR 7150.2D 083 , SSP 50038 014 , FAA, NASA-STD-8739.8B 278), mission-critical functionality, and stakeholder acceptance of residual risks, demonstrating that the software is safe, reliable, and mission-ready for crewed spaceflight operations.
Insert PAT-082 for Comprehensive Checklist for Software Certification in Human-Rated Missions
2. Key Compliance Data Needs
2.1 Summary Table of Key Compliance Data Needs
Category | Key Data/Documentation |
|---|---|
Requirements |
|
Design |
|
Development |
|
Verification & Validation |
|
Hazard Analysis |
|
Configuration Management |
|
Operational Procedures |
|
2.2 Key Compliance Data Needs
- Software Requirements
- High-level system/software requirements
- Detailed software requirements (or whatever the developer used)
- All known software safety constraints
- Software bi-directional traceability data
- Specifications for internal and external software interfaces definition and testing
- Encryption protocols, authentication mechanisms, secure coding practices, and access control procedures.
- Software Design
- Description of software designed
- Hardware design data on safety-critical subsystems
- Data Dictionary: input/output data formats, telemetry parameters, and command sequences.
- Software Development
- All software analyses results
- Completed Time-to-effect (TTE) analysis
- Completed Fault Tree Analyses
- Completed Failure Mode and Effects Analysis
- Software process audit results
- Developer software process training records
- Software Verification and Validation (software testing)
- Software test data,
- safety-critical requirements test results,
- fault Injection Test Results,
- End-to-End Integration Testing results,
- Penetration Testing Results (resilience testing and telemetry plans against unauthorized system access and cyberattacks),
- test results and data showing command execution timing within acceptable,
- test results and data confirming adequate system resource margins
- Detailed description of the software test environments
- software interfaces (internal and external) test results
- Code test coverage data
- Software static analysis results reports
- Number and types of static analysis tools used.
- Results of a Security Vulnerability Analysis: detected and resolved vulnerabilities in the software's security framework.
- All of the Independent Verification and Validation (IV&V) assessments results
- Data showing that the safety-critical software components meet complexity thresholds
- Evidence that the code structural quality has low risks.
- Hazards
- Hazards and mitigation controls that include software
- List of any unresolved hazards
- CM
- Processes used for version control, change tracking, and baseline management.
- Identification of flight-ready software configurations,
- Flight readiness and Operations
- Clear understanding of the operational environment for the mission.
- Operational procedures for updating the software and data
- Any software related threats for the operational environment on the software operation
- List of and access to all open software defects
- List of and access to all open and closed high-risk software defects.
- Stakeholder-approved sign-off on any unavoidable operational software related risks.
- Evidence of adherence to validated development processes, coding guidelines, and testing protocols.
- Deliverables required for regulatory certification
- Software Version Description Document (VDD)
- FRR Exit Criteria Sign-Off for software
- Crew software user guides, operational procedures, and troubleshooting documentation.
- Documentation showing mechanisms to handle errors, recover failures, and preserve system operation under degraded conditions.
3. Safety Case for Human-Rated Software Certification
This safety case demonstrates that the software used in this human-rated mission adheres to rigorous safety, quality, and regulatory standards. Based on the evidence provided, the software is flight-ready and capable of supporting critical mission operations while ensuring the safety of the crew and spacecraft under both nominal and adverse conditions.
1. Requirements and Traceability
- Argument: The software requirements are clearly defined, traceable, and aligned with safety-critical mission needs.
- Evidence:
- Comprehensive Software Requirements Specification (SRS) covering high-level mission-critical systems (e.g., navigation, propulsion, anomaly detection, life support, and abort operations).
- Verified safety requirements (fault tolerance, redundancy, and safe initialization/termination).
- Acceptable quality of detailed low-level safety-critical requirements, including specifics like algorithm designs and timing constraints.
- A completed and validated Requirements Traceability Matrix (RTM) showing bi-directional traceability from requirements through design, code, and test results.
- Reviewed system-level safety analyses to document "Must Work" (MWF) and "Must Not Work" (MNWF) requirements, prerequisite checks for hazardous commands, and mitigation strategies.
2. Software Design and Architecture
- Argument: The software architecture is resilient, modular, and designed for fault tolerance and safety-critical operations.
- Evidence:
- Architecture documentation detailing modular fault isolation, redundancy, and resiliency mechanisms.
- Block diagrams illustrating fault containment, fail-safe control paths, and separation of critical functions.
- Documentation and analysis of safety-critical subsystems (e.g., propulsion, crew displays, navigation) with clearly defined responsibilities.
- Verified Interface Control Documents (ICDs), ensuring compatibility between internal software, hardware systems, and external interactions.
- Safety validation evidence for safeguards like fault containment, error detection, operator validation, integrity checks, and anomaly recovery processes.
- Independent redundant system designs ensuring physical and logical separation to mitigate single points of failure.
- Validation of fault-tolerant mechanisms, including cosmic radiation protection in CPU designs.
3. Hazard Analysis and Safety Evidence
- Argument: All hazards associated with software functionality are identified, analyzed, and mitigated to acceptable levels of risk.
- Evidence:
- A complete Hazard Analysis Report (HAR) identifying software-driving hazards and the mitigation strategies in place.
- Fault Tree Analysis (FTA) and Failure Mode and Effects Analysis (FMEA) showing robust fault prevention and recovery mechanisms.
- Time-to-effect (TTE) analyses ensuring hazardous conditions can be addressed by safing systems within operational thresholds.
- Residual risk documentation showing resolution or acceptance of remaining risks by stakeholders.
4. Verification and Validation (V&V) Evidence
- Argument: Rigorous testing, validation, and coverage analyses demonstrate software compliance with safety-critical requirements.
- Evidence:
- 100% Statement Coverage.
- 100% Decision Coverage.
- 100% Modified Condition/Decision Coverage (MC/DC) for safety-critical components.
- Unit testing, system integration testing, end-to-end validation, and operational flight simulations confirming that expected functional performance aligns with safety goals.
- Validation of reused components (COTS, GOTS, OSS, MOTS) to ensure compatibility and reliable integration into human-rated environments.
- Coverage analysis demonstrating:
- Static analysis reports showing compliance with coding standards and identification/remediation of software defects.
- Fault injection testing results validating responses to corrupted data, anomalies during power disruptions, and memory errors.
- Worst-case response timing analysis confirming safing systems meet TTE requirements under degraded conditions.
5. Configuration Management and Change Tracking
- Argument: Configuration management processes ensure version control and traceability for all software changes.
- Evidence:
- Documentation showing version-controlled baselines for flight-ready software, including configuration hashes and release notes.
- Audit records verifying modifications, regression testing, impact analyses, and stakeholder approvals
6. Cybersecurity and Security Validation
- Argument: The software architecture incorporates robust cybersecurity measures to mitigate threats in operation environments.
- Evidence:
- Security validation reports demonstrating encryption protocols, authentication mechanisms, access control, and secure coding practices.
- Penetration testing results validating resilience against cyberattacks and unauthorized system access during pre-launch and flight.
- Vulnerability analysis reports confirming detection, resolution, and closure of security-related risks.
7. Defect Management and Residual Risks
- Argument: All software defects have been resolved or mitigated to acceptable levels of residual risk.
- Evidence:
- Defect reports showing all open and closed defects categorized by severity and justifications for acceptance of residual risks.
- Logs documenting defect resolutions and testing data validating the outcomes of mitigation measures.
- Residual risk acceptance documentation signed off by stakeholders, with sufficient evidence showing safe system behavior despite unresolved minor risks.
8. Resource Utilization and Performance Metrics
- Argument: The software demonstrates sufficient resource margins and acceptable performance under normal and worst-case conditions.
- Evidence:
- Validation test results confirming acceptable command execution timing (e.g., abort triggers).
- Operating analysis showing CPU utilization below 80% even under maximum load conditions.
- Methods for anomaly detection and recovery to safe states outlined and validated.
9. Team Training and Software Process Compliance
- Argument: Development teams adhere to validated processes and are properly trained in safety-critical mission standards.
- Evidence:
- Records of team training addressing human-rated software workflows, defect management, and compliance with coding guidelines.
- Process compliance reports documenting adherence to validated development processes.
- Operator manuals ensuring deliberate, independent actions are necessary to execute critical safety commands
10. Certification and Regulatory Compliance
- Argument: The software complies with all applicable standards and safety regulations for human-rated missions.
- Evidence:
- Certification artifacts for compliance with standards like NASA NPR 7150.2D 083 ,
NASA SSP 50038 014, FAA requirements, and NASA-STD-8739.8B 278 . - IV&V certification reports confirming operational maturity and compliance with safety standards by independent entities.
- Regulatory compliance statements from authorities certifying readiness for human-rated missions.
- Validation of software updates (patched or upgraded) ensuring continued compliance with safety requirements.
11. Flight Readiness Review (FRR) Certification
- Argument: The software is flight-ready and capable of safely supporting mission operations.
- Evidence:
- Software Version Description Document (VDD) completion demonstrating proper documentation of the deployed software.
- Final test results confirming readiness during flight operations in all mission environments.
- FRR exit criteria signed off by stakeholders, certifying acceptance or resolution of all known risks, hazards, defects, and anomalies.
12. Flight Software Structural Quality
- Argument: The software architecture and implementation are structurally sound and meet all quality standards for safety-critical applications.
- Evidence:
- Cyclomatic complexity analysis showing all safety-critical components meet thresholds (≤ 15).
- Documentation verifying fault-tolerant mechanisms for error handling, failure recovery, and system operation under degraded conditions.
- Maintainability analysis supporting modular coding practices for long-term sustainability and easy updates.
- Code quality reports validating compliance with architecture, standards, security, and testability requirements.
4. Resources
4.1 References
- (SWEREF-014) SSP 50038, Revision C, NASA International Space Station Program, 1995.
- (SWEREF-083) NPR 7150.2D, Effective Date: March 08, 2022, Expiration Date: March 08, 2027 https://nodis3.gsfc.nasa.gov/displayDir.cfm?t=NPR&c=7150&s=2D Contains link to full text copy in PDF format. Search for "SWEREF-083" for links to old NPR7150.2 copies.
- (SWEREF-278) NASA-STD-8739.8B, NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A"
4.2 Tools
4.3 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
4.4 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
| SPAN Links |
|---|



0 Comments