This topic contains checklists for use by Software Assurance and Software Safety personnel when they are auditing projects with safety-critical software. The first checklist, Software Safety Process Audit Checklist, is intended to be used primarily with contractor organizations doing the safety critical software and has more of a focus on the processes in place as well as checking on activities. The second checklist, Software Safety Activities Checklist for Internal Audits is intended to be used when the software safety personnel are in-house and focuses more on the compliance with the specific required activities for safety-critical software.
2. Software Safety Process Audit Checklist
The Software Safety Process Audit Checklist is intended to be used primarily with contractor organizations doing the safety critical software and has more of a focus on the processes in place as well as checking on activities.
Click to download a usable copy of this checklist: Software Safety Process Audit Checklist.
This checklist is designed to check the processes in place for performing software safety activities, either in the contract organization or in the in-house organization.
Is there a software safety process in place for the development of software?
Is this process documented?
Is safety-critical software identified?
Has a software safety requirements analysis been done?
Are safety-critical requirements identified?
Is there CM in place for tracking all software safety-critical requirements?
Is the software analyzed throughout the design, development, test and the V&V process to verify and validate that the safety design requirements have been correctly and completely implemented?
Does software design analysis include Fault Tree Analysis (FTA), Failure Modes and Effects Analysis (FMEA), Code/Logic Analysis, & Traceability to assess the adequacy of hazard controls?
Is there a Safety Assessment Report? Is there end-to-end traceability of software safety activities, from initial system assessment through implementation and verification of Safety Critical Software Functions (SCSFs), software release, deployment, operations, and maintenance?
Have the verification methods been identified and applied for each software safety-critical requirement?
Does the safety organization have work instructions for each task that is performed?
Does the safety organization participate in milestone and software reviews, including
a) Conceptual review?
b) Requirement review?
c) Design reviews?
d) Code reviews?
e) Test readiness reviews?
f) System acceptance reviews?
g) Peer reviews?
Has safety/software safety reviewed the SRS?
a) Were safety-critical requirements identified?
b) Are all software controls, mitigations included in the Software Requirements Specification (SRS)?
Does safety track safety-critical requirements throughout the system lifecycle to ensure they are correctly coded, tested, and verified?
Is there a current process for formally documenting software safety-critical issues?
Does the software safety-critical organization review and approve all changes to the software safety-critical requirements during the change review process?
Is there objective data or evidence that shows the software development organizations followed good software processes?
Is there objective evidence that the system engineers and hardware engineers have reviewed the software requirements, software design, software testing and participated in the software peer reviews associated with software safety-critical functions?
Does the safety process include a hazard identification and analysis process?
Are system-level hazards identified and tracked?
Have all of the software contributions to the system hazard been identified?
Have software/hardware controls been identified to mitigate software contributions to system hazards?
Have adequate verification methods been identified for each hazard mitigation?
Are all safety-critical software requirements traced through the software products (i.e., software requirements, software design, software code, software test documents)?
Are the interfaces in an ICD or equivalent clearly defined?
What was the method for documenting discrepancies in the requirements? Were all discrepancies in the requirements documented in a tracking system?
Did the safety organization provide objective evidence that all safety-related discrepancies in the requirements review were fixed and closed?
Did safety ensure that all safety-related requirements have been satisfied by the design?
Is there evidence of closure of all safety-related action items from the software design reviews?
Did safety participate in design evaluations before release for coding?
Does the software architecture consistently define and support the implementation of all software safety-critical requirements and all applicable computer-based control system requirements?
Does the software architecture consistently define and support the implementation of Fault Detection, Isolation, and Recovery functions for all safety-critical activities?
Does the software design address software fault management functions?
Does the software design address cybersecurity requirements and functions?
Are all of the software design functionally traceable to the software requirements?
Did the software development engineers follow a secure coding standard when creating code?
Did the safety organization perform traceability from the requirements down to the design and code?
Was safety involved in the code peer reviews and/or code walkthroughs?
What was the method for documenting discrepancies in the code?
Is there objective evidence that all safety-related discrepancies identified in code reviews were fixed and closed?
Are all of the safety-critical code components below the cyclomatic complexity requirement of 15?
Is there evidence of repeatable software unit tests for all software safety-critical components?
Has an independent software code quality risk assessment been done? And is the software code risk acceptable?
Was the safety organization involved in test peer reviews for safety-critical test cases?
What were the methods for documenting discrepancies in the test?
Was traceability performed from the requirements and code to the test cases?
Was software assurance involved in the test readiness review?
Is there objective evidence that all safety-related discrepancies in the test reviews are fixed and closed?
Did software assurance witness all safety-critical formal qualification tests?
Does the test verify all software safety-critical components?
Do all software safety-critical components have 100% test coverage?
Is there objective evidence that the software being used is the same software version tested, including the software configuration items (e.g., software records, code, configuration data, tools, models, scripts)?
System Acceptance Review
Have safety issues been identified, documented, and resolved throughout the software lifecycle?
Is there evidence that all software changes are tracked and evaluated?
Is there a plan in place for maintenance, changes, and operations of the software?
3. Software Safety Activities Checklist for Internal Audits
The Software Safety Activities Checklist for Internal Audits is intended to be used when the software safety personnel are in-house and focuses more on the compliance with the specific required activities for safety-critical software.
Click to download a usable copy of this checklist: Software Safety Activities Checklist for Internal Audits.
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.