This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2C
3.5.2 The project’s software assurance manager shall perform an independent classification assessment.
Engineering and software assurance must reach agreement on the software classification determination of the software. Disagreements are elevated via both the Engineering Technical Authority and Safety and Mission Assurance Technical Authority chains.
1.2 Applicability Across Classes
Class A B C CSC D DSC E F G H Applicable?
Key: - Applicable | - Not Applicable
A & B = Always Safety Critical; C & D = Not Safety Critical; CSC & DSC = Safety Critical; E - H = Never Safety Critical.
The reason for an independent (separate from the project) assessment of the software class is two-fold:
(1) It provides an additional perspective, based on analyses, of a project's software to ensure that the proper software processes and requirements are applied to a project's software development.
(2) It assures that the software assurance personnel are aware of, and plan for, the level of assurance needed to cover the software for the project.
Having separate assessments allows engineering and assurance to come to the table with their own perspectives, backed by analyses, for what they expect the processes and requirements need to be for a project; and then work together to resolve any differences and reach a consensus for what the project needs. Also, the independent assessment provides a path for the software assurance personnel to raise awareness of their issues and concerns as well as a path to raise any unresolved issues to a higher level. Quality, reliability and safety are required to have a separate, independent path to as high as NASA Headquarters, if needed, to assure Safety and Mission Assurance's (S&MA) concerns are addressed. However, it is expected that most conflicts will be resolved at the project or Center level.
This requirement lets the project know that it needs to allow for and expect, an independent assessment of the class of software being used on their project.
The independent assessment is performed by the Center's S&MA organization's software assurance personnel. This assessment is independent of the classification performed by the project.
The project and software assurance each document their own initial analyses, reasons and assessments and document it within their own systems and records. The final consensus software classification between the project and the independent Center S&MA personnel is recorded in the project documentation.
For the preliminary assessment, which is based on high-level project goals and systems functions, the role software will have is based on the expected roles and responsibilities of the software to aid in performing the system functions. At best, the preliminary software classifications will be based on the overall functions and the consensus software classifications are recorded in a document, such as a preliminary Systems Requirements document.
As system requirements evolve into software requirements and the Software Development/Management Plan takes shape, both the Software Development/Management Plan and the Software Assurance Plan (if separate) contain the consensus software classification.
Once there is more specificity about a sub-system and the possible computer software configuration item (CSCI), typically known by the Preliminary Design Review (PDR), both the project software engineering and the Center S&MA assurance personnel need to assess and reach consensus on the classification of each CSCI. Again, each organization performs and documents their own assessments and the consensus classifications get documented and signed off in the project documents; at this point, the design documents may be used to capture the consensus classifications for the CSCIs or the Software Development/Management Plan(s) can be updated. The intent is to document the agreed-upon software classification based on where it is most useful to the project in order to properly plan and execute the necessary processes and requirements that apply to the CSCIs. In fact, if there is not a lot of volatility, the classification and safety criticality could be documented in both places, though this has the obvious issue of maintenance consistency.
The level of CSCIs a project decides on determines the level and number of software classification assessments that need to be performed. At a minimum, there is an assessment for each system and sub-system containing software.
SMA software assurance personnel and the software engineers might need to work together to understand and work out the software classifications for new and novel software or approaches to software development, including, but not limited to:
- Development approaches new to NASA.
- Approaches new to the software industry, such as new software design approaches.
- Innovative software, such as software that employs new ways of dividing up functionality, perhaps to support an experiment.
NASA-STD-8739.8, NASA Software Assurance Standard, 278 contains more details on the software assurance approach to software classification and how it is used to determine the level of effort for software assurance on a project. A software assessment sheet used to capture the software classification can be found in Appendix A of the Standard.
Additional guidance related to classifying software may be found in the following related requirement in this Handbook:
4. Small Projects
Even small projects need to have an independent assessment. And it is especially important to assure there is consensus agreement between projects and assurance on the criticality as well as the class or classes of software. Smaller projects can consider performing only one overall assessment, adjusting the levels at which the assessments are done or simplifying when and where the consensus, agreed-upon, classification is documented.
Tools relative to this SWE may be found in the table below. You may wish to reference the Tools Table in this handbook for an evolving list of these and other tools in use at NASA. Note that this table should not be considered all-inclusive, nor is it an endorsement of any particular tool. Check with your Center to see what tools are available to facilitate compliance with this requirement.
No tools have been currently identified for this SWE. If you wish to suggest a tool, please leave a comment below.
6. Lessons Learned
No Lessons Learned have currently been identified for this requirement.