This version of SWEHB is associated with NPR 7150.2B. Click for the latest version of the SWEHB based on NPR7150.2D



1. Requirements
4.2.4 The project manager shall perform a software architecture review on the following categories of projects:
a. Category 1 Projects as defined in NPR 7120.5.
b. Category 2 Projects as defined in NPR 7120.5 that have Class A or Class B payload risk classification per NPR 8705.4.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
This requirement applies based on the selection criteria defined in this requirement.
2. Rationale
Software architecture deals with the fundamental organization of a system, as embodied in its components and their relationships to each other and to the environment. 487 Software architecture reviews are conducted to inspect quality attributes, principles of design, verifiability, and operability of a software system. The software architecture is reviewed to evaluate its effectiveness, efficiency, and robustness relative to the software requirements. The software architecture review helps manage and/or reduce flight software complexity through improved software architecture, improved mission software reliability and cost savings.
3. Guidance
Per the FAQ section of the NASA Software Architecture Review Board (SARB) sub-community of practice, accessible to NASA users on the NASA Engineering Network (NEN):
“Software architecture should be evaluated with respect to the problem it is trying to solve. That's why a project's problem statement is an essential part of review preparation; it describes the problem to be solved and lists success criteria held by various stakeholders. Software architecture should also be evaluated with respect to common concerns within its problem domain [e.g.,] flight software...
“During a review it's important to keep in mind the distinction between software architecture and software architecture description. A system can have a great architecture and a poor description, or vice versa, or any of the other two combinations. In a review it's important to note where each weakness lies: in architecture or description. ...
“The system architecture and the requirements for the items shall be evaluated considering the criteria listed below. The results of the evaluations shall be documented.
a) Traceability to the system requirements.
b) Consistency with the system requirements.
c) Appropriateness of design standards and methods used.
d) Feasibility of the software items fulfilling their allocated requirements.
e) Feasibility of operation and maintenance.
NOTE System architecture traceability to the system requirements should also provide for traceability to the stakeholder requirements baseline.” 224
Software architecture reviews are held during the software architecture formulation phase, before the software architecture is finalized. Generally a software architecture review should be held prior to the software preliminary design review (PDR). The earlier the review board is involved in the software architecture development process, the more effective the inputs will be to the architecture development.
The SARB checklist found on the SARB sub-community of practice, accessible to NASA users on NEN, provides the following advice for software architecture reviews:
- Some of the basic questions to be answered during the review include (excerpted from the SARB checklist found on the SARB sub-community of practice, accessible to NASA users on NEN):
- Is it clear what the architecture needs to do?
- Is the architecture capable of supporting what the system must do?
- How does the architecture compare with previously implemented architectures, both successful and unsuccessful? In particular, does the architecture display superior qualities that should be incorporated on other missions and/or at other Centers?
- Have all the requirements been defined?
- Have all the stakeholders been identified to provide the essential design input specifying how the software is to be operated, how it is to execute its functions, and how it is to provide its products?
- Have all the relevant quality attributes been identified and prioritized, and have they been characterized relative to the nature of the mission?
- Has the source of all software elements been identified? Is it clear which portions of the system are custom-built, which are heritage/reused, and which are commercial off-the-shelf (COTS)? Does rationale exist explaining the project’s respective choices?
- Is it clear whether the architecture will be compatible with the other system components?
- Has the architecture been properly documented for future reference?
- Does the architecture induce unnecessary complexity when addressing what must be done?
- Is the proposed fault management approach appropriate for the system?
- Is the architecture flexible enough to support maturation of the architects’ understanding of what the system must do?
- Does the architecture support efficient code development and testing?
- Have safety critical elements of the architecture been identified properly?
- Will the architecture support future systems that are natural extensions or expansions of the current system?
- Does the architecture support performance requirements, including processing rates, data volumes, downlink constraints, etc.?
- To enable effective reuse, does the proposed architecture support some degree of scalability?
- When putting together the review team, choose reviewers sufficiently familiar with the type of mission being supported to know what basic functions the software must implement.
- General guidance for architecture reviews:
- Have a standard approach/format for presenting the architecture; use a format similar to a formal design review, for example.
- Have a standard template for the reviews to ensure the required material is included in the presentation.
- Presenters are to clearly present the architecture and the rationale for their architecture choices.
The results of the software architecture review, including findings, concerns, best practices, etc. are captured in a report and presented by the review board to management and others who can improve future processes and ensure that identified concerns are addressed. Problems found during the software architecture review should be addressed in the project’s closed-loop problem tracking system or corrective action system to ensure they are addressed.
Expanded criteria definitions
Category 1 projects are defined in NPR 7120.5E, NASA Space Flight Program and Project Management Requirements, as human space flight projects, projects with life-cycle cost exceeding $1B, or projects with significant radioactive material. 082
Category 2 projects are defined in NPR 7120.5E as projects that have life-cycle costs greater than $250M and less than $1B or have life-cycle costs less than $250M with a high priority level based on “the importance of the activity to NASA, the extent of international participation (or joint effort with other government agencies), the degree of uncertainty surrounding the application of new or untested technologies” and a Class A or Class B payload risk classification. 082
Class A payload risk classifications are defined in NPR 8705.4, Risk Classification for NASA Payloads, as payloads with high priority, very low (minimized) risk, very high national significance, very high to high complexity, greater than 5 year mission lifetime, high cost, critical launch constraints, no alternative or re-flight opportunities, and/or payloads where “all practical measures are taken to achieve minimum risk to mission success. The highest assurance standards are used.” 048
Class B payload risk classifications are defined in NPR 8705.4 as payloads with high priority, low risk, high national significance, high to medium complexity, two- to five- year mission lifetime, high to medium cost, medium launch constraints, infeasible or difficult in-flight maintenance, few or no alternative or re-flight opportunities, and/or payloads where “stringent assurance standards [are applied] with only minor compromises in application to maintain a low risk to mission success.”
Per NPR 8705.4, “The importance weighting assigned to each consideration is at the discretion of the responsible Mission Directorate.”
The Software Architecture Review Board sub-community of practice, accessible to NASA users on NASA Engineering Network (NEN), provides additional information, including guidance, checklists, and examples, for conducting software architecture reviews.
Additional guidance related to software architecture may be found in the following related requirements in this Handbook:
4. Small Projects
No additional guidance is available for small projects.
5. Resources
5.1 Tools
Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
The NASA Lesson Learned database contains the following lessons learned related to software architecture reviews:
- MER Spirit Flash Memory Anomaly (2004). Lesson Number 1483: "Shortly after the commencement of science activities on Mars, an MER rover lost the ability to execute any task that requested memory from the flight computer. The cause was incorrect configuration parameters in two operating system software modules that control the storage of files in system memory and flash memory. Seven recommendations cover enforcing design guidelines for COTS software, verifying assumptions about software behavior, maintaining a list of lower priority action items, testing flight software internal functions, creating a comprehensive suite of tests and automated analysis tools, providing downlinked data on system resources, and avoiding the problematic file system and complex directory structure.." 557
- NASA Study of Flight Software Complexity. Lesson Learned 2050: “Flight software development problems led NASA to study the factors that have led to the accelerating growth in flight software size and complexity. The March 2009 report on the NASA Study on Flight Software Complexity contains recommendations in the areas of systems engineering, software architecture, testing, and project management." 571