Page History
Info |
---|
This wiki page is mirrored from the following attached file: NPR7150_2A_Class_A_and_Safety_Critical.docx |
Panel | |||||
---|---|---|---|---|---|
| |||||
This file contains only requirements associated with the title above. All notes and associated explanations have been removed. The convention used are:
|
P.2.3 This NPR shall be applied to software development, maintenance, retirement, operations, management, acquisition, and assurance activities started after its initial date of issuance [SWE-001].
Div | ||
---|---|---|
| ||
1.2.1 The NASA Headquarters Office of the Chief Engineer shall lead, maintain, and fund a NASA Software Engineering Initiative to advance software engineering practices. [SWE-002] 1.2.2 Each Center shall maintain, staff, and implement a plan to continually advance its in-house software engineering capability and monitor the software engineering capability of NASA's contractors, as per NASA's Software Engineering Initiative Improvement Plan. [SWE-003] 1.2.3 The NASA Headquarters' Chief Engineer shall periodically benchmark each Center's software engineering capability against its Center Software Engineering Improvement Plan. [SWE-004] 1.2.4 Each Center shall establish, document, execute, and maintain software processes. [SWE-005] 1.2.5 To support compliance with NASA policy and facilitate the application of resources to mitigate risk, the NASA Headquarters' Chief Engineer, in coordination with the Chief, Safety and Mission Assurance, shall maintain a reliable list of the Agency's programs and projects containing software. [SWE-006] |
2.2.1.1 The project shall develop software plan(s). [SWE-013]
2.2.1.2 For safety-critical software, the project shall develop a software safety plan. [SWE-130]
2.2.1.3 If a project is selected for software Independent Verification and Validation (IV&V) by the NASA Chief, Safety and Mission Assurance, the NASA IV&V program shall develop an IV&V Project Execution Plan (IPEP). [SWE-131] [If selected for IV&V]
2.2.2 The project shall implement, maintain, and execute the software plan(s). [SWE-014]
2.2.3 The project shall establish, document, and maintain at least one software cost estimate and associated cost parameter(s) that satisfies the following conditions: [SWE-015]
a. Covers the entire software life cycle.
b. Is based on selected project attributes (e.g., assessment of the size, functionality, complexity, criticality, and risk of the software processes and products).
c. Is based on the cost implications of the technology to be used and the required maturation of that technology.
2.2.4 The project shall document and maintain a software schedule that satisfies the following conditions: [SWE-016]
a. Coordinates with the overall project schedule.
b. Documents the interactions of milestones and deliverables between software, hardware, operations, and the rest of the system.
c. Reflects the critical path for the software development activities.
2.2.5 The project shall plan, track, and ensure project specific software training for project personnel. [SWE-017]
2.2.6 The project shall regularly hold reviews of software activities, status, and results with the project stakeholders and track issues to resolution. [SWE-018]
2.2.7 The project shall select and document a software development life cycle or model that includes phase transition criteria for each life-cycle phase (e.g., formal review milestones, informal reviews, software requirements review (SRR), preliminary design review (PDR), critical design review (CDR), test readiness reviews, customer acceptance or approval reviews). [SWE-019]
2.2.8.1 The project shall classify each system and subsystem containing software in accordance with the software classification definitions for Classes A, B, C, D, E, F, G, and H software in Appendix E. [SWE-020]
2.2.8.2 The project's software assurance organization shall perform an independent classification assessment. [SWE-132]
2.2.8.3 The project, in conjunction with the Safety and Mission Assurance organization, shall determine the software safety criticality in accordance with NASA-STD-8739.8. [SWE-133]
2.2.9 If a system or subsystem evolves to a higher software classification as defined in Appendix E, then the project shall update its plan to fulfill the added requirements per the Requirements Mapping Matrix in Appendix D. [SWE-021]
2.2.10 The project shall implement software assurance per NASA-STD-8739.8, NASA Software Assurance Standard. [SWE-022]
2.2.11 When a project is determined to have safety-critical software, the project shall ensure that the safety requirements of NASA-STD-8719.13, Software Safety Standard, are implemented by the project. [SWE-023]
2.2.12 When a project is determined to have safety-critical software, the project shall ensure the following items are implemented in the software: [SWE-134]
a. Safety-critical software is initialized, at first start and at restarts, to a known safe state.
b. Safety-critical software safely transitions between all predefined known states.
c. Termination performed by software of safety critical functions is performed to a known safe state.
d. Operator overrides of safety-critical software functions require at least two independent actions by an operator.
e. Safety-critical software rejects commands received out of sequence, when execution of those commands out of sequence can cause a hazard.
f. Safety-critical software detects inadvertent memory modification and recovers to a known safe state.
g. Safety-critical software performs integrity checks on inputs and outputs to/from the software system.
h. Safety-critical software performs prerequisite checks prior to the execution of safety-critical software commands.
i. No single software event or action is allowed to initiate an identified hazard.
j. Safety-critical software responds to an off nominal condition within the time needed to prevent a hazardous event.
k. Software provides error handling of safety-critical functions.
l. Safety-critical software has the capability to place the system into a safe state.
m. Safety-critical elements (requirements, design elements, code components, and interfaces) are uniquely identified as safety-critical.
n. Incorporate requirements in the coding methods, standards, and/or criteria to clearly identify safety-critical code and data within source code comments.
2.2.13 The project shall ensure that actual results and performance of software activities are tracked against the software plans. [SWE-024]
2.2.14 The project shall ensure that corrective actions are taken, recorded, and managed to closure when actual results and performance deviate from the software plans. [SWE-025]
2.2.15 The project shall ensure that changes to commitments (e.g., software plans) are agreed to by the affected groups and individuals. [SWE-026]
2.3.1 The project shall ensure that when a COTS, GOTS, MOTS, reused, or open source software component is to be acquired or used, the following conditions are satisfied: [SWE-027]
a. The requirements that are to be met by the software component are identified.
b. The software component includes documentation to fulfill its intended purpose (e.g., usage instructions).
c. Proprietary, usage, ownership, warranty, licensing rights, and transfer rights have been addressed.
d. Future support for the software product is planned.
e. The software component is verified and validated to the same level of confidence as would be required of the developed software component.
2.4.1 The project shall plan software verification activities, methods, environments, and criteria for the project. [SWE-028]
2.4.2 The project shall plan the software validation activities, methods, environments, and criteria for the project. [SWE-029]
2.4.3 The project shall record, address, and track to closure the results of software verification activities. [SWE-030]
2.4.4 The project shall record, address, and track to closure the results of software validation activities. [SWE-031]
2.5.1 [SWE-032] The project shall ensure that software is acquired, developed, and maintained by an organization with a non-expired Capability Maturity Model Integration® for Development (CMMI-DEV) rating as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser as follows:
For Class A software:
CMMI-DEV Maturity Level 3 Rating or higher for software, or CMMI-DEV Capability Level 3 Rating or higher in all CMMI-DEV Maturity Level 2 and 3 process areas for software.
For Class B software:
CMMI-DEV Maturity Level 2 Rating or higher for software, or CMMI-DEV Capability Level 2 Rating or higher for software in the following process areas:
a. Requirements Management.
b. Configuration Management.
c. Process and Product Quality Assurance.
d. Measurement and Analysis.
e. Project Planning.
f. Project Monitoring and Control.
g. Supplier Agreement Management (if applicable).
For Class C software:
The required CMMI-DEV Maturity Level for Class C software will be defined per Center or project requirements.
2.5.2 The project shall assess options for software acquisition versus development. [SWE-033]
2.5.3 The project shall define and document or record the acceptance criteria and conditions for the software. [SWE-034]
2.5.4 For new contracts, the project shall establish a procedure for software supplier selection, including proposal evaluation criteria. [SWE-035]
2.5.5 The project shall determine which software processes, activities, and tasks are required for the project. [SWE-036]
2.5.6 The project shall define the milestones at which the software supplier(s) progress will be reviewed and audited as a part of the acquisition activities. [SWE-037]
2.5.7 The project shall document software acquisition planning decisions. [SWE-038]
2.6.1.1 The project shall require the software supplier(s) to provide insight into software development and test activities; at a minimum the following activities are required: monitoring integration, review of the verification adequacy, review of trade study data and results, auditing the software development process, participation in software reviews and systems and software technical interchange meetings. [SWE-039]
2.6.1.2 The project shall require the software supplier(s) to provide NASA with all software products and software process tracking information, in electronic format, including software development and management metrics. [SWE-040]
2.6.1.3 The project shall require the software supplier(s) to notify the project, in the response to the solicitation, as to whether open source software will be included in code developed for the project. [SWE-041]
2.6.1.4 The project shall require the software supplier(s) to provide NASA with electronic access to the source code developed for the project, including MOTS software and non-flight software (e.g., ground test software, simulations, ground analysis software, ground control software, science data processing software, and hardware manufacturing software). [SWE-042]
2.6.2.1 The project shall require the software supplier to track all software changes and non-conformances and provide the data for the project's review. [SWE-043]
2.6.2.2 The project shall require the software supplier(s) to provide software metric data as defined in the project's Software Metrics Report. [SWE-044]
2.6.2.3 The project shall participate in any joint NASA/contractor audits of the software development process and software configuration management process. [SWE-045]
2.6.2.4 The project shall require the software supplier(s) to provide a software schedule for the project's review and schedule updates as requested. [SWE-046]
2.6.2.5 The project shall require the software supplier(s) to make available, electronically, the software traceability data for the project's review. [SWE-047]
2.6.2.6 The project shall document in the solicitation the software processes, activities, and tasks to be performed by the supplier. [SWE-048]
3.1.1.1 The project shall document the software requirements. [SWE-049]
3.1.1.2 The project shall identify, develop, document, approve, and maintain software requirements based on analysis of customer and other stakeholder requirements and the operational concepts. [SWE-050]
3.1.1.3 The project shall perform software requirements analysis based on flowed-down and derived requirements from the top-level systems engineering requirements and the hardware specifications and design. [SWE-051]
3.1.1.4 The project shall perform, document, and maintain bidirectional traceability between the software requirement and the higher-level requirement. [SWE-052]
3.1.2.1 The project shall collect and manage changes to the software requirements. [SWE-053]
3.1.2.2 The project shall identify, initiate corrective actions, and track until closure inconsistencies among requirements, project plans, and software products. [SWE-054]
3.1.2.3 The project shall perform requirements validation to ensure that the software will perform as intended in the customer environment. [SWE-055]
3.2.1 The project shall document and maintain the software design. [SWE-056]
3.2.2 The project shall transform the allocated and derived requirements into a documented software architectural design. [SWE-057]
3.2.3 The project shall develop, record, and maintain a detailed design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested. [SWE-058]
3.2.4 The project shall perform and maintain bidirectional traceability between the software requirements and the software design. [SWE-059]
3.3.1 The project shall implement the software design into software code. [SWE-060]
3.3.2 The project shall ensure that software coding methods, standards, and/or criteria are adhered to and verified. [SWE-061]
3.3.3 The project shall ensure that results from static analysis tool(s) are used in verifying and validating software code. [SWE-135]
3.3.4 The project shall ensure that the software code is unit tested per the plans for software testing. [SWE-062]
3.3.5 The project shall provide a Software Version Description document for each software release. [SWE-063]
3.3.6 The project shall provide and maintain bidirectional traceability from software design to the software code. [SWE-064]
3.3.7 The project shall validate and accredit software tool(s) required to develop or maintain software. [SWE-136]
3.4.1 The project shall establish and maintain: [SWE-065]
a. Software Test Plan(s).
b. Software Test Procedure(s).
c. Software Test Report(s).
3.4.2 The project shall perform software testing as defined in the Software Test Plan. [SWE-066]
3.4.3 The project shall ensure that the implementation of each software requirement is verified to the requirement. [SWE-067]
3.4.4 The project shall evaluate test results and document the evaluation. [SWE-068]
3.4.5 The project shall document defects identified during testing and track to closure. [SWE-069]
3.4.6 The project shall verify, validate, and accredit software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment. [SWE-070]
3.4.7 The project shall update Software Test Plan(s) and Software Test Procedure(s) to be consistent with software requirements. [SWE-071]
3.4.8 The project shall provide and maintain bidirectional traceability from the Software Test Procedures to the software requirements. [SWE-072]
3.4.9 The project shall ensure that the software system is validated on the targeted platform or high-fidelity simulation. [SWE-073]
3.5.1 The project shall document the software maintenance plans in a Software Maintenance Plan document. [SWE-074]
3.5.2 The project shall plan software operations, maintenance, and retirement activities. [SWE-075]
3.5.3 The project shall implement software operations, maintenance, and retirement activities as defined in the respective plans. [SWE-076]
3.5.4 The project shall complete and deliver the software product to the customer with appropriate documentation to support the operations and maintenance phase of the software's life cycle. [SWE-077]
3.5.5 The project shall deliver to the customer the as-built documentation to support the operations and maintenance phase of the software life cycle. [SWE-078]
4.1.1 The project shall develop a Software Configuration Management Plan that describes the functions, responsibilities, and authority for the implementation of software configuration management for the project. [SWE-079]
4.1.2 The project shall track and evaluate changes to software products. [SWE-080]
4.1.3 The project shall identify the software configuration items (e.g., software documents, code, data, tools, models, scripts) and their versions to be controlled for the project. [SWE-081]
4.1.4 The project shall establish and implement procedures designating the levels of control each identified configuration item must pass through; the persons or groups with authority to authorize changes and to make changes at each level; and the steps to be followed to request authorization for changes, process change requests, track changes, distribute changes, and maintain past versions. [SWE-082]
4.1.5 The project shall prepare and maintain records of the configuration status of configuration items. [SWE-083]
4.1.6 The project shall ensure that software configuration audits are performed to determine the correct version of the configuration items and verify that they conform to the documents that define them. [SWE-084]
4.1.7 The project shall establish and implement procedures for the storage, handling, delivery, release, and maintenance of deliverable software products. [SWE-085]
4.2.1 The project shall identify, analyze, plan, track, control, communicate, and document software risks in accordance with NPR 8000.4, Agency Risk Management Procedural Requirements. [SWE-086]
4.3.1 The project shall perform and report on software peer reviews/inspections for: [SWE-087]
a. Software requirements.
b. Software Test Plan.
c. Any design items that the project identified for software peer review/inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
4.3.2 The project shall perform and report on software peer reviews/inspections for: [SWE-137]
a. Software Development or Management Plan.
b. Software Configuration Management Plan.
c. Software Maintenance Plan.
d. Software Assurance Plan.
e. Software Safety Plan.
4.3.3 The project shall, for each planned software peer review/inspections: [SWE-088]
a. Use a checklist to evaluate the work products.
b. Use established readiness and completion criteria.
c. Track actions identified in the reviews until they are resolved.
d. Identify required participants.
4.3.4 The project shall, for each planned software peer review/inspection, record basic measurements. [SWE-089]
4.4.1 The project shall establish and document specific measurement objectives for their project. [SWE-090]
4.4.2 The project shall select and record the selection of specific measures in the following areas: [SWE-091]
a. Software progress tracking.
b. Software functionality.
c. Software quality.
d. Software requirements volatility.
e. Software characteristics.
4.4.3 The project shall specify and record data collection and storage procedures for their selected software measures and collect and store measures accordingly. [SWE-092]
4.4.4 The project shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures. [SWE-093]
4.4.5 The project shall report measurement analysis results periodically and allow access to measurement information by Center-defined organizational measurement programs. [SWE-094]
Div | ||
---|---|---|
| ||
4.4.6 Each NASA Mission Directorate shall establish its own software measurement system to include the minimum reporting requirements in SWE-091. [SWE-095] 4.4.7 Each NASA Mission Directorate shall identify and document the specific measurement objectives, the chosen specific measures, the collection procedures, and storage and analysis procedures. [SWE-096] 4.5.1 The NASA Headquarters' Office of the Chief Engineer shall maintain an Agency-wide process asset library of applicable best practices. [SWE-098] 4.5.2 Each Center shall review the contents of the process asset library to identify those practices that may have direct applicability and value to its software activities. [SWE-099] 4.6.1 The NASA Headquarters' Office of the Chief Engineer and Center training organizations shall provide and fund training to advance software engineering practices and software acquisition. [SWE-100] 4.6.2 Each Center shall maintain and implement Software Training Plan(s) to advance its in-house software engineering capability and as a reference for its contractors. [SWE-101] |
5.1.1.1 The Software Development or Management Plan shall contain: [SWE-102]
a. Project organizational structure showing authority and responsibility of each organizational unit, including external organizations (e.g., Safety and Mission Assurance, Independent Verification and Validation (IV&V), Technical Authority, NASA Engineering and Safety Center, NASA Safety Center).
b. The safety criticality and classification of each of the systems and subsystems containing software.
c. Tailoring compliance matrix for approval by the designated Engineering Technical Authority, if the project has any waivers or deviations to this NPR.
d. Engineering environment (for development, operation, or maintenance, as applicable), including test environment, library, equipment, facilities, standards, procedures, and tools.
e. Work breakdown structure of the life-cycle processes and activities, including the software products, software services, non-deliverable items to be performed, budgets, staffing, acquisition approach, physical resources, software size, and schedules associated with the tasks.
f. Management of the quality characteristics of the software products or services.
g. Management of safety, security, privacy, and other critical requirements of the software products or services.
h. Subcontractor management, including subcontractor selection and involvement between the subcontractor and the acquirer, if any.
i. Verification and validation.
j. Acquirer involvement.
k. User involvement.
l. Risk management.
m. Security policy.
n. Approval required by such means as regulations, required certifications, proprietary, usage, ownership, warranty, and licensing rights.
o. Process for scheduling, tracking, and reporting.
p. Training of personnel, including project unique software training needs.
q. Software life-cycle model, including description of software integration and hardware/software integration processes, software delivery, and maintenance.
r. Configuration management.
s. Software documentation tree.
t. Software peer review/inspection process of software work products.
u. Process for early identification of testing requirements that drive software design decisions (e.g., special system level timing requirements/checkpoint restart).
v. Software metrics.
w. Content of software documentation to be developed on the project.
x. Management, development, and testing approach for handling any commercial-off-the-shelf (COTS), government-off-the-shelf (GOTS), modified-off-the-shelf (MOTS), reused, or open source software component(s) that are included within a NASA system or subsystem.
5.1.2.1 The Software Configuration Management Plan shall contain: [SWE-103]
a. The project organization(s).
b. Responsibilities of the software configuration management organization.
c. References to the software configuration management policies and directives that apply to the project.
d. All functions and tasks required to manage the configuration of the software, including configuration identification, configuration control, status accounting, and configuration audits and reviews.
e. Schedule information, which establishes the sequence and coordination for the identified activities and for all events affecting the plan's implementation.
f. Resource information, which identifies the software tools, techniques, and equipment necessary for the implementation of the activities.
g. Plan maintenance information, which identifies the activities and responsibilities necessary to ensure continued planning during the life cycle of the project.
h. Release management and delivery.
5.1.3.1 The Software Test Plan shall include: [SWE-104]
a. Test levels (separate test effort that has its own documentation and resources, e.g., component, integration, and system testing).
b. Test types:
(1) Unit testing.
(2) Software integration testing.
(3) Systems integration testing.
(4) End-to-end testing.
(5) Acceptance testing.
(6) Regression testing.
c. Test classes (designated grouping of test cases).
d. General test conditions.
e. Test progression.
f. Data recording, reduction, and analysis.
g. Test coverage (breadth and depth) or other methods for ensuring sufficiency of testing.
h. Planned tests, including items and their identifiers.
i. Test schedules.
j. Requirements traceability (or verification matrix).
k. Qualification testing environment, site, personnel, and participating organizations.
5.1.4.1 The Software Maintenance Plan shall include: [SWE-105]
a. Plan information for the following activities:
(1) Maintenance process implementation.
(2) Problem and modification analysis.
(3) Modification implementation.
(4) Maintenance review/acceptance.
(5) Migration.
(6) Software Retirement.
(7) Software Assurance.
(8) Software Risk Assessment for all changes made during maintenance and operations.
b. Specific standards, methods, tools, actions, procedures, and responsibilities associated with the maintenance process. In addition, the following elements are included:
(1) Development and tracking of required upgrade intervals, including implementation plan.
(2) Approach for the scheduling, implementation, and tracking of software upgrades.
(3) Equipment and laboratories required for software verification and implementation.
(4) Updates to documentation for modified software components.
(5) Licensing agreements for software components.
(6) Plan for and tracking of operational backup software (e.g., backup flight software, backup to the primary operational software).
(7) Approach for the implementation of modifications to operational software (e.g., testing of software in development laboratory prior to operational use).
(8) Approach for software delivery process, including distribution to facilities and users of the software products and installation of the software in the target environment (including but not limited to, spacecraft, simulators, Mission Control Center, and ground operations facilities).
(9) Approach for providing NASA access to the software version description data (e.g., revision number, licensing agreement).
5.1.5.1 The Software Assurance Plan(s) shall be developed and documented per NASA-STD-8739.8, NASA Software Assurance Standard. [SWE-106]
Div | ||
---|---|---|
| ||
5.1.6.1 The Center Software Training Plan shall include: [SWE-107] [One plan per Center] a. Responsibilities. 5.1.7.1 The Center Software Engineering Improvement Plans shall include: [SWE-108] [One plan per Center] a. Process improvement goal(s). |
5.1.9.1 The Software Safety Plan(s) shall be developed per NASA-STD-8719.13, Software Safety Standard. [SWE-138]
5.2.1.1 The Software Requirements Specification shall contain: [SWE-109]
a. System overview.
b. CSCI requirements:
(1) Functional requirements.
(2) Required states and modes.
(3) External interface requirements.
(4) Internal interface requirements.
(5) Internal data requirements.
(6) Adaptation requirements (data used to adapt a program to a given installation site or to given conditions in its operational environment).
(7) Safety requirements.
(8) Performance and timing requirements.
(9) Security and privacy requirements.
(10) Environment requirements.
(11) Computer resource requirements:
(a) Computer hardware resource requirements, including utilization requirements.
(b) Computer software requirements.
(c) Computer communications requirements.
(12) Software quality characteristics.
(13) Design and implementation constraints.
(14) Personnel-related requirements.
(15) Training-related requirements.
(16) Logistics-related requirements.
(17) Packaging requirements.
(18) Precedence and criticality of requirements.
c. Qualification provisions (e.g., demonstration, test, analysis, inspection).
d. Bidirectional requirements traceability.
e. Requirements partitioning for phased delivery.
f. Testing requirements that drive software design decisions (e.g., special system level timing requirements/checkpoint restart).
g. Supporting requirements rationale.
5.2.2.1 The Software Data Dictionary shall include: [SWE-110]
a. Channelization data (e.g., bus mapping, vehicle wiring mapping, hardware channelization).
b. Input/Output (I/O) variables.
c. Rate group data.
d. Raw and calibrated sensor data.
e. Telemetry format/layout and data.
f. Data recorder format/layout and data.
g. Command definition (e.g., onboard, ground, test specific).
h. Effecter command information.
i. Operational limits (e.g., maximum/minimum values, launch commit criteria information).
5.2.3.1 The Software Design Description shall include: [SWE-111]
a. CSCI-wide design decisions/trade decisions.
b. CSCI architectural design.
c. CSCI decomposition and interrelationship between components:
(1) CSCI components:
(a) Description of how the software item satisfies the software requirements, including algorithms, data structures, and functional decomposition.
(b) Software item I/O description.
(c) Static/architectural relationship of the software units.
(d) Concept of execution, including data flow, control flow, and timing.
(e) Requirements, design and code traceability.
(f) CSCI's planned utilization of computer hardware resources.
(2) Rationale for software item design decisions/trade decisions including assumptions, limitations, safety and reliability related items/concerns or constraints in design documentation.
(3) Interface design.
5.2.4.1 The Interface Design Description shall include: [SWE-112]
a. Priority assigned to the interface by the interfacing entity(ies).
b. Type of interface (e.g., real-time data transfer, storage-and-retrieval of data) to be implemented.
c. Specification of individual data elements (e.g., format and data content, including bit-level descriptions of data interface) that the interfacing entity(ies) will provide, store, send, access, and receive.
d. Specification of individual data element assemblies (e.g., records, arrays, files, reports) that the interfacing entity(ies) will provide, store, send, access, and receive.
e. Specification of communication methods that the interfacing entity(ies) will use for the interface.
f. Specification of protocols the interfacing entity(ies) will use for the interface.
g. Other specifications, such as physical compatibility of the interfacing entity(ies).
h. Traceability from each interfacing entity to the system or CSCI requirements addressed by the entity's interface design, and traceability from each system or CSCI requirement to the interfacing entities that address it.
i. Interface compatibility.
j. Safety-related interface specifications and design features.
5.2.5.1 The Software Change Request/Problem Report shall contain: [SWE-113]
a. Identification of the software item.
b. Description of the problem or change to enable problem resolution or justification for and the nature of the change, including: assumptions/ constraints and change to correct software error.
c. Originator of Software Change Request/Problem Report and originator's assessment of priority/severity.
d. Description of the corrective action taken to resolve the reported problem or analysis and evaluation of the change or problem, changed software configuration item, schedules, cost, products, or test.
e. Life-cycle phase in which problem was discovered or in which change was requested.
f. Approval or disapproval of Software Change Request/Problem Report.
g. Verification of the implementation and release of modified system.
h. Date problem discovered.
i. Status of problem.
j. Identify any safety-related aspects/considerations/ impacts associated with the proposed change and/or identified problem.
k. Configuration of system and software when problem is identified (e.g., system/software configuration identifier or list of components and their versions).
l. Any workaround to the problem that can be used while a change is being developed or tested.
5.2.6.1 The Software Test Procedures shall contain: [SWE-114]
a. Test preparations, including hardware and software.
b. Test descriptions, including:
(1) Test identifier.
(2) System or CSCI requirements addressed by the test case.
(3) Prerequisite conditions.
(4) Test input.
(5) Instructions for conducting procedure.
(6) Expected test results, including assumptions and constraints.
(7) Criteria for evaluating results.
c. Requirements traceability.
d. Identification of test configuration.
5.2.7.1 The Software User Manual shall contain: [SWE-115]
a. Software summary, including: application, inventory, environment, organization, overview of operation, contingencies, alternate states, and modes of operation, security, privacy, assistance, and problem reporting.
b. Access to the software: first-time user of the software, initiating a session, and stopping and suspending work.
c. Processing reference guide: capabilities, conventions, processing procedures, related processing, data back up, recovery from errors, malfunctions, emergencies, and messages.
d. Assumptions, limitations, and safety-related items/concerns or constraints.
e. Information that is unique or specific for each version of the software (e.g., new and modified features, new and modified interfaces).
5.2.8.1 The Software Version Description shall identify and provide: [SWE-116]
a. Full identification of the system and software (e.g., numbers, titles, abbreviations, version numbers, and release numbers).
b. Executable software (e.g., batch files, command files, data files, or other software needed to install the software on its target computer).
c. Software life-cycle data that defines the software product.
d. Archive and release data.
e. Instructions for building the executable software, including, for example, the instructions and data for compiling and linking and the procedures used for software recovery, software regeneration, testing, or modification.
f. Data integrity checks for the executable object code and source code.
g. Software product files (any files needed to install, build, operate, and maintain the software).
h. Open change requests and or problem reports, including any workarounds.
i. Change requests and/or problem reports implemented in the current software version since the last Software Version Description was published.
5.3.1 Software Metrics Report.
The Software Metrics Report provides data to the project for the assessment of software cost, technical, and schedule progress. The Software Metrics Report shall contain as a minimum the following information tracked on a CSCI basis: [SWE-117]
a. Software progress tracking measures.
b. Software functionality measures.
c. Software quality measures.
d. Software requirement volatility.
e. Software characteristics.
5.3.2.1 The Software Test Report shall include: [SWE-118]
a. Overview of the test results:
(1) Overall evaluation of the software as shown by the test results.
(2) Remaining deficiencies, limitations, or constraints detected by testing (e.g., including description of the impact on software and system performance, the impact a correction would have on software and system design, and recommendations for correcting the deficiency, limitation, or constraint).
(3) Impact of test environment.
b. Detailed test results:
(1) Project-unique identifier of a test and test procedure(s).
(2) Summary of test results (e.g., including requirements verified).
(3) Problems encountered.
(4) Deviations from test cases/procedures.
c. Test log:
(1) Date(s), time(s), and location(s) of tests performed.
(2) Test environment, hardware, and software configurations used for each test.
(3) Date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable.
d. Rationale for decisions.
5.3.3.1 The Software Peer Review/Inspection Report shall include: [SWE-119]
a. Identification information (including item being reviewed/inspected, review/inspection type (e.g., requirements inspection, code inspection, etc.) and review/inspection time and date).
b. Summary on total time expended on each software peer review/inspection (including total hour summary and time participants spent reviewing/inspecting the product individually).
c. Participant information (including total number of participants and participant's area of expertise).
d. Total number of defects found (including the total number of major defects, total number of minor defects, and the number of defects in each type such as accuracy, consistency, completeness).
e. Peer review/inspection results summary (i.e., pass, re-inspection required).
f. Listing of all review/inspection defects.
Div | ||
---|---|---|
| ||
6.1.1 For those cases in which a Center or project desires a general exclusion from requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester shall submit a waiver for those exclusions or alternate requirements for approval by the NASA Headquarters' Chief Engineer with appropriate justification. [SWE-120] 6.1.2 Where approved, the requesting Center or project shall document the approved alternate requirement in the procedure controlling the development, acquisition, and/ or deployment of the affected software. [SWE-121] 6.2.1 The designated Engineering Technical Authority(s) for requirements in this NPR, which can be waived at the Center level, shall be approved by the Center Director. [SWE-122] 6.3.1 The designated Center Engineering Technical Authority(s) for this NPR shall comply with NASA Headquarters' Office of the Chief Engineer's direction on roles and responsibilities for Engineering Technical Authority. [SWE-124] |
6.3.2 Each project with software components shall maintain a compliance matrix against requirements in this NPR, including those delegated to other parties or accomplished by contract vehicles. [SWE-125]
Div | ||
---|---|---|
| ||
6.3.3 The Engineering Technical Authority(s) for this NPR shall consider the following information when assessing waivers and deviations from requirements in this NPR: [SWE-126] a. The NASA software inventory data on the project. |
6.3.4 Centers and projects shall fully comply with the "shall" statements in this NPR that are marked with an "X" in Appendix D consistent with their software classification. [SWE-139]
6.3.5 When the requirement and software class are marked with a "P (Center)," Centers and projects shall meet the requirement with an approved non-null subset of the "shall" statement (or approved alternate) for that specific requirement. [SWE 140]
Div | ||
---|---|---|
| ||
6.3.6 The NASA Headquarters' Office of the Chief Engineer shall review and have concurrence approval when a Center defines subsets of requirements denoted by "P (Center)" in the Requirements Mapping Matrix in Appendix D for the indicated classes of software. [SWE-127] 6.3.7 The Center-level Engineering Technical Authority shall keep records of projects' compliance matrices, waivers, and deviations against this NPR. [SWE-128] 6.3.8 The NASA Headquarters' Office of the Chief Engineer shall authorize appraisals against selected requirements in this NPR (including NASA Headquarters' Office of the Chief Engineer approved subsets and alternative sets of requirements) to check compliance. [SWE-129] |