bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
{alias:Entrance and Exit Criteria}

{tabsetup:Introduction|MCR|SRR|SwRR|MDR|SDR|PDR|CDR|PRR|SIR|TRR|SAR|ORR|FRR}

{div3:id=tabs-1}

This section identifies criteria for the entrance into and the successful completion of one of the 13 life cycle reviews from NPR 7123.1 Appendix GINTRO TAB

{div3}
{div3:id=tabs-2}

h2. Mission Concept Review (MCR)

{info}The MCR affirms the mission need and examines the proposed mission's objectives and the concept for meeting those objectives. Key technologies are identified and assessed. It is organizedan byinternal review that andusually occurs includesat the cognizant followingsystem information:development entranceorganization. criteria,ROM exitbudget criteria,and softwareschedules communityare responsibilitiespresented. (e.g., {term:SDP}/{term:SMP}), software community contributions to system activities/products (e.g., Project Plan).

{note} The software requirements review (SwRR) is not included in 7123.1, but is often used in software projects, so it is included here. The following reviews were not included because they did not have any apparent correlation to 7150.2: Program/System Requirements Review, Program/System Definition Review, Post-Launch Assessment Review, Critical Event Readiness Review, Post-Flight Assessment Review, Decommission Review, Periodic Technical Review. (Tables G-1,2,15-19){note}

*{+}Material was also added and adapted from the following sources:+*

{refstable-topic}


{div3}
{div3:id=tabs-2}

h2. {anchor:_Toc286592508}Mission Concept Review (MCR)

The MCR affirms the mission need and examines the proposed mission's objectives and the concept for meeting those objectives.  Key technologies are identified and assessed.  It is an internal review that usually occurs at the cognizant system development organization.  ROM budget and schedules are presented. (NPR 7120.5)

h3. !entrance.png! Entrance Criteria

* Need for mission clearly identified
* Mission goals and objectives clearly defined and stated; unambiguous and internally consistent
* Analysis of alternative concepts (showing at least one feasible)
* Concept of operations
* Preliminary risk assessment, including technologies and associated risk management/mitigation strategies and options
* Conceptual test and evaluation strategy
* Preliminary technical plans to achieve next phase
* Conceptual life-cycle
* Preliminary Software Management Plan (SMP)
* The preliminary set of requirements to meet the mission objectives
* The mission is feasible
** A solution has been identified that is technically feasible
** A rough cost estimate is within an acceptable cost range
* The cost and schedule estimates are credible
* An updated technical search was done to identify existing assets or products that could satisfy the mission or parts of the mission
* Software inputs / contributions to
** Preliminary Project Plan
** Preliminary SEMP

h3. !check.png! Items Reviewed

* Mission goals and objectives
* Analysis of alternative concepts
* Preliminary development approaches and acquisition plans
* Concept of operations
* Risk assessments
* Conceptual test and evaluation strategy
* Technical plans to achieve next phase
* Conceptual life cycle
* Preliminary requirements
* Draft cost and schedule estimates
* Conceptual system design

h3. !exit.png! Exit/Success Criteria

* Technical planning is sufficient to proceed to the next phase
* Risk and mitigation strategies have been identified and are acceptable based on technical risk assessments
* As applicable,
** Science objectives are clearly understood and comprehensively defined
** Preliminary mission requirements are traceable to science objectives
** Operations concept clearly supports achievement of science objectives
* Conceptual system design meets mission requirements, and the various system elements are compatible
* Technology dependencies are understood, and alternative strategies for achievement of requirements are understood
* Cost and schedule estimates are credible
{div3}{div3:id=tabs-3}

h2. {anchor:_Toc286592509}Systems Requirements Review (SRR)


h3. !entrance.png! Entrance Criteria

* Successful completion of the MCR and responses made to all MCR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* A preliminary SRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products made available to participants prior to SRR (noted in this list)
* System requirements document
* Preliminary system requirements allocation to next lower level system
* System software functionality description
* Updated concept of operations
* Updated mission requirementsNPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* Need for mission is clearly identified
* Concept of operations available
* Preliminary risk assessment available, including technologies and associated risk management/mitigation strategies and options
* Conceptual test and evaluation strategy available
* An MCR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Preliminary technical plans available to achieve next phase
* Conceptual life-cycle available
* Top level set of requirements are identified to meet the mission objectives
* The mission is feasible
** A solution has been identified that is technically feasible
** A rough cost estimate is within an acceptable cost range
* Draft cost and schedule estimates are available
** As developed by SW developers and SW assurance personnel
* A technical search was done to identify existing assets / products that could satisfy the mission or parts of the mission
* Software inputs / contributions provided for:
** Preliminary Project Plan
** Preliminary SEMP
** Development and analysis of alternative concepts (showing at least one feasible)

h3. !check.png! Items Reviewed

* Mission goals and objectives
* Analysis of alternative concepts
* Preliminary development approaches and acquisition plans
* Concept of operations
* Risk assessments
* Conceptual test and evaluation strategy
* Technical plans to achieve next phase
* Conceptual life cycle
* Preliminary requirements
* Draft cost and schedule estimates
* Conceptual system design

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Technical planning is sufficient to proceed to next phase
** Risk and mitigation strategies have been identified and are acceptable based on technical risk assessments
** Cost and schedule estimates are credible
** Mission goals and objectives are clearly defined and stated, unambiguous, and internally consistent
** Conceptual system design meets mission requirements, and the various system elements are compatible
** Technology dependencies are understood, and alternative strategies for achievement of requirements are understood
* As applicable, agreement is reached that:
** Objectives are clearly understood and comprehensively defined
** Preliminary mission requirements are traceable to science objectives
** Operations concept clearly supports achievement of science objectives

{div3}
{div3:id=tabs-3}

h2. System Requirements Review (SRR)

{info}The SRR examines the functional and performance requirements defined for the system and the preliminary Program or Project Plan and ensures that the requirements and the selected concept with satisfy the mission. (NPR 7120.5)

* If not performing a SwRR, include SwRR criteria as part of SRR.
* For software-only projects, the SwRR serves as the SRR.
{info}

h3. !entrance.png! Entrance Criteria 

* Successful completion of MCR and responses made to all MCR Requests for Actions (RFAs), Review Item Discrepancies (RIDs)
* A preliminary SRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products required for this review made available to participants prior to SRR
* System requirements captured in format for review
* System requirements allocated to next lower level system (subsystems) -- preliminary allocation completed
* System level software functionality description completed
* System level interface requirements for software documented
* Updated concept of operations available
* Updated mission requirements available, if applicable
* Preliminary Hazards Analysis (PHA) available
* Software inputs / contributions completed for:
** Baselined Systems Engineering Management Plan (SEMP)
** Preliminary Project Plan
** System safety and mission assurance plan, including s/w classification
** Risk management plan
*** Updated risk assessment and mitigations (including Probabilistic Risk Assessment (PRA) as applicable)
** Technology Development Maturity Assessment Plan
** Logistics documentation (e.g., preliminary maintenance plan)
** Preliminary human rating plan, if applicable
** Initial document tree
* Lessons Learned
** Review of existing Lessons Learned (LL) from previous projects completed
** Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)
** Confirmation exists that Lessons Learned added to LL database

h3. !check.png! Items Reviewed

* System level requirements and preliminary allocation to next lower level system (subsystems)
* System level software functionality description
* System level interface requirements for software
* Concept of operations
* Mission requirements
* Preliminary Hazard Analysis (PHA)
* Preliminary approach for how requirements will be verified and validated down to the subsystem level
* Risk and mitigation strategies
* Acquisition strategy

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Process for allocation and control of requirements throughout all levels deemed sound; plan defined to complete the definition activity within schedule constraints
** Requirements definition is complete with respect to top-level mission and science requirements; interfaces with external entities and between major internal elements have been defined
** Preliminary allocation of system requirements to hardware, human, and software subsystems has been defined
** Requirements allocation and flow down of key driving requirements have been defined down to subsystems
** Preliminary approaches have been determined for how requirements will be verified and validated down to the subsystem level
** Major risks have been identified and technically assessed, and viable mitigation strategies have been defined
** Requirements and selected concept of operations will satisfy the mission
** System requirements, approved material solution, available product/process technology, and program resources are sufficient to proceed to the next lifecycle phase

{div3}
{div3:id=tabs-4}

h2. Software Requirements Review (SwRR)

{info} See definition of SRR.
\\
If not performing a SwRR, include SwRR criteria as part of SRR.
\\
For software-only projects, the SwRR serves as the SRR.{info}


h3. !entrance.png! Entrance Criteria  - General

* Successful completion of the previous review (typically SDR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* A final SwRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products for this review made available to participants prior to SwRR
* Peer reviews completed: SMP, s/w requirements, V&V plans
* Preliminary concept of operations available for review

h3. !entrance.png! Entrance Criteria  - Plans

* Preliminary SDP / SMP updated for corresponding architectural design and test development activities
** Preliminary approach and acquisition strategy
** Preliminary identification of personnel (quantity; assignment duration; required skills) or reference to document with this info
** Organizational responsibilities and interfaces
** Preliminary schedules exist for all software to be developed including dependencies with other disciplines within the project
** Preliminary cost estimate
** Milestones defined / revised
** Processes and metrics defined for program success
** Management methods and controls for design & development id'd
** Programming languages (if known), security requirements, operational and support concepts identified
** Training for project personnel identified
* Preliminary Software Quality Assurance Plan (SQAP) exists
* Preliminary Software Safety Plan exists (if have safety critical software)
* Preliminary software V&V plan, including qualification requirements exists:
** Overall software verification strategy
** Software development and test environments, including processors, operating systems, communications equipment, simulators and their fidelity
** Test facilities, needs and capabilities
** Methodology for verifying the flowed down system requirements and acceptance criteria
** Test tool requirements and development plans
* Risk management plan updated
** Risks that may impact cost, schedule and technical goals completed
* Preliminary configuration management plan is available addressing:
** Configuration identification, change control, status accounting, and configuration audits
* IV&V plan available and IV&V assessment of s/w requirements, if reviewed

h3. !entrance.png! Entrance Criteria  - Requirements

* Preliminary allocation of system requirements to software available
* Preliminary software requirements (SRS)
** Complete for preliminary concept
** Requirements are consistent, feasible, testable, and traceable
** Test, delivery and quality requirements identified and understandable
* Functional requirements
** High-level requirements defined for each functional area
** Block diagram exists for the major software components in each functional area, their interfaces and data flows
** Relevant s/w operational modes defined (e.g., nominal, critical, contingency)
** Critical and/or controversial requirements identified, including safety-critical requirements, open issues, and areas of concern
** Requirements identified that need clarification or additional information
* Performance requirements
** Performance requirements for the software identified
** Description exists of critical timing relationships and constraints
* Software requirements and interface requirements have been analyzed and specified
* QA assessment of the requirements completed and ready for review
* Bidirectional traceability matrix
** Requirements traced to higher-level requirements
** Includes identification of verification methodology (e.g., test, demonstration, analysis, inspection)

h3. !entrance.png! Entrance Criteria  - Design

* Preliminary high level software architecture defined
* Report of current computer resource estimates and margins (memory, bus, throughput) available for review
* Design constraints documented
* Design drivers exist:
** Explanation of design drivers and preliminary investigations made during the requirements process to determine reasonableness of the requirements, including preliminary decisions regarding software architecture, operating systems, reuse of existing software, and selection of COTS components
** Resource goals and preliminary sizing estimates (incl. timing and database storage) in the context of available hardware allocations; strategies for measuring and tracking resource utilization
** Initial Build Plan
* Review completed for technical and economic feasibility of allocation of functions at the (sub)system level to hardware, firmware, and software
* Software Interface Specifications (SISs - requirements portion) exist
* Software-related trade-off and design decisions completed and reviewed or preliminary results available, as applicable, for:
** Inherited capabilities
** New technologies
** Programming language selection
** Sizing and timing budget
** Design methods and tool selection
** Programming standards and conventions
** Database conceptual design

h3. !entrance.png! Entrance Criteria  - Analysis

* PHA/Software Assurance Classification Report (SACR), Software Safety Litmus Test available for review
* Make-buy decisions available and supported by analysis, if they exist
* Software-related analyses completed or preliminary results available, as applicable:
** Functional analyses
** Testability
** Operability
** Failure modes and effects analyses
** Reliability engineering
** Systems safety and hazards
** Life-cycle costs
** Security

h3. !check.png! Items Reviewed

* Concept of operations
* Requirements
** Preliminary system requirements allocation to software
** Software requirements (SRS)
*** Functional requirements
*** Performance requirements
** Software interface requirements
* Risk management plan
* Preliminary SW V&V plan
* Software QA Plan
* PHA, SW classification, litmus test results
* Design: constraints, strategy, trade-off decisions
* Results of technical and economic feasibility review and associated analyses
* Bidirectional traceability matrix
* IV&V plan and assessment of SW requirements
* QA assessment of requirements
* Computer resource estimates and margins
* CM plan
* SDP/SMP
* SW concept of operations
* Peer review results

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that plans and requirements are satisfactory and ready to proceed to the design phase:
** Software requirements determined to be clear, complete, consistent, feasible, traceable, testable
** SMP, software requirements, interface requirements, V&V plans are an adequate and feasible basis for architectural design activities and are approved, baselined and placed under configuration management
** Requirements and performance requirements defined, testable, and consistent with cost, schedule, risk, technology readiness, and other constraints
** System requirements, approved material solution, available product/process technology, and program resources form a satisfactory basis for proceeding into the development phase
** Milestones are verifiable and achievable
** Initial computer resource estimate are within margin limits; if not, plans for control of resource margins is deemed adequate to meet margins by PDR
* All SwRR RIDs and actions are documented with resolution plans and authorization received to proceed to software architecture design

{div3}
{div3:id=tabs-5}

h2. Mission Definition Review (MDR)

{info}The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system.
\\
MDR is equivalent to SDR for robotic projects. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products for this review made available to participants prior to MDR
* System requirements document updated, if applicable
* Concept of operations updated, if applicable
* Mission requirements, goals, and objectives updated, if applicable
* Preliminary Software Development/Management Plan (SDP/SMP)
** Updated risk management plan, if applicable
** Updated risk assessment and mitigations (including Probabilistic Risk Assessment (PRA), as applicable)
** Updated project software classification(s)
** Cost and schedule data updated
* Preferred system solution definition exists, including major trades and options
* Logistics documentation exists (e.g., preliminary maintenance plan)
* Preliminary configuration management plan available for review
* Preliminary system safety analysis available
* System requirements traced to mission goals and objectives and to concept of operations
* Project software cost estimate exists and project has the ability to track software-related costs and assess those costs
* Preliminary Human Rating Plan exists, if applicable
* Software inputs / contributions completed tofor:
** Baselined Systems Engineering Management Plan (SEMP), if applicable
** Preliminary Project Plan
** System safety and mission assurance plan
** Risk management planTechnology Development Maturity Assessment Plan
*** Updated initial riskdocument assessmenttree, and mitigations (including Probabilistic Risk Assessment (PRA)if applicable

h3. !check.png! Items Reviewed

* System documentation, as applicable)
** Technology Development Maturity Assessment PlanArchitecture
** Updated system requirements document
** LogisticsSystem documentation (e.g., preliminary maintenance plan)level software functionality description
** Preliminary human rating plan, if applicableSystem requirements traceability and preliminary allocation to software
** InitialPreliminary documentsystem tree
* Lessons Learnedsafety analysis
** ReviewPreferred ofsystem existing Lessons Learned from previous projects
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects
** Confirmation that Lessons Learned added to Lessons Learned database

\\solution definition
* Mission requirements, goals, objectives, if applicable
* Concept of operations, if applicable
* SDP/SMP
* CM plan
* Acquisition strategy/plans
* Cost and schedule data
* Logistics documentation (e.g., preliminary maintenance plan)
* Initial document tree, if applicable
* Preliminary Human Rating Plan

h3. !exit.png! Exit/Success Criteria

* ProcessReview forpanel allocation and control of requirements throughout all levels deemed sound; plan defined to complete the definition activity within schedule constraints
* Requirements definition is complete with respect to top-level mission and science requirements; interfaces with external entities and between major internal elements have been defined
* Requirements allocation and flow down of key driving requirements defined down to subsystems
* Preliminary allocation of system requirements to hardware, human, and software subsystems
* Preliminary approaches determined for how requirements will be verified and validated down to the subsystem level
* Major risks identified and technically assessed, and viable mitigation strategies defined
* Requirements and selected concept will satisfy the mission
* System requirements, approved material solution, available product/process technology, and program resources form a satisfactory basis for proceeding into the development phase
{div3}{div3:id=tabs-4}

h2. {anchor:_Toc286592510}Software Requirements Review (SwRR)


agrees that:
** Overall concept is reasonable, feasible, complete, responsive to the mission requirements, and is consistent with system requirements and available resources (cost, schedule, mass, and power)
** Software design approaches and operational concepts exist and are consistent with the requirements set
** Requirements, design approaches, and conceptual design will fulfill the mission needs within the estimated costs
** Major risks have been identified and technically assessed, and viable mitigation strategies have been defined
** System level requirements are clearly and logically allocated to software

{div3}
{div3:id=tabs-6}

h2. System Definition Review (SDR)

{info} The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5)
\\
MDR is equivalent to SDR for robotic projects.{info}

h3. !entrance.png! Entrance Criteria 

* Successful completion of the previous review (typically SRR) and responses made to all SRR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* A* preliminaryFinal SwRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products for this review made available to participants prior to SwRR (noted in this list)SDR
* Updated concept of operations
* Preliminary system requirements allocation to software
* Software requirements (SRS)
** Complete for preliminary concept
** Consistent, feasible, testable, and traceable
** Identify test, delivery and quality requirements, and are understandable
* Functional requirements
** High-level requirements for each functional area
** Block diagram of the major software components in each functional area, their interfaces and data flows
** Definition of relevant software operational modes (e.g., nominal, critical, contingency)
** Critical and/or controversial requirements, including safety-critical requirements, open issues, and areas of concern
** Requirements needing clarification or additional information
* Traceability Matrix (bidirectional)
** Requirements to higher-level requirements
** Requirements to build and system level tests
* Performance requirements
** Performance requirements for the software
** Critical timing relationships and constraints
* Software Interface Specifications (SISs - requirements portion)
* Software requirements and interface requirements have been analyzed and specified
* Computer resource estimates and margins (memory, bus, throughput)
* Software Quality Assurance Plan (SQAP)
* Software QA organization structured with independent reporting relationship outside development group
* Updated PHA/Software Assurance Classification Report (SACR), Software Safety Litmus Test
* Review for technical and economic feasibility completed for allocation of functions at the (sub)system level to hardware, firmware, and software
* Design Constraints
* Design strategy
** Explanation of design drivers and design decisions that have been made, including software architecture, operating systems, reuse of existing software, and selection of COTS components
** Resource goals and preliminary sizing estimates (incl. timing and database storage) in the context of available hardware allocations; strategies for measuring and tracking resource utilization
** Initial Build Plan
* Risk management plan
** Risks that may impact cost, schedule and technical goals completed
* Configuration Management Plan addressing:
** Configuration identification, change control, status accounting, and configuration audits
* SDP / SMP updated for corresponding architectural design and test development activities
** Personnel identified (quantity, names, assignment duration, required skills)
** Organizational responsibilities and interfaces
** All computer programs identified, their development schedules compatible, their dependencies evident in schedules, their classifications identified/updated, and supporting resource allocations made
** Updated cost estimate
** Milestones are verifiable and achievable
** Schedules for development of all computer programs, and procedures for monitoring and reporting their status
** Processes and metrics for program success
** Management methods and controls for design & development
** Programming languages, security requirements, operational and support concepts identified
** Preliminary high level software architecture
* Qualification requirements
** Overall software test strategy, including the test levels (unit, integration, build, and system-level testing), test types (interface, load/stress, regression), and test tools
** Software development and test environments, including processors, operating systems, communications equipment, simulators and their fidelity
** Test facilities, needs and capabilities
** Methodology for verifying the system requirements and acceptance criteria
** Test tool requirements and development plans
* Preliminary software V&V plan
* Peer reviews completed: SMP, s/w requirements, V&V plans, preliminary s/w system architectural design (if identified for peer review/inspection in s/w development plans)
* Make-buy decisions supported by analysis
* Analyses completed, as applicable:
** Functional analyses
** Testability
** Operability
** Failure modes and effects analyses
** Reliability engineering
** Systems safety and hazards
** Life-cycle costs
** Security
* Software-related trade-off and design decisions completed and reviewed, as applicable, for:
** Inherited capabilities
** New technologies
** Programming language selection
** Sizing and timing budget
** Design methods and tool selection
** Programming standards and conventions
** Database conceptual design
* IV&V plan discussion and IV&V assessment of software requirements, if reviewed

h3. !exit.png! Exit/Success Criteria

* Software requirements determined to be clear, complete, consistent, feasible, traceable, testable
* SMP, software requirements, interface requirements, V&V plans are adequate and feasible basis for architectural design activities and are approved, baselined and placed under configuration management
* Requirements and performance requirements defined, testable, and consistent with cost, schedule, risk, technology readiness, and other constraints
* System requirements, approved material solution, available product/process technology, and program resources form a satisfactory basis for proceeding into the development phase
* All SwRR RIDs and actions are documented with resolution plans and authorization received to proceed to software architecture design
* Milestones are verifiable and achievable
* Initial computer resource estimate within margin limits; if not, plans for control of resource margins is deemed adequate to meet margins by PDR
{div3}{div3:id=tabs-5}

h2. {anchor:_Toc286592511}Mission Definition Review (MDR)

h3. !entrance.png! Entrance Criteria

* Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* Preliminary agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products made available to participants prior to SDR (noted in this list)
* {color:#000000}System architecture{color}\*\* {color:#000000}Flow down to all functional elements of the system{color}
* {color:#000000}Updated system requirements document, if applicable{color}
* {color:#000000}System software functionality description{color}
* {color:#000000}Updated concept of operations, if applicable{color}
* {color:#000000}Updated mission requirements, goals, and objectives, if applicable{color}
* Software Development/Management Plan (SDP/SMP)
** Updated risk management plan, if applicable
** Updated risk assessment and mitigations (including Probabilistic Risk Assessment  (PRA), as applicable)
** Project Software Classification(s)
* Technology Development Maturity Assessment Plan
* Preferred system solution definition, including major trades and options
* Updated cost and schedule data
* Logistics documentation (e.g., preliminary maintenance plan)
* Configuration management plan
* Updated initial document tree, if applicable
* Preliminary system safety analysis
* Other specialty disciplines, as required
* Traceability of system requirements to mission goals and objectives and to concept of operations
* Preliminary System requirements allocation to the next lower level system
* Project software cost estimate, and how software related expenditures will be collected and reported by life cycle phases
* Preliminary Human Rating Plan, if applicable
* Software inputs / contributions to
** Systems Engineering Management Plan (SEMP), if applicable
** Project Plan
** System safety and mission assurance plan

h3. !exit.png! Exit/Success Criteria

* Overall concept is reasonable, feasible, complete, responsive to the mission requirements, and is consistent with system requirements and available resources (cost, schedule, mass, and power)
* Software design approaches and operational concepts exist and consistent with the requirements set
* Requirements, design approaches, and conceptual design will fulfill the mission needs within the estimated costs
* Major risks identified and technically assessed, and viable mitigation strategies defined
* System level requirements are clearly and logically allocated to software
{div3}{div3:id=tabs-6}

h2. {anchor:_Toc286592512}System Definition Review (SDR)


h3. !entrance.png! Entrance Criteria

* Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
* Preliminary agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products made available to participants prior to SDR (noted in this list)
* System architecture, including software
* Preferred software solution definition including major trade offs and options
* Updated baselined documentation, as required
* Preliminary functional baseline (with supporting trade-off analyses and data)
* Preliminary system software functional requirements
* Updated risk management plan (could be part of SDP/SMP)
** Updated software risk assessment and mitigations (including Probabilistic Risk Assessment (PRA), as applicable)
* Updated SDP/SMP
** Updated technology development, maturity, and assessment plan
** Updated cost and schedule data
** Work Breakdown Structure
* Updated logistics documentation
* Software verification and validation plan
* Software requirements document(s)
* Interface requirements documents (including software)
* Technical resource utilization estimates and margins
* Updated preliminary software safety analysis
* Project Software Data Dictionary
* Project Software Configuration Management Plan
* Project Software Assurance Plan
* Project Software Maintenance Plan
* Software inputs / contributions to
** Systems Engineering Management Plan (SEMP) changes, if any
** Based on system complexity, updated human rating plan
** Flow down of system requirements to all software functional elements of the system
* Software requirements, including mission success criteria and any sponsor-imposed constraints, are defined and form the basis for the proposed conceptual design
* All software technical requirements are allocated and the flow down to subsystems is adequate; requirements, design approaches, and conceptual design will fulfill the mission needs consistent with the available resources (cost, schedule, throughput, and sizing)
* Requirements process is sound and can reasonably be expected to continue to identify and flow detailed requirements in a manner timely for development
* Technical approach is credible and responsive to the identified requirements


\\

h3. !exit.png! Exit/Success Criteria

* Software requirements, including mission success criteria and any sponsor-imposed constraints, are defined and form the basis for the proposed conceptual design
* All software technical requirements are allocated and the flow down to subsystems is adequate; requirements, design approaches, and conceptual design will fulfill the mission needs consistent with the available resources (cost, schedule, throughput, and sizing)
* Requirements process is sound and can reasonably be expected to continue to identify and flow detailed requirements in a manner timely for development
* Technical approach is credible and responsive to the identified requirements
* Technical plans have been updated, as necessary
* Tradeoffs are completed, and those planned for Phase B adequately address the option space
* Significant development, mission, and safety risks are identified and technically assessed, and a process and resources exist to manage the risks
* Adequate planning exists for the development of any enabling new technology
* Operations concept is consistent with proposed design concept(s) and in alignment with the mission requirements
* All allocated requirements are verifiable and traceable to their corresponding system level requirement
* Preliminary verification approaches are agreed upon
* Requisite level of detail and resources are available to support the acquisition and development plan within existing constraints
* A software system is defined which satisfies all of the system requirements assigned to software
* All of these software system requirements are traceable to either mission objectives, concept of operations, or interface requirements
* Monitoring processes/practices are in place to create software system within planned technical, schedule, cost, effort, and quality capabilities
{div3}{div3:id=tabs-7}

h2. {anchor:_Toc286592513}Preliminary Design Review (PDR)


h3. !entrance.png! Entrance Criteria

* Successful completion of the SDR or MDR and responses made to all SDR or MDR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
* Preliminary agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products made available to participants prior to PDR (noted in this list)
* Updated baselined documentation, as required
* Updated technology development maturity assessment plan
* Updated risk assessment and mitigation
* Updated cost and schedule data
* Updated logistics documentation, as required
* Applicable technical plans (e.g., technical performance measurement plan, payload-to-carrier integration plan, producibility / manufacturability program plan, reliability program plan, quality assurance plan)
* Applicable standards
* Interface control documents
* Software V&V Plan
* Technical resource utilization estimates and margins
** Storage or memory resource allocations developed allocating those resources to each software segment in the architecture
* Updated SDP/SMP
** Work Breakdown Structure
* Preliminary Traceability Matrix to CSCI level, including V&V trace
** safety-critical requirements highlighted
** Requirements allocated to components of the architecture (to CSCI level)
* SDD and Traceability Matrix review by test team completed and SDD updated as needed
* SMP updated for the corresponding detailed design activities
* Software inputs or contributions to the updated Project Plan
* Supplier documentation
** Software Data Dictionary(s)
** Software Classification(s)
** Software Development or Management Plan(s) \[with V&V separate\]
** Software Configuration Management Plan(s)
** Software Assurance Plan(s)
** Software Maintenance Plan(s)
* Revised SRS
** Software requirements to CSCI level
** Subsystem and lower-level technical requirements
** Requirements for reuse of existing software, reuse analysis
** Performance requirements, including memory, bus, CPU requirements
** Quality requirements, e.g., reliability, usability, or maintainability requirements
** Safety requirements
** Security requirements
** Derived requirements
* Revised Operational Concepts, as applicable
** Normal operations scenarios
** Fault detection, isolation and recovery (FDIR) strategy
** Hazard reduction strategies
* Lessons Learned
** Review of existing Lessons Learned from previous projects
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects
** Confirmation that Lessons Learned added to Lessons Learned database
* Trade studies
** Addressing COTS, reuse, etc.
** Trade-off analysis and data supporting design, as required
** Documented Alternative Design Solutions and Selection Criteria
* Documented Solutions, Analysis, Decision, and Rationale
** Inherited capabilities identified and compatible with the designs
* Preliminary Software Design Document (SDD)
** Subsystem design specifications for each configuration item (h/w and s/w)
** Completed definition of the software architecture and preliminary database design description, as applicable
** External interfaces and end-to-end data flow
** Design drivers (e.g., performance, reliability, usability, hardware considerations)
** Overview of software architecture, including context diagram
** List of subsystems, tasks, or major components -- e.g., user interface, database, task management
** Functional allocations, descriptions of major modules, and internal interfaces
** Safety considerations in the design elements and interfaces
** Design verification approach, e.g., prototyping, inspection, peer review
** Architectural design verified via operational scenarios to include required functionality, operating modes, and states
* Safety analyses and plans
** Matrix showing each subsystem/task/component's software classification (per NPR 7150.2), its safety classification (per NASA-STD-8719.13B), the rationale for the classifications, and the status of the classifications' approval by Software Assurance and management
** Updated PHA, Software Safety Litmus Test, if necessary
** Approved SMP/ PHA/Software Assurance Classification Report (SACR)
* Analyses completed:
** Partitioning analysis (modularity)
** Executive control and Start/Recovery
** Control and Data flow analysis
** Operability
** Failure modes and effects analyses
* Results of prototyping factored into architectural design
* Prototype software, if necessary
* Critical components identified and trial coding scheduled
* Human engineering aspects of design addressed with solutions acceptable to potential users
* Developmental tools and facility requirements identified and plans made and actions taken to ensure their availability when needed
* Test tools and facility requirements identified with plans and actions to ensure their availability when needed
* Test group involved in requirements and design analysis
* Security and supportability requirements factored into the design
* Metrics established and gathered to measure software development progress
* Procedures and tools developed for mechanizing management and configuration management plans
* Configuration Control Board established for software (and change control procedures working)
* Configuration management system understood by those who must use it
* Library established for storing, controlling and distributing software products; library procedures understood and working
* Independent software quality assurance group formed and contributing as a team member to the design and test activities
* Interdisciplinary teams working design issues that cross (sub)system component boundaries (software, hardware, etc.)
* Peer reviews completed: SRS , software architectural design (if identified for s/w peer review/inspection in s/w development plans), integration test plans
* Status of change requests

h3. !exit.png! Exit/Success Criteria

* Top-level requirements including mission success criteria, Technical Performance Measures (TPMs), and any sponsor-imposed constraints are agreed upon, finalized, stated clearly, and consistent with preliminary design
* Flow down of verifiable requirements is complete and proper or, if not, an adequate plan exists for timely resolution of open items; requirements are traceable to mission goals and objectives
** All supplier software requirements are verifiable
* Preliminary design is expected to meet the functional and performance requirements at an acceptable level of risk
* Definition of technical interfaces is consistent with overall technical maturity and provides an acceptable level of risk
* Adequate technical interfaces are consistent with the overall technical maturity and provide an acceptable level of risk
* Adequate technical margins exist with respect to TPMs
* Any required new technology has been developed to an adequate state of readiness, or back-up options exist and are supported to make them a viable alternative
* Project risks are understood and credibly assessed; plans, process, and resources exist to effectively manage them
* Operational concept is technically sound, includes (where appropriate) human factors, and includes flow down of requirements for its execution
* All RIDs/actions are completed and customer approval to proceed to detailed design phase
* Proposed design approach has sufficient maturity to proceed to final design
** Subsystem requirements, subsystem preliminary design, results of peer reviews, and plans for development, testing and evaluation form a satisfactory basis for proceeding into detailed design and test procedure development
* SMP, the software architectural design, and integration test plans adequate and feasible to support software detailed design
* Products (listed above) are approved, baselined and placed under configuration management
* Software inputs / contributions to
** Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) adequately addressed in preliminary designs and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis, and failure modes and effects analysis) have been approved
** Management processes used by the mission team are sufficient to develop and operate the mission
** Cost estimates and schedules indicate that the mission will be ready to launch and operate on time and within budget, and that the control processes are adequate to ensure remaining within allocated resources


{div3}{div3:id=tabs-8}

h2. {anchor:_Toc286592514}Critical Design Review (CDR)


h3. !entrance.png! Entrance Criteria

* Successful completion of the previous review (typically PDR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
* Preliminary agenda, success criteria, and charge to the board have been agreed to by technical team, project manager, and review chair
* Technical products made available to participants prior to CDR (noted in this list)
* Updated baselined documents, as required
* Technical data package (e.g., integrated schematics, Spares provisioning list, interface control documents, engineering analyses, and specifications)
* Updated Technology Development Maturity Assessment Plan
* SMP updated for implementation and unit test activities
** Updated Work Breakdown Structure
** Updated cost and schedule data
* Progress against software management plans
* Plan for milestone and peer reviews, walkthroughs, and external reviews
* Documentation plan, including each document's status and when it will be baselined
* Software requirements, management process, including documents used and produced, and V&V plan are baselined
* Preliminary NPR 7150.2 compliance matrix
* Design process, including methodology and standards used, design documentation produced, inspections and reviews
* Implementation process, incl. standards, review process, problem reporting, unit test, integration
* Management procedures and tools for measuring and reporting progress available and working
* Software measurements on planned and actual regarding product size, cost, schedule, effort, and defect
* Procedures established and working for software quality assurance and quality an integral part of the product being produced
* Updated logistics documentation
* Staffing-up problems being addressed and contingency plans in place
* IT Security Requirements (Mission-specific)
* Software design document(s) (including interface design documents, detailed design and unit test)
* Command and telemetry list
* Final Design Solution, Evaluation, and Rationale
** Documented Make, Buy, and/or Reuse, Analysis, Criteria, and Rationale
** Reused/heritage software or functionality from previous projects; necessary modifications
* Final Architecture Definition
* System design diagram (e.g., Level 0 data flow diagram or UML)
** For each task in the system design diagram
** Design diagrams for the task
** Description of functionality and operational modes
** Safety considerations addressed in the design
** Resource and utilization constraints (e.g., CPU, memory); how the software will adapt to changing margin constraints; performance estimates
** Data storage concepts and structuresPreferred software solution defined including major tradeoffs and options
* Baselined documentation updated, as required
* Preliminary functional baseline (with supporting trade-off analyses and data) available
* Preliminary system software functional requirements available
* As applicable, risk management plan updated (could be part of SDP/SMP)
** Updated software risk assessment and mitigations (including Probabilistic Risk Assessment (PRA), as applicable)
* As applicable, SDP/SMP updated
** Updated technology development, maturity, and assessment plan
** Updated cost and schedule data
** Work Breakdown Structure
* Preliminary software safety analysis available for review
* Project software data dictionary available
* Preliminary project software assurance plan available
* As applicable, updated project software maintenance plan available
* Software inputs / contributions completed for:
** Systems Engineering Management Plan (SEMP) changes, if any
** Based on system complexity, updated human rating plan
** Flow down of system requirements to all software functional elements of the system
** Requirements process
** Technical approach

h3. !check.png! Items Reviewed

* System architecture, including software
* Preferred software solution with tradeoffs and options
* Preliminary functional baseline
* Preliminary system software functional requirements
* Risk management plan, as applicable
* SDP/SMP, as applicable
* Preliminary software V&V plan, as applicable
* Software requirements documents
* Interface requirements documents, including SW
* Technical resource utilization estimates and margins
* Software safety analysis
* Software data dictionary
* Software CM plan
* Software QA plan

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Software requirements, including mission success criteria and any sponsor-imposed constraints, are defined and form the basis for the proposed conceptual design
** System level requirements are flowed down to software
** All software technical requirements are allocated and flow down to subsystems is adequate; they are verifiable, and traceable to their corresponding system level requirement; requirements, design approaches, and conceptual design will fulfill the mission needs consistent with the available resources (cost, schedule, throughput, and sizing)
** Technical plans have been updated, as necessary, including risk management plan, SDP/SMP, V&V plan, software maintenance plan, QA plan, CM plan
* Review panel agrees that:
** Tradeoffs are completed, and those planned for Phase B adequately address the option space
** Adequate planning exists for the development of any enabling new technology
** Significant development, mission, and safety risks are identified and technically assessed, and a process and resources exist to manage the risks
** Operations concept is consistent with proposed design concept(s) and in alignment with the mission requirements
** Requisite level of detail and resources are available to support the acquisition and development plan within existing constraints
* Review panel agrees that:
** All of these software subsystem requirements are traceable to either mission objectives, concept of operations, or interface requirements
** Monitoring processes/practices are in place to create software subsystem within planned technical, schedule, cost, effort, and quality capabilities
* Preliminary verification approaches are agreed upon

{div3}
{div3:id=tabs-7}

h2. Preliminary Design Review (PDR)

{info}The PDR demonstrates that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and establishes the basis for proceeding with detailed design. It shows that the correct design option has been selected, interfaces have been identified, and verification methods have been described. Full baseline cost and schedules, as well as risk assessments, management systems, and metrics are presented. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria  - General

* Successful completion of the SDR or MDR and responses made to all SDR or MDR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
* Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
* Technical products for this review made available to participants prior to PDR
* Baselined documentation updated, as required
* Risk assessment and mitigation updated
* Cost and schedule data baselined
* Peer reviews completed: SRS, software architectural design (if identified for SW peer review/inspection in SW development plans), integration test plans
* Lessons Learned
** Review of existing Lessons Learned (LL) from previous projects completed
** Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)
** Confirmation exists that Lessons Learned added to LL database

h3. !entrance.png! Entrance Criteria  - Plans

* Applicable technical plans (e.g., technical performance measurement plan, payload-to-carrier integration plan, producibility / manufacturability program plan, reliability program plan, baselined quality assurance plan) available
* SDP/SMP baselined
** Work Breakdown Structure
** For the corresponding detailed design activities
* Metrics established and gathered to measure software development progress
* Logistics documentation (e.g., maintenance plan) updated, as required
* Software inputs or contributions completed for:
** Updated Project Plan
** Updated Technology Development Maturity Assessment Plan
* Configuration management plan baselined
* Configuration Control Board established for software (and change control procedures working)
* Configuration management system understood by those who must use it
* Procedures and tools developed for mechanizing management and configuration management plans
* Supplier documentation available for review:
** Software Data Dictionary(s)
** Software Classification(s)
** SDP/SMP \[with V&V separate\]
** Software configuration management plan(s)
** Software assurance plan(s)
** Software maintenance plan(s)
* Test tools and facility requirements identified with plans and actions to ensure their availability when needed
* Development environment ready (e.g., h/w diagram, operating system(s), compilers, DBMS, tools)
** Developmental tools and facility requirements identified and plans made and actions taken to ensure their availability when needed
** If relevant, new compiler validated and producing acceptable object code for the target machine
** Tools needed for software implementation completed, qualified, installed and accepted, and team trained in their use
** Facilities for software implementation in place, operating, ready for use
* Library established for storing, controlling and distributing software products; library procedures understood and working
* Independent software quality assurance group formed and contributing as a team member to the design and test activities

h3. !entrance.png! Entrance Criteria  - Requirements

* Preliminary traceability matrix to CSCI level exists, including V&V trace
** Safety-critical requirements highlighted
** Requirements allocated to components of the architecture (to CSCI level)
* SRS baselined:
** Software requirements to CSCI level
** Subsystem and lower-level technical requirements
** Requirements for reuse of existing software, reuse analysis
** Performance requirements, including memory, bus, CPU requirements
** Quality requirements, e.g., reliability, usability, or maintainability requirements
** Safety requirements
** Security requirements
** Derived requirements

h3. !entrance.png! Entrance Criteria  - Design

* Applicable standards available to the review team
* Preliminary interface control documents available for review
* Technical resource utilization estimates and margins ready for review
** Storage or memory resource allocations developed allocating those resources to each software segment in the architecture
* Solutions, analysis, decision, and rationale documented
** Inherited capabilities identified and compatible with the designs
* Security and supportability requirements factored into the design
* Trade studies completed
** Addressing COTS, reuse, etc.
** Trade-off analysis and data supporting design, as required
** Alternative design solutions and selection criteria
* Results of prototyping factored into architectural design
* Preliminary Software Design Document (SDD) created:
** Completed definition of the software architecture and preliminary database design description, as applicable
** External interfaces and end-to-end data flow
** Design drivers (e.g., performance, reliability, usability, hardware considerations)
** Overview of software architecture, including context diagram
** List of subsystems, tasks, or major components -- e.g., user interface, database, task management
** Functional allocations, descriptions of major modules, and internal interfaces
** Safety considerations in the design elements and interfaces
** Design verification approach, e.g., prototyping, inspection, peer review
** Architectural design (baselined) verified via operational scenarios to include required functionality, operating modes, and states
* Results available from evaluations of prototype software, if necessary to evaluate design
* Human engineering aspects of design addressed with solutions acceptable to potential users
* SDD and traceability matrix review by test team completed and SDD updated as needed
* Critical components identified and trial coding scheduled
* Confirmation exists that
** The test group participated in requirements and design analysis
** Interdisciplinary teams are working design issues that cross (sub)system component boundaries (software, hardware, etc.)

h3. !entrance.png! Entrance Criteria  - Analysis

* Safety analyses and plans baselined:
** Matrix showing each subsystem/task/component's software classification (per NPR 7150.2A), its safety classification (per NASA-STD-8719.13B), the rationale for the classifications, and the status of the classifications' approval by Software Assurance and management
** Updated PHA, Software Safety Litmus Test, if necessary
** Approved SMP/ PHA/Software Assurance Classification Report (SACR)
* Analyses completed:
** Partitioning analysis (modularity)
** Executive control and Start/Recovery
** Control and Data flow analysis
** Operability
** Preliminary failure modes and effects analyses
* Operational Concepts revised, as applicable, and baselined
** Normal operations scenarios
** Fault detection, isolation and recovery (FDIR) strategy
** Hazard reduction strategies
* Status of change requests available for review

h3. !check.png! Items Reviewed

* Risk assessment and mitigation
* Safety analysis and plans
* Cost and schedule data
* Logistics documentation (e.g., maintenance plan)
* Technical plans (e.g., QA plan, performance measurement plan)
* Interface control documents
* Software V&V plan
* Resource utilization estimates and margins
* SDP/SMP
* Bidirectional traceability matrix
* Software design documents
* Supplier documentation
* Requirements documents
* Concept of operations
* Trade studies
* Documented solutions, analysis, decisions and rationale
* Completed analyses
* Prototype software, if applicable
* Plans for development and test tools and facilities
* Software development progress metrics
* CM plan
* Peer review results / proof of completion
* Status of change requests

h3. !exit.png! Exit/Success Criteria

* Top-level requirements including mission success criteria, Technical Performance Measures (TPMs), and any sponsor-imposed constraints are agreed upon, finalized, stated clearly, and consistent with preliminary design
* Review panel agrees that:
** Flow down of verifiable requirements is complete and proper or, if not, an adequate plan exists for timely resolution of open items; requirements are traceable to mission goals and objectives
** All supplier software requirements are verifiable
** Preliminary design is expected to meet the functional and performance requirements at an acceptable level of risk
** Definition of technical interfaces is consistent with overall technical maturity and provides an acceptable level of risk
** Adequate technical interfaces are consistent with the overall technical maturity and provide an acceptable level of risk
** Adequate technical margins exist with respect to TPMs
* Review panel agrees that:
** Any required new technology has been developed to an adequate state of readiness, or back-up options exist and are supported to make them a viable alternative
** Project risks are understood and credibly assessed; plans, process, and resources exist to effectively manage them
** Operational concept is technically sound, includes (where appropriate) human factors, and includes flow down of requirements for its execution
** Proposed design approach has sufficient maturity to proceed to final design
** Subsystem requirements, subsystem preliminary design, results of peer reviews, and plans for development, testing and evaluation form a satisfactory basis for proceeding into detailed design and test procedure development
** SMP, the software architectural design, and integration test plans adequate and feasible to support software detailed design
* All RIDs/actions are completed or have closure plans and customer approval received to proceed to detailed design phase
* Products from this review are approved, baselined and placed under configuration management
* Approval received for software inputs / contributions:
** Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) is adequately addressed in preliminary designs and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis, and failure modes and effects analysis)
** Management processes used by the mission team are sufficient to develop and operate the mission
** Cost estimates and schedules indicate that the mission will be ready to launch and operate on time and within budget, and that the control processes are adequate to ensure remaining within allocated resources

{div3}
{div3:id=tabs-8}

h2. Critical Design Review (CDR)

{info}The CDR demonstrates that the maturity of the design is appropriate to support proceeding with full scale fabrication, assembly, integration, and test, and that the technical effort is on track to complete the flight and ground system development and mission operations in order to meet mission performance requirements within the identified cost and schedule constraints. Progress against management plans, budget, and schedule, as well as risks assessments are presented. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria  - General

* Successful completion of the previous review (typically PDR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
* Final agenda, success criteria, and charge to the board have been agreed to by technical team, project manager, and review chair
* Technical products for this review made available to participants prior to CDR
* Baselined documents updated, as required
* Peer reviews for software and rework accomplished, as defined in the s/w and/or project plans
* NPR 7150.2 compliance matrix baselined
* Lessons Learned
** Review of existing Lessons Learned (LL) from previous projects completed
** Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)
** Confirmation exists that Lessons Learned added to LL database

h3. !entrance.png! Entrance Criteria  - Plans

* SDP/SMP updated for implementation and unit test activities
** Updated Work Breakdown Structure
** Updated cost and schedule data
* Progress against software management plans available for review
* Plan exists for milestone and peer reviews, walkthroughs, and external reviews
* Documentation plan exists, including each document's status and when it will be baselined
* Software requirements, management process, including documents used and produced, and V&V plan updated and baselined
* Logistics documentation (e.g., maintenance plan) updated
* Staffing-up problems being addressed, contingency plans in place
* Independent verification and validation (IV&V) plans and status available for review
* Management procedures and tools for measuring and reporting progress available and working
* Software measurements available for review (on planned and actual regarding product size, cost, schedule, effort, and defect)
* Procedures established and working for software quality assurance and quality an integral part of the product being produced
* Implementation process exists, incl. standards, review process, problem reporting, unit test, integration
* Supplier documentation available for review:
** Software Design Description(s)
** Interface Design Description(s)
** Updated Supplier Software Verification and Validation Plan(s)
** Preliminary Supplier Software Test Procedure(s)
* Systems and subsystem certification plans and requirements exist (as needed)
* Changes since PDR available for review:
** Updated product assurance and software safety plans and activities
** System safety analysis with associated verifications
* Status of configuration management processes since PDR available, including discrepancy reporting and tracking (development and post-release)
* Build plan exists
* Build test timeline and ordered list of components and requirements to be tested in each build ready for review
* Test group trained and prepared to evaluate the code using their facilities and tools
* Software is ready for testing / activation
* Coding, integration, and test plans and procedures available for review
** Test plans baselined
** Preliminary integration and test procedures
** Test team roles, functions, support required are defined
** Test levels described (e.g., unit testing, integration testing, software system testing) -- description, who executes, test environment, standards followed, verification methodologies
** Testing preparation and execution activities planned, incl. testing of reused/heritage software, if applicable
** Test environments described for each test level \--diagram and description of tools, testbeds, facilities
* Plans available for review:
** Launch site operations plan
** Checkout and activation plan
** Disposal plan (including decommissioning or termination)
* Delivery, installation, maintenance processes planned
* System and acceptance testing defined -- operational scenarios to be tested, including stress tests and recovery testing, if applicable
* Acceptance process exists -- reviews (e.g., Acceptance Test Readiness Review, Acceptance Test Review), approval, and signoff processes
* Acceptance criteria baselined
* Software inputs / contributions completed for:
** Updated Project Plan
** Updated technology development maturity assessment plan

h3. !entrance.png! Entrance Criteria  - Requirements

* Changes to IT security requirements since PDR available for review (Mission-specific)
* SRS updated to the Computer Software Unit (CSU) level
* Traceability matrix updated (to CSU level)
* Verification exists that detailed designs cover the requirements

h3. !entrance.png! Entrance Criteria  - Design

* Technical data package (e.g., integrated schematics, spares provisioning list, interface control documents, engineering analyses, and specifications) available for review
* Design process exists, including methodology and standards used, design documentation produced, inspections and reviews
* Software design document(s) baselined (including interface design documents, detailed design and unit test)
* Command and telemetry list available for review
* Final design solution, evaluation, and rationale available
** Documented make, buy, and/or reuse, analysis, criteria, and rationale
** Reused/heritage software or functionality from previous projects; necessary modifications
* Final architecture definition available
* Subsystem/component context diagram available
* Data flow diagrams available
* Software subsystem design diagram available (e.g., Level 0 data flow diagram or UML)
** For each task in the software subsystem design diagram
** Design diagrams for the task
** Description of functionality and operational modes
** Safety considerations addressed in the design
** Resource and utilization constraints (e.g., CPU, memory); how the software will adapt to changing margin constraints; performance estimates
** Data storage concepts and structures
* Input and output data and formats identified
* Interrupts and/or exception handling available, including event, FDC, and error messages
* IT Security features (design features) identified
* Detailed description of software operation and flow exists
* Operational limits and constraints identified
* Technical resource utilization estimates and margins updated
** Detailed timing and storage allocation compiled
* Algorithms exist sufficient to satisfy their requirements
* Failure detection and correction (FDC) requirements, approach, and detailed design available for review
* Trial code analyzed and designs modified accordingly
* Designs comprising the software completed, peer reviewed and placed under change control

h3. !entrance.png! Entrance Criteria  - Analysis

* Analyses completed:
** Algorithm accuracy
** Critical timing and sequence control
** Undesired event handling
** Operability
** Failure modes and effects analyses
* Final status and results of analyses ready for review
* HA / Software Assurance Classification Report (SACR) updated, if necessary
* Subsystem-level and preliminary operations safety analyses exist
* Risk assessment and mitigation updated
* Reliability analyses and assessments updated
* Operational Concepts updated
* Product build-to specifications exist for each hardware and software configuration item, along with supporting trade-off analyses and data
* Status of change requests available for review

h3. !entrance.png! Entrance Criteria  - Other

* Software requirement verification recording, monitoring, and current status available for review -- databases and test reports; sample test verification matrix
* Preliminary operations handbook created
* Programmer's manual drafted
* User's manual drafted

h3. !check.png! Items Reviewed

* Baselined documents
* Technical data package
* SDP/SMP
* Progress against software management plans
* Plan and status for reviews
* Documentation plan
* NPR 7150.2 compliance matrix
* Design and implementation processes
* Status of management procedures and tools
* Software measurements
* Logistics documentation (e.g., maintenance plan)
* Status of any staffing problems
* Software design document(s)
* Command and telemetry list
* Final Design Solution, Evaluation, and Rationale
* Final Architecture Definition
* Software subsystem design diagram
* Data flow diagrams
* Identification and formats of input and output data
* Interrupts and/or exception handling, including event, FDC, and error messages
* IT Security featuresrequirements (designand features)
* Detailed description of software operation and flow
* Operational limits and constraints
* Technical resource utilization estimates and margins
** Detailed timing and storage allocation compiled
* Analyses completed:
** Algorithm accuracy
** Critical timing and sequence control
** Dimensional analysis (such as consistency of array dimensions)
** Singularity Analysis (such as division by zero)
** Undesired event handling
** Operability
** Failure modes and effects analyses
* Final status * Status and results of analyses
* Algorithms sufficient to satisfy their requirements
* Failure detection and correction (FDC) requirements, approach, and detailed design
* Subsystem/component context diagram
* TrialStatus of trial code analyzedanalysis and designs modified accordinglydesign
* Supplier documentation
** Software Design Description(s)
** Interface Design Description(s)
** Updated Supplier Software Verification and Validation Plan(s)
** Preliminary Supplier Software Test Procedure(s)
* Peer reviews for software and rework accomplished, as defined in the s/w and/or project plans
* Designs comprising the software completed, peer reviewed and placed under change control
* SRS to Computer Software Unit (CSU) level
* Updated Traceability Matrix (to CSU level)
* Verification that detailed designs cover the requirements
* Product Assurance and Software Safety plans and activities
* System safety analysis with associated verifications
* Updated HA / Software Assurance Classification Report (SACR), if necessary
* Subsystem-level and preliminary operations safety analyses
* Updated risk assessment and mitigation
* Updated reliability analyses and assessments
* Independent verification and validation (IV&V) plans and status
* Systems and subsystem certification plans and requirements (as needed)
* Configuration Management (CM) processes, including discrepancy reporting and tracking (development and post-release)
* Development environment (e.g., h/w diagram, operating system(s), compilers, DBMS, tools)
* If relevant, new compiler validated and producing acceptable object code for the target machine
* Tools needed for software implementation completed, qualified, installed and accepted, and team trained in their use
* Facilities for software implementation in place, operating, ready for use
* Build plan
* Product build-to specifications for each hardware and software configuration item, along with supporting trade-off analyses and data
* Coding, integration, and test plans and procedures
* V&V plan (including requirements and specification)
* Test team roles, functions, support required are defined
* Software Test Plan (integration and test procedure outlines)
* Test procedures
* Test levels (e.g., unit testing, integration testing, system testing) -- description, who executes, test environment, standards followed, verification methodologies
* Testing preparation and execution activities, incl. testing of reused/heritage software, if applicable
* Build test timeline and ordered list of components and requirements to be tested in each build
* Test environments for each test level \--diagram and description of tools, testbeds, facilities
* Test group trained prepared to evaluate the code using their facilities and tools
* Software for testing / activation
* Software requirement verification recording, monitoring, and current status -- databases and test reports; sample test verification matrix
* System and acceptance testing -- operational scenarios to be tested, including stress tests and recovery testing, if applicable
* Acceptance process -- reviews (e.g., Acceptance Test Readiness Review, Acceptance Test Review), approval, and signoff processes
* Acceptance criteria
* Delivery, Installation, Maintenance
** Disposition of source code and tools, handling of load images, installation of databases, etc.
** Version identification and documentation
** Maintenance plan, if applicable, including disposition of COTS components (source code, licenses, etc.)
** Close-out and archive of software products
* Launch site operations plan
* Checkout and activation plan
* Disposal plan (including decommissioning or termination)
* Preliminary Operations Handbook
* Revised Draft of Programmer's Manual
* Draft of User's Manual
* Lessons Learned
** Review of existing Lessons Learned from previous projects
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects
** Confirmation that Lessons Learned added to Lessons Learned database
* Status of change requests
* Software inputs / contributions to updated Project Plan

h3. !exit.png! Exit/Success Criteria

* All supplier software requirements have been mapped to the software design
* All elements of the design are compliant with functional and performance requirements
** Detailed design is expected to meet requirements with adequate margins at acceptable level of risk
* Interface control documents are sufficiently matured to proceed with fabrication, assembly, integration, and test, and plans are in place to manage any open items
* High confidence exists in the product baseline, and adequate documentation exists or will exist in a timely manner to allow proceeding with coding, integration, and test
* Product verification and product validation requirements and plans are complete
** Verification approach is viable, and will confirm compliance with all requirements
* Testing approach is comprehensive, and planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into next phase
* Adequate technical and programmatic margins and resources exist to complete development within budget, schedule, and risk constraints
* Risks to mission success are understood and credibly assessed, and plans and resources exist to effectively manage them
* Software inputs / contributions to
** Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) have been adequately addressed in system and operational designs, and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis and failure modes and effects analysis) have been approved
* Management processes used by the project team are sufficient to develop and operate the mission
* High priority RIDs against the SDD are closed/actions are completed and customer approval to proceed to next phase
* Approved readiness to proceed with software implementation and test activities
* SMP, software detailed designs, and unit test plans are an adequate and feasible basis for the implementation and test activities
* Products (listed above) are approved, baselined, and placed under configuration management
{div3}{div3:id=tabs-9}

h2. {anchor:_Toc286592515}Production Readiness Review (PRR)

A PRR is held for FS&GS projects developing or acquiring multiple or similar systems greater than three or as determined by the project. The PRR determines the readiness of the system developers to efficiently produce the required number of systems. It ensures that the production plans; fabrication, assembly, and integration enabling products; and personnel are in place and ready to begin production. -- NPR 7123.1

h3. !entrance.png! Entrance Criteria

* Significant production engineering problems encountered during development are resolved
* Design documentation adequate to support production
* Production plans and preparation adequate to begin fabrication
* Production-enabling products and adequate resources available, allocated, and ready to support end product production
* Production plans
* Production risks and mitigations
* Schedule Status of SW designs and requirements coverage verification
* SRS
* Bidirectional Traceability Matrix
* Status of software QA and safety plans, procedures, activities
* HA / Software Assurance Classification Report (SACR), if necessary
* Risk assessment and mitigation
* Reliability analyses and assessments
* IV&V plans and status
* Systems and subsystem certification plans and requirements (as needed)
* CM processes
* Status of development environment and personnel training
* Build plan
* Product build-to specifications along with supporting trade-off analyses and data
* Coding, integration, and test plans and procedures
* V&V plan
* Software test plan, procedures
* Testing preparation and execution activities
* Build test timeline and ordered list of components and requirements to be tested in each build
* Description of test environments for each test
* Status of test group training
* Launch site operations plan
* Checkout and activation plan
* Disposal plan
* Preliminary Operations Handbook
* Draft of Programmer's Manual
* Draft of User's Manual
* Status of change requests

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** All supplier software requirements have been mapped to the software design
** All elements of the design are compliant with functional and performance requirements (detailed design is expected to meet requirements with adequate margins at acceptable level of risk)
** Interface control documents are sufficiently matured to proceed with fabrication, assembly, integration, and test, and plans are in place to manage any open items
** Product verification and product validation requirements and plans are complete; verification approach is viable, and will confirm compliance with all requirements
** Management processes used by the project team are sufficient to develop and operate the mission
** Testing approach is comprehensive, and planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into next phase
* Review panel agrees that:
** Adequate technical and programmatic margins and resources exist to complete development within budget, schedule, and risk constraints
** Risks to mission success are understood and credibly assessed, and plans and resources exist to effectively manage them
** SDP/SMP, software detailed designs, and unit test plans are an adequate and feasible basis for the implementation and test activities
** High confidence exists in the product baseline, and adequate documentation exists or will exist in a timely manner to allow proceeding with coding, integration, and test
* Approval received for software inputs / contributions:
** Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) have been adequately addressed in system and operational designs, and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis and failure modes and effects analysis) have been approved
* High priority RIDs against the SDD are closed/actions are completed and customer approval received to proceed to next phase
* Approved readiness to proceed with software implementation and test activities
* Products from this review are approved, baselined, and placed under configuration management

{div3}
{div3:id=tabs-9}

h2. Production Readiness Review (PRR)

{info}The PRR is held for projects developing or acquiring multiple similar or identical flight and/or ground support systems. The purpose of the PRR is to determine the readiness of the system developer(s) to efficiently produce (build, integrate, test, and launch) the required number of systems. The PRR also evaluates how well the production plans address the system's. operational support requirements. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* Significant production engineering problems encountered during development are resolved
* Design documentation is adequate to support production
* Production plans and preparation are adequate to begin fabrication
* Production-enabling products and adequate resources are available, allocated, and ready to support end product production
* Production risks and mitigations identified
* Schedule reflects production activities

h3. !check.png! Items Reviewed

* Design documentation
* Production plans and preparation
* Production risks and mitigations
* Schedule

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** System requirements are fully met in final production configuration
** Adequate measures are in place to support production
** Design-for-manufacturing considerations ensure ease and efficiency of production and assembly
** Risks are identified, credibly assessed, and characterized, and mitigation efforts defined
** Alternate sources for resources identified, as appropriate
** Required facilities and tools are sufficient for end product production
** Specified special tools and test equipment are available in proper quantities
** Production and support staff are qualified
** Production engineering and planning are sufficiently mature for cost-effective production
** Production processes and methods are consistent with quality requirements
** Qualified suppliers are available for materials that are to be procured
* Delivery schedules are verified

{div3}
{div3:id=tabs-10}

h2. System Integration Review (SIR)

{info}The SIR evaluates the readiness of the project to start flight system assembly, test, and launch operations. V&V Plans, integration plans, and test plans are reviewed. Test articles (hardware/software), test facilities, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* Integration plans and procedures completed and approved
* Segments and/or components available for integration
* Mechanical and electrical interfaces verified against the interface control documentation
* All applicable functional, unit-level, subsystem, and qualification testing conducted successfully
* Integration facilities, including clean rooms, ground support equipment, electrical test equipment, and simulators ready and available
* Support personnel adequately trained
* Handling and safety requirements documented
* All known system discrepancies identified and disposed in accordance with agreed-upon plan
* All previous design review success criteria and key issues satisfied in accordance with agreed-upon plan
* Quality control organization is ready to support integration effort

h3. !check.png! Items Reviewed

* Integration plans and procedures
* Interface control documentation
* Functional, unit-level, subsystem, and qualification test results/proof of completion
* Test preparation (facilities, tools, equipment, personnel)
* Handling and safety requirements
* V&V plans, test plans

h3. !exit.png! Exit/Success Criteria

* SystemReview requirementspanel fully met in final production configurationagrees that:
** Adequate measures in place to support production
* Design-for-manufacturing considerations ensure ease and efficiency of production and assembly
* Risks identified, credibly assessed, and characterized, and mitigation efforts defined
* Delivery schedules verified
* Alternate sources for resources identified, as appropriate
* Required facilities and tools are sufficient for end product production
* Specified special tools and test equipment are available in proper quantities
* Production and support staff are qualified
* Production engineering and planning are sufficiently mature for cost-effective production
* Production processes and methods are consistent with quality requirements
* Qualified suppliers are available for materials that are to be procured
{div3}integration plans and procedures are completed and approved for the system to be integrated
** Previous component, subsystem, and system test results form a satisfactory basis for proceeding to integration
** Integration procedures and work flow have been clearly defined and documented
** Review of integration plans, as well as procedures, environment, and configuration of items to be integrated, provides a reasonable expectation that integration will proceed successfully
** Integration personnel have received appropriate training in integration and safety procedures
* Risk level is identified and accepted by program/project leadership, as required

{div3}
{div3:id=tabs-1011}

h2. Test Readiness Review {anchor:_Toc286592516}System Integration Review (SIR)
(TRR)

{info}The TRR ensures that the test article (hardware/software), test facility, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7123.1){info}

h3. !entrance.png! Entrance Criteria

* Integration plans and procedures completed and approved
* Segments and/or components available for integration- General

* MechanicalAll and electrical interfaces verified against the interface control documentation
* All applicable functional, unit-level, subsystem, and qualification testing conducted successfully
* Integration facilities, including clean rooms, ground support equipment, electrical test equipment, and simulators ready and available
* Support personnel adequately trained
* Handling and safety requirements documented
* All known system discrepancies identified and disposed in accordance with agreed-upon plan
* All previous design review success criteria and key issues satisfied TRR-specific materials, such as test plans, test cases, procedures, and version description document available to all participants prior to TRR
* Updated baselined documentation available (from previous reviews -- SwRR, SRR, PDR, CDR)
* Required documents are in the state/status required; any required deviations or waivers are in place and approved
* All known system discrepancies identified and disposed in accordance with agreed-upon plan
* QualitySoftware controlcost organization ready to support integration effort
* V&V plans, test plans

h3. !exit.png! Exit/Success Criteria

* Adequate integration plans and procedures are completed and approved for the system to be integrated
* Previous component, subsystem, and system test results form a satisfactory basis for proceeding to integration
* Risk level is identified and accepted by program/project leadership, as required
* Integration procedures and work flow have been clearly defined and documented
* Review of integration plans, as well as procedures, environment, and configuration of items to be integrated, provides a reasonable expectation that integration will proceed successfully
* Integration personnel have received appropriate training in integration and safety procedures


{div3}{div3:id=tabs-11}


h2. {anchor:_Toc286592517}Test Readiness Review (TRR)
estimate updated, and software related expenditures collection and report by life cycle phases available
* Test schedule updated:
** Current system test status
** Issues and concerns
** Test schedule
* Schedules for integration and test established and are reasonable based on results of unit testing
* Lessons Learned
** Plans to capture any lessons learned from test program are documented
** Review of existing Lessons Learned from previous projects completed
** Lessons Learned captured from software areas of the project ( indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)

h3. !entrance.png! Entrance Criteria  - Plans

* Objectives of testing and testing approach clearly defined and documented, and supported by:
** Test plans, test planscases, procedures, environment, and configuration of test item(s) support those objectives
* Configuration of system under test defined and agreed to; all interfaces placed under configuration management or defined in accordance with an agreed-to plan, and version description document available to TRR participants
* Applicable functional, unit-level, subsystem, system, and qualification testing conducted successfully; results available
* All TRR-specific materials, such as test plans, test cases, and procedures, available to all participants prior to TRR
* Updated and current baselined documentation (from previous reviews - SRR, PDR, CDR
* Updated requirements and design documentation
* Required documents in the state/status required Deviations? Waivers?
* All known system discrepancies identified and disposed in accordance with agreed-upon plan
* All previous design review success criteria and key issues satisfied in accordance with agreed-upon plan
* All required test resources people (including a designated test director), facilities, test articles, test instrumentation, and other test enabling products identified and available to support required tests
* Facilities and tools for integration and test ready, qualified, validated, and available for operational use including test engineering products (test cases, procedures, tools, etc.), test beds, simulators, and models
* Roles and responsibilities of all test participants defined and agreed to
* Test contingency planning accomplished, and all personnel trained
* Supplier Software Version Description(s)
* Software Build from CM
** Operational software ready for testing
* Informal Dry Run completed without errors
* Outstanding Software Change Requests (SCRs) ready for review
* Updated Traceability Matrix
* All requirements included in test procedure document and uniquely identified and traceable in the SRTM
* Requirements Analysis and Traceability Reports (with possible RIDs)
* Code Analysis and Assessment Results (including SCRs, RIDs, etc.)
* Metric Data and Reports (implementation and test)
* Description of System Test Approach
* Test plan includes safety critical test scenarios
* Test plan includes test scenarios for all software/system requirements defined in the SRTM, tests that check the performance at the limits of ranges specified for the requirements and operational scenarios; includes test limitations and/or constraints
* Validation of operations and users manuals
* Test case structure established that identifies for each test:
** Software requirements to be tested
** Required inputs
** Facilities and test tools required
** Expected outputs and analysis methods
** Software entities to be exercised by the test
* Configuration for system testing
* Summary of Quality Assurance (QA) activities used during development
* Successful audit of the VDD (such as FSW) including fixes
* Any current risks, issues, or requests for action (RFAs) that require follow-up and how they will be tracked to closure
* Results of Testing completed to date
** Objectives of tests
** Defined configuration of test item(s)
* Interfaces placed under configuration management or defined in accordance with an agreed-to plan
* All required test resources, people (including a designated test director), facilities, test articles, test instrumentation, and other test enabling products identified and available to support required tests
* Facilities and tools for integration and test ready, qualified, validated, and available for operational use including test engineering products (test cases, procedures, tools, etc.), test beds, simulators, and models
* Roles and responsibilities of all test participants defined and agreed to and all personnel have been trained
* Supplier Software Version Description(s) available
* Metric data and reports (implementation and test) ready for review
* SDP/SMP updated for integration and test activities
* IV&V report/status - if applicable
* Any current risks, issues, or requests for action (RFAs) that require follow-up and how they will be tracked to closure ready for review
* Risk analysis and risk list updated and associated risk management plan updated
* Test plan includes test scenarios:
** User-defined scenarios to test interactive or operator-oriented software
** Safety critical scenarios
** For all software/system requirements defined in the bidirectional traceability matrix
** Performance checks at the limit of ranges specified for the requirements and operational scenarios, including test limitations and constraints
* Test case structure established that identifies per test case:
** Software requirements to be tested and SW entities to be exercised
** Required inputs
** Facilities and test tools required, setup, and required qualifications
** Limitations of the test environment
** Verification of standards compliance
* Test plan updated for integration and test activities
* Software test procedures baselined:
** Defined build process
** Safety critical software test considerations
** Defined software test success criteria
** Defined CM process and procedures used for testing
** Process for capturing test data and storing it
** Test procedure red-line process
** Process for restarting a test if error found during testing
** Discrepancy Reporting System
** Process for tracking Test Progress
** Role of Quality Assurance including redlining and QA witnessing role and responsibilities
** Any safety and security issues relevant to the testing activity
** All workarounds and non-functioning software components
** Time required for testing; include schedule and analysis of time needed on various environments / test beds / spacecraft

h3. !entrance.png! Entrance Criteria  - Requirements

* All requirements included in baselined test procedure document and uniquely identified and traceable in the updated bidirectional traceability matrix (includes necessary corrections due to discrepancy reports)

h3. !entrance.png! Entrance Criteria  - Design

* All previous design review success criteria and key issues satisfied in accordance with agreed-upon plan

h3. !entrance.png! Entrance Criteria  - Analysis

* Outstanding software change requests (SCRs) ready for review
* Code inspection results available for review
* Results of testing completed to date available for review:
** Objectives of tests
** Expected results defined
** Confirm all steps in the test runs are documented
** Results andincluding Safetysafety Criticalcritical Testtest results
** Tests performed
** Successful Teststests
** Known problems, issues
** Deviations
** Waivers
** Issues, waivers

h3. !entrance.png! Entrance Criteria  - Other

* Software Test Process
** Build and System Test Methodology
** Electrostatic Discharge considerations
** Safety critical software verification considerations
** Software test standards (including use of CM)
** CM process and Procedures used for testing and how each was verified prior to usage
** Process for capturing test data and storing it in the CM system
** Test procedure red-line process
** If/how a test can be resumed if error found during testing
** Discrepancy Reporting System
** Process for tracking Test Progress
** Role of Quality Assurance including redlining and QA witnessing role and responsibilities
** Any safety and security issues relevant to the testing activity
** All workarounds and non-functioning software components
** Time required for testing; include schedule and analysis of time needed on various environments / testbeds / Spacecraft
* List of all Requirements Documents relevant to Acceptance testing
* Acceptance Test Readiness
** Process for analysis of Test Results including the division of responsibility
** Acceptance Test testbed (environment) setup (hardware)
** Setup and use of Simulators or other Test tools and their required qualifications
** Limitations of the testbed (environment)
** Tests that require hardware for verification
** Description, at a high level, of what each test does, how long it lasts, and any special circumstances
** IV&V report/status - if applicable
** Preparedness for Acceptance Testing
** Requests For Action (RFAs)
** Decision to proceed to Acceptance Testing
* SMP updated for integration and test activities
* Updated software cost estimate, and software related expenditures collection and report by life cycle phases
* Test Schedule
** Current system test status
** Plans for Acceptance Test
** Acceptance Test acceptance criteria
** Issues and Concerns
** Test Schedule
* Schedules for integration and test established and are reasonable based on results of unit testing
* Tests reusable for regression testing
* Expected results
* Completed evaluations (in conjunction with unit testing):
** Verification of computations using nominal data
** Verification of computations using stress data
** Verification of output options and formats
** Exercise of executable statements in units at least once
** Test of options at branch points
** Verification of standards compliance
* Completed evaluations (in conjunction with s/w integration and test):
** Verification of performance throughout anticipated range of operation conditions including nominal, abnormal, failure and degraded mode situations
** Verification of performance throughout anticipated range of operating conditions as various strings of units are linked together and various modes are exercised
** Verification of end-to-end functional flows and database linkages
** Exercise of logic switching and executive control options at least once
* Risk analysis and risk list updated and associated risk management plan prepared
* Databases for integration and test been created and validated
* Test network showing interdependencies among test events and planned time deviations for these activities prepared
* Lessons Learned
** Plans to capture any lessons learned from test program are documented
** Review of existing Lessons Learned from previous projects
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects

h3. !exit.png! Exit/Success Criteria

* User-defined scenarios developed to test interactive or operator-oriented software
* Peer reviews completed for implementation and tests to be performed, as defined in the software and/or project plans
* Adequate test plans are completed and approved to proceed for the system under test
* Adequate identification and coordination of required test resources are completed
* Previous component, subsystem, and system test results form a satisfactory basis for proceeding into planned tests
* Risk level is identified and accepted by program/competency leadership as required
* Objectives of testing have been clearly defined and documented, and review of all test plans, as well as procedures, environment, and configuration of test item, provide a reasonable expectation that objectives will be met
* Test cases have been reviewed and analyzed for expected results, and results are consistent with test plans and objectives
* Test personnel have received appropriate training in test operation and safety procedures
* Provisions have been made should test levels or system response exceed established limits or if the system exceeds its expected range of response
* Software is ready to be tested
* Formal dry test run completed
* SMP, software implementations, and test are an adequate and feasible basis for integration and test activities
* Products (listed above) are approved, baselined and placed under configuration management
{div3}{div3:id=tabs-12}

h2. {anchor:_Toc286592518}System Acceptance Review (SAR)


h3. !entrance.png! Entrance Criteria

* A preliminary agenda coordinated (nominally) prior to SAR
* Technical products made available to participants prior to SAR (noted in this list)
* Results of the SARs conducted at the major suppliers
* Transition to production and/or manufacturing plan
* Product verification results / test reports
* Product validation results
* Documentation that the delivered system complies with the established acceptance criteria
* Documentation that the system will perform properly in the expected operational environment
* Technical data package updated to include all test results
* Certification package
* Updated risk assessment and mitigation
* Successfully completed previous milestone reviews
* Remaining liens or unclosed actions and plans for closure
* Baselined Software Build
* Metrics Data and Reports
* Software presentation prepared for AR
** Software overview
** Project System Diagram
** Functional software overview
** Software products/artifacts
** Software traceability matrix examples
** STPr/SVVPr status
** Open RIDs
** Open SCRs
** Software summary and recommendations
* Lessons Learned
** Review of existing Lessons Learned from previous projects
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects
** Confirmation that Lessons Learned added to Lessons Learned database build created from CM and ready for testing
* Applicable functional, unit-level, subsystem, system, and qualification testing conducted successfully
* Informal dry run completed without errors
* Validation of operations and users manuals completed
* Successful functional audit (FCA) of the VDD (such as FSW) including fixes
* Tests reusable for regression testing exist
* Databases for integration and test have been created and validated
* Test network showing interdependencies among test events and planned time deviations for these activities prepared
* Evaluations completed (in conjunction with unit testing):
** Verification of computations using nominal and stress data
* Evaluations completed (in conjunction with s/w integration and test):
** Verification of performance throughout anticipated range of operation conditions including nominal, abnormal, failure and degraded mode situations
** Verification of performance throughout anticipated range of operating conditions as various strings of units are linked together and various modes are exercised
** Verification of end-to-end functional flows and database linkages
** Exercise of logic switching and executive control options at least once

h3. !check.png! Items Reviewed

* Test preparation
** Test plans, test cases, scenarios, databases, procedures, environment, expected results, and configuration of test item(s)
** Software build ready for testing
** Resources (people, facilities, tools, etc.)
** Test schedule
** Test contingency planning
** Test network
* Results for all testing completed to date
* Interfaces
* SDP/SMP
* VDD and VDD audit results
* Software change requests
* Bidirectional traceability matrix
* Current risks, issues, or RFAs
* Baselined documentation from previous reviews
* Requirements and design
* Status of QA activities
* Status of known system discrepancies
* Software cost estimate and expenditures report
* Supplier Software VDD(s)
* Requirements Analysis and Traceability Reports
* Code Analysis and Assessment Results
* Metric Data and Reports
* Operations and users manuals
* Completed evaluations of unit, integration tests
* Risk analysis, list, management plan

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Peer reviews completed for implementation and tests to be performed, as defined in the software plans
** Adequate identification and coordination of required test resources are completed
** Previous component, subsystem, and system test results form a satisfactory basis for proceeding into planned tests
** All the entrance criteria have been met
** Test cases have been reviewed and analyzed for expected results, and results are consistent with test plans and objectives
** Test personnel have received appropriate training in test operation and safety procedures
** Provisions have been made should test levels or system response exceed established limits or if the system exceeds its expected range of response
** Software is ready to be tested
** SDP/SMP, software implementations, and test plans are an adequate and feasible basis for integration and test activities
* Formal dry test run completed
* Adequate test plans are completed and approved to proceed for the system under test
* Risk level associated with testing is identified (during TRR) and accepted by the appropriate program/competency leadership, as required
* Products from this review are approved, baselined and placed under configuration management

{div3}
{div3:id=tabs-12}

h2. System Acceptance Review (SAR)

{info}The SAR verifies the completeness of the specific end item with respect to the expected maturity level and to assess compliance to stakeholder expectations. The SAR examines the system, its end items and documentation, and test data and analyses that support verification. It also ensures that the system has sufficient technical maturity to authorize its shipment to the designated operational facility or launch site. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* A final agenda coordinated (nominally)
* Technical products for this review made available to participants prior to SAR
* Acceptance test readiness available for review
** Process for analysis of Test Results including the division of responsibility
** Acceptance Test testbed (environment) setup (hardware)
** Setup and use of Simulators or other Test tools and their required qualifications
** Limitations of the testbed (environment)
** Tests that require: hardware for verification and/or human input
** Description, at a high level, of what each test does, how long it lasts, and any special circumstances
** IV&V report/status - if applicable
** Preparedness for Acceptance Testing
** Requests For Action (RFAs)
** Decision to proceed to Acceptance Testing
* Results available from SARs conducted at major suppliers
* Transition to production and/or manufacturing plan exists
* Product verification results / final test reports available
* Product validation results available
* Acceptance plans and acceptance criteria
* Documentation exists to confirm that the delivered system complies with the established acceptance criteria
* Documentation exists to confirm that the system will perform properly in the expected operational environment
* Technical data package updated to include all test results
* Certification package available for review
* Risk assessment and mitigations updated
* Previous milestone reviews successfully completed
* Metrics data and reports available for review
* Remaining liens or unclosed actions and plans for closure available for review
* Waivers and deviations available for review
* Software build has been updated
* Functional audit (FCA) completed
* Software presentation prepared (for AR):
** Software overview
** Project System Diagram
** Functional software overview
** Software products/artifacts
** Software traceability matrix examples
** STPr/SVVPr status
** Open RIDs
** Open SCRs
** Software summary and recommendations
* Lessons Learned
** Review of existing Lessons Learned from previous projects completed
** Lessons Learned captured from software areas of the project ( indicating the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)
** Confirmation exists that Lessons Learned added to Lessons Learned database

h3. !check.png! Items Reviewed

* Test readiness information
* Results of the SARs conducted at the major suppliers
* Transition to production and/or manufacturing plan
* Product verification results / test reports
* Product validation results
* Baselined Software Build
* Certification package
* Documentation that the delivered system complies with the established acceptance criteria
* Documentation that the system will perform properly in the expected operational environment
* Technical data package
* Risk assessment and mitigation
* Hazard report
* Results/proof of completion for previous milestone reviews
* Remaining liens or unclosed actions and plans for closure
* Waivers/deviations
* Metrics Data and Reports

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Required tests and analyses are complete and indicate that system will perform properly in expected operational environment
** Risks are known and manageable
** Software system meets established acceptance criteria
** Required safe shipping, handling, checkout, and, checkout procedures are complete and ready for use
** Required operational plans and procedures are complete and ready for use
** Technical data package is complete and reflects delivered system, including software user's manual and version description document
** All applicable lessons learned for organizational improvement and system operations are captured
** Software system has sufficient technical maturity to authorize shipment to designated operational facility or launch site

{div3}
{div3:id=tabs-13}

h2. {anchor:_Toc286592519}Operational Readiness Review (ORR)




h2. Operational Readiness Review (ORR)

{info}The ORR examines the actual system characteristics and the procedures used in the system or product's operation and ensures that all system and support (flight and ground) hardware, software, personnel, and procedures are ready for operations and that user documentation accurately reflects the deployed state of the system. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* All validation testing completed
* Test failures and anomalies from validation testing resolved and results incorporated into all supporting and enabling operational products
* All operational supporting and enabling products (e.g., facilities, equipment, documents, updated databases) that are necessary for the nominal and contingency operations have been tested and delivered/installed at the site(s) necessary to support operations
* Software user'sOperations manual completedcomplete
* Operations manual completePhysical audit (PCA) completed
* Software inputs / contributions tocompleted for:
** Training provided to users and operators on correct operational procedures for system
** Ground Systemssystems Readinessreadiness
*** Diagram describing main functionality for project, how parts interact, and main flow of data between major functional parts
*** Problem Reportingreporting and Changechange Requestrequest process for Discrepancydiscrepancy Reportsreports (DR), Enhancementenhancement Reportsreports (ER), Database Changechange Requestsrequests (DCR)
*** Current DR, ER, DCR status, include historical trend data, and details on current open DRs, ERs, DCRs
*** Key parts of system, their current Operationaloperational Readinessreadiness, and how verified; any
*** Any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
*** Key interactions with other systems, their Operationaloperational Readinessreadiness, and how verified; any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
*** Outstanding items that need to be completed before readiness is achieved along with scheduled date
* Software freezemaintenance plan (when completed
** When software is frozen for launch, what types of fixes will be approved for implementation under a freeze, etc.) and how
** How CCB will handle software changes or bug fixes close to launch
** Science Missionplanning Operationsand Center Readiness
*** MOC software readinessprocessing system readiness is available for allreview, systems and how verified; any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
*** Testing that was done, results, criticality of problems encountered, how problems will be resolved, and schedule for correction/verification of any fix
*** Current status of procedures that will be used by the MOC; how tested, results, and schedule for correction/verification of any fix
* Flight Software Maintenance Process Planned
** Outstanding items that need to be completed before readiness is achieved along with scheduled date
** Confirmation that flight software table loads and code patch testing successfully completed on all processors, including all possible on-board media (e.g., RAM, {term:EEPROM})
* Science Planning and Processing System Readiness, as applicable
** Diagram describing Science Data Processing products and general timelines involved
** Diagram describing Science System Context (relationship of main Mission Operations Center, Mission Planning Office, Science Validation Facility, Ground stations, interconnecting networks, and the main science data Instrument teams)
** Description of these main components in high-level detail including planning and processing functions; include any special cases for launch, in-orbit checkout, end of mission, etc.; description of testing, results, and issues done to verify and validate these components
** Summary of all testing done, results, and outstanding issues for Science Data Processing
* Safety and Security Issues
** Software issues with safety, how addressed, and current status
** Software issues with security, how addressed, and current status
* Simulations
** Number and main details for simulations by subsystem exercised, for example: Launch, Attitude Control System, Command & Data Handling, Communication, Flight Software, Power System Electronics, Mission Operations Center, Pre-Launch, Others deemed important for project
** Outstanding issues from Simulation testing, schedule impact, workarounds, and risks; for workarounds, when will problem/issue be permanently fixed
* Contingencies and Constraints
** State of Contingency Flow Chart Book and any planned updates
** List of current constraints on system, state of database that details these constraints, and any outstanding actions that need to be taken
** Audits that were done and against what areas to verify constraints
** Operational problem escalation process
** Operational emergency notification process including telephone numbers to be called
* Documentation Readiness - Status of
** Version Description Document(s); its location, and any outstanding issues
** Software User's Manual; its location, and any outstanding issues
** Software Operations Plan; its location, and any outstanding issues
** Software Maintenance Plan; its location, and any outstanding issues
** Software Retirement Plan; its location, and any outstanding issues
* Lessons Learned
** Lessons Learned captured from software areas of the project; indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects
** Confirmation that Lessons Learned added to Lessons Learned database
* Work Remaining
** All launch critical work that needs to be completed before launch along with expected completion data
** RFA and RID reports generated as result of this ORR
*as applicable:
** Diagram describing science data processing products and general timelines involved
** Diagram describing science system context (relationship of main Mission Operations Center, Mission Planning Office, Science Validation Facility, Ground stations, interconnecting networks, and the main science data Instrument teams)
** Description of these main components in high-level detail including planning and processing functions; include any special cases for launch, in-orbit checkout, end of mission, etc.; description of testing, results, and issues done to verify and validate these components
** Summary of all testing done, results, and outstanding issues for Science Data Processing
* Safety and security issues status available for review:
** Software issues with safety, how addressed, and current status
** Software issues with security, how addressed, and current status
* Simulations status available for review:
** Number and main details for simulations by subsystem exercised, for example: Launch, Attitude Control System, Command & Data Handling, Communication, Flight Software, Power System Electronics, Mission Operations Center, Pre-Launch, others deemed important for project
** Outstanding issues from Simulation testing, schedule impact, workarounds, and risks; for workarounds, when will problem/issue be permanently fixed
* Contingencies and constraints available for review:
** State of Contingency Flow Chart Book and any planned updates
** List of current constraints on system, state of database that details these constraints, and any outstanding actions that need to be taken
** Audits that were done and against what areas to verify constraints
** Operational problem escalation process
** Operational emergency notification process including telephone numbers to be called
* Status of documentation readiness available for review:
** Version Description Document(s); its location, and any outstanding issues
** Baselined Software User's Manual; its location, and any outstanding issues
** Software Operations Plan; its location, and any outstanding issues
** Software Maintenance Plan; its location, and any outstanding issues
** Planned software retirement activities; location, and any outstanding issues
* Lessons Learned
** Lessons Learned (LL) captured from software areas of the project (indicating the problem or success that generated the Lesson Learned, what the LL was, and its applicability to future projects)
** Confirmation exists that Lessons Learned added to LL database
* Status of work remaining available for review:
** All critical work that needs to be completed along with expected completion data

h3. !check.png! Items Reviewed

* Validation test results/proof of completion
* Status of test failures and anomalies from validation testing
* Status of all testing, delivery, and installation for operational supporting and enabling products necessary for nominal and contingency operations
* Status of software user's manual
* Status of operations manual
* Software Maintenance Plan
* Science Planning and Processing System Readiness
* Safety and Security Issues
* Number and main details for simulations by subsystem exercised and any open issues
* Contingencies and constraints
* Status of documentation readiness
* Work Remaining

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** System, including any enabling products, is determined to be ready to be placed in operational status
** All applicable lessons learned for organizational improvement and systems operations have been captured
** All waivers/deviations and anomalies have been closed
** Systems hardware, software, personnel, and procedures are in place to support operations
** All project and support (flight and ground) h/w, s/w, and procedures are ready for operations and user documentation accurately reflects the deployed state of the entire system

h3. !exit.png! Exit/Success Criteria

* Summary of status for Operational Readiness
** Ground Systems
** Flight Systems
** Science Systems
** Documentation including contingency book readiness
** Operational support and maintenance support plans
** Configuration control procedures
** Waivers
** Issues
** Decision to proceed to Operational Readiness


{div3}{div3:id=tabs-14}

h2. {anchor:_Toc286592520}Flight Readiness Review (FRR)



* RFA and RID reports generated, as needed, as result of this ORR

{div3}
{div3:id=tabs-14}

h2. Flight Readiness Review (FRR)

{info}The FRR examines tests, demonstrations, analyses, and audits that determine the system's readiness for a safe and successful flight/launch and for subsequent flight operations. It also ensures that all flight and ground hardware, software, personnel, and procedures are operationally ready. (NPR 7120.5){info}

h3. !entrance.png! Entrance Criteria 

* Certification received that flight operations can safely proceed with acceptable risk
* System and support elements confirmed as properly configured and ready for flight
* Interfaces compatible and function as expected
* System state supports a launch "go" decision based on go/no-go criteria
* Flight supports a launch "go" decision based on go/no-go criteria
* Flight failures and anomalies from previously completed flights and reviews resolved and results incorporated into all supporting and enabling operational products.
* System configured for flight
* Tests, demonstrations, analyses, audits support flight readiness

h3. !check.png! Items Reviewed

* Open items and waivers/deviations
* System and support elements configuration confirmation
* Status of interface compatibility and functionality
* System state
* Status of failures and anomalies from previously completed flights and reviews resolved and results incorporated into all supporting and enabling operational products.
* System configured for flightconfiguration
* Tests, demonstrations, analyses, audits
* supportSoftware flightuser's readinessmanual

h3. !exit.png! Exit/Success Criteria

* Review panel agrees that:
** Flight vehicle is ready for flight
** Software is deemed acceptably safe for flight (i.e., meeting the established acceptable risk criteria or documented as being accepted by the PM and Designated Governing Authority (DGA))
** Flight and ground software elements are ready to support flight and flight operations
** Interfaces are checked and found to be functional
** Open items and waivers/deviations have been examined and found to be acceptable
** Software contributions to all open safety and mission risk items have been addressed
** Operators are ready and work-arounds have been fully vetted
** Software user's manual is ready and available to be used for testing

{div3}

{tabclose}