bannerc

The title of this page has been changed. If you are using a bookmark to get here, please updated it.

You should be redirected to https://swehb.nasa.gov/display/SWEHBVC/7.09+-+Entrance+and+Exit+Criteria. If you do not get there in 2 seconds, click the link to go there. 

7.09 - Entrance and Exit Criteria

Entrance and Exit Criteria

Background

This guidance provides the maximum set of life cycle review entrance and exit criteria for software projects and should be tailored for the project class.This guidance is a summarized collection of material from the following core documents: NPR 7123.1, Appendix G 041; version D of NPR 7120.5 (since superseded by version E) 082; and Center Procedures.

This guidance includes three types of information for each review:

  1. Entrance criteria - Activities and products that are to be completed before the review can begin.
  2. Materials for the Review - Items to be reviewed during review and used to confirm exit criteria; this information is typically available a couple of weeks prior to the review.
  3. Exit criteria – Decisions and actions to be completed before the review is considered complete.

This guidance is focused on the responsibilities of the software engineering community throughout the project life cycle reviews. Therefore, the guidance includes reviews and products which are the primary responsibility of the software engineering community as well as software engineering community contributions to system activities and products, such as the Project Plan.

Note that different mission types (e.g., robotic vs. human) can have different life cycles and, therefore, different sets of life cycle reviews that apply.

This material considers a software project to be a system of systems as well as a single subsystem within the larger project. "System of systems" refers to a software project that includes software subsystems that perform functions allocated to them. Just as a project allocates requirements to hardware, software, external components, etc., software projects allocate software requirements to software subsystems.

This material has been reviewed by the Software Working Group and the Office of the Chief Engineer.

Source of Content

Including the resources in the list below, information was pulled and consolidated based on repetition between Center Process Asset Libraries (PALs) and documents (Ames Research Center (ARC), Jet Propulsion Laboratory (JPL), Goddard Space Flight Center (GSFC), Marshall Space Flight Center (MSFC), Stennis Space Center (SSC)).


Mission Concept Review (MCR)

The MCR affirms the mission need and examines the proposed mission's objectives and the concept for meeting those objectives. Key technologies are identified and assessed. It is an internal review that usually occurs in the cognizant system development organization. ROM (Rough Order of Magnitude) budget and schedules are presented. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

MCR Entrance Criteria

  • The need for the mission is clearly identified.
  • Concept of operations available.
  • Preliminary risk assessment available, including technologies and associated risk management/mitigation strategies and options.
  • A Mission Concept Review (MCR) agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Preliminary technical Plans and conceptual life cycle available.
  • A conceptual life-cycle is available.
  • A top-level set of requirements are identified to meet the mission objectives.
  • The mission is feasible.
    • A solution has been identified that is technically feasible.
    • A rough cost estimate is within an acceptable cost range.
  • Draft cost and schedule estimates are available.
    • As developed by software (SW) developers and SW assurance personnel.
  • A technical search was done to identify existing assets/products that could satisfy the mission or parts of the mission
  • Software inputs/contributions provided for:
    • Preliminary Project Plan.
    • Preliminary Systems Engineering Management Plan (SEMP).
    • Development and analysis of alternative concepts (showing at least one feasible).

Software Assurance:

  • Software assurance point of contact for the project has been identified
  • Software assurance personnel have reviewed the materials available for the review:
    • The top-level requirements
    • Mission concept of operations
    • Preliminary technical plans and the conceptual life cycle
    • Preliminary risk assessment
  • Software assurance personnel confirm that the software portions of the MCR Entrance Criteria are met prior to the review

MCR Items Reviewed

  • Mission goals and objectives.
  • Analysis of alternative concepts.
  • Preliminary development approaches and acquisition plans.
  • Concept of operations.
  • Risk assessments.
  • Technical plans and conceptual lifecycle to achieve the next phase.
  • Preliminary requirements.
  • Draft cost and schedule estimates.
  • Conceptual system design.
  • Software Process Root cause analysis results.
  • SA analysis showing uncovered software code percentage.

Software Assurance:

  • Attends review to gain an understanding of mission
  • Record and submit RIDS (Review Item Discrepancy)/RFAs on risks or issues identified

MCR Exit/Success Criteria

  • Review panel agrees that:
    • Technical planning is sufficient to proceed to the next phase.
    • Risk and mitigation strategies have been identified and are acceptable based on technical risk assessments.
    • Cost and schedule estimates are credible.
    • Mission goals and objectives are clearly defined and stated, unambiguous, and internally consistent.
    • Conceptual system design meets mission requirements, and the various system elements are compatible.
    • Technology dependencies are understood, and alternative strategies for the achievement of requirements are understood.
  • As applicable, the agreement is reached that:
    • Preliminary mission requirements are traceable to science objectives.
    • The operations concept clearly supports the achievement of science objectives.

Software Assurance:

  • Has gained an understanding of mission goals, objectives, preliminary requirements and operations concept
  • Confirms that all issues and risks have been recorded
  • Agrees with resolution of RIDS (Review Item Discrepancy)/RFAs submitted

System Requirements Review (SRR)

The SRR examines the functional and performance requirements defined for the system and the preliminary Program or Project Plan and ensures that the requirements and the selected concept will satisfy the mission. (NPR 7120.5 082)

  • If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
  • For software-only projects, the SwRR serves as the SRR.

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 SRR Entrance Criteria

  • Successful completion of Mission Concept Review (MCR) and responses made to all MCR Requests for Actions (RFAs), Review Item Discrepancies (RIDs).
  • A preliminary SRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Technical products required for this review made available to participants prior to SRR.
  • System requirements captured in format for review.
  • System requirements allocated to the next lower level system (subsystems) – preliminary allocation completed.
  • System-level software functionality description completed.
  • System-level interface requirements for software documented.
  • Updated concept of operations available.
  • Updated mission requirements available, if applicable.
  • Preliminary Hazards Analysis (PHA) available.
  • Software inputs/contributions completed for:
    • Baselined Systems Engineering Management Plan (SEMP).
    • Preliminary Project Plan.
    • System safety and mission assurance plan, including software classification.
    • Risk management plan.
      • Updated risk assessment and mitigations (including Probabilistic Risk Assessment (PRA) as applicable).
    • Preliminary human rating plan, if applicable.
    • Initial document tree.

Software Assurance:

  • Confirm RFAs and RiDs from MCR have been satisfactorily resolved
  • Have identified the software assurance personnel for the project and associated required training
  • Have reviewed materials available prior to milestone review for:
    • System requirements captured and allocated to the next lower level
    • System-level software functionality description
    • System-level interface requirements for software
    • Updated mission requirements and concept of operations
  • Have reviewed the trace matrix for the top-level requirements to the next lower level allocation
  • Has done a software classification and safety criticality assessment and coordinated with the software development/management team. If the software classification was generated by software engineering, software assurance reviews classification confirms it and concurs
  • Have worked with system engineering and safety personnel to develop Preliminary Hazard Analysis
  • Have provided initial contributions for the systems safety plan
  • Preliminary software assurance plan has been developed per the SA Standard
  • Have reviewed software inputs for:
    • Systems Engineering Management Plan
    • Preliminary Project Plan (or Software Project Plan if the project is software only)
    • System safety and mission assurance plan
    • Risk management plan
    • Preliminary Human Rating Plan, if applicable
  • Prior to the SRR, confirm that criteria for SRR have been met, particularly those relating to software.

SRR Items Reviewed

  • System-level requirements and preliminary allocation to the next lower level-system (subsystems).
  • System-level software functionality description.
  • System-level interface requirements for the software.
  • Concept of operations.
  • Mission requirements.
  • Preliminary Hazard Analysis (PHA).
  • Preliminary approach for how requirements will be verified and validated down to the subsystem level.
  • Risk and mitigation strategies.
  • Acquisition strategy.
  • Preliminary Software and Software Assurance schedule  
  • Software cost estimate  -
  • The preliminary set of software and software assurance processes

Software Assurance:

  • Attend review to gain an understanding of mission including:
    • Requirements allocation to lower-level subsystems
    • Risk and mitigation strategies
    • System-level software functionality and interfaces
    • Acquisition strategy
    • Updated concept of operations
  • Record and submit RIDS (Review Item Discrepancy)/RFAs on any risks or issues including the following:
    • Feasibility of overall project schedule
    • The preliminary allocation of system requirements to hardware, human and software systems 
    • The requirements allocation and flow down to subsystems
    • Verification approaches
    • The Concept of Operations presented to satisfy the mission requirements

SRR Exit/Success Criteria

  • Review panel agrees that:
    • Process for allocation and control of requirements throughout all levels is deemed sound; the plan is defined to complete the definition activity within schedule constraints.
    • Requirements definition is complete with respect to top-level mission and science requirements; interfaces with external entities and between major internal elements are defined.
    • Requirements allocation and flow down of key driving requirements are defined down to subsystems including hardware, software, and human.
    • Preliminary approaches have been determined for how requirements will be verified and validated down to the subsystem level.
    • Major risks have been identified and technically assessed, and viable mitigation strategies defined.
    • Requirements and selected concepts of operations will satisfy the mission.
    • System requirements, approved material solution, available product/process technology, and program resources are sufficient to proceed to the next life cycle phase.

Software Assurance:

  • Has gained an understanding of review material
  • Agrees with resolution of RIDS (Review Item Discrepancy)/RFAs submitted; tracks to closure
  • Agrees that plans reviewed/presented are satisfactory to continue with the development
  • Concurs with the review panel’s assessment of the review’s success
  • Confirms that all issues and risks have been recorded
  • Understands system Preliminary Hazard Analysis and the areas where software might be involved

Software Requirements Review (SwRR)

See the definition of SRR.

  • If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
  • For software-only projects, the SwRR serves as the SRR.

Jump to: Entrance Criteria - General | Entrance Criteria - Plans | Entrance Criteria - Requirements | Entrance Criteria - Design | Entrance Criteria - Analysis | Items Reviewed | Exit/Success Criteria

 SwRR Entrance Criteria - General

  • Successful completion of the previous review and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs).
  • A final Software Requirements Review (SwRR) agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Technical products for this review made available to participants prior to SwRR.
  • Peer reviews completed on technical products as needed.
  • Preliminary concept of operations available for review.

Software Assurance:

  • Should have reviewed the technical products (e.g. Software Management Plan, software requirements, and the verification and validation plans) prior to any peer reviews
  • Should have reviewed the preliminary Concept of Operations
  • Software Assurance personnel have received necessary training (both SA general and project-specific)
  • May have performed audits and/or participated in peer reviews of the above material
  • Confirm that all RIDs/RFAs from the previous review have been addressed

 SwRR Entrance Criteria - Plans

  • Preliminary Software Development Plan (SDP)/Software Management Plan (SMP) updated for corresponding architectural design and test development activities including as appropriate the items from the 7.18 documentation guidance.
    • Preliminary cost estimate.
    • A preliminary schedule, including milestones, exists for all software to be developed.
    • Preliminary Software Configuration Management Plan
  • Preliminary Software Quality Assurance Plan (SQAP) exists.
  • Preliminary Software Safety Plan exists (if have safety-critical software).
    • Overall software verification strategy.
    • Test facilities, needs, and capabilities.
    • Methodology for verifying the flowed down system requirements and acceptance criteria.
    • Test tool requirements and development plans.
  • The risk management plan updated.
  • Independent Verification and Validation (IV&V) plan available and IV&V assessment of software requirements, if reviewed

Software Assurance:

  • Have completed reviews of the preliminary plan(s), including:
    • Software development/management/assurance plans
    • Configuration management plan
    • Risk management Plan
    • Overall verification strategy
    • Processes and metrics planned for software
    • Preliminary software safety plans
  • Have reviewed the Software Requirements Mapping Matrix and confirmed SA TA signature.
  • Have reviewed the preliminary cost estimate and schedule – looking for feasibility.
  • Have completed SA requirements mapping matrix for SA requirements
  • Have completed the preliminary Software Assurance Plan. See topic 7.18 in the Software Engineering Handbook for document contents of the Software Assurance Plan.
  • Have a preliminary software and software assurance schedule
  • Identify software safety-critical components by having a preliminary software hazard analysis completed

 SwRR Entrance Criteria - Requirements

  • The preliminary allocation of system requirements to software available
  • Preliminary software requirements (SRS) including, as appropriate, the items from the 7.18 documentation guidance, and the following: 
    • Block diagram exists for the major software components in each functional area, their interfaces and data flows
    • Relevant software operational modes defined (e.g., nominal, critical, contingency).
    • Critical and/or controversial requirements identified, including safety-critical requirements, open issues, and areas of concern.
    • Requirements identified that need clarification or additional information.
  • Performance requirements for the software identified.
    • The description exists in critical timing relationships and constraints.
  • Software requirements and interface requirements have been analyzed and specified.
  • Quality assurance assessment of the requirements completed and ready for review.
  • Bidirectional traceability matrix.
    • Requirements traced to higher-level requirements.
  • Includes identification of verification methodology (e.g., test, demonstration, analysis, inspection).

Software Assurance:

  • Software Assurance analysis has been completed on the software requirements (including analysis of traceability).
    • SA performs requirements analyses and prepares results for the milestone review
    • SA participates in software requirements peer reviews and prepares feedback
    • SA reviews the bidirectional traceability
    • SA safety analysis of the software safety requirements has been completed.


  • Results of SA requirements analyses and peer reviews are ready for review, including any risks or issues found.
  • Updates to (or generation of) hazard analysis reports have been made for detailed requirements, where needed
  • SA has reported (or is ready to report) the results/findings of any audits performed by SA
  • Have verified that the entrance criteria for the SwRR have been met

 SwRR Entrance Criteria - Design and Analysis

  • Preliminary high-level software architecture defined.
  • Report on current computer resource estimates and margins (memory, bus, throughput) available for review.
  • Design constraints documented.
  • Design drivers exist:
    • Explanation of design drivers and preliminary investigations made during the requirements process to determine the reasonableness of the requirements, including preliminary decisions regarding software architecture, operating systems, reuse of existing software, and selection of commercial-off-the-shelf (COTS) components.
    • Resource goals and preliminary sizing estimates (including timing and database storage) in the context of available hardware allocations; strategies for measuring and tracking resource utilization.
  • Review completed for the technical and economic feasibility of allocation of functions at the (sub)system level to hardware, firmware, and software.
  • Software-related trade-off and design decisions completed and reviewed or preliminary results available, as applicable.
  • Software Interface requirements defined.
  • Preliminary Hazard Analysis (PHA)/Software Assurance records of the Software Classification, and Software Safety Criticality available for review.
  • Software-related trade-off and design decisions completed and reviewed or preliminary results available. 


Software Assurance:

  • Confirms that a high-level software architecture exists and is reasonable to satisfy requirements
  • Confirmation that early software planning has considered:
    • Documentation of design constraints and drivers
    • Knowledge of resource estimates and margins
    • Estimates of software size, timing requirements
    • Software interface (requirements)
    • Review software trade studies and feasibility studies
  • Confirm that any long-lead procurement items have been selected and procurement started
  • Confirm that all applicable entry criteria for the review have been met
  • Has independently performed software classification or concurred with project determination
  • Has reviewed or participated in producing preliminary hazard analysis
  • Has independently performed determination of safety-critical software
  • Has confirmed other items listed above exist and has reviewed them in preparation for the review

Note: In many projects these design items are not done until after the SwRR, at the beginning of the preliminary design period.

SwRR Items Reviewed

  • Concept of operations.
    • Preliminary system requirements allocation to the software.
    • All software requirements including those in the SRS, the interface documents, and the systems requirements (SRS).
  • Risk management plan.
  • Preliminary software verification and validation (V&V) planning.
  • Software quality assurance (QA) plan.
  • Preliminary Hazards Analysis (PHA), software classification, litmus test results.
  • Design: constraints, strategy, trade-off decisions.
  • Results of technical and economic feasibility review and associated analyses.
  • Bidirectional traceability matrix.
  • Independent verification and validation (IV&V) plan and assessment of software requirements.
  • QA assessment of requirements.
  • Computer resource estimates and margins.
  • Software Configuration Management (CM) plan.
  • Software Development Plan (SDP)/Software Management Plan (SMP).
  • Software concept of operations.
  • Peer review results.

Software Assurance:

  • Review materials prior to review, attend review
  • Be prepared to present the SA assessment of the requirements and progress so far on the project. (Include results of any audits/analysis done). If this information is not presented at the review, it should be made available to the project manager.
  • Submit RIDS/RFAs on any identified issues or areas of risk
  • Implement any RIDs/RFAs submitted on SA Plan

SwRR Exit/Success Criteria

  • Review panel agrees that plans and requirements are satisfactory and ready to proceed to the design phase:
    • Software requirements determined to be clear, complete, consistent, feasible, traceable, testable.
    • Software Management Plan (SMP), software requirements, interface requirements, verification, and validation (V&V) planning is an adequate and feasible basis for architectural design activities and is approved, baselined, and placed under configuration management (CM).
    • Requirements and performance requirements defined, testable, and consistent with cost, schedule, risk, technology readiness, and other constraints.
    • System requirements, approved material solution, available product/process technology, and program resources form a satisfactory basis for proceeding into the development phase.
    • Milestones are verifiable and achievable.
    • Initial computer resource estimates are within margin limits; if not, plans for control of resource margins are deemed adequate to meet margins by preliminary design review (PDR).
  • All Software Requirements Review (SwRR) Review Item Discrepancies (RIDs) and actions are documented with resolution plans and authorization received to proceed to software architecture design.

Software Assurance:

  • Confirm that all issues and risks, RFAs, RIDs have been recorded
  • Agree with the review panel that the project is ready to proceed into development
  • Agree with resolutions of RIDs/RFAs and considers any action plans reasonable and practical within schedule constraints. Track closure of RIDs/RFAs.
  • Review and update the SA Plan if necessary

Mission Definition Review (MDR)

The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)

  • MDR is equivalent to SDR for robotic projects.

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 MDR Entrance Criteria

  • Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review made available to participants prior to MDR
  • The following items are updated, if applicable

    • System requirements document

    • Concept of operations

    • Mission requirements, goals, and objectives

    • Risk management plan

    • Risk assessment and mitigations

    • Project software classification(s)

    • Cost and schedule data (including software)

    • Software Assurance Plan

  • Preferred system solution definition exists, including major trades and options
  • Logistics documentation exists (e.g., preliminary maintenance plan)
  • Preliminary system safety analysis available
  • System requirements traced to mission goals and objectives and to the concept of operations
  • Preliminary Human Rating Plan exists, if applicable

Software Assurance:

  • Confirm that all RIDs/RFAs from SwRR have been resolved
  • Coordinate with the project on any changes in software classification and software criticality
  • Review or participate in system preliminary safety analysis
  • Review updates to requirements, concept of operations, and plans; verify changes are in alignment
  • Complete contributions to system safety and mission assurance plans
  • Confirm those entrance criteria are met
  • Confirm system requirements are traced to mission goals and objectives and to the concept of operations
  • Be prepared to present the SA assessment of progress so far on the project, including the results of any audits/analysis done. If this information is not presented at the review, it should be made available to the project manager.
  • Review the materials submitted for the milestone review

Review and update the Preliminary SA Plan, as necessary

MDR Items Reviewed

  • System documentation, as applicable
    • Architecture
    • Updated system requirements document
    • System-level software functionality description
    • System requirements traceability and preliminary allocation to software
    • Preliminary system safety analysis
    • Preferred system solution definition
  • Mission requirements, goals, objectives, if applicable
  • Concept of operations, if applicable
  • SDP/SMP
  • CM plan
  • Acquisition strategy/plans
  • Cost and schedule data
  • Logistics documentation (e.g., preliminary maintenance plan)
  • Initial document tree, if applicable
  • Preliminary Human Rating Plan if applicable
  • Updated SA Plan

Software Assurance:

  • Reviews the documentation for the review; attends review
  • Summarizes or presents any SA analysis or audit results from this phase
  • Submits any RIDs/RFAs for identified issues or risks

MDR Exit/Success Criteria

  • Review panel agrees that:
    • The overall concept is reasonable, feasible, complete, responsive to the mission requirements, and is consistent with system requirements and available resources (cost, schedule, mass, and power)
    • Software design approaches and operational concepts exist and are consistent with the requirements set
    • Requirements, design approaches, and conceptual design will fulfill the mission needs within the estimated costs
    • Major risks have been identified and technically assessed, and viable mitigation strategies have been defined
    • System-level requirements are clearly and logically allocated to software

Software Assurance:

  • Confirms all risks and issues are documented
  • Confirms RFAs/RIDs for SA Plan are documented
  • Agrees with review panel that the items above have been adequately addressed
  • Agrees with resolutions or action plans for RFAs/RIDs; tracks them to closure

System Definition Review (SDR)

The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)

  • MDR is equivalent to SDR for robotic projects.

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 SDR Entrance Criteria

  • Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review made available to participants prior to SDR
  • Preferred software solution defined including major tradeoffs and options
  • Baselined documentation updated, as required
  • Preliminary functional baseline (with supporting trade-off analyses and data) available
  • Preliminary system software functional requirements available
  • As applicable, risk management plan updated (could be part of SDP/SMP)
    • Updated software risk assessment and mitigations (including Probabilistic Risk Assessment (PRA), as applicable)
  • As applicable, SDP/SMP updated
    • Updated technology development, maturity, and assessment plan
    • Updated cost and schedule data
    • Work Breakdown Structure
  • Preliminary software safety analysis available for review
  • Project software data dictionary available
  • Preliminary project software assurance plan available
  • As applicable, updated project software maintenance plan available
  • Software inputs/contributions completed for:
    • Flow down of system requirements to all software functional elements of the system
    • Requirements process
    • Technical approach

Software Assurance:

  • Confirm that all RIDs/RFAs from SwRR have been resolved
  • Coordinate with the project on any changes in software classification
  • Review or participate in system preliminary safety analysis
  • Confirm that:
    • The preferred software solution is defined including major tradeoffs and options
    • The functional baseline is available; system software functional requirements exist and have been flowed down into software functional elements of the system.
    • The requirements process is in place
    • SDP/SMP has been updated (as applicable), including risk management, cost, schedule, work break-down structure, technical management, plans for monitoring performance
    • Confirm other baselined documentation has been updated as needed; check for alignment of documentation
    • Complete contributions to system safety and mission assurance plans
    • Complete software assurance plan
    • Confirm those entrance criteria are met
    • Review materials submitted for the review

SDR Items Reviewed

  • System architecture, including software
  • Preferred software solution with tradeoffs and options
  • Preliminary functional baseline
  • Preliminary system software functional requirements
  • The risk management plan, as applicable
  • SDP/SMP, as applicable
  • Preliminary software verification and validation (V&V) planning, as applicable
  • Software requirements documents
  • Interface requirements documents, including SW
  • Technical resource utilization estimates and margins
  • Software safety analysis
  • Software data dictionary
  • Software configuration management (CM) plan
  • Software quality assurance (QA) plan

Software Assurance:

  • Reviews materials for review; attend the review
  • Summarizes or presents any SA analysis or audit results from this phase
  • Submits any RIDs/RFAs for identified issues or risks

SDR Exit/Success Criteria

  • Review panel agrees that:
    • Software requirements, including mission success criteria and any sponsor-imposed constraints, are defined and form the basis for the proposed conceptual design
    • System-level requirements are flowed down to software
    • All software technical requirements are allocated and flow down to subsystems is adequate; they are verifiable, and traceable to their corresponding system level requirement; requirements, design approaches, and conceptual design will fulfill the mission needs consistent with the available resources (cost, schedule, throughput, and sizing)
    • Technical plans have been updated, as necessary, including a risk management plan, SDP/SMP, V&V planning, software maintenance plan, QA plan, CM plan
    • Tradeoffs are completed, and those planned for Phase B adequately address the option space
    • Adequate planning exists for the development of any enabling new technology
    • Significant development, mission, and safety risks are identified and technically assessed, and a process and resources exist to manage the risks
    • Operations concept is consistent with the proposed design concept(s) and in alignment with the mission requirements
    • The requisite level of detail and resources are available to support the acquisition and development plan within existing constraints
    • All of these software subsystem requirements are traceable to either mission objectives, the concept of operations, or interface requirements
    • Monitoring processes/practices are in place to create a software subsystem within planned technical, schedule, cost, effort, and quality capabilities
  • Preliminary verification approaches are agreed upon

Software Assurance:

  • Confirm all risks and issues are documented.
  • RIDs/RFAs for SA Plan implemented
  • Agree with the review panel that the items above have been adequately addressed
  • Agree with resolutions or action plans for RFAs/RIDs; tracks them to closure

Preliminary Design Review (PDR)

The PDR demonstrates that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and establishes the basis for proceeding with detailed design. It shows that the correct design option has been selected, interfaces have been identified, and verification methods have been described. Full baseline cost and schedules, as well as risk assessments, management systems, and metrics are presented. (NPR 7120.5 082)

Jump to: Entrance Criteria - General | Entrance Criteria - Plans | Entrance Criteria - Requirements | Entrance Criteria - Design | Entrance Criteria - Analysis | Items Reviewed | Exit/Success Criteria

 Entrance Criteria - General

  • Successful completion of the SDR or MDR and responses made to all SDR or MDR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review made available to participants prior to PDR
  • Baselined documentation updated, as required
  • Risk assessment and mitigation updated
  • Cost and schedule data baselined
  • Peer reviews completed: SRS, software architectural design (if identified for SW peer review/inspection in SW development plans), integration test plans
  • Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)

Software Assurance:

  • Confirm that RFAs and RIDs from the previous review have been resolved and any resulting actions are complete
  • Participate in or review materials for peer reviews
  • Confirm other general entrance criteria have been completed

 PDR Entrance Criteria - Plans

  • Applicable technical plans (e.g., technical performance measurement plan, payload-to-carrier integration plan, producibility/manufacturability program plan, reliability program plan, baselined quality assurance plan) available
  • SDP/SMP updated, if appropriate
    • Work Breakdown Structure
    • For the corresponding detailed design activities
  • Metrics established and gathered to measure software development progress
  • Logistics documentation (e.g., maintenance plan) updated, as required
  • Configuration management plan baselined
  • Configuration Control Board established for software (and change control procedures working)
  • Procedures and tools developed for mechanizing management and configuration management plans
  • Supplier documentation available for review, if applicable:
    • Software Data Dictionary(s)
    • Software Classification(s)
    • SDP/SMP [with verification and validation (V&V) separate]
    • Software configuration management plan(s)
    • Software assurance plan(s)
    • Software maintenance plan(s)
  • Test tools and facility requirements identified with plans and actions to ensure their availability when needed
  • Development environment ready (e.g., hardware diagram, operating system(s), compilers, DBMS, tools)
    • Developmental tools and facility requirements identified and plans made and actions taken to ensure their availability when needed
    • Tools needed for software implementation completed, qualified, installed and accepted, and the team trained in their use
    • Facilities for software implementation in place, operating, ready for use
  • Software quality assurance group formed and contributing as a team member to the design and test activities

Software Assurance:

  • Review/Assess plans including updated SDP/SMP, configuration plans, risk management plans, V&V plans, maintenance plans; identify any issues or inconsistencies
  • Confirm issues and/or inconsistencies in software plans are addressed
  • Confirm that development tools, computers, facilities for implementation are in place for development
  • Confirm tools for testing and facilities are planned and will be in place when needed
  • Confirm implementation team is trained on tools, and development environment and is familiar with the requirements and their allocation to components
  • Confirm configuration management processes and tools are in place
  • Conduct software assurance as an independent contributor to the quality of the design and test
  • Confirm other plan entrance criteria are in place and review items are available

 PDR Entrance Criteria - Requirements

  • Preliminary traceability matrix to CSCI (Computer Software Configuration Item) level exists, including V&V trace
    • Safety-critical requirements highlighted
    • Requirements allocated to components of the architecture (to CSCI level)
  • SRS baselined after SwRR/SRR is updated, if appropriate

Software Assurance:

  • Assess the requirements traceability to the CSCI level, to the lower level software elements, and to the V&V tests, if available
  • Confirm that the requirements that flow from the hazard analysis reports are tracked and included in the architectural design
  • Confirm that all safety requirements, performance requirements, security requirements, and derived requirements, as well as requirements to be satisfied by Off-the-Shelf (OTS) are documented in the requirements baseline and included in the architectural design
  • Confirm that the Software Requirements Specification has been baselined, review updates if necessary

 PDR Entrance Criteria - Design

  • Applicable standards available to the review team
  • Preliminary interface control documents available for review
  • Technical resource utilization estimates and margins ready for review
    • Storage or memory resource allocations developed allocating those resources to each software segment in the architecture
  • Design Solutions, analysis, decision, and rationale documented
    • Inherited capabilities identified and compatible with the designs
  • Security and supportability requirements factored into the design
  • Trade studies completed
    • Addressing COTS (Commercial Off The Shelf), reuse, etc.
    • Trade-off analysis and data supporting design, as required
    • Alternative design solutions and selection criteria
  • Results of prototyping factored into the architectural design
  • Preliminary Software Design Document (SDD) including, as appropriate, the items from the 7.18 documentation guidance
  • Results available from evaluations of prototype software, if necessary to evaluate design
  • Human engineering aspects of design addressed with solutions acceptable to potential users
  • SDD and traceability matrix review by test team completed and SDD updated as needed
  • Critical components identified and trial coding scheduled
  • Confirmation exists that
    • The test group participated in requirements and design analysis
    • Interdisciplinary teams are working design issues that cross (sub)system component boundaries (software, hardware, etc.)

Software Assurance:

  • Review or participate in design team meetings and peer reviews
  • Confirm that Preliminary Interface Control Documents, Preliminary Software Design Document, and a completed definition of the software architecture and preliminary database design description are available
  • Review architectural and design documentation and confirm that the following have been considered:
    • Technical resource utilization
    • Storage or memory resource allocations for each software segment
    • Inherited capabilities
    • Design Solutions, analysis, decision, and rationale
    • Preliminary database design description, as applicable
    • Overview of software architecture, including context diagram
    • List of subsystems or major components
    • Functional allocations, descriptions of major modules, and internal interfaces
    • Security and supportability requirements
    • External interfaces and end-to-end data flow
    • Safety considerations in design elements and interfaces
    • Design verification approach/methods
    • Human engineering aspects of the design
  • Confirm the preliminary SDD has the content prescribed in the 7.18 documentation guidance for an SDD
  • Review other architectural documentation and participate in peer reviews to understand design and design decisions

 PDR Entrance Criteria - Analysis

  • Safety analyses and plans baselined:
    • Matrix showing each subsystem/task/component's software classification (per NPR 7150.2), its safety classification (per NASA-STD-8739.8A), the rationale for the classifications, and the status of the classifications' approval by Software Assurance and management
    • Updated PHA, Software Safety criticality determined, if necessary
    • Approved SMP/ PHA/Software Assurance Classification
  • Analyses for the following completed, as appropriate:
    • Partitioning analysis (modularity)
    • Executive control and Start/Recovery
    • Control and Data flow analysis
    • Operability
    • Preliminary failure modes and effects analyses
  • Operational Concepts revised, as applicable, and baselined
    • Normal operations scenarios
    • Fault detection, isolation and recovery (FDIR) strategy
    • Hazard reduction strategies
  • Status of change requests available for review

Software Assurance

  • Perform a requirements analysis (See requirements analysis description (SAANALYSIS) in this Handbook in Section 7.18: Documentation Guidance
  • Perform a preliminary design analysis  (See description in this Handbook in Section 7.18: Documentation Guidance
  • Review the software classifications and safety criticality determinations and update as necessary
  • Review the updated Preliminary Hazard Analysis and Hazard Reports to identify any new safety-critical software and requirements
  • The safety-related analysis may include Fault Tree Analysis (FTA), or Preliminary Failure Modes and Effects Analysis (See Topics 8.7 and 8.5, respectively, in SA tab of Software Topics in this Handbook)
  • Be prepared to report on the audits and analysis performed during the Preliminary Design Phase or in a report to the project management
  • Confirm that all the input criteria for the review have been completed

PDR Items Reviewed

  • Risk assessment and mitigation
  • Safety and assurance analysis and plans
  • Cost and schedule data
  • Logistics documentation (e.g., maintenance plan)
  • Technical plans (e.g., QA plan, performance measurement plan)
  • Interface control documents
  • Software V&V planning
  • Resource utilization estimates and margins
  • SDP/SMP
  • Bidirectional traceability matrix
  • Software design documents
  • Supplier documentation
  • Requirements documents
  • Concept of operations
  • Trade studies
  • Documented solutions, analysis, decisions and rationale
  • Completed analyses
  • Prototype software, if applicable
  • Plans for development and test tools and facilities
  • Software development progress metrics
  • CM plan
  • Peer review results/proof of completion
  • Status of change requests

Software Assurance:

  • Reviews material for review; attends review
  • Summarizes or presents any SA analysis or audit results from the preliminary design phase
  • Submits any RIDs/RFAs for identified issues or risks

PDR Exit/Success Criteria

  • Top-level requirements including mission success criteria, Technical Performance Measures (TPMs), and any sponsor-imposed constraints are agreed upon, finalized, stated clearly, and consistent with preliminary design
  • Review panel agrees that:
    • Flow down of verifiable requirements is complete and proper or, if not, an adequate plan exists for timely resolution of open items; requirements are traceable to mission goals and objectives
    • All supplier software requirements are verifiable
    • Preliminary design is expected to meet the functional and performance requirements at an acceptable level of risk
    • Definition of technical interfaces is consistent with overall technical maturity and provides an acceptable level of risk
    • Adequate technical interfaces are consistent with the overall technical maturity and provide an acceptable level of risk
    • Adequate technical margins exist with respect to TPMs
    • Any required new technology has been developed to an adequate state of readiness, or back-up options exist and are supported to make them a viable alternative
    • Project risks are understood and credibly assessed; plans, process, and resources exist to effectively manage them
    • The operational concept is technically sound, includes (where appropriate) human factors, and includes flow down of requirements for its execution
    • The proposed design approach has sufficient maturity to proceed to final design
    • Subsystem requirements, subsystem preliminary design, results of peer reviews, and plans for development, testing and evaluation form a satisfactory basis for proceeding into detailed design and test procedure development
    • SMP, the software architectural design, and integration test plans adequate and feasible to support software detailed design
  • All RIDs/actions are completed or have closure plans and customer approval received to proceed to the detailed design phase
  • Products from this review are approved, baselined and placed under configuration management
  • Approval received for software inputs/contributions:
    • Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and IEEE parts) is adequately addressed in preliminary designs and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis, and failure modes and effects analysis)
    • Management processes used by the mission team are sufficient to develop and operate the mission
    • Cost estimates and schedules indicate that the mission will be ready to launch and operate on time and within budget and that the control processes are adequate to ensure remaining within allocated resources

Software Assurance: 

  • Baseline Software Assurance Plan
  • Baseline NASA-STD-8739.8 compliance matrix
  • Confirm NPR 7150.2 compliance matrix is complete and approved
  • Confirm that cost, schedule, Software CM Plan, Software Requirements Specification, Software Design Description, and Operations Concept are baselined
  • Agree with the review panel that the items above have been adequately addressed
  • Confirm that all issues, risks, RFAs/RIDs are documented
  • Agree with resolutions or action plans for RFAs/RIDs; tracks them to closure

Critical Design Review (CDR)

The CDR demonstrates that the maturity of the design is appropriate to support proceeding with full-scale fabrication, assembly, integration, and test and that the technical effort is on track to complete the flight and ground system development and mission operations in order to meet mission performance requirements within the identified cost and schedule constraints. Progress against management plans, budget, and schedule, as well as risk assessments are presented. (NPR 7120.5 082)

Jump to: Entrance Criteria - General | Entrance Criteria - Plans | Entrance Criteria - Requirements | Entrance Criteria - Design | Entrance Criteria - Analysis | Entrance Criteria - Other | Items Reviewed | Exit/Success Criteria

 CDR Entrance Criteria - General

  • Successful completion of the previous review (typically PDR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review made available to participants prior to CDR
  • Baselined documents updated, as required
  • Peer reviews for software and rework accomplished, as defined in the s/w and/or project plans
  • NPR 7150.2 compliance matrix baselined
  • Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)

Software Assurance:

  • Confirm NPR 7150.2 compliance matrix is approved and baselined
  • Confirm that any lessons learned to date have been added to the LL database
  • Attend or review any software peer reviews of design material
  • Confirm that all RFAs/RIDs from PDR have been successfully completed

 CDR Entrance Criteria - Plans

  • Previously baselined software documentation updated as appropriate
    • Updated cost and schedule data
  • Progress against software management plans available for review
  • Software requirements, management process, including documents used and produced, and verification and validation (V&V) planning updated and baselined
  • Staffing-up problems being addressed, contingency plans in place
  • Independent verification and validation (IV&V) plans and status available for review, if applicable
  • Management procedures and tools for measuring and reporting progress available and working
  • Software measurements available for review (on planned and actual regarding product size, cost, schedule, effort, and defect)
  • Procedures established and working for software quality assurance and quality an integral part of the product being produced
  • The implementation process exists, incl. standards, review process, problem reporting, unit test, integration
  • Supplier documentation available for review, if applicable :
    • Software Design Description(s)
    • Interface Design Description(s)
    • Updated Supplier Software V&V Plan(s)
    • Preliminary Supplier Software Test Procedure(s)
    • Systems and subsystem certification plans and requirements exist (as needed)
  • Changes since PDR available for review:
    • Updated product assurance and software safety plans and activities
    • System safety analysis with associated verifications
    • Status of configuration management processes since PDR available, including discrepancy reporting and tracking (development and post-release)
  • Build plan exists, including
    • Test timeline and ordered list of components and requirements to be tested in each build ready for review
    • Test group trained and prepared to evaluate the code using their facilities and tools
  • Coding, integration, and test plans and procedures available for review
    • Test levels described (e.g., unit testing, integration testing, software system testing) – description, who executes, test environment, standards followed, verification methodologies
    • Testing preparation and execution activities planned, incl. testing of reused/heritage software, if applicable
    • Test environments described for each test level --diagram and description of tools, testbeds, facilities
  • Preliminary plans available for review:
    • Launch site operations plan
    • Checkout and activation plan
    • Disposal plan (including decommissioning or termination)
  • Delivery, installation, maintenance processes planned
  • Preliminary system and acceptance testing defined – operational scenarios to be tested, including stress tests and recovery testing, if applicable
  • Acceptance process exists – reviews (e.g., Acceptance Test Readiness Review, Acceptance Test Review), approval, and signoff processes
  • Acceptance criteria baselined

Software Assurance:

  • Review updates to baselined software plans (e.g. SDP, SMP, V&V plans)
  • Assess whether requirements management processes and configuration management processes are in place and working
  • Assess whether management procedures and tools for assessing progress are in place and working
  • SA measurement process has been established and is working; measures are being collected analyzed and reported on
  • Confirm supplier documentation exists and review as appropriate
  • Confirm that procedures and standards for implementation, review process, problem reporting, unit test, integration are in place
  • Confirm planning for testing has been addressed including test procedures, levels of testing, test team training, test environment and facility, test tools and accreditation plans, acceptance process, acceptance criteria
  • If applicable, confirm launch site operations plan, checkout and activation plan and decommissioning plans are in place
  • Update software assurance and software safety plans, if needed

 CDR Entrance Criteria - Requirements

  • Changes to IT security requirements since PDR available for review (Mission-specific)
  • SRS updated to the Computer Software Unit (CSU) level
  • Traceability matrix updated (to CSU level)
  • Verification exists that detailed designs cover the requirements

Software Assurance:

  • Confirm any changes IT security requirements are available; review changes
  • Assess the updates to the Software Requirements Specification
  • Assess the traceability matrix updates: Be sure SRS updates are included
  • Verify that the detailed designs cover the requirements

 CDR Entrance Criteria - Design

  • Technical data package (e.g., integrated schematics, spares provisioning list, interface control documents, engineering analyses, and specifications) available for review
  • Design process exists, including methodology and standards used, design documentation produced, inspections and reviews
  • Software design document(s) baselined (including interface design documents, detailed design, and unit test)
  • Command and telemetry list available for review
  • The final design solution, evaluation, and rationale available
    • Reused/heritage software or functionality from previous projects; necessary modifications
  • Final architecture definition available
  • Subsystem/component context diagram available
  • Data flow diagrams available
  • Software subsystem design diagram available (e.g., Level 0 data flow diagram or Unified Modeling Language (UML))
  • For each task in the software subsystem design diagram
    • Design diagrams for the task
    • Description of functionality and operational modes
    • Safety considerations addressed in the design
  • Resource and utilization constraints (e.g., CPU, memory); how the software will adapt to changing margin constraints; performance estimates
  • Data storage concepts and structures
  • Input and output data and formats identified
  • Interrupts and/or exception handling available, including event, FDC, and error messages
  • IT Security features (design features) identified
  • A detailed description of software operation and flow exists
  • Operational limits and constraints identified
  • Technical resource utilization estimates and margins updated
    • Detailed timing and storage allocation compiled
  • Algorithms exist sufficient to satisfy their requirements
  • Failure detection and correction (FDC) requirements, approach, and detailed design available for review
  • Trial code analyzed and designs modified accordingly
  • Designs comprising the software completed, peer-reviewed and placed under change control

Software Assurance:

  • Review/Assess all design diagrams and design documentation - Confirm existence and completeness (see next section for design analysis)
  • Attend design reviews
  • Confirm safety and security considerations are included in the design
  • Update hazard analysis reports and safety plan, based on detailed design

 CDR Entrance Criteria - Analysis

  • Analyses completed:
    • Algorithm accuracy
    • Critical timing and sequence control
    • Undesired event handling
    • Operability
    • Failure modes and effects analyses
  • Final status and results of analyses ready for review
  • Hazard analysis / Software Assurance Classification updated, if necessary
  • Subsystem-level and preliminary operations safety analyses exist
  • Risk assessment and mitigation updated
  • Reliability analyses and assessments updated
  • Operational Concepts updated
  • Product build-to specifications exist for each hardware and software configuration item, along with supporting trade-off analyses and data
  • Status of change requests available for review

Software Assurance:

  • Review and update risks
  • Perform design analysis (SADESIGN) and prepare results for reporting (See Section 7.18: Documentation Guidance in this Handbook )
  • Perform safety analysis (See Software Safety Analysis Topic in SA Tab of Topics in this Handbook) and assist with security analysis; prepare results for reporting
  • Update safety criticality determinations, software classifications, if necessary
  • Perform failure modes and effects analyses (See Topic 8.5 in SA tab of Topics in this Handbook), if necessary
  • Review updated documentation and analysis performed by software 

 CDR Entrance Criteria - Other

  • Software requirement verification recording, monitoring, and current status available for review – databases and test reports; sample test verification matrix
  • Preliminary operations handbook created
  • Programmer's manual drafted
  • User's manual drafted

Software Assurance:       

  • Confirm draft manuals above exist, if planned

CDR Items Reviewed

  • Baselined documents
  • Technical data package
  • SDP/SMP
  • Progress against software management plans
  • Plan and status for reviews
  • Documentation plan
  • NPR 7150.2 compliance matrix
  • Design and implementation processes
  • Status of management procedures and tools
  • Software measurements
  • Logistics documentation (e.g., maintenance plan)
  • Status of any staffing problems
  • Software design document(s)
  • Command and telemetry list
  • Final Design Solution, Evaluation, and Rationale
  • Final Architecture Definition
  • Software subsystem design diagram
  • Data flow diagrams
  • Identification and formats of input and output data
  • Interrupts and/or exception handling, including event, FDC, and error messages
  • IT Security requirements and features
  • A detailed description of software operation and flow
  • Operational limits and constraints
  • Technical resource utilization estimates and margins
  • Status and results of analyses
  • Algorithms sufficient to satisfy their requirements
  • Failure detection and correction (FDC) requirements, approach, and detailed design
  • Subsystem/component context diagram
  • Status of trial code analysis and design
  • Supplier documentation
  • Status of SW designs and requirements coverage verification
  • SRS
  • Bidirectional Traceability Matrix
  • Status of software QA and safety plans, procedures, activities
  • Hazard analysis / Software Assurance Classification Report (SACR), if necessary
  • Risk assessment and mitigation
  • Reliability analyses and assessments
  • IV&V plans and status
  • Systems and subsystem certification plans and requirements (as needed)
  • CM processes
  • Status of the development environment and personnel training
  • Build plan
  • Product build-to specifications along with supporting trade-off analyses and data
  • Coding, integration, and test plans and procedures
  • V&V planning
  • Build test timeline and ordered list of components and requirements to be tested in each build
  • Launch site operations plan
  • Checkout and activation plan
  • Disposal plan
  • Preliminary Operations Handbook
  • Draft of Programmer's Manual
  • Draft of User's Manual
  • Status of change requests

Software Assurance:          

  • Review the CDR review materials and updates
  • Be prepared to present SA status and results of SA analysis, audits, SA metrics, analysis of software metrics – SA should be able to provide a general “state of the software project” assessment
  • Submit RFAs/RIDs on any identified issues, risks

CDR Exit/Success Criteria

  • Review panel agrees that:
    • All supplier software requirements have been mapped to the software design
    • All elements of the design are compliant with functional and performance requirements (detailed design is expected to meet requirements with adequate margins at acceptable level of risk)
    • Interface control documents are sufficiently matured to proceed with fabrication, assembly, integration, and test, and plans are in place to manage any open items
    • Product verification and product validation requirements and plans are complete; verification approach is viable, and will confirm compliance with all requirements
    • Management processes used by the project team are sufficient to develop and operate the mission
    • Testing approach is comprehensive, and planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into next phase
    • Adequate technical and programmatic margins and resources exist to complete development within budget, schedule, and risk constraints
    • Risks to mission success are understood and credibly assessed, and plans and resources exist to effectively manage them
    • SDP/SMP, software detailed designs, and unit test plans are an adequate and feasible basis for the implementation and test activities
    • High confidence exists in the product baseline, and adequate documentation exists or will exist in a timely manner to allow proceeding with coding, integration, and test
  • Approval received for software inputs / contributions:
    • Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) have been adequately addressed in system and operational designs, and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis and failure modes and effects analysis) have been approved
  • High priority RIDs against the SDD are closed/actions are completed and customer approval received to proceed to next phase
  • Approved readiness to proceed with software implementation and test activities
  • Products from this review are approved, baselined, and placed under configuration management

Software Assurance:

  • Confirm that all issues, risks, RFAs/RIDs are documented
  • Agrees with the review team that the exit criteria listed above has been met and the project is ready to move into the next phase of development
    • Agree that the review panel items above have been adequately addressed
    • Agree with resolutions or action plans for RFAs/RIDs; tracks them to closure

Production Readiness Review (PRR)

The PRR is held for projects developing or acquiring multiple similar or identical flight and/or ground support systems. The purpose of the PRR is to determine the readiness of the system developer(s) to efficiently produce (build, integrate, test, and launch) the required number of systems. The PRR also evaluates how well the production plans address the system's operational support requirements. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 PRR Entrance Criteria

  • Significant production engineering problems encountered during development are resolved
  • Design documentation is adequate to support production
  • Production plans and preparation are adequate to begin fabrication
  • Production-enabling products and adequate resources are available, allocated, and ready to support end product production
  • Production risks and mitigations identified
  • Schedule reflects production activities

Software Assurance:

  • Confirms that production engineering problems encountered during development are resolved and any production risks and mitigations have been identified
  • Confirms that design documentation, production plans, and preparation are adequate to support production
  • Confirm all necessary resources are available and ready to support production
  • Confirm the schedule is reasonable for a production

PRR Items Reviewed

  • Design documentation
  • Production plans and preparation
  • Production risks and mitigations
  • Schedule

Software Assurance:         

  • Review all materials for review
  • Submit RFAs/RIDs on any identified issues, risks

PRR Exit/Success Criteria

  • Review panel agrees that:
    • System requirements are fully met in the final production configuration
    • Adequate measures are in place to support production
    • Design-for-manufacturing considerations ensure ease and efficiency of production and assembly
    • Risks are identified, credibly assessed, and characterized, and mitigation efforts defined
    • Alternate sources for resources identified, as appropriate
    • Required facilities and tools are sufficient for end-product production
    • Specified special tools and test equipment are available in proper quantities
    • Production and support staff are qualified
    • Production engineering and planning are sufficiently mature for cost-effective production
    • Production processes and methods are consistent with quality requirements
    • Qualified suppliers are available for materials that are to be procured
  • Delivery schedules are verified

Software Assurance:

  • Agrees with the panel that exit criteria have been met
  • Agree with resolutions or action plans for RFAs/RIDs; tracks them to closure
  • Confirm that all issues, risks, RFAs/RIDs are documented

System Integration Review (SIR)

The SIR evaluates the readiness of the project to start flight system assembly, test, and launch operations. V&V Planning, integration plans, and test plans are reviewed. Test articles (hardware/software), test facilities, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 SIR Entrance Criteria

  • Integration plans and procedures completed and approved
  • Segments and/or components available for integration
  • Mechanical and electrical interfaces verified against the interface control documentation
  • All applicable functional, unit-level, subsystem and qualification testing conducted successfully
  • Integration facilities, including clean rooms, ground support equipment, electrical test equipment, and simulators ready and available
  • Support personnel adequately trained
  • Handling and safety requirements documented
  • All known system discrepancies identified and disposed of in accordance with the agreed-upon plan
  • All previous design review success criteria and key issues satisfied in accordance with the agreed-upon plan
  • Quality control organization is ready to support integration effort

Software Assurance:

  • Confirm integration plans and procedures are in place and approved
  • Confirm segments and components are ready for integration; all facilities are ready and available
  • Confirm all mechanical and electrical interfaces have been verified; Confirm all applicable qualification testing has been successfully completed
  • Confirm support personnel are adequately trained
  • Assess all handling and safety requirements; identify any issues or risks
  • Confirm that all previous design review success criteria and key issues have been satisfied as per previously agreed-upon plan
  • Confirm all known system discrepancies have been identified and disposed of in accordance with the agreed-upon plan
  • Confirm software assurance/control organization is prepared to support integration

SIR Items Reviewed

  • Integration plans and procedures
  • Interface control documentation
  • A functional, unit-level, subsystem, and qualification test results/proof of completion
  • Test preparation (facilities, tools, equipment, personnel)
  • Handling and safety requirements
  • V&V planning, test plans

Software Assurance:         

  • Review all documentation, attends review
  • Submits RFAs/RIDs for any identified risks or issues

SIR Exit/Success Criteria

  • Review panel agrees that:
    • Adequate integration plans and procedures are completed and approved for the system to be integrated
    • Previous component, subsystem, and system test results form a satisfactory basis for proceeding to integration
    • Integration procedures and workflow have been clearly defined and documented
    • Review of integration plans, as well as procedures, environment, and configuration of items to be integrated, provides a reasonable expectation that integration will proceed successfully
    • Integration personnel have received the appropriate training in integration and safety procedures
  • The risk level is identified and accepted by program/project leadership, as required

Software Assurance:

  • Agrees with the panel that exit criteria have been met
  • Agree with resolutions or action plans for RFAs/RIDs; tracks them to closure
  • Confirm that all issues, risks, RFAs/RIDs are documented

Test Readiness Review (TRR)

The TRR ensures that the test article (hardware/software), test facility, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7123.1 041)

Jump to: Entrance Criteria - General | Entrance Criteria - Plans | Entrance Criteria - Requirements | Entrance Criteria - Design | Entrance Criteria - Analysis | Entrance Criteria - Other | Items Reviewed | Exit/Success Criteria

 TRR Entrance Criteria - General

  • All TRR-specific materials, such as test plans, test cases, procedures, and version description document available to all participants prior to TRR
  • Updated baselined documentation available (from previous reviews – SwRR, SRR, PDR, CDR)
  • Required documents are in the state/status required; any required deviations or waivers are in place and approved
  • All known system discrepancies identified and disposed of in accordance with the agreed-upon plan
  • Software cost estimate updated, and software related expenditures collection and report by life cycle phases available
  • Test schedule updated and are reasonable based on results of unit testing
  • Lessons Learned captured from software areas of the project ( indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)

Software Assurance:

  • Confirm that all baselined documents from previous reviews are available
  • Confirm any waivers or deviations are approved
  • Confirm TRR-specific materials are available (e.g. test plans, test procedures, test cases, version descriptions document, test schedule, test status, test data)
  • Confirm all known discrepancies are identified and addressed in an agreed-upon manner
  • Assess schedules for reasonableness
  • Lessons learned captured from the software areas of the project
  • Confirm all input criteria for TRR have been met

 TRR Entrance Criteria - Plans

  • Objectives of testing and testing approach clearly defined and documented and supported by:
    • Test plans, test cases, procedures, environment
    • Defined configuration of test item(s)
  • Interfaces placed under configuration management or defined in accordance with an agreed-to plan
  • All required test resources, people (including a designated test director), facilities, test articles, test instrumentation, and other test enabling products identified and available to support required tests
  • Facilities and tools for integration and test ready, qualified, validated, and available for operational use including test engineering products (test cases, procedures, tools, etc.), testbeds, simulators, and models
  • Roles and responsibilities of all test participants defined and agreed to and all personnel have been trained
  • Software Version Description(s) available
  • Metric data and reports (implementation and test) ready for review
  • IV&V report/status - if applicable
  • Any current risks, issues, or requests for action (RFAs) that require follow-up and how they will be tracked to closure ready for review
  • Risk analysis and risk list updated and associated risk management plan updated
  • The test plan includes test scenarios:
    • User-defined scenarios to test interactive or operator-oriented software
    • Safety-critical scenarios
    • Security scenarios
    • For all software/system requirements defined in the bidirectional traceability matrix
    • Performance checks at the limit of ranges specified for the requirements and operational scenarios, including test limitations and constraints
  • Test case structure established that identifies per test case:
    • Software requirements to be tested and SW entities to be exercised
    • Required inputs
    • Facilities and test tools required, setup, and required qualifications
    • Limitations of the test environment
  • Test plan updated for integration and test activities
  • Software test procedures baselined, including safety criticality and security considerations:
    • Defined CM process and procedures used for testing and problem reporting/resolution activities
    • Process for capturing test data and storing it
    • Role of Quality Assurance including redlining and QA witnessing role and responsibilities
    • Any safety and security issues relevant to the testing activity
    • All workarounds and non-functioning software components

Software Assurance:

  • Confirm all plans have been updated as needed and are ready for testing, including:
    • Test Plans, Test Procedures, Test Cases with applicable data, Test Scenarios, Test Environment
    • Version Description Document
    • Any safety-critical and security considerations
  • Confirm processes and procedures are in place for testing:
    • Configuration management of test artifacts
    • Procedures used for testing; capturing test results; reporting discrepancies
  • Confirm resources needed and facility are prepared for testing
  • Have an agreed-upon plan for SA review of testing (witnessing, sampling, etc.)
  • Have assessed bidirectional traceability from requirements to test procedures/cases

 TRR Entrance Criteria - Requirements

  • All requirements included in baselined test procedure document and uniquely identified and traceable in the updated bidirectional traceability matrix (includes necessary corrections due to discrepancy reports)

Software Assurance:          

  • Confirms that all requirements have a corresponding test(s) in the test procedures document, including changes due to discrepancy reports and all requirements are up to date in the traceability matrix.

 TRR Entrance Criteria - Design

  • All previous design review success criteria and key issues satisfied in accordance with the agreed-upon plan

Software Assurance:

  • Confirms all previous review criteria have been satisfied and any key issues resolved in accordance with the agreed-upon plan

 TRR Entrance Criteria - Analysis

  • Outstanding software change requests (SCRs) ready for review
  • Code inspection results available for review
  • Results of testing completed to date available for review:
    • Objectives of tests
    • Expected results defined
    • Known problems, issues
    • Deviations, waivers

Software Assurance:

  • Confirm findings from all previous analyses have been resolved as agreed-upon
  • Participate in code reviews and walkthroughs, test plan and test procedures peer reviews
    • Confirm adherence to standards
  • Review results of testing completed to date, any outstanding software change requests
    • Confirm any problems/issues have been documented

 TRR Entrance Criteria - Other

  • Software build created from CM and ready for testing
  • Applicable functional, unit-level, subsystem, system, and qualification testing conducted successfully
  • Informal dry run completed without errors
  • Validation of operations and users manuals completed
  • Successful functional configuration audit (FCA) of the version description document (VDD) (such as FSW) including fixes
  • Tests reusable for regression testing exist
  • Databases for integration and test have been created and validated
  • Test network showing interdependencies among test events and planned time deviations for these activities prepared
  • Verification of computations using nominal and stress data
  • Verification of performance throughout the anticipated range of operating conditions including nominal, abnormal, failure and degraded mode situations
  • Verification of end-to-end functional flows and database linkages
  • Exercise of logic switching and executive control options at least once

Software Assurance:

  • Confirm audits or assessments of documentation and processes have been performed throughout the implementation
  • Confirm software build is ready for testing and previous testing has been completed successfully
  • Conduct or participate in a functional configuration audit of the VDD
  • Confirm any necessary verification of computations have been completed
  • Confirm end-to-end functional flows and database linkages will be tested

TRR Items Reviewed

  • Test preparation
    • Test plans, test cases, scenarios, databases, procedures, environment, expected results, and configuration of test item(s)
    • Software build ready for testing
    • Resources (people, facilities, tools, etc.)
    • Test schedule
    • Test contingency planning
    • Test network
  • Results for all testing completed to date
  • Interfaces
  • Software V&V Plan
  • VDD and VDD audit results
  • Software change requests
  • Bidirectional traceability matrix
  • Current risks, issues, or requests for action (RFAs)
  • Baselined documentation from previous reviews
  • Requirements and design
  • Status of quality assurance (QA) activities
  • Status of known system discrepancies
  • Software cost estimate and expenditures report
  • Supplier Software VDD(s)
  • Requirements Analysis and Traceability Reports
  • Code Analysis and Assessment Results
  • Metric Data and Reports
  • Operations and users manuals
  • Completed evaluations of the unit, integration tests
  • Risk analysis, list, management plan

Software Assurance:          

  • Attend review and review documentation related to the review
  • Submit any RFAs/RIDs on identified issues or risks

TRR Exit/Success Criteria

  • Review panel agrees that:
    • Peer reviews completed for implementation and tests to be performed, as defined in the software plans
    • Adequate identification and coordination of required test resources are completed
    • Previous component, subsystem, and system test results form a satisfactory basis for proceeding into planned tests
    • All the entrance criteria have been met
    • Test cases have been reviewed and analyzed for expected results, and results are consistent with test plans and objectives
    • Test personnel have received the appropriate training in test operation and safety and security procedures
    • Provisions have been made should test levels or system response exceed established limits or if the system exceeds its expected range of response
    • Software is ready to be tested
    • Requirements, software implementations, and test plans are an adequate and feasible basis for integration and test activities
  • A formal dry test run completed
  • Adequate test plans are completed and approved to proceed for the system under test
  • The risk level associated with testing is identified (during TRR) and accepted by the appropriate program/competency leadership, as required
  • Products from this review are approved, baselined and placed under configuration management

Software Assurance:

  • Confirm any risks, issue, RIDs/RFAs are documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

System Acceptance Review (SAR)

The SAR verifies the completeness of the specific end item with respect to the expected maturity level and to assess compliance to stakeholder expectations. The SAR examines the system, its end items and documentation, and test data and analyses that support verification. It also ensures that the system has sufficient technical maturity to authorize its shipment to the designated operational facility or launch site. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 SAR Entrance Criteria

  • A final agenda coordinated (nominally)
  • Technical products for this review made available to participants prior to SAR
  • Acceptance test readiness available for review
    • Process for analysis of Test Results
    • Acceptance Test testbed (environment) setup (hardware)
    • Setup and use of Simulators or other Test tools and their required qualifications
    • Limitations of the testbed (environment)
    • Tests that require: hardware for verification and/or human input
    • Description, at a high level, of what each test does, how long it lasts, and any special circumstances
    • IV&V report/status - if applicable
    • Preparedness for Acceptance Testing
    • Requests For Action (RFAs)
    • The decision to proceed to Acceptance Testing
  • Results available from SARs conducted at major suppliers
  • Transition to production and/or manufacturing plan exists
  • Product verification results / final test reports available
  • Product validation results available
  • Acceptance plans and acceptance criteria
  • Documentation exists to confirm that the delivered system complies with the established acceptance criteria
  • Documentation exists to confirm that the system will perform properly in the expected operational environment
  • Technical data package updated to include all test results
  • Certification package available for review
  • Risk assessment and mitigations updated
  • Previous milestone reviews successfully completed
  • Metrics data and reports available for review
  • Remaining liens or unclosed actions and plans for closure available for review
  • Waivers and deviations available for review
  • Software build has been updated
  • Functional audit (FCA) completed
  • Software presentation prepared (for SAR):
    • Software overview
    • Project System Diagram
    • Functional software overview
    • Software products/artifacts
    • Software traceability matrix examples
    • Software Test Procedures status
    • Open RIDs
    • Open SCRs
    • Software summary and recommendations
  • Lessons Learned captured from software areas of the project ( indicating the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)

Software Assurance:

  • Confirmation that all previous milestone reviews have been successfully completed, including SARs conducted by suppliers
  • Confirm  system acceptance testing has been successfully completed with product verification and validation results available for review
  • Conduct or participate in a functional configuration audit
  • Review metrics data analysis
  • Review any open RIDs, software change reports, issues, or unclosed actions
  • Confirm software documentation has been updated, as needed and the following is available for review: project system design, functional software overview, software products, traceability matrix examples, status and results of the previous testing, technical data package, open RIDs, SCRs, lessons learned
  • Review materials for review in advance

SAR Items Reviewed

  • Test readiness information
  • Results of the SARs conducted at the major suppliers
  • Transition to production and/or manufacturing plan
  • Product verification results/test reports
  • Product validation results
  • Baselined Software Build
  • Certification package, if acquiring software or using COTS/GOTS/MOTS/OpenSource
  • Documentation that the delivered system complies with the established acceptance criteria
  • Documentation that the system will perform properly in the expected operational environment
  • Technical data package
  • Risk assessment and mitigation
  • Hazard report
  • Results/proof of completion for previous milestone reviews
  • Remaining liens or unclosed actions and plans for closure
  • Waivers/deviations
  • Metrics Data and Reports

Software Assurance:         

  • Reviews materials prepared for review; Attends review
  • Submit any RFAs/RIDs on identified issues or risks

SAR Exit/Success Criteria

  • Review panel agrees that:
    • Required tests and analyses are complete and indicate that the system will perform properly in the expected operational environment
    • Risks are known and manageable
    • Software system meets established acceptance criteria
    • Required safe shipping, handling, checkout procedures are complete and ready for use
    • Required operational plans and procedures are complete and ready for use
    • Technical data package is complete and reflects the delivered system, including software user's manual and version description document
    • All applicable lessons learned for organizational improvement and system operations are captured
    • The software system has sufficient technical maturity to authorize shipment to designated operational facility or launch site

Software Assurance:

  • Confirm issues, risks, RFAs, RIDs are documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

Operational Readiness Review (ORR)

The ORR examines the actual system characteristics and the procedures used in the system or product's operation and ensures that all system and support (flight and ground) hardware, software, personnel, and procedures are ready for operations and that user documentation accurately reflects the deployed state of the system. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 ORR Entrance Criteria

  • All validation testing completed
  • Test failures and anomalies from validation testing resolved and results incorporated into all supporting and enabling operational products
  • All operational supporting and enabling products (e.g., facilities, equipment, documents, updated databases) that are necessary for the nominal and contingency operations have been tested and delivered/installed at the site(s) necessary to support operations
  • Operations manual complete
  • Physical audit (PCA) completed
  • Software inputs/contributions completed for:
    • Training provided to users and operators on correct operational procedures for system
    • Ground systems readiness
      • Diagram describing the main functionality for the project, how parts interact, and the main flow of data between major functional parts
      • Problem reporting and change request process for discrepancy reports (DR), enhancement reports (ER), Database change requests (DCR)
      • Current DR, ER, DCR status, include historical trend data, and details on current open DRs, ERs, DCRs
      • Key parts of the system, their current operational readiness, and how verified
      • Any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
      • Key interactions with other systems, their operational readiness, and how verified; any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
      • Outstanding items that need to be completed before readiness is achieved along with the scheduled date
  • Software maintenance plan completed
    • When software is frozen, what types of fixes will be approved for implementation under a freeze, etc.
    • How the change control board (CCB) will handle software changes or bug fixes
  • Science planning and processing system readiness are available for review, as applicable:
    • Diagram describing science data processing products and general timelines involved
    • Diagram describing science system context (relationship of main Mission Operations Center, Mission Planning Office, Science Validation Facility, Ground stations, interconnecting networks, and the main science data Instrument teams)
    • Description of these main components in high-level detail including planning and processing functions; include any special cases for launch, in-orbit checkout, end of the mission, etc.; description of testing, results, and issues done to verify and validate these components
    • Summary of all testing done, results, and outstanding issues for Science Data Processing
  • Safety and security issues status available for review:
    • Software issues with safety, how addressed, and current status
    • Software issues with security, how addressed, and current status
  • Simulations status available for review:
    • The number and main details for simulations by subsystem exercised, for example, Launch, Attitude Control System, Command & Data Handling, Communication, Flight Software, Power System Electronics, Mission Operations Center, Pre-Launch, others deemed important for the project
    • Outstanding issues from Simulation testing, schedule impact, workarounds, and risks; for workarounds, when will problem/issue be permanently fixed
  • Contingencies and constraints available for review:
    • State of Contingency Flow Chart Book and any planned updates
    • List of current constraints on the system, state of the database that details these constraints, and any outstanding actions that need to be taken
    • Audits that were done and against what areas to verify constraints
    • Operational problem escalation process
    • Operational emergency notification process including telephone numbers to be called
  • Status of documentation readiness available for review:
    • Version Description Document(s); its location, and any outstanding issues
    • Baselined Software User's Manual; its location, and any outstanding issues
    • Software Operations Plan; its location, and any outstanding issues
    • Software Maintenance Plan; its location, and any outstanding issues
    • Planned software retirement activities; location, and any outstanding issues
  • Lessons Learned captured from software areas of the project (indicating the problem or success that generated the Lesson Learned, what the LL was, and its applicability to future projects)
  • Status of work remaining available for review:
    • All critical work that needs to be completed along with the expected completion date

Software Assurance:

  • Confirm that all entrance criteria for review have been met (or are not applicable)
  • Confirm all verification/validation activities have been completed and anomalies addressed in software and documentation
  • Confirm all operational and supporting products have been tested and delivered
  • Confirm all documentation has been updated and delivered, including maintenance manual, software user guide, operations manual
  • Conduct or participate in a physical configuration audit
  • Confirm that operational processes and maintenance processes have been defined, including operational problem escalation, emergency notification, retirement activities, CCB processes, DR/PR process
  • Confirm complete system status is available for review: constraints and contingencies, status of outstanding issues, security and safety issue status, simulation status, science planning system status, training status
  • Prepare to report on status and findings of any audits/assessments and metrics analysis done by SA

ORR Items Reviewed

  • Validation test results/proof of completion
  • Status of test failures and anomalies from validation testing
  • Status of all testing, delivery, and installation for operational supporting and enabling products necessary for nominal and contingency operations
  • Status of software user's manual
  • Status of operations manual
  • Software Maintenance Plan
  • Science Planning and Processing System Readiness
  • Safety and Security Issues
  • The number and main details for simulations by subsystem exercised and any open issues
  • Contingencies and constraints
  • Status of documentation readiness
  • Work Remaining

Software Assurance:

  • Review materials prepared for review; attend review
  • Submit RIDs/RFAs on any identified issues, risks
  • Track RIDs/RFAs to closure

ORR Exit/Success Criteria

  • Review panel agrees that:
    • The system, including any enabling products, is ready to be placed in operational status
    • All applicable lessons learned for organizational improvement and systems operations have been captured
    • All waivers/deviations and anomalies have been closed
    • Systems hardware, software, personnel, and procedures are in place to support operations
    • All project and support h/w, s/w, and procedures are ready for operations and user documentation accurately reflects the deployed state of the entire system
  • RFA and review item discrepancy (RID) reports generated, as needed, as a result of this ORR

Software Assurance:

  • Confirm that all issues, risks, RIDs/RFAs have been documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

Flight Readiness Review (FRR)

The FRR examines tests, demonstrations, analyses, and audits that determine the system's readiness for a safe and successful flight/launch and for subsequent flight operations. It also ensures that all flight and ground hardware, software, personnel, and procedures are operationally ready. (NPR 7120.5 082)

Jump to: Entrance Criteria | Items Reviewed | Exit/Success Criteria

 FRR Entrance Criteria

  • Certification received that flight operations can safely proceed with acceptable risk
  • System and support elements confirmed as properly configured and ready for flight
  • Interfaces compatible and function as expected
  • System state supports a launch "go" decision based on go/no-go criteria
  • Flight failures and anomalies from previously completed flights and reviews resolved and results incorporated into all supporting and enabling operational products.
  • A system configured for flight
  • Tests, demonstrations, analyses, audit support flight readiness

Software Assurance:

  • Confirm that the entrance criterion for review has been met
  • Confirm that software assurance has signed off on the certification package
  • Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems
  • Confirm that system and supporting elements are properly configured and ready to support flight

FRR Items Reviewed

  • Open items and waivers/deviations
  • System and support elements configuration confirmation
  • Status of interface compatibility and functionality
  • System state
  • Status of failures and anomalies from previously completed flights and reviews
  • System configuration
  • Tests, demonstrations, analyses, audits
  • Software user's manual

Software Assurance:

  • Review materials prepared for review; attend review
  • Submit RIDs/RFAs on identified issues or risks
  • Track RIDs/RFAs to closure

FRR Exit/Success Criteria

  • Review panel agrees that:
    • Flight vehicle is ready for flight
    • Software is deemed acceptably safe for flight (i.e., meeting the established acceptable risk criteria or documented as being accepted by the PM and Designated Governing Authority (DGA))
    • Flight and ground software elements are ready to support flight and flight operations
    • Interfaces are checked and found to be functional
    • Open items and waivers/deviations have been examined and found to be acceptable
    • Software contributions to all open safety and mission risk items have been addressed
    • Operators are ready and workarounds have been fully vetted
    • Software user's manual is ready and available to be used for testing

Software Assurance:

  • Confirm that all issues, risks, RIDs/RFAs have been documented
  • Agrees with the panel that exit criteria has been met by the review team and signs the FRR documentation
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure





  • No labels