bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 52 Next »

Error formatting macro: alias: java.lang.NullPointerException
7.7 - Acquisition Guidance
Unknown macro: {div3}

1. Purpose

This document discusses guidance for projects implementing the NPR 7150.2A requirements addressing software acquisition, including [SWE-015],[SWE-032], [SWE-033],[SWE-037],[SWE-038], [SWE-045], [SWE-046], [SWE-047], [SWE-048], and [SWE-102]. This guidance is intended for all persons responsible for the software acquisition process from the planning stages through contract close-out.

1.1 Roles

Role

Responsibility

Project Manager

Approve procurement plan

Software Project Lead

Prepare procurement plan, prepare SOW software requirements and software data requirements for the contract, monitor execution of contract

System Engineer

Conduct trade studies, engineering analyses

Contracting Officer (CO)

Prepare contracts

Contracting Officer Technical Representative (COTR)

Prepare contracts

Software Technical Authority

Prior to contract release verify that the SOW includes the complete flow down of the agency and Center software requirements [recommended practice]

Unknown macro: {div3}

2. Planning

Before software acquisition can be carried out, a need must be identified for which a solution is required. During the planning stage, various solutions to address the identified need are evaluated with the following possible options:

  1. In-house development/service
  2. Contracted development/service
  3. Acquire OTS product
  4. Use/enhance existing product/service

If the solution to the need will involve software, NPR 7150.2 applies and the acquisition planning guidance below should be applied:

  1. Define the scope of the system of interest.
  2. Identify the goals and objectives for the software portion of the system.
  3. Identify the technical requirements (functional, operational, performance).
  4. Perform "make or buy" market research/trade studies to determine if an OTS solution exists:
    • Establish criteria (and a plan) for the studies:
      • Technical requirements
      • NPR 7150.2 classification
      • Constraints and limitations (cost, schedule, resources)
      • Use past studies, known alternatives, existing make/buy criteria
    • Conduct studies.
      • Assess potential products and technologies
      • Assess how well technical requirements are addressed
      • Assess estimated costs, including support
      • Identify risks (delivery, safety, development practices used by supplier, supplier track record, etc.)
      • Assess provider business stability, past performance, ability to meet maintenance requirements, etc.
    • Identify in-house capabilities to meet the need:
      • Assess availability of existing products which could meet the need or be modified to meet the need
      • Assess availability of qualified personnel for development or modification activities
      • Assess estimated costs (time, personnel, materials, etc.), including support
        • Use past projects as basis, where appropriate
      • Identify risks
    • Determine if solution will be custom made, an existing product, or a modified existing product.
    • Review COTS/GOTS/MOTS ([SWE-027]) guidance in this handbook for additional guidance and considerations.
  5. Identify any acquisition risks based on requirements and "make or buy" decisions.
  6. Create at least one government software cost estimate ([SWE-015]) for this work.
  7. Document analysis:
    • Expected classification of the software to be acquired
    • Availability of in-house staff and funding resources
    • Availability of the software product(s)
    • Projected licensing and support costs
    • List of potential suppliers
    • Security considerations
    • Potential risks related to supplier's viability and past performance
  8. Document solution choice and basis for that choice:
    • Estimate of in-house vs. acquisition costs (including OTS solutions and any associated costs for requirements not met by the OTS solution)
    • Comparison of cost estimates to available funding
    • Risk assessment
    • Assumptions, observations, rationale, determining factors
    • Significant issues, impacts of each option
    • If solution is in-house development/service, exit this procedure
    • If solution is to acquire product/service, continue tailoring as needed based on development under contract or purchase OTS solution
    • Other planning decisions resulting in best overall value to NASA
    • Description of chosen acquisition strategy
  9. Identify stakeholders based on requirements and "make or buy" decisions:
    • Those directly concerned with, or affected by, the acquisition decision.
    • May include management, the project team, procurement, customers, end users, and suppliers.
  10. Ensure acquisition team includes organization from NASA (acquirer) with appropriate (see [SWE-032]) non-expired CMMI rating as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser*
  11. Report analysis and resulting decision to appropriate stakeholders.
  12. Document lessons learned for future acquisition activities.
  13. Develop acquisition schedule, including solicitation, supplier selection, supplier monitoring, and product acceptance and transition to operations, as appropriate.
  14. Develop acquisition plan using Center-specific template.

*When a project acquires either class A or class B software, at a minimum, personnel from an organization that has a non-expired CMMI Maturity Level (ML)3 or ML2 rating respectively, in the Supplier Agreement Management (SAM) process area is required to support the acquiring organization during the acquisition planning process in the software area. This ensure the project is supported by a smart buyer that is knowledgeable of best software practices associated with CMMI resident within the supplier's engineering capability. This SAM-only alternative allows a Center or primer contractor who might only procure small amounts of class A and class B software, and who may not have a full CMMI ML2 or ML3 rating, to acquire this type of software and to participate in a project that requires its use.

Unknown macro: {div3}

3. Solicitation, Selection, Award

Once the planning activities for software acquisition have been completed and the decision has been made to acquire the software or software development services, a selection process needs to be followed to choose the best provider for the project. This process typically begins with development of a Statement of Work (SOW). The following recommendations should be considered as part of this process. Additionally, a SOW checklist is included in the Tools section of this guidance document.

1. Develop solicitation, including SOW:

  • Acceptance criteria
  • Solicitation constraints
  • Proper requirements ([SWE-048]) from the software development perspective:
    • Software classification (from NPR 7150.2 and safety criticality (from Software Safety Litmus Test)
    • Technical requirements
    • Development standard to be followed, if any
    • Development life cycle to be followed, or indication that developer can choose appropriate life cycle
    • Surveillance activities (and acquirer involvement) including monitoring activities, reviews, audits ([SWE-039]),decision points, meetings, etc.([SWE-045]).
    • Management and support requirements (project management, schedule and schedule updates ([SWE-046]), configuration management, non-conformance and change tracking ([SWE-043]), risk management, metrics collection ([SWE-044]), IV&V support, required records, traceability records, electronic records ([SWE-047]) and code access ([SWE-042]), V&V, etc.)
    • Requirements for maintenance, support, updates, new versions, training to be included in life cycle and cost estimates
    • Concise task and deliverable descriptions, including delivery format ([SWE-040])
    • Media format for code deliverables ([SWE-040])
    • Templates or Data Item Descriptions (DID) for documentation deliverables
    • Complete set of deliverables with delivery dates, review periods, and acceptance procedures for each
    • Time period for responses to review findings, including making changes
    • Data Requirements Documents for deliverables, if appropriate
    • Government and contractor proprietary, usage, ownership, warranty, data, and licensing rights, including transfer
    • Requirement to include notice of use of open source software ([SWE-041]) in developed code
    • OTS software requirements ([SWE-027]) (identify which requirements are met by OTS s/w, provide OTS s/w documentation such as usage instructions, etc.)
    • List of all mandatory NASA software development standards and DIDs, as applicable
    • Requirements for non-expired CMMI rating as measured by a Lead Appraiser certified by the Software Engineering Institute (SEI) ([SWE-032]) (see the Useful Tools section below for sample text for the solicitation)

*Acquisition should not simply levy NPR 7150.2 as a whole on a potential supplier, as it contains some NASA institutional requirements. If a project is acquiring software development services for class A through H software, the project should only levy the applicable minimal set of supplier requirements, plus additions that address specific risk concerns. Requirements that are the responsibility of the Agency, Center or Headquarters should not be levied on a contractor as they will cause confusion and unnecessary expense.

If the class of software and the safety critical designation are known when the SOW is written, the project can levy, via a compliance matrix, the project/system specific set of NPR 7150.2 requirements to be satisfied by the contractor. If the class and/or safety critical designation are not yet known, the SOW should list applicable requirements for each class and safety critical designation with instructions to comply accordingly when a class and safety critical designation are determined.

In the case of requirements marked P(Center), the responsible NASA Center and project supply the applicable elements of these requirements for inclusion in the SOW.

The full list of project related requirements can be found in the compliance matrices found in this handbook.

2. Ensure proper review of SOW before delivery to procurement/contracts official:

  • Technical Authority to ensure proper flow down of NPR 7150.2 requirements
  • Coordinate with the Safety and Mission Assurance Office to ensure all QA requirements, clauses, and intended delegations are identified and included. Identify potential suppliers.

3. Identify potential suppliers.

4. Distribute solicitation package.

5. Evaluate proposals (typically an evaluation team), based on selection criteria, including:

  • Cost estimation comparisons
  • Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2 requirements)
  • Staff available
  • Past performance
  • Software engineering and management capabilities
  • Prior expertise on similar projects
  • Available resources (facilities, hardware, software, training, etc.)
  • CMMI ratings
    • Check the SEI Published Appraisal Results (PARs) to confirm non-expired rating (http://sas.sei.cmu.edu/pars)
    • Be sure to check the scope of the organization holding the CMMI rating to confirm the rating is held by the specific organization submitting the proposal
  • Other factors relevant to the project

6. Select supplier/contractor and document basis for selection.

7. Negotiate and finalize contract:

  • Based on SOW
  • Identify and include management reviews and meetings, such as:
    • Formal reviews, such as those found in NPR 7123.1 and NPR 7120.4
    • Technical reviews
    • Progress reviews
    • Peer reviews (see Software Peer Reviews and Inspection topic guidance in this handbook)
    • Software quality assurance meetings
    • System integration test and verification meetings
    • System safety meetings
    • Configuration management meetings
    • Other relevant review for this project
  • Consider for inclusion in contract provisions (description of the method to be used) for verification of
    • Contractor handling of requirements changes
    • Accuracy of contractor transformation of high-level requirements into software requirements and detailed designs
    • Interface specifications between the contractor's product and systems external to it
    • Adequacy of contractor's risk management plan and its implementation in accordance with the required activities in the project Software Risk Management Plan
    • Adequacy of the contractor's integration and test plan and its implementation in accordance with the required activities in the project Software Integration and Test Plan
    • Adequacy of the contractor's configuration management plan and its implementation in accordance with the required activities in the project Software Configuration Management Plan
  • Consider for inclusion in the contract the content and frequency of progress reports and metrics submissions
  • Consider for inclusion in the contract identification of quality records to be maintained by the supplier
  • Consider for inclusion in the contract the delivery process and how it will be accomplished; if incremental development and delivery agreed upon, state how the validation process works (e.g., incremental validation) and whether it requires integration and test with software/hardware products developed by acquirer and/or other contractors or organizations (other institutes, universities, etc.)
  • Consider for inclusion in the contract a policy for maintaining the software after delivery: who is responsible for maintenance of the software, tools, testbeds, and documentation updates
Unknown macro: {div3}

4. Monitoring and Quality Assurance

Once the provider has been chosen, the acquisition process moves into a monitoring role. The following guidance should be included when establishing the process for provider monitoring and quality assurance:

  1. Provide technical requirements interpretation for contractor.
  2. Ensure contractor requirements documents meet original intent.
  3. Evaluate contractor progress with respect to cost.
  4. Periodically monitor contractor skill mix to ensure agreed-upon skills and experience levels are being provided.
  5. Oversee government-furnished equipment (GFE) to ensure equipment and information provided in timely manner.
  6. Periodically assess contractor processes to ensure conformance to process requirements stated in the contract.
  7. Review and assess adequacy of contractor-provided documentation and ensure contractor implementation of feedback, consider using Formal Inspections to accomplish this task
    Track status considering the following example questions:
    • Is the contractor meeting their staffing plan?
    • Have the project and the contractor met the user's needs?
    • Does the contractor have stable, educated staff?
    • Does the contractor's project have adequate resources (e.g., adequate staffing and computer resources)?
    • Is there realistic planning/budgeting in place?
    • Is the build plan being met?
    • Does the contractor have a good understanding of what is required?
    • Are the requirements stable?
    • Is the completion of designed functionality visible?
    • Is the evolving capability and performance of the contractor's product likely to impact development on the acquirer side of the interface?
    • Are integration and testing proceeding as planned?
    • Is contractor cost/schedule performance on target?
    • Is contractor developing a quality product?
  8. Provide regular status reviews to higher-level management on contractor progress.
  9. Regularly assess status of identified risks and provide reports during management reviews.
  10. Software engineering should provide technical review to the level required to enhance the probability of mission success (see the Useful Tools section below for a list of areas to consider for software engineering technical review).
Unknown macro: {div3}

5. Contract Administration

In addition to monitoring the selection provider's progress and quality, contract administration activities are also carried out for the project. The following guidance should be included when establishing the process for contract administration:

  1. Regularly assess contractor financial data and invoices against budget.
  2. Work with Contracting Officer to ensure timely resolution of any contract-related issues.
  3. Work with Contracting Officer to ensure timely address of needed modifications to contract terms and conditions, as needed. Primarily those affecting schedule, costs, services/products, resources (people, facilities), deliverables.
  4. Periodically evaluate contractor performance in manner consistent with contract and provide documented evaluation to Contracting Officer.
Unknown macro: {div3}

6. Product Acceptance and Control

Once the provider is ready to deliver the software product, the acquirer should have a process in place for review and acceptance of the product. The following guidance should be included when establishing the process for product acceptance:

  1. Review deliverables based on agreed-upon acceptance criteria (or generally accepted standards if specific criteria have not been established), document results, and work with contractor to resolve acceptance issues.
    • Typically, an acceptance test plan is created addressing the following:
      • Acquirer and contractor roles and responsibility
      • Defined Test Strategy
      • Defined Test Objectives
      • Defined Acceptance Criteria
      • Developed Test Scenarios
      • Developed Test Scripts
      • Developed Test Matrix
      • Time and Resources Estimate
      • Approval Cycle
      • Strategy for post-delivery problem resolutions
    • Once approved, the test plan is executed and results are documented:
      • Select Test Tools
      • Select and Train Team Members
      • Execute the Test Plan (Manual and Automated Methods)
      • Track Test Progress
      • Regression Test
      • Document Test Results
      • Resolve Problems
  2. Place formal deliverables under configuration control.
  3. After acceptance of delivered products, support transition to an operational and/or maintenance environment.
Unknown macro: {div3}

7. Contract Close-Out

The final acquisition step is to close out the contract. The following guidance should be included when establishing the process for contract close-out:

  1. Verify satisfaction of all contract terms and conditions, considering the following sample questions:
    • Has the contract period of performance expired (level of effort type contract)?
    • Have all deliverables been delivered (completion type contract)?
    • Have all CDRL Items been delivered and accepted?
    • Was the contractor's performance of the SOW acceptable?
    • If the contract involved patent rights, has the final patent report been filed?
    • Has the final invoice been received?
  2. Verify return of all GFE, as appropriate.
  3. Complete final reports as requested by Contracting Officer.
  4. Provide final contractor performance evaluation to Contracting Officer.
  5. Capture Lessons Learned, if not captured earlier in the project life cycle.
Unknown macro: {div3}

8. Useful Tools

The documents below are tools collected from various Centers that have been deemed good practices or practices that work well and produce good results. They are included here as aides for carrying out the software acquisition process.

8.1 CMMI Rating Language for RFP

If a project wants to procure the development of class A, B or C software, they must levy the associated requirements for which the project has responsibility and also clearly specify that the contractor meet CMMI maturity level requirements associated with the class.  Below are examples of wording that could be used in a statement of work to describe the CMMI maturity level requirements:

For class A software:

The contractor responsible for the acquisition, development or maintenance of class A software shall have a non-expired Capability Maturity Model Integratedâ for Development (CMMI-DEV) rating, as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser, of CMMI DEV maturity level 3 Rating or higher for software, or CMMI-DEV Capability level 3 rating or higher in all CMMI-DEV Maturity Level 2 and 3 process areas for software.

For class B software

The contractor responsible for the acquisition, development or maintenance of class B software shall have a non-expired Capability Maturity Model Integratedâ for Development (CMMI-DEV) rating, as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser, of CMMI DEV maturity level 2 Rating or higher for software, or CMMI-DEV Capability level 2 rating or higher for software in the following process areas:

  1. a.       Requirements Management
  2. b.      Configuration Management
  3. c.       Process and Product Quality Assurance
  4. d.      Measurement and Analysis
  5. e.       Project Planning
  6. f.        Project Monitoring and Control
  7. g.       Supplier Agreement Management (if applicable).

For class C software, the project can minimally choose to pass down this requirement in accordance with their Center's procedures related to class C as long as they provide some action that determines the contractor's capability to develop software in a meets or exceeds manner. As many NASA contractors are already at CMMI ML2 or higher, projects may alternatively choose to simply require ML2 in their RFP for Class C software.

If a contractor chooses to subcontract the development of class A, B, or C software, then the subcontractor is required to have a CMMI ML2 (for class B),  ML3 (for class A), or Center-specified (for class C) rating in the Supplier Agreement Management process area and levy the project-related requirements on the subcontractor.

8.1.1  General Example

The Contractor and its subcontractors' organizations associated with software development responsibilities shall be at Software Engineering Institute Software Capability Maturity Model Integration CMMI - DEV Maturity Level 3 (Staged Representation) or higher prior to the Preliminary Design Review.  This requirement does not apply to common unmodified commercial-off-the-shelf software procured for the project.

8.1.2  Examples for RFP Information Technology Management section

Example 1: Information Technology (IT) Management

For IT applications, other than mission-specific flight and non-flight software, the Contractor shall use Commercial-off-the-Shelf and existing Government-off-the-Shelf products where cost effective to NASA.  All IT applications, other than mission-specific flight and non-mission flight software shall comply with NASA requirements as outlined in NPR 7150.2, NASA Software Engineering Requirements for the appropriate software classes, limited to classes E, F, and G, and as applicable for the project, MPR 2800.4, and NPR 2830.1 NASA Enterprise Architecture Procedures.

Example 2: Information Technology Management

For IT applications, other than mission-specific software, the Contractor shall:

*o   Where cost effective to NASA, use Commercial-Off-The-Shelf (COTS) and existing Government Off-The-Shelf (GOTS) products.

*o   Ensure compatibility with existing NASA applications and systems.

*o   Comply with NASA requirements for NPR 7150.2, NASA Software Engineering Requirements for the appropriate software classes, limited to classes E, F, and G, and as applicable for the project.

1.1.3  Examples for RFP Software section

Example 1: Embedded Software (Firmware)

The Contractor shall develop and maintain software in accordance with NPR 7150.2, NASA Software Engineering Requirements for the appropriate software classes and as applicable for the project; NASA-STD-8739.8, and NASA Software Assurance Standard (chapters 6 and 7)

Example 2: Software Engineering

a.       The Contractor shall define, design, develop, test, qualify, integrate, verify, validate, deliver, and maintain all software. The plans for accomplishing this work shall be documented in DRD_, Software Development Plan_.

b.      The Contractor shall justify the reuse of existing software, modification of existing software, and the development of new software in DRD_, Software Development Plan_.

c.       The Contractor shall, under Project direction, participate in coordinating with the NASA IV&V Facility in accordance with NASA-STD-8739.8, NASA Software Assurance (Chapter 6 and 7) to plan for the participation of the NASA IV&V Facility in the software development life cycle activities.

d.      The Contractor and its subcontractors' organizations associated with software development responsibilities shall be at Software Engineering Institute Software Capability Maturity Model Integration CMMI-- DEV Maturity Level 3 (Staged Representation) or higher prior to the Preliminary Design Review.  This requirement does not apply to commercial-off-the-shelf software procured for the Project.

e.       The Contractor shall develop, update, and maintain all software and software development tools under configuration management in accordance with the DRD_, Software Configuration Management Plan_.

f.        The Contractor shall develop and maintain electronic Software Development Folders for all flight, ground, and test software per DRD_, Software Development Folder_.

g.       The Contractor shall use the following guidance document(s) for the development of all software document deliverables:

  • Project Classification Matrix (use as guidance in interpreting flight software classification definitions in NPR 7150.2)

h.      The Contractor shall use the following standards for designing, developing, and testing all software:

  • NPR 7150.2 NASA Software Engineering Requirements
  • NASA-STD-8739.8, NASA Software Assurance Standard (chapters 6 and 7)

Example 3

The contractor shall define, design, code, test, integrate, and qualify the software.  The contractor shall treat the software component of firmware, which consists of computer programs and data loaded into a class of memory that cannot be dynamically modified by the computer during processing, as software for the purposes of this statement of work.  The scope of this activity applies to the re-use of existing software, modification of existing software, and/or development of new software. The contractor shall provide information and access to products under development to provide the government with insight into software development and test activities, including monitoring integration and verification adequacy, auditing the software development process and participation in all software and system reviews.  The contractor shall support the implementation of the overall risk management process as well as program status and progress reviews for the software development process.  The contractor shall support software Technical Interchange Meetings and other status meetings, as required, to facilitate government insight into the software development.  The government insight may include government civil servant insight, government support contractor insight and independent verification and validation review.  The contractor shall perform Peer Reviews on Software Requirements Specifications, Software Test Plans, and on selected design and code items and provide results to the government via Software Inspection/Peer Review Reports.  The contractor shall maintain software metrics and provide Software Metrics Reports in accordance with DRD.

The contractor shall provide the government web-based electronic access (with access control) to intermediate and final software products (including code) and software process tracking information including software development and management metrics.

The software development shall comply with NPR 7150.2 NASA Software Engineering Requirements as applicable for the project.

8.1.4  Examples for RFP EGSE Software section

Example 1: EGSE Software

a.       The Contractor shall perform the design, code, Verification, Validation, and delivery of all EGSE code and executables in response to EGSE Subsystem Requirements Document."

b.      The Contractor shall develop and maintain software in accordance with NPR 7150.2, NASA Software Engineering Requirements for the appropriate software classes and as applicable for the project; and NASA-STD-8739.8, NASA Software Assurance Standard (chapters 6 and 7).

8.2 Recommended Technical Review Activities List

Areas to consider for Software Engineering technical review consist of the following:

  • Performing independent assessment of software systems engineering, software processes, software products, software integration, and software test analyses
  • Reviewing all mission critical software products
  • Software schedule and resource assessments and analyses
  • Development of software technical oversight plans
  • Coordination of any software related issues with the project
  • Participate in reviews and Technical Interchange Meetings
  • Perform periodic audits on pre-defined process(es)
  • Chair board or serve as board member, or Review Item Disposition (RID) writer, at a formal review
  • Participate in resolution and closure of issues
  • Independent models to check and compare vendor data
  • Perform evaluations of software products (software documentation, code, etc.)
  • Serve as Software Technical Authority responsible for acquired software products
  • Planning and Project Support:
    • Support and coordinate software trade studies
    • Assess software development processes
    • Support review of system level requirements specifications
    • Support development and review of system level verification and validation test plans
    • Verify compliance with Software Development Plan(s)
    • Verify compliance with software quality and configuration management plans
    • Participate in project documentation reviews
    • Support risk management activities
    • Participate in project and software developer Review Boards, Technical Interchange Meetings, Working Groups and telecons
    • Participate in developer's daily and/or weekly software development activities to maintain knowledge of software development progress
    • Identify and track software metrics
    • Review and assess schedule of the software development activities
    • Provide a status of the developer's software progress, metrics and any problems to the project
    • Conduct periodic site visits as needed to attain knowledge of software development progress
    • Review and assess the content and completeness of instrumentation and command control list (engineering integration database)
  • Requirements analysis:
    • Verify absence of problems and risk items associated with requirements:
      • Documentation standards used and properly applied
      • System requirements clearly organized
      • Even emphasis and levels of detail
      • Consistent identification schemes
      • Clear or concise requirement statements
      • Good sentence structure
    • Good word selection, unambiguous terms
    • Track growth in size and complexity of requirements to identify positive/negative trends
    • Estimate variances in schedule and costs based on requirements size & completeness
    • Support software requirements problem and issue resolution
    • Review and assess the interface specifications and data
    • Verify software requirements traceability
    • Support software requirements walkthroughs
    • Support evaluation of potential requirements changes and associated impacts through the life of the project
  • Design Phase:
    • Support review of preliminary and detailed design specifications (DDS)
    • Support software design problem and issue resolution
    • Verify traceability of design to software requirements
    • Support design walkthroughs
  • Code analysis:
    • Track growth and complexity of source code modules across builds
    • Rank source code modules according to their relative risk, as determined by:
      • Percent of internal documentation
      • Overly large files or modules
      • Use of unstructured programming constructs
      • High decision or calling complexity
      • Unused or "dead" code
    • Poor implementation, if applicable
    • Compliance with program coding standards
    • Develop and maintain knowledge of code functionality
    • Present code functionality to subsystems for validity
    • Support code development and integration testing
    • Support software code problem and issue resolution
    • Support developer code walkthroughs
  • Test Phase:
    • Support development and review of test plans, test procedures and test cases
    • Support TRR:
      • Review and identify discrepancies in software documentation
      • Support final closure of discrepancies
    • Support software test problem and issue resolution
    • Support CSCI integration and test activities
    • Review software test reports
  • Software problem report & effort data analyses:
    • Analyze Problem Reports and present understandable graphical summaries
    • Track error detection and correction rates
    • Assess adequacy of test program
    • Detect schedule risks early
    • Predict effective completion date
  • Software Metrics:
    • Help project office identify applicable software metrics
    • Review and assess the software metric data provided by the contractor
    • Develop, maintain and report software insight metric data to the project
  • Software Independent Verification and Validation (IV&V) support:
    • Perform software criticality assessments,
    • Perform software risk assessments,
    • Develop software IV&V project plans,
    • Develop software IV&V statements of work,
    • Support projects in review of all software IV&V products,
    • Provide expertise and assistance to the projects in resolution and implementation of any software IV&V recommendations.

8.3 Statement of Work Checklist

This checklist was taken directly from the Langley Research Center Statement of Work (SOW) Review Procedure, LMS-CP-5523 Rev. B, and includes practices recognized by OCE as practices that work very well for NASA. See the NASA Agency PAL for the latest version of this checklist.

Note: Items in gray text are provided as examples and explanatory guidance.
For additional guidance and examples on developing a Statement of Work see:
URL: http://sw-eng.larc.nasa.gov/docs/statements_of_work.html
and LPR 5000.2 "Procurement Initiator's Guide, Section 12 and 13.

8.3.1 Editorial Checklist

a. Is the SOW requirement in the form: "Who" shall "Do What"? E.g., "The Contractor shall (perform, provide, develop, test, analyze, or other verb followed by a description of what)."
Example SOW requirements:

  1. The Contractor shall design the XYZ flight software...
  2. The Contractor shall operate the ABC ground system...
  3. The Contractor shall provide maintenance on the following...
  4. The Contractor shall report software metrics monthly ...
  5. The Contractor shall integrate the PQR instrument with the spacecraft...

b. Is the SOW requirement a simple sentence that contains only one requirement? Compound sentences that contain more than one SOW requirement need to be split into multiple simple sentences. (For example, "The Contractor shall do ABC and perform XYZ" should be rewritten as: "The Contractor shall do ABC. The Contractor shall perform XYZ.")
c. Is the SOW composed of simple, cohesive paragraphs, each covering a single topic? Paragraphs containing many requirements should be divided into sub-paragraphs for clarity.
d. Has each paragraph and subparagraph been given a unique number or letter identifier? Is the numbering / lettering correct?
e. Is the SOW requirement in the active rather than the passive voice? Passive voice leads to vague statements. (For example, state: "The Contractor shall hold monthly management review meetings..." instead of "Management review meeting shall be held monthly ...")
f. Is the SOW requirement stated positively as opposed to negatively? (i.e., replace statements such as "The Contractor shall not exceed the budgetary limits specified..." with "The contractor shall comply with the budgetary limits specified...")
g. Is the SOW requirement grammatically correct?
h. Is the SOW requirement free of typographic errors, misspellings, and punctuation errors?
i. Have all acronyms been defined in an Acronym List or spelled out in the first occurrence?
j. Have the quantities, delivery schedules, and delivery method been identified for each deliverable within the SOW or a separate attachment/section?
k. Has the content of documents to be delivered been defined in a separate attachment/section and submitted with the SOW?
l. Has the file format of each electronic deliverable been defined? (e.g., Microsoft – Project, Adobe – Acrobat PDF, National Instruments – Labview VIs)

8.3.2 Content Checklist

a. Are correct terms used to define the requirements?
1. Shall = requirement (binds the contractor)
2. Should = goal (leaves decision to contractor; avoid using this word)
3. May = allowable action (leaves decision to contractor; avoid using this word)
4. Will = facts or declaration of intent by the Government (use only in referring to the Government)
5. Present tense (e.g., "is") = descriptive text only (avoid using in requirements statements; use "shall" instead)
6. NEVER use 'must'
b. Is the scope of the SOW clearly defined? Is it clear what you are buying?
c. Is the flow and organizational structure of the document logical and understandable? (See LPR 5000.2 "Procurement Initiator's Guide", Section 12 for "helpful hints".) Is the text compatible with the title of the section it's under? Are sub-headings compatible with the subject matter of a heading?
d. Is the SOW requirement clear and understandable?
1. Can the sentence only be understood one way?
2. Will all terminology used have the same meaning to different readers without definition? Has any terminology for which this is not the case been defined in the SOW? (e.g., in a Definitions section or Glossary.)
3. Is it free from indefinite pronouns ("this", "that", "these", "those") without clear antecedents? (e.g., replace statements such as "These shall be inspected on an annual basis." with "The fan blades shall be inspected on an annual basis.")
4. Is it stated concisely?
e. Have all redundant requirements been removed? Redundant requirements can reduce clarity, increase ambiguity, and lead to contradictions.
f. Is the requirement consistent with other requirements in the SOW, without contradicting itself, without using the same terminology with different meanings, without using different terminology for the same thing?
g. If the SOW includes the delivery of a product (as opposed to just a services SOW):
1. Are the technical product requirements in a separate section or attachment, apart from the activities that the contractor is required to perform? The intent is to clearly delineate between the technical product requirements and requirements for activities the contractor is to perform. (E.g., separate SOW statements "The contractor shall" from technical product requirement statements such as "The system shall" and "The software shall".)
2. Are references to the product and its sub-elements in the SOW at the level described in the technical product requirements?
3. Is the SOW consistent with and does it use the same terminology as the technical product requirements?
h. Is the SOW requirement free of ambiguities? Make sure the SOW requirement is free of vague terms. (For example, "as appropriate", "any", "either", "etc.", "and/or", "support", "necessary", "but not limited to", "be capable of", "be able to")?
i. Is the SOW requirement verifiable? Make sure the SOW requirement is free of unverifiable terms. For example, "flexible", "easy", "sufficient", "safe", "ad hoc", "adequate", "accommodate", "user-friendly", "usable", "when required", "if required", "appropriate", "fast", "portable", "light-weight", "small", "large", "maximize", "minimize", "optimize", "sufficient", "robust", "quickly", "easily", "clearly", other "ly" words, other "ize" words.
j. Is the SOW requirement free of implementation constraints? SOW requirements should state WHAT the contractor is to do, NOT HOW they are to do it. For example, "The Contractor shall design the XYZ flight software" states WHAT the contractor is to do, while "The Contractor shall design the XYZ software using object-oriented design" states HOW the contractor is to implement the activity of designing the software. In addition, too low a level of decomposition of activities can result in specifying how the activities are to be done, rather than what activities are to be done.
k. Is the SOW requirement stated in such a way that compliance with the requirement is verifiable? Does a means exist to measure or otherwise assess its accomplishment? Can a method for verifying compliance with the requirement be defined (e.g., described in a Quality Assurance Surveillance Plan)?
l. Is the background material clearly labeled as such (i.e., included in the background section of the SOW if one is used)?
m. Are the assumptions able to be validated and restated as requirements? If not, the assumptions should be deleted from the SOW. Assumptions should be recorded in a document separate from the SOW.
n. Is the SOW complete, covering all of the work the contractor is to do?
1. Are all of the activities necessary to develop the product included? (E.g., system, software, and hardware activities for the following: requirements, architecture, and design development; implementation and manufacturing; verification and validation; integration testing and qualification testing.)
2. Are all safety, reliability, maintainability (e.g., mean time to restore), availability, quality assurance, and security requirements defined for the total life of the contract?
3. Does the SOW include a requirement for the contractor to have a quality system (e.g., ISO certified), if one is needed?
4. Are all of the necessary management and support requirements included in the SOW? (For example, project management; configuration management; systems engineering; system integration and test; risk management; interface definition and management; metrics collection, reporting, analysis and use; acceptance testing; NASA Independent Verification and Validation support tasks.)
5. Are clear Performance Standards included and sufficient to measure contractor performance? (e.g., systems, software, hardware, and service performance standards for the following: schedule, progress, size, stability, cost, resources, and defects.) See Guidance on System and Software Metrics for Performance-Based Contracting at: http://sw-eng.larc.nasa.gov/docs/statements_of_work.html for more information and examples on Performance Standards.
6. Are all of the necessary service activities included? (For example, transition to operations, operations, maintenance, database administration, system administration, data management.)
7. Are all of the Government surveillance activities included? (For example, project management meetings; decision points; requirements and design peer reviews for systems, software, and hardware; demonstrations; test readiness reviews; other desired meetings (e.g., Technical Interchange Meetings); collection and delivery of metrics for systems, software, hardware, and services (e.g. to provide visibility into development progress and cost); electronic access to technical and management data; access to subcontractors and other team members for the purposes of communication.)
8. Are the Government requirements for contractor inspection and testing addressed, if necessary?
9. Are the requirements for contractor support of Government acceptance activities addressed, if necessary?
o. Does the SOW only include contractor requirements? It should not include Government requirements.
p. Does the SOW give the contractor full management responsibility and hold them accountable for the end result?
q. Is the SOW sufficiently detailed to permit a realistic estimate of cost, labor, and other resources required to accomplish each activity?
r. Are all deliverables identified (e.g., status, financial, product deliverables)? The following are examples of deliverables that are sometimes overlooked: management and development plans; technical progress reports that identify current work status, problems and proposed corrective actions, and planned work; financial reports that identify costs (planned, actual, projected) by category (e.g., software, hardware, quality assurance); products (e.g., source code, Maintenance/User Manual, test equipment); and discrepancy data (e.g., defect reports, anomalies). All deliverables should be specified in a separate document except for technical deliverables which should be included in the SOW (e.g. hardware, software, prototypes, etc.).
s. Does each technical and management deliverable track to a paragraph in the SOW? Each deliverable should have a corresponding SOW requirement for its preparation (e.g., the SOW identifies the title of the deliverable in parenthesis after the task requiring the generation of the deliverable).
t. Are all reference citations complete?
1. Is the complete number, title, and date or version of each reference specified?
2. Does the SOW reference the standards and other compliance documents in the proper SOW paragraphs?
3. Is the correct reference document cited and is it referenced at least once?
4. Is the reference document either furnished with the SOW or available at a location identified in the SOW?
5. If the referenced standard or compliance document is only partially applicable, does the SOW explicitly and unambiguously reference the portion that is required of the contractor?

8.8.3 Critical and/or Complex Requirements Checklist

Note: The checklist items below may be duplicative of items included earlier in this Appendix but are summarized here to specifically identify what is required for critical and/or complex procurements.
a. Does the SOW include the name or identification of all critical and/or complex items (i.e., specifications [TOOL:e.g. IEEE Standards, NFPA Standards], drawings, process requirements [TOOL:e.g. LMS-CPs], inspection instructions, and other relevant technical data, as applicable)?
b. Are the requirements for design, test, examination, inspection, and related instructions for acceptance by the Government included in the SOW where applicable?
c. Are the requirements for test specimens (e.g. production method, number, storage conditions) included in the SOW if applicable? These specimens could be used by the Government for design approval, inspection, investigation or auditing.


8.4 Example Templates

The following NASA Data Item Descriptions (DIDs) are listed as sample templates for the documentation templates called for during the solicitation portion of the software acquisition process. Center Process Asset Libraries (PALs) should be consulted for DIDs and Data Requirements Documents (DRDs) relevant to a specific NASA Center.


8.4.1 NASA-STD-2100-91

NASA DIDs are defined in the NASA-STD-2100-91 Software Documentation Standard, which is available at http://satc.gsfc.nasa.gov/assure/docstd.html. The NASA DIDs provide a format for a documentation set, including what needs to be addressed in each section.
MASTER DOCUMENTATION DATA ITEM DESCRIPTIONS

  • NASA-DID-000Software Documentation Set DID
  • NASA-DID-999Template DID

MANAGEMENT PLAN DATA ITEM DESCRIPTIONS

  • NASA-DID-M000Management Plan DID
  • NASA-DID-M100Acquisition Activities Plan DID
  • NASA-DID-M200Development Activities Plan DID
  • NASA-DID-M210Training Development Plan DID
  • NASA-DID-M300Sustaining Engineering and Operations Activities Plan DID
  • NASA-DID-M400Assurance Plan DID
  • NASA-DID-M500Risk Management Plan DID
  • NASA-DID-M600Configuration Management Plan DID
  • NASA-DID-M700Delivery and Operational Transition Plan DID


PRODUCT SPECIFICATION DATA ITEM DESCRIPTIONS

  • NASA-DID-P000Product Specification DID
  • NASA-DID-P100Concept DID
  • NASA-DID-P200Requirements DID
  • NASA-DID-P300Architectural Design DID
  • NASA-DID-P400Detailed Design DID
  • NASA-DID-P410Firmware Support Manual DID
  • NASA-DID-P500Version Description DID
  • NASA-DID-P600User's Guide DID
  • NASA-DID-P700Operational Procedures Manual DID

ASSURANCE AND TEST PROCEDURES DATA ITEM DESCRIPTIONS

  • NASA-DID-A000Assurance and Test Procedures DID
  • NASA-DID-A100Assurance Procedures DID
  • NASA-DID-A200Test Procedures DID

MANAGEMENT, ENGINEERING, AND ASSURANCE REPORTS DATA ITEM DESCRIPTIONS

  • NASA-DID-R000Management, Engineering, and Assurance Reports DID
  • NASA-DID-R001Certification Report
  • NASA-DID-R002Audit Report
  • NASA-DID-R003Inspection Report
  • NASA-DID-R004Discrepancy (NRCA) Report
  • NASA-DID-R005Engineering Change Proposal
  • NASA-DID-R006Lessons Learned ReportF-10
  • NASA-DID-R007Performance/Status Reports
  • NASA-DID-R008Assurance Activity Report
  • NASA-DID-R009Test Report
  • NASA-DID-R010Waiver/Deviation Request
  • NASA-DID-R011Review Report

8.4.2 Center DIDs and DRDs

The following DIDs and DRDs are samples available from center PALs. Consult your own center PAL for templates relevant to work performed for your center.



Marshall Space Flight Center Templates
Available from http://spi.msfc.nasa.gov/templates.html and the individual Project Asset sections of the Marshall Space Flight Center PAL.

  • Software Configuration Management Plan
  • Software Test Report (STR) Template
  • Unit Test Procedure Template

Goddard Space Flight Center Templates
Available from http://software.gsfc.nasa.gov/ispaindx.cfm.

  • Software Management Plan/Product Plan (SMP/PP) for Class A, B, & C Software
  • ISD Software Management Plan/Product Plan (SMP/PP) for Class D&E Software
  • Version Description Document
  • Template for the Software Quality Assurance Plan
  • Configuration Management Plan Template
  • Other templates in progress or not available publicly
Unknown macro: {div3}

9. References

  1. Glenn Research Center, Software Acquisition Statement of Work Guideline , GRC-SW-7150.14, 2010.
  2. Office of Procurement, Langley Research Center, Prepare Presolicitation Documents, Revision O-1, LMS-OP-4509, 2009.
  3. Langley Research Center, Statement of Work (SOW) Review Procedure , LMS-CP-2253 Rev. B, 2006.
  4. "Product Requirements Development and Management Procedure" , 5526_7-21-06_Req_RevA_generic-R1V0, 2006.
  5. Goddard Space Flight Center, Process for Conducting a Make/Buy Analysis , 580-SP-075-01, 2009.
  6. Goddard Space Flight Center, WBS Checklist Tool, 2007.
  7. Jet Propulsion Laboratory, Software Supplier Agreement Management Plan .
  8. Polydys, Wisseman, "Software Assurance: Five Essential Considerations for Acquisition Officials", STSC Crosstalk, May 2007.
  9. Ward, Elm, "A Method for Reasoning About an Acquisition Strategy", Software Engineering Institute (SEI), 2005.
  10. Adams, Eslinger, Owens, Rich, "Software Acquisition Best Practices: 2004 Edition", 3rd OSD Conference on the Acquisition of Software-Intensive Systems.
  • No labels