bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 32 Next »

Error formatting macro: alias: java.lang.NullPointerException
7.7 - Acquisition Guidance
Unknown macro: {div3}

1. Purpose

This document addresses guidance for projects implementing the NPR 7150.2A requirements addressing software acquisition, including SWE-033, ?SWE-037, ?SWE-038, SWE-045 through SWE-048, and SWE-102. This guidance is intended for all persons responsible for the software acquisition process from the planning stages through contract close-out.

1.1 Roles

Role

Responsibility

Project Manager

Approve procurement plan

Software Project Lead

Prepare procurement plan, monitor execution of contract

System Engineer

Conduct trade studies, engineering analyses

Contracting Officer (CO)

Prepare contracts

Contracting Officer Technical Representative (COTR)

Prepare contracts

Technical Authority

Review SOW

Unknown macro: {div3}

2. Planning

Before software acquisition can be carried out, a need must be identified for which a solution is required. During the planning stage, various solutions to address the identified need are evaluated with the following possible options:

  • In-house development/service
  • Contracted development/service
  • Acquire OTS product
  • Use/enhance existing product/service

If the solution to the need will involve software, NPR 7150.2A applies and the acquisition planning guidance below should be applied:

  1. If not already done, define the scope of the system of interest.
  2. If not already done, identify the goals and objectives for the software portion of the system.
  3. Identify the technical requirements (functional, operational, performance).
  4. Perform "make or buy" market research/trade studies to determine if an OTS solution exists:
    • Establish criteria (and a plan) for the studies:
      • Technical requirements
      • NPR 7150.2A classification
      • Constraints and limitations (cost, schedule, resources)
      • Use past studies, known alternatives, existing make/buy criteria
    • Conduct studies.
      • Assess potential products and technologies
      • Assess how well technical requirements are addressed
      • Assess estimated costs, including support
      • Identify risks (delivery, safety, development practices used by supplier, supplier track record, etc.)
      • Assess provider business stability, past performance, ability to meet maintenance requirements, etc.
    • Identify in-house capabilities to meet the need:
      • Assess availability of existing products which could meet the need or be modified to meet the need
      • Assess availability of qualified personnel for development or modification activities
      • Assess estimated costs (time, personnel, materials, etc.), including support
        • Use past projects as basis, where appropriate
      • Identify risks
    • Determine if solution will be custom made, an existing product, or a modified existing product.
    • Review COTS/GOTS/MOTS guidance in NPR 7150.2A handbook for additional guidance and considerations.
  5. Identify any acquisition risks based on requirements and "make or buy" decisions.
  6. Document analysis:
    • Expected classification of the software to be acquired
    • Availability of in-house staff and funding resources
    • Availability of the software product(s)
    • Projected licensing and support costs
    • List of potential suppliers
    • Security considerations
    • Potential risks related to supplier's viability and past performance
  7. Document solution choice and basis for that choice:
    • Estimate of in-house vs. acquisition costs (including OTS solutions and any associated costs for requirements not met by the OTS solution)
    • Comparison of cost estimates to available funding
    • Risk assessment
    • Assumptions, observations, rationale, determining factors
    • Significant issues, impacts of each option
    • If solution is in-house development/service, exit this procedure
    • If solution is to acquire product/service, continue tailoring as needed based on development under contract or purchase OTS solution
    • Other planning decisions resulting in best overall value to NASA
    • Description of chosen acquisition strategy
  8. Identify stakeholders based on requirements and "make or buy" decisions:
    • Those directly concerned with, or affected by, the acquisition decision.
    • May include management, the project team, procurement, customers, end users, and suppliers.
  9. Report analysis and resulting decision to appropriate stakeholders.
  10. Document lessons learned for future acquisition activities.
  11. Develop acquisition schedule, including solicitation, supplier selection, supplier monitoring, and product acceptance and transition to operations, as appropriate.
  12. Develop acquisition plan using center-specific template.
Unknown macro: {div3}

3. Solicitation, Selection, Award

Once the planning activities for software acquisition have been completed and the decision has been made to acquire the software or software development services, a selection process needs to be followed to choose the best provider for the project. This process typically begins with development of a Statement of Work (SOW). The following recommendations should be considered as part of this process. Additionally, a SOW checklist is included in the Tools section of this guidance document.

1. Develop solicitation, including SOW:

  • Acceptance criteria
  • Solicitation constraints
  • Proper requirements from the software development perspective:
    • Software classification (from NPR 7150.2A and safety criticality (from Software Safety Litmus Test)
    • Technical requirements
    • Development standard to be followed, if any
    • Development lifecycle to be followed, or indication that developer can choose appropriate lifecycle
    • Surveillance activities (and acquirer involvement) including monitoring activities, reviews, audits, decision points, meetings, etc.
    • Management and support requirements (project management, schedule and schedule updates, configuration management, non-conformance and change tracking, risk management, metrics collection, IV&V support, required records, traceability records, electronic records and code access, V&V, etc.)
    • Requirements for maintenance, support, updates, new versions, training to be included in lifecycle and cost estimates
    • Concise task and deliverable descriptions, including delivery format
    • Media format for code deliverables
    • Templates or Data Item Descriptions (DID) for documentation deliverables
    • Complete set of deliverables with delivery dates, review periods, and acceptance procedures for each
    • Time period for responses to review findings, including making changes
    • Data Requirements Documents for deliverables, if appropriate
    • Government and contractor proprietary, usage, ownership, warranty, data, and licensing rights, including transfer
    • Requirement to include notice of use of open source software in developed code
    • OTS software requirements (identify which requirements are met by OTS s/w, provide OTS s/w documentation such as usage instructions, etc.)
    • List of all mandatory NASA software development standards and DIDs, as applicable
    • Requirements for non-expired CMMI rating as measured by a Lead Appraiser certified by the Software Engineering Institute (SEI) (see the Useful Tools section below for sample text for the solicitation)

Acquisition should not simply levy NPR 7150.2A as a whole on a potential supplier, as it contains some NASA institutional requirements. If a project is acquiring software development services for class A through H software, the project should only levy the applicable minimal set of supplier requirements, plus additions that address specific risk concerns. Requirements that are the responsibility of the Agency, center or Headquarters should not be levied on a contractor as they will cause confusion and unnecessary expense.

If the class of software and the safety critical designation are known when the SOW is written, the project can levy, via a compliance matrix, the project/system specific set of NPR 7150.2A requirements to be satisfied by the contractor. If the class and/or safety critical designation are not yet known, the SOW should list applicable requirements for each class and safety critical designation with instructions to comply accordingly when a class and safety critical designation are determined.

In the case of requirements marked P(Center), the responsible NASA Center and project supply the applicable elements of these requirements for inclusion in the SOW.

The full list of project related requirements can be found in the compliance matrices found in this handbook.

2. Ensure proper review of SOW before delivery to procurement/contracts official:

  • Technical Authority to ensure proper flow down of NPR 7150.2A requirements
  • Coordinate with the Safety and Mission Assurance Office to ensure all QA requirements, clauses, and intended delegations are identified and included3. Identify potential suppliers.

3. Identify potential suppliers.

4. Distribute solicitation package.

5. Evaluate proposals (typically an evaluation team), based on selection criteria, including:

  • Cost estimation comparisons
  • Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2A requirements)
  • Staff available
  • Past performance
  • Software engineering and management capabilities
  • Prior expertise on similar projects
  • Available resources (facilities, hardware, software, training, etc.)
  • CMMI ratings
    • Check the SEI Published Appraisal Results (PARs) to confirm non-expired rating (http://sas.sei.cmu.edu/pars)
    • Be sure to check the scope of the organization holding the CMMI rating to confirm the rating is held by the specific organization submitting the proposal
    • Other factors relevant to the project

6. Select supplier/contractor and document basis for selection.

7. Negotiate and finalize contract:

  • Based on SOW
  • Identify and include management reviews and meetings, such as:
    • Formal reviews, such as those found in NPR 7123.1 and NPR 7120.4
    • Technical reviews
    • Progress reviews
    • Peer reviews (see Peer Reviews and Inspection topic guidance in this handbook)
    • Software quality assurance meetings
    • System integration test and verification meetings
    • System safety meetings
    • Configuration management meetings
    • Other relevant review for this project
  • Consider for inclusion in contract provisions (description of the method to be used) for verification of
    • Contractor handling of requirements changes
    • Accuracy of contractor transformation of high-level requirements into software requirements and detailed designs
    • Interface specifications between the contractor's product and systems external to it
    • Adequacy of contractor's risk management plan and its implementation in accordance with the required activities in the project Software Risk Management Plan
    • Adequacy of the contractor's integration and test plan and its implementation in accordance with the required activities in the project Software Integration and Test Plan
    • Adequacy of the contractor's configuration management plan and its implementation in accordance with the required activities in the project Software Configuration Management Plan
  • Consider for inclusion in the contract the content and frequency of progress reports and metrics submissions
  • Consider for inclusion in the contract identification of quality records to be maintained by the supplier
  • Consider for inclusion in the contract the delivery process and how it will be accomplished; if incremental development and delivery agreed upon, state how the validation process works (e.g., incremental validation) and whether it requires integration and test with software/hardware products developed by acquirer and/or other contractors or organizations (other institutes, universities, etc.)
  • Consider for inclusion in the contract a policy for maintaining the software after delivery: who is responsible for maintenance of the software, tools, testbeds, and documentation updates
Unknown macro: {div3}

Monitoring and Quality Assurance

Once the provider has been chosen, the acquisition process moves into a monitoring role. The following guidance should be included when establishing the process for provider monitoring and quality assurance:

  1. Provide technical requirements interpretation for contractor.
  2. Ensure contractor requirements documents meet original intent.
  3. Evaluate contractor progress with respect to cost.
  4. Periodically monitor contractor skill mix to ensure agreed-upon skills and experience levels are being provided.
  5. Oversee government-furnished equipment (GFE) to ensure equipment and information provided in timely manner.
  6. Periodically assess contractor processes to ensure conformance to process requirements stated in the contract.
  7. Review and assess adequacy of contractor-provided documentation and ensure contractor implementation of feedback, consider using Formal Inspections to accomplish this task
    Track status considering the following example questions:
    • Is the contractor meeting their staffing plan?
    • Have the project and the contractor met the user's needs?
    • Does the contractor have stable, educated staff?
    • Does the contractor's project have adequate resources (e.g., adequate staffing and computer resources)?
    • Is there realistic planning/budgeting in place?
    • Is the build plan being met?
    • Does the contractor have a good understanding of what is required?
    • Are the requirements stable?
    • Is the completion of designed functionality visible?
    • Is the evolving capability and performance of the contractor's product likely to impact development on the acquirer side of the interface?
    • Are integration and testing proceeding as planned?
    • Is contractor cost/schedule performance on target?
    • Is contractor developing a quality product?
  8. Provide regular status reviews to higher-level management on contractor progress.
  9. Regularly assess status of identified risks and provide reports during management reviews.
  10. Software engineering should provide technical review to the level required to enhance the probability of mission success (see the Useful Tools section below for a list of areas to consider for software engineering technical review).
Unknown macro: {div3}

Contract Administration

In addition to monitoring the selection provider's progress and quality, contract administration activities are also carried out for the project. The following guidance should be included when establishing the process for contract administration:

  1. Regularly assess contractor financial data and invoices against budget.
  2. Work with Contracting Officer to ensure timely resolution of any contract-related issues.
  3. Work with Contracting Officer to ensure timely address of needed modifications to contract terms and conditions, as needed. Primarily those affecting schedule, costs, services/products, resources (people, facilities), deliverables.
  4. Periodically evaluate contractor performance in manner consistent with contract and provide documented evaluation to Contracting Officer.
Unknown macro: {div3}

Product Acceptance and Control

Once the provider is ready to deliver the software product, the acquirer should have a process in place for review and acceptance of the product. The following guidance should be included when establishing the process for product acceptance:

  1. Review deliverables based on agreed-upon acceptance criteria (or generally accepted standards if no criteria established), document results, and work with contractor to resolve acceptance issues.
    • Typically, an acceptance test plan is created addressing the following:
      • Acquirer and contractor roles and responsibility
      • Defined Test Strategy
      • Defined Test Objectives
      • Defined Acceptance Criteria
      • Developed Test Scenarios
      • Developed Test Scripts
      • Developed Test Matrix
      • Time and Resources Estimate
      • Approval Cycle
      • Strategy for post-delivery problem resolutions
    • Once approved, the test plan is executed and results are documented:
      • Select Test Tools
      • Select and Train Team Members
      • Execute the Test Plan (Manual and Automated Methods)
      • Track Test Progress
      • Regression Test
      • Document Test Results
      • Resolve Problems
  2. Place formal deliverables under configuration control.
  3. After acceptance of delivered products, support transition to an operational and/or maintenance environment.
Unknown macro: {div3}

Contract Close-Out

The final acquisition step is to close out the contract. The following guidance should be included when establishing the process for contract close-out:

  1. Verify satisfaction of all contract terms and conditions, considering the following sample questions:
    • Has the contract period of performance expired (level of effort type contract)?
    • Have all deliverables been delivered (completion type contract)?
    • Have all CDRL Items been delivered and accepted?
    • Was the contractor's performance of the SOW acceptable?
    • If the contract involved patent rights, has the final patent report been filed?
    • Has the final invoice been received?
  2. Verify return of all GFE, as appropriate.
  3. Complete final reports as requested by Contracting Officer.
  4. Provide final contractor performance evaluation to Contracting Officer.
  5. Capture Lessons Learned, if not captured earlier in the project lifecycle.
Unknown macro: {div3}

8. Useful Tools

The documents below are tools collected from various Centers that have been deemed good practices or practices that work well and produce good results. They are included here as aides for carrying out the software acquisition process.

8.1 Recommended Technical Review Activities List

Areas to consider for Software Engineering technical review consist of the following:

  • Performing independent assessment of software systems engineering, software processes, software products, software integration, and software test analyses
  • Reviewing all mission critical software products
  • Software schedule and resource assessments and analyses
  • Development of software technical oversight plans
  • Coordination of any software related issues with the project
  • Participate in reviews and Technical Interchange Meetings
  • Perform periodic audits on pre-defined process(es)
  • Chair board or serve as board member, or Review Item Disposition (RID) writer, at a formal review
  • Participate in resolution and closure of issues
  • Independent models to check and compare vendor data
  • Perform evaluations of software products (software documentation, code, etc.)
  • Serve as Software Technical Authority responsible for acquired software products
  • Planning and Project Support:
    • Support and coordinate software trade studies
    • Assess software development processes
    • Support review of system level requirements specifications
    • Support development and review of system level verification and validation test plans
    • Verify compliance with Software Development Plan(s)
    • Verify compliance with software quality and configuration management plans
    • Participate in project documentation reviews
    • Support risk management activities
    • Participate in project and software developer Review Boards, Technical Interchange Meetings, Working Groups and telecons
    • Participate in developer's daily and/or weekly software development activities to maintain knowledge of software development progress
    • Identify and track software metrics
    • Review and assess schedule of the software development activities
    • Provide a status of the developer's software progress, metrics and any problems to the project
    • Conduct periodic site visits as needed to attain knowledge of software development progress
    • Review and assess the content and completeness of instrumentation and command control list (engineering integration database)
  • Requirements analysis:
    • Verify absence of problems and risk items associated with requirements:
      • Documentation standards used and properly applied
      • System requirements clearly organized
      • Even emphasis and levels of detail
      • Consistent identification schemes
      • Clear or concise requirement statements
      • Good sentence structure
    • Good word selection, unambiguous terms
    • Track growth in size and complexity of requirements to identify positive/negative trends
    • Estimate variances in schedule and costs based on requirements size & completeness
    • Support software requirements problem and issue resolution
    • Review and assess the interface specifications and data
    • Verify software requirements traceability
    • Support software requirements walkthroughs
    • Support evaluation of potential requirements changes and associated impacts through the life of the project
  • Design Phase:
    • Support review of preliminary and detailed design specifications (DDS)
    • Support software design problem and issue resolution
    • Verify traceability of design to software requirements
    • Support design walkthroughs
  • Code analysis:
    • Track growth and complexity of source code modules across builds
    • Rank source code modules according to their relative risk, as determined by:
      • Percent of internal documentation
      • Overly large files or modules
      • Use of unstructured programming constructs
      • High decision or calling complexity
      • Unused or "dead" code
    • Poor implementation, if applicable
    • Compliance with program coding standards
    • Develop and maintain knowledge of code functionality
    • Present code functionality to subsystems for validity
    • Support code development and integration testing
    • Support software code problem and issue resolution
    • Support developer code walkthroughs
  • Test Phase:
    • Support development and review of test plans, test procedures and test cases
    • Support TRR:
      • Review and identify discrepancies in software documentation
      • Support final closure of discrepancies
    • Support software test problem and issue resolution
    • Support CSCI integration and test activities
    • Review software test reports
  • Software problem report & effort data analyses:
    • Analyze Problem Reports and present understandable graphical summaries
    • Track error detection and correction rates
    • Assess adequacy of test program
    • Detect schedule risks early
    • Predict effective completion date
  • Software Metrics:
    • Help project office identify applicable software metrics
    • Review and assess the software metric data provided by the contractor
    • Develop, maintain and report software insight metric data to the project
  • Software Independent Verification and Validation (IV&V) support:
    • Perform software criticality assessments,
    • Perform software risk assessments,
    • Develop software IV&V project plans,
    • Develop software IV&V statements of work,
    • Support projects in review of all software IV&V products,
    • Provide expertise and assistance to the projects in resolution and implementation of any software IV&V recommendations.

8.2 Statement of Work Checklist

This checklist was taken directly from the Langley Research Center Statement of Work (SOW) Review Procedure, LMS-CP-5523 Rev. B, and includes practices recognized by OCE as practices that work very well for NASA. See the <Lee: insert link to document > for the latest version of this checklist.

Note: Items in gray text are provided as examples and explanatory guidance.
For additional guidance and examples on developing a Statement of Work see:
URL: http://sw-eng.larc.nasa.gov/docs/statements_of_work.html
and LPR 5000.2 "Procurement Initiator's Guide, Section 12 and 13.
9.1.1 Editorial Checklist
a. Is the SOW requirement in the form: "Who" shall "Do What"? E.g., "The Contractor shall (perform, provide, develop, test, analyze, or other verb followed by a description of what)."
Example SOW requirements:

  1. ? The Contractor shall design the XYZ flight software...
  2. ? The Contractor shall operate the ABC ground system...
  3. ? The Contractor shall provide maintenance on the following...
  4. ? The Contractor shall report software metrics monthly ...
  5. ? The Contractor shall integrate the PQR instrument with the spacecraft...

b. Is the SOW requirement a simple sentence that contains only one requirement? Compound sentences that contain more than one SOW requirement need to be split into multiple simple sentences. (For example, "The Contractor shall do ABC and perform XYZ" should be rewritten as: "The Contractor shall do ABC. The Contractor shall perform XYZ.")
c. Is the SOW composed of simple, cohesive paragraphs, each covering a single topic? Paragraphs containing many requirements should be divided into sub-paragraphs for clarity.
d. Has each paragraph and subparagraph been given a unique number or letter identifier? Is the numbering / lettering correct?
e. Is the SOW requirement in the active rather than the passive voice? Passive voice leads to vague statements. (For example, state: "The Contractor shall hold monthly management review meetings..." instead of "Management review meeting shall be held monthly ...")
f. Is the SOW requirement stated positively as opposed to negatively? (i.e., replace statements such as "The Contractor shall not exceed the budgetary limits specified..." with "The contractor shall comply with the budgetary limits specified...")
g. Is the SOW requirement grammatically correct?
h. Is the SOW requirement free of typographic errors, misspellings, and punctuation errors?
i. Have all acronyms been defined in an Acronym List or spelled out in the first occurrence?
j. Have the quantities, delivery schedules, and delivery method been identified for each deliverable within the SOW or a separate attachment/section?
k. Has the content of documents to be delivered been defined in a separate attachment/section and submitted with the SOW?
l. Has the file format of each electronic deliverable been defined? (e.g., Microsoft – Project, Adobe – Acrobat PDF, National Instruments – Labview VIs)
9.1.2 Content Checklist
a. Are correct terms used to define the requirements?
1. Shall = requirement (binds the contractor)
2. Should = goal (leaves decision to contractor; avoid using this word)
3. May = allowable action (leaves decision to contractor; avoid using this word)
4. Will = facts or declaration of intent by the Government (use only in referring to the Government)
5. Present tense (e.g., "is") = descriptive text only (avoid using in requirements statements; use "shall" instead)
6. NEVER use 'must'
b. Is the scope of the SOW clearly defined? Is it clear what you are buying?
c. Is the flow and organizational structure of the document logical and understandable? (See LPR 5000.2 "Procurement Initiator's Guide", Section 12 for "helpful hints".) Is the text compatible with the title of the section it's under? Are sub-headings compatible with the subject matter of a heading?
d. Is the SOW requirement clear and understandable?
1. Can the sentence only be understood one way?
2. Will all terminology used have the same meaning to different readers without definition? Has any terminology for which this is not the case been defined in the SOW? (e.g., in a Definitions section or Glossary.)
3. Is it free from indefinite pronouns ("this", "that", "these", "those") without clear antecedents? (e.g., replace statements such as "These shall be inspected on an annual basis." with "The fan blades shall be inspected on an annual basis.")
4. Is it stated concisely?
e. Have all redundant requirements been removed? Redundant requirements can reduce clarity, increase ambiguity, and lead to contradictions.
f. Is the requirement consistent with other requirements in the SOW, without contradicting itself, without using the same terminology with different meanings, without using different terminology for the same thing?
g. If the SOW includes the delivery of a product (as opposed to just a services SOW):
1. Are the technical product requirements in a separate section or attachment, apart from the activities that the contractor is required to perform? The intent is to clearly delineate between the technical product requirements and requirements for activities the contractor is to perform. (E.g., separate SOW statements "The contractor shall" from technical product requirement statements such as "The system shall" and "The software shall".)
2. Are references to the product and its sub-elements in the SOW at the level described in the technical product requirements?
3. Is the SOW consistent with and does it use the same terminology as the technical product requirements?
h. Is the SOW requirement free of ambiguities? Make sure the SOW requirement is free of vague terms. (For example, "as appropriate", "any", "either", "etc.", "and/or", "support", "necessary", "but not limited to", "be capable of", "be able to")?
i. Is the SOW requirement verifiable? Make sure the SOW requirement is free of unverifiable terms. For example, "flexible", "easy", "sufficient", "safe", "ad hoc", "adequate", "accommodate", "user-friendly", "usable", "when required", "if required", "appropriate", "fast", "portable", "light-weight", "small", "large", "maximize", "minimize", "optimize", "sufficient", "robust", "quickly", "easily", "clearly", other "ly" words, other "ize" words.
j. Is the SOW requirement free of implementation constraints? SOW requirements should state WHAT the contractor is to do, NOT HOW they are to do it. For example, "The Contractor shall design the XYZ flight software" states WHAT the contractor is to do, while "The Contractor shall design the XYZ software using object-oriented design" states HOW the contractor is to implement the activity of designing the software. In addition, too low a level of decomposition of activities can result in specifying how the activities are to be done, rather than what activities are to be done.
k. Is the SOW requirement stated in such a way that compliance with the requirement is verifiable? Does a means exist to measure or otherwise assess its accomplishment? Can a method for verifying compliance with the requirement be defined (e.g., described in a Quality Assurance Surveillance Plan)?
l. Is the background material clearly labeled as such (i.e., included in the background section of the SOW if one is used)?
m. Are the assumptions able to be validated and restated as requirements? If not, the assumptions should be deleted from the SOW. Assumptions should be recorded in a document separate from the SOW.
n. Is the SOW complete, covering all of the work the contractor is to do?
1. Are all of the activities necessary to develop the product included? (E.g., system, software, and hardware activities for the following: requirements, architecture, and design development; implementation and manufacturing; verification and validation; integration testing and qualification testing.)
2. Are all safety, reliability, maintainability (e.g., mean time to restore), availability, quality assurance, and security requirements defined for the total life of the contract?
3. Does the SOW include a requirement for the contractor to have a quality system (e.g., ISO certified), if one is needed?
4. Are all of the necessary management and support requirements included in the SOW? (For example, project management; configuration management; systems engineering; system integration and test; risk management; interface definition and management; metrics collection, reporting, analysis and use; acceptance testing; NASA Independent Verification and Validation support tasks.)
5. Are clear Performance Standards included and sufficient to measure contractor performance? (e.g., systems, software, hardware, and service performance standards for the following: schedule, progress, size, stability, cost, resources, and defects.) See Guidance on System and Software Metrics for Performance-Based Contracting at: http://sw-eng.larc.nasa.gov/docs/statements_of_work.html for more information and examples on Performance Standards.
6. Are all of the necessary service activities included? (For example, transition to operations, operations, maintenance, database administration, system administration, data management.)
7. Are all of the Government surveillance activities included? (For example, project management meetings; decision points; requirements and design peer reviews for systems, software, and hardware; demonstrations; test readiness reviews; other desired meetings (e.g., Technical Interchange Meetings); collection and delivery of metrics for systems, software, hardware, and services (e.g. to provide visibility into development progress and cost); electronic access to technical and management data; access to subcontractors and other team members for the purposes of communication.)
8. Are the Government requirements for contractor inspection and testing addressed, if necessary?
9. Are the requirements for contractor support of Government acceptance activities addressed, if necessary?
o. Does the SOW only include contractor requirements? It should not include Government requirements.
p. Does the SOW give the contractor full management responsibility and hold them accountable for the end result?
q. Is the SOW sufficiently detailed to permit a realistic estimate of cost, labor, and other resources required to accomplish each activity?
r. Are all deliverables identified (e.g., status, financial, product deliverables)? The following are examples of deliverables that are sometimes overlooked: management and development plans; technical progress reports that identify current work status, problems and proposed corrective actions, and planned work; financial reports that identify costs (planned, actual, projected) by category (e.g., software, hardware, quality assurance); products (e.g., source code, Maintenance/User Manual, test equipment); and discrepancy data (e.g., defect reports, anomalies). All deliverables should be specified in a separate document except for technical deliverables which should be included in the SOW (e.g. hardware, software, prototypes, etc.).
s. Does each technical and management deliverable track to a paragraph in the SOW? Each deliverable should have a corresponding SOW requirement for its preparation (e.g., the SOW identifies the title of the deliverable in parenthesis after the task requiring the generation of the deliverable).
t. Are all reference citations complete?
1. Is the complete number, title, and date or version of each reference specified?
2. Does the SOW reference the standards and other compliance documents in the proper SOW paragraphs?
3. Is the correct reference document cited and is it referenced at least once?
4. Is the reference document either furnished with the SOW or available at a location identified in the SOW?
5. If the referenced standard or compliance document is only partially applicable, does the SOW explicitly and unambiguously reference the portion that is required of the contractor?

9.1.3 Critical and/or Complex Requirements Checklist
Note: The checklist items below may be duplicative of items included earlier in this Appendix but are summarized here to specifically identify what is required for critical and/or complex procurements.
a. Does the SOW include the name or identification of all critical and/or complex items (i.e., specifications [TOOL:e.g. IEEE Standards, NFPA Standards], drawings, process requirements [TOOL:e.g. LMS-CPs], inspection instructions, and other relevant technical data, as applicable)?
b. Are the requirements for design, test, examination, inspection, and related instructions for acceptance by the Government included in the SOW where applicable?
c. Are the requirements for test specimens (e.g. production method, number, storage conditions) included in the SOW if applicable? These specimens could be used by the Government for design approval, inspection, investigation or auditing.


Example Templates

The following NASA Data Item Descriptions (DIDs) are listed as sample templates for the documentation templates called for during the solicitation portion of the software acquisition process. Center Process Asset Libraries (PALs) should be consulted for DIDs and Data Requirements Documents (DRDs) relevant to a specific NASA center.
9.2.1 NASA-STD-2100-91
NASA DIDs are defined in the NASA-STD-2100-91 Software Documentation Standard, which is available at http://satc.gsfc.nasa.gov/assure/docstd.html. The NASA DIDs provide a format for a documentation set, including what needs to be addressed in each section.
MASTER DOCUMENTATION DATA ITEM DESCRIPTIONS

  • NASA-DID-000Software Documentation Set DID
  • NASA-DID-999Template DID

MANAGEMENT PLAN DATA ITEM DESCRIPTIONS

  • NASA-DID-M000Management Plan DID
  • NASA-DID-M100Acquisition Activities Plan DID
  • NASA-DID-M200Development Activities Plan DID
  • NASA-DID-M210Training Development Plan DID
  • NASA-DID-M300Sustaining Engineering and Operations Activities Plan DID
  • NASA-DID-M400Assurance Plan DID
  • NASA-DID-M500Risk Management Plan DID
  • NASA-DID-M600Configuration Management Plan DID
  • NASA-DID-M700Delivery and Operational Transition Plan DID


PRODUCT SPECIFICATION DATA ITEM DESCRIPTIONS

  • NASA-DID-P000Product Specification DID
  • NASA-DID-P100Concept DID
  • NASA-DID-P200Requirements DID
  • NASA-DID-P300Architectural Design DID
  • NASA-DID-P400Detailed Design DID
  • NASA-DID-P410Firmware Support Manual DID
  • NASA-DID-P500Version Description DID
  • NASA-DID-P600User's Guide DID
  • NASA-DID-P700Operational Procedures Manual DID

ASSURANCE AND TEST PROCEDURES DATA ITEM DESCRIPTIONS

  • NASA-DID-A000Assurance and Test Procedures DID
  • NASA-DID-A100Assurance Procedures DID
  • NASA-DID-A200Test Procedures DID

MANAGEMENT, ENGINEERING, AND ASSURANCE REPORTS DATA ITEM DESCRIPTIONS

  • NASA-DID-R000Management, Engineering, and Assurance Reports DID
  • NASA-DID-R001Certification Report
  • NASA-DID-R002Audit Report
  • NASA-DID-R003Inspection Report
  • NASA-DID-R004Discrepancy (NRCA) Report
  • NASA-DID-R005Engineering Change Proposal
  • NASA-DID-R006Lessons Learned ReportF-10
  • NASA-DID-R007Performance/Status Reports
  • NASA-DID-R008Assurance Activity Report
  • NASA-DID-R009Test Report
  • NASA-DID-R010Waiver/Deviation Request
  • NASA-DID-R011Review Report

9.2.2 Center DIDs and DRDs
The following DIDs and DRDs are samples available from center PALs. Consult your own center PAL for templates relevant to work performed for your center.

Marshall Space Flight Center Templates
Available from http://spi.msfc.nasa.gov/templates.html and the individual Project Asset sections of the Marshall Space Flight Center PAL.

  • Software Configuration Management Plan
  • Software Test Report (STR) Template
  • Unit Test Procedure Template

Goddard Space Flight Center Templates
Available from http://software.gsfc.nasa.gov/ispaindx.cfm.

  • Software Management Plan/Product Plan (SMP/PP) for Class A, B, & C Software
  • ISD Software Management Plan/Product Plan (SMP/PP) for Class D&E Software
  • Version Description Document
  • Template for the Software Quality Assurance Plan
  • Configuration Management Plan Template
  • Other templates in progress or not available publicly
Unknown macro: {div3}

9. References

  1. Software Acquisition Statement of Work Guideline, SEPG-SWACQ-PRC-1, Glenn Research Center,
  2. Prepare Presolicitation Documents, LMS-OP-4509, Langley Research Center
  3. Statement of Work (SOW) Review Procedure, LMS-CP-5523, Langley Research Center
  4. Product Requirements Development and Management Procedure, LMS-CP-5526, Langley Research Center
  5. Process for Conducting a Make/Buy Analysis, 580-SP-075-01, Goddard Space Flight Center, http://software.gsfc.nasa.gov/AssetsApproved/PA2.1.1.1.doc
  6. WBS Checklist Tool, Goddard Space Flight Center, http://software.gsfc.nasa.gov/toolsDetail.cfm?selTool=1.2.4.0
  7. Software Supplier Agreement Management Plan, Jet Propulsion Laboratory,
  8. Software Assurance: Five Essential Considerations for Acquisition Officials, Mary Linda Polydys, Stan Wisseman, STSC Crosstalk, July 2005
  9. A Method for Reasoning About an Acquisition Strategy, Mary Catherine Ward, Joseph P. Elm, Software Engineering Institute (SEI), 2005
  10. Software Acquisition Best Practices: 2004 Edition, Adams, Eslinger, Owens, Rich, 3rd OSD Conference on the Acquisition of Software-Intensive Systems
  • No labels