bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
Wiki Markup
{alias:SWE-035} {tabsetup:1. The Requirement|2. Rationale|3. Guidance|4. Small Projects|5. Resources|6. Lessons Learned} {div3:id=tabs-1} h1. 1. Requirements
Div
idtabs-1

1. Requirements

2.5.4

For

new

contracts,

the

project

shall

establish

a

procedure

for

software

supplier

selection,

including

proposal

evaluation

criteria.

h2. {color:#003366}{*}

1.1

Notes{*}{color} NPR

Notes

NPR 7150.2

does not include any notes for this requirement. h2.

, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2

Applicability

Across

Classes

Classes

C

through

E

and

Safety

Critical

are

labeled,

"SO

if

D-E."

. 

This

means

that

for

Classes

D

through

E,

this

requirement

applies

only

to

the

safety-critical

aspects

of

the

software.

Class

H

is

labeled

with

"P (Center).

 

" This

means

that

an

approved

Center-defined

process

which

meets

a

non-empty

subset

of

the

full

requirement

can

be

used

to

achieve

this

requirement.

{applicable:asc=1|ansc=1|bsc=1|bnsc=1|csc=*|cnsc=1|dsc=*|dnsc=0|esc=*|ensc=0|f=1|g=1|h=p} {div3} {div3:id=tabs-2} h1. 2. Rationale {panel} When choosing a supplier to create software, it is important to use a consistent evaluation process for all potential suppliers.  {panel} An established evaluation process includes criteria by which all proposals are weighed allowing the results to be compared equally and as objectively as possible.  A process with preset criteria helps ensure that each proposal is evaluated and the final choice made based on the most important features and capabilities required for project success. {div3} {div3:id=tabs-3} h1. 3. Guidance The base set of suppliers may come from a variety of sources, including market analyses of software suppliers, pre-existing supplier lists, or simply the set of respondents to a request for proposals (RFP).  "In some organizations, acquirers may solicit proposals from a limited number of suppliers to reduce their cost and efforts for the solicitation. Acquirers should, however, ensure that they include suppliers who are capable of meeting the requirements and that a sufficient number of suppliers are included to provide a competitive environment. This competition enhances the leverage of the acquirer in achieving its objectives (e.g., providing different approaches to meeting requirements). In some cases, the organization prequalifies preferred suppliers from which an acquirer can choose provided the preferred suppliers meet the specific needs of the project. Choosing from preferred suppliers can greatly reduce the effort and time required for solicitation. Depending on applicable regulations and project characteristics, the acquirer can determine to pursue a sole-source acquisition rather than a competitive bid. Acquirers should document the rationale for determining potential suppliers, particularly in the case of sole-source selection."^6^ An established procedure and set of evaluation criteria is used to select the most qualified supplier for a new contract.  The selection procedure includes the evaluation criteria as well as the method for evaluating proposals.  Supplier selection decisions "must be carefully managed in accordance with regulations governing the fairness of the selection process."^7^ {panel} Note that [SWE-027], [SWE-032], [SWE-041] contain criteria for certain types of software that are to be included in applicable RFPs as well as the evaluation criteria. {panel} *{+}Supplier selection procedure{+}* The selection procedure may be documented in a source selection plan that contains the following suggested sections: *    Roles and Responsibilities *    Facilities/Security capabilities *    Criteria for Selection *    Geographic Location *    Staff available to work on the contract *    Proposal Requirements *    Selection Process{^}5^ Additionally, the selection procedure normally includes a source selection authority (SSA) as appropriate for the size or priority of the project{^}3^.  The SSA will make the final supplier selection using input from a selection/evaluation team.  Members of the selection team are typically chosen and confirmed well before proposals arrive for evaluation.  Members typically include technical experts, a contracting specialist, and software assurance.  Having software assurance on the team is "essential not only for establishing appropriate Software Assurance requirements, but also in evaluating potential contractors and ensuring that secure software is delivered."^4^ {panel} The results of the selection procedure, including notes regarding advantages, disadvantages, and scores for each potential supplier, should be documented and maintained. {panel} If the selection process includes a period for questions or a period for negotiations with potential suppliers before a selection is made{^}6^, those processes and any bounding regulatory restrictions that apply should be included in the process documentation. The [NASA Systems Engineering Handbook|http://www.ap233.org/ap233-public-information/reference/20080008301_2008008500.pdf] includes the following proposal evaluation advice: * "Give adequate weight to evaluating the capability of disciplines that could cause mission failure (e.g., hardware, software, thermal, optics, electrical, mechanical). * Conduct a pre-award site visit of production/test facilities that are critical to mission success. * Distinguish between "pretenders" (good proposal writers) and "contenders" (good performing organizations). Pay special attention to how process descriptions match relevant experience and past performance. While good proposals can indicate good future performance, lesser quality proposals usually predict lesser quality future work products and deliverables. * Assess the contractor's SEMP and other items submitted with the proposal based on evaluation criteria that include quality characteristics (e.g., complete, unambiguous, consistent, verifiable, and traceable)." *{+}Proposal evaluation criteria{+}* {panel} Evaluation criteria are used to rate or score proposals received in response to a solicitation. Evaluation criteria for selecting a supplier must appear in the solicitation.  {panel} Consider the following possible criteria: * Cost estimation comparisons * Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2A requirements, and others in the solicitation) * Technical approach * Available staff and associated skills * Past performance including how well cost, schedule, performance, and technical requirements were met * Customer satisfaction * Software engineering and management capabilities * Prior expertise on similar projects (domain expertise) * Available resources (facilities, hardware, software, training, etc.) * Delivery processes and procedures * Process maturity * CMMI ratings (see {SWE-032]) ** Check the SEI Published Appraisal Results (PARs) to confirm non-expired rating ([


applicable
f1
g1
hp
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnsc1
dnsc0
dsc*
ensc0
Div
idtabs-2

2. Rationale


Panel

When choosing a supplier to create software, it is important to use a consistent evaluation process for all potential suppliers. 


An established evaluation process includes criteria by which all proposals are weighed allowing the results to be compared equally and as objectively as possible. A process with pre-set criteria helps ensure that each proposal is evaluated and the final choice made based on the most important features and capabilities required for project success.

Div
idtabs-3

3. Guidance

The base set of suppliers may come from a variety of sources, including market analyses of software suppliers, pre-existing supplier lists, or simply the set of respondents to a request for proposals (RFP). 

"In some organizations, acquirers may solicit proposals from a limited number of suppliers to reduce their cost and efforts for the solicitation. Acquirers should, however, ensure that they include suppliers who are capable of meeting the requirements and that a sufficient number of suppliers are included to provide a competitive environment. This competition enhances the leverage of the acquirer in achieving its objectives (e.g., providing different approaches to meeting requirements). In some cases, the organization pre-qualifies preferred suppliers from which an acquirer can choose provided the preferred suppliers meet the specific needs of the project. Choosing from preferred suppliers can greatly reduce the effort and time required for solicitation.

"Depending on applicable regulations and project characteristics, the acquirer can determine to pursue a sole-source acquisition rather than a competitive bid. Acquirers should document the rationale for determining potential suppliers, particularly in the case of sole-source selection."

sweref
328
328

"An established procedure and set of evaluation criteria is used to select the most qualified supplier for a new contract. The selection procedure includes the evaluation criteria as well as the method for evaluating proposals. Supplier selection decisions "must be carefully managed in accordance with regulations governing the fairness of the selection process."

sweref
273
273


Panel

Note that SWE-027, SWE-032, SWE-041 contain criteria for certain types of software that are to be included in applicable RFPs as well as the evaluation criteria.


Supplier selection procedure

The selection procedure may be documented in a source selection plan that contains the following suggested sections:

  •    Roles and Responsibilities.
  •    Facilities/Security capabilities.
  •    Criteria for Selection.
  •    Geographic Location.
  •    Staff available to work on the contract.
  •    Proposal Requirements.
  •    Selection Process.
    sweref
    062
    062

Additionally, the selection procedure normally includes a source selection authority (SSA) as appropriate for the size or priority of the project

sweref
002
002
. The SSA will make the final supplier selection using input from a selection/evaluation team. Members of the selection team are typically chosen and confirmed well before proposals arrive for evaluation. Members typically include technical experts, a contracting specialist, and software assurance.  Having software assurance on the team is "essential not only for establishing appropriate Software Assurance requirements, but also in evaluating potential contractors and ensuring that secure software is delivered."
sweref
301
301


Panel

The results of the selection procedure, including notes regarding advantages, disadvantages, and scores for each potential supplier, need to be documented and maintained.


If the selection process includes a period for questions or a period for negotiations with potential suppliers before a selection is made, those processes and any bounding regulatory restrictions that apply should be included in the process documentation.

sweref
328
328

The NASA Systems Engineering Handbook

sweref
273
273
includes the following proposal evaluation advice:

  • "Give adequate weight to evaluating the capability of disciplines that could cause mission failure (e.g., hardware, software, thermal, optics, electrical, mechanical).
  • "Conduct a pre-award site visit of production/test facilities that are critical to mission success.
  • "Distinguish between "pretenders" (good proposal writers) and "contenders" (good performing organizations). Pay special attention to how process descriptions match relevant experience and past performance. While good proposals can indicate good future performance, lesser quality proposals usually predict lesser quality future work products and deliverables.
  • "Assess the contractor's Systems Engineering Management Plan (SEMP) and other items submitted with the proposal based on evaluation criteria that include quality characteristics (e.g., complete, unambiguous, consistent, verifiable, and traceable)."

Proposal evaluation criteria


Panel

Evaluation criteria are used to rate or score proposals received in response to a solicitation. Evaluation criteria for selecting a supplier must appear in the solicitation. 


Consider the following possible criteria:

  • Cost estimation comparisons.
  • Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2 requirements, and others in the solicitation).
  • Technical approach.
  • Available staff and associated skills.
  • Past performance including how well cost, schedule, performance, and technical requirements were met.
  • Customer satisfaction.
  • Software engineering and management capabilities.
  • Prior expertise on similar projects (domain expertise).
  • Available resources (facilities, hardware, software, training, etc.).
  • Delivery processes and procedures.
  • Process maturity.
  • Capability Maturity Model Integration (CMMI) ratings (see SWE-032).
    • Check the Software Engineering Institute (SEI) Published Appraisal Results (PARs) to confirm non-expired rating (http://sas.sei.cmu.edu/pars
|http://sas.sei.cmu.edu/pars]) ** Be sure to check the scope of the organization holding the CMMI rating to confirm the rating is held by the specific organization submitting the proposal * Total ownership and lifecycle costs * Intellectual property rights * Use of Open Source (see [SWE-041]) and COTS, GOTS, and MOTS (see [SWE-027]) Additional evaluation considerations may be found in the supplier evaluation checklist in IEEE STD 1062-1998{^}8^ which contains questions for consideration specific to: * Financial soundness * Experience and capabilities * Development and control processes * Technical assistance * Quality practices * Maintenance service * Product usage * Product warranty * Costs * Contracts {note} Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to supplier selection. {note} See the [Acquisition Guidance|https://nasa7150.onconfluence.com/display/7150/7.7+-+Acquisition+Guidance] topic in this handbook for additional guidance and a broader discussion on software acquisition. The references in this topic may also provide additional guidance on creating a procedure for supplier selection. If supplier selection includes COTS/GOTS/MOTS products, see [SWE-027] for guidance relevant to this type of software and software suppliers. If supplier selection includes Open Source products, see [SWE-041] for guidance relevant to this type of software and software suppliers.\\ Additional guidance related to acquisition and supplier selection may be found in the following related requirement in this handbook: | *[SWE-027]* | COTS, GOTS, MOTS, etc. | | *[SWE-032]* | CMMI Levels for Class A, B, and C software | | *[SWE-033]* | Acquisition Assessment | | *[SWE-038]* | Acquisition Planning | | *[SWE-041]* | Open Source Notification | \\ {div3} {div3:id=tabs-4} h1. 4. Small Projects There is currently no guidance for this requirement specific to small projects. {div3} {div3:id=tabs-5} h1. 5. Resources # [Acquisition Guidance|https://nasa7150.onconfluence.com/display/7150/7.7+-+Acquisition+Guidance] topic in this handbook. # Defense Acquisition University, "[Defense Acquisition Guidebook|http://at.dod.mil/docs/DefenseAcquisitionGuidebook.pdf]", 2010. # Office of Procurement, LARC, Prepare Presolicitation Documents, Revision O-1, LMS-OP-4509, 2009. # Polydys, M. and Wisseman, S., "Software Assurance: Five Essential Considerations for Acquisition Officials", [CrossTalk The Journal of Defense Software Engineering|http://www.crosstalkonline.org/storage/issue-archives/2007/200705/200705-0-Issue.pdf], May 2007. Accessed April 1, 2011. # Jet Propulsion Laboratory, "[Software Supplier Agreement Management Plan Template|https://nen.nasa.gov/web/software/nasa-software-process-asset-library-pal?p_p_id=webconnector_WAR_webconnector_INSTANCE_PA7b&p_p_lifecycle=1&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_webconnector_WAR_webconnector_INSTANCE_PA7b_edu.wisc.my.webproxy.URL=https%3A%2F%2Fnx.arc.nasa.gov%2Fnx%2Fdsweb%2FGet%2FDocument-499361%2FJPL%2BSoftware%2BSupplier%2BAgreement%2BManagement%2BPlan%2B%28SSAMP%29%2BTemplate%2B%2804-15-02%29.doc]". # Software Engineering Institute, "[CMMI for Acquisition, Version 1.3|http://www.sei.cmu.edu/reports/10tr032.pdf]", CMU/SEI-2010-TR-032, 2010. # NASA Scientific and Technical Information (STI), NASA Center for AeroSpace Information, "[NASA Systems Engineering Handbook|http://www.ap233.org/ap233-public-information/reference/20080008301_2008008500.pdf]", NASA/SP-2007-6105, Rev1, 2007. # IEEE Computer Society, "[IEEE Recommended Practice for Software Acquisition|http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=741938]", IEEE Std 1062-1998, 1998 (need user account to access IEEE standards via this [NASA Technical Standards System|http://standards.nasa.gov/] link). {toolstable} {div3} {div3:id=tabs-6} h2. 6. Lessons Learned *Inheritance Review of the Mars Phoenix Flight System , Public Lessons Learned Entry: 1807* (http://www.nasa.gov/offices/oce/llis/imported_content/lesson_1807.html) "Despite the unusually large percentage of the Phoenix design and hardware that was inherited from previous Mars spaceflight projects, the format used for Phoenix project system and subsystem Inheritance Reviews (IRs) proved adequate to mitigate the risk within technical and programmatic constraints. A mission assurance checklist provided acceptance criteria to validate the flight worthiness of each subsystem. Consider using the Phoenix Inheritance Review format as a model for future missions that feature substantial inheritance. Plan carefully for the collection, analysis, and eventual archiving of records documenting the system and subsystem pedigree." # "Soliciting the participation of the spacecraft system contractor in evaluating the system compatibility of the inherited or commercial off-the-shelf (COTS) product functionality with project Level 1 and Level 2 requirements. # Conducting a mission assurance review and system engineering review in concert with the subsystem IRs. # Utilizing a mission assurance checklist that provided acceptance criteria to validate the flight worthiness of each subsystem. The checklist was derived from the form (Hardware Review & Certification Record) that JPL uses to assess the risk to flight hardware posed by mechanical or electrical integration with the system (Reference (3)). # Providing the project with a recommended course of action (e.g., modification or additional testing) in cases where a subsystem did not meet the checklist's acceptance criteria. " * * {div3} {tabclose}
    • ).
    • Be sure to check the scope of the organization holding the CMMI rating to confirm the rating is held by the specific organization submitting the proposal.
  • Total ownership and life-cycle costs.
  • Intellectual property rights.
  • Use of Open Source Software (see SWE-041) and COTS (Commercial Off the Shelf), GOTS (Government Off the Shelf), and MOTS (Modified Off the Shelf) (see SWE-027).

Additional evaluation considerations may be found in the supplier evaluation checklist in IEEE STD 1062-1998, IEEE Recommended Practice for Software Acquisition,

sweref
213
213
which contains questions for consideration specific to:

  • Financial soundness.
  • Experience and capabilities.
  • Development and control processes.
  • Technical assistance.
  • Quality practices.
  • Maintenance service.
  • Product usage.
  • Product warranty.
  • Costs.
  • Contracts.


Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to supplier selection.


See Topic 7.03 - Acquisition Guidance in this Handbook for additional guidance and a broader discussion on software acquisition. The references in this topic may also provide additional guidance on creating a procedure for supplier selection.

If supplier selection includes COTS/GOTS/MOTS products, see SWE-027 for guidance relevant to this type of software and software suppliers.

If supplier selection includes Open Source Software products, see SWE-041 for guidance relevant to this type of software and software suppliers.

Additional guidance related to acquisition and supplier selection may be found in the following related requirement in this Handbook:


SWE-027

Use of Commercial, Government, and Legacy Software (COTS, GOTS, MOTS, etc.)

SWE-032

CMMI Levels for Class A, B, and C Software

SWE-033

Acquisition vs. Development Assessment

SWE-038

Acquisition Planning

SWE-041

Open Source Software Notification



Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.

Div
idtabs-5

5. Resources


refstable



toolstable
Div
idtabs-6

6. Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

Inheritance Review of the Mars Phoenix Flight System. Lessons Learned Entry 1807: "Despite the unusually large percentage of the Phoenix design and hardware that was inherited from previous Mars spaceflight projects, the format used for Phoenix project system and subsystem Inheritance Reviews (IRs) proved adequate to mitigate the risk within technical and programmatic constraints. A mission assurance checklist provided acceptance criteria to validate the flight worthiness of each subsystem. Consider using the Phoenix Inheritance Review format as a model for future missions that feature substantial inheritance. Plan carefully for the collection, analysis, and eventual archiving of records documenting the system and subsystem pedigree."

  1. "Soliciting the participation of the spacecraft system contractor in evaluating the system compatibility of the inherited or commercial off-the-shelf (COTS) product functionality with project Level 1 and Level 2 requirements.
  2. "Conducting a mission assurance review and system engineering review in concert with the subsystem IRs.
  3. "Utilizing a mission assurance checklist that provided acceptance criteria to validate the flight worthiness of each subsystem. The checklist was derived from the form (Hardware Review & Certification Record) that JPL uses to assess the risk to flight hardware posed by mechanical or electrical integration with the system (Reference (3)).
  4. "Providing the project with a recommended course of action (e.g., modification or additional testing) in cases where a subsystem did not meet the checklist's acceptance criteria."
    sweref
    570
    570