2.5.4 For new contracts, the project shall establish a procedure for software supplier selection, including proposal evaluation criteria.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 Applicability Across Classes
Classes C through E and Safety Critical are labeled, "SO if D-E." This means that for Classes D through E, this requirement applies only to the safety-critical aspects of the software.
Class H is labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.
An established evaluation process includes criteria by which all proposals are weighed allowing the results to be compared equally and as objectively as possible. A process with pre-set criteria helps ensure that each proposal is evaluated and the final choice made based on the most important features and capabilities required for project success.
The base set of suppliers may come from a variety of sources, including market analyses of software suppliers, pre-existing supplier lists, or simply the set of respondents to a request for proposals (RFP).
"In some organizations, acquirers may solicit proposals from a limited number of suppliers to reduce their cost and efforts for the solicitation. Acquirers should, however, ensure that they include suppliers who are capable of meeting the requirements and that a sufficient number of suppliers are included to provide a competitive environment. This competition enhances the leverage of the acquirer in achieving its objectives (e.g., providing different approaches to meeting requirements). In some cases, the organization pre-qualifies preferred suppliers from which an acquirer can choose provided the preferred suppliers meet the specific needs of the project. Choosing from preferred suppliers can greatly reduce the effort and time required for solicitation.
Supplier selection procedure
The selection procedure may be documented in a source selection plan that contains the following suggested sections:
The results of the selection procedure, including notes regarding advantages, disadvantages, and scores for each potential supplier, need to be documented and maintained.
Proposal evaluation criteria
Evaluation criteria are used to rate or score proposals received in response to a solicitation. Evaluation criteria for selecting a supplier must appear in the solicitation.
Consider the following possible criteria:
- Cost estimation comparisons.
- Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2 requirements, and others in the solicitation).
- Technical approach.
- Available staff and associated skills.
- Past performance including how well cost, schedule, performance, and technical requirements were met.
- Customer satisfaction.
- Software engineering and management capabilities.
- Prior expertise on similar projects (domain expertise).
- Available resources (facilities, hardware, software, training, etc.).
- Delivery processes and procedures.
- Process maturity.
- Capability Maturity Model Integration (CMMI) ratings (see SWE-032).
- Check the Software Engineering Institute (SEI) Published Appraisal Results (PARs) to confirm non-expired rating (http://sas.sei.cmu.edu/pars).
- Be sure to check the scope of the organization holding the CMMI rating to confirm the rating is held by the specific organization submitting the proposal.
- Total ownership and life-cycle costs.
- Intellectual property rights.
- Use of Open Source Software (see SWE-041) and COTS, GOTS, and MOTS (see SWE-027).
- Financial soundness.
- Experience and capabilities.
- Development and control processes.
- Technical assistance.
- Quality practices.
- Maintenance service.
- Product usage.
- Product warranty.
Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to supplier selection.
See Topic 7.3 - Acquisition Guidance in this Handbook for additional guidance and a broader discussion on software acquisition. The references in this topic may also provide additional guidance on creating a procedure for supplier selection.
If supplier selection includes Open Source Software products, see SWE-041 for guidance relevant to this type of software and software suppliers.
Additional guidance related to acquisition and supplier selection may be found in the following related requirement in this Handbook:
Use of Commercial, Government, and Legacy Software (COTS, GOTS, MOTS, etc.)
CMMI Levels for Class A, B, and C Software
Acquisition vs. Development Assessment
Open Source Software Notification
4. Small Projects
No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph.
6. Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
Inheritance Review of the Mars Phoenix Flight System. Lessons Learned Entry 1807: "Despite the unusually large percentage of the Phoenix design and hardware that was inherited from previous Mars spaceflight projects, the format used for Phoenix project system and subsystem Inheritance Reviews (IRs) proved adequate to mitigate the risk within technical and programmatic constraints. A mission assurance checklist provided acceptance criteria to validate the flight worthiness of each subsystem. Consider using the Phoenix Inheritance Review format as a model for future missions that feature substantial inheritance. Plan carefully for the collection, analysis, and eventual archiving of records documenting the system and subsystem pedigree."