(Revised 9/27/2012 through PROCLTR 2012-50)
DLAD PGI APPENDIX C
SOURCE SELECTION PROCEDURES
[September 18, 2012]
Chapter 1. Source Selection.
Purpose | |
Best-Value Continuum | |
Applicability | |
Source Selection Team Roles and Responsibilities | |
Program Management/Requirements Office Roles and Responsibilities |
Chapter 2. Pre-Solicitation Activities.
Conduct Acquisition Planning |
||
Develop a Source Selection Plan | ||
2.3 |
Develop the Request For Proposal |
No DLA Text |
2.4 |
Release the Request For Proposal |
No DLA Text |
Chapter 3. Evaluation and Decision Process.
Evaluation Activities | |
Documentation of Initial Evaluation Results | |
Award without Discussions | |
Discussion Process | |
Final Proposal Revisions | |
Documentation of Final Evaluation Results | |
Conduct and Documentation of Comparative Analysis | |
Best-Value Decision | |
Source Selection Decision Document | |
Debriefings |
Chapter 4. Documentation Requirements.
Rationale and justification. | |
Clearance documentation. |
Chapter 5. Definitions.
Proposal Analysis Report (PAR). | |
Uncertainty |
DLA Supplement to DOD SSP Appendices
A |
Lowest Price Technically Acceptable Source Selection Process |
No DLA Text |
B |
Debriefing Guide |
No DLA Text |
Attachments:
1. Source Selection Non-Disclosure Agreement
2. Conflict of Interest Statement
3. Source Selection Information Cover Sheet
4. Source Selection Plan Template
5. Initial/Final Technical Evaluation Report Template
11. Past Performance/Performance Confidence Assessment Report Template and Hints
12. Past Performance Questionnaire
13. Proposal Analysis Report Template
14. Comparative Analysis Report (CAR) and Award Recommendation Template
15. Source Selection Decision Document (SSDD) Template, Preparation Tips and Checklist
The following procedures are mandatory for all DLA competitive negotiated acquisitions not meeting the exemptions defined in DoD Source Selection Procedures (DoD SSP), paragraph 1.3 Applicability. These procedures supplement and describe how DLA will accomplish certain elements of the DoD SSP requirements. In every instance you are responsible for verifying the regulatory guidance in the FAR, DFARS, and DLAD.
These procedures also provide sample templates that may be used in preparation of source selection-related documentation and the execution of source selection activities. The templates may be tailored based on the Best Value Continuum methodology used, complexity, and dollar amount of the acquisition.
The DoD Source Selection Procedures are located at DFARS 215.300 at http://www.acq.osd.mil/dpap/policy/policyvault/USA007183-10-DPAP
This mandatory guidance sets forth supplemental procedures for conducting competitively negotiated source selections using Federal Acquisition Regulation (FAR) Part 15 Contracting By Negotiation procedures within the Defense Logistics Agency (DLA), and follows the numbering convention of the Department of Defense Source Selection Procedures (DoD SSP). DoD SSP sections are included only if there is DLA supplementation.
● Tradeoff Source Selection Process.
○ Performance Price Tradeoff (PPT) is a simplified best-value source selection strategy in which the only non-price evaluation factor is past performance, permitting a tradeoff between price and performance in reaching the award decision. This strategy permits recognition of the good performer and thereby minimizes the risk of awarding to a contractor that will not perform at an acceptable level. PPT also allows for SSA discretion in awarding to higher rated performers over lower rated performers if the price differential is warranted and considered to be best value. The PPT process can be used for any competitive negotiated acquisition for which it is unnecessary to distinguish levels of technical merit among the proposals to make an award decision. PPT is not appropriate for use with sole source buys or technically complex acquisitions. There are two basic PPT models, those with technical proposals and those without.
n Without technical proposals. This simpler model is structured without the use of technical evaluation factors and submission of technical proposals. The assessment of recent and relevant past performance, resulting in a performance confidence assessment rating, is based on the results of surveys sent to customers identified by the respective offerors and other sources of information available to the contracting officer.
n With technical proposals. The second model includes technical evaluation factors and/or subfactors that must be considered to ensure the offeror can satisfy certain minimum requirements. The factors/subfactors are evaluated on an acceptable/unacceptable, pass/fail, or similar basis, and are used only to determine technical acceptability and not as part of the tradeoff. As with the first model, the assessment of recent and relevant past performance, resulting in a performance confidence assessment rating, is based on the results of surveys sent to customers identified by the respective offerors and other sources of information available to the contracting officer.
This procedure is not a stand-alone document and therefore shall be used in conjunction with FAR Part 15, as supplemented, to include the DoD SSP, related law, regulation and policy. This procedure is applicable to all negotiated competitive acquisitions using FAR Part 15 procedures regardless of source selection approach taken within the best-value continuum. Compliance with this procedure is not required for acquisitions exempted in the DoD SSP paragraph 1.3. Proposed deviations or waivers from the DoD SSP shall be forwarded to HQ DLA Acquisition and Policy Division (J71) for review and processing to the Director, Defense Procurement and Acquisition Policy in accordance with DLAD Subpart 1.490. Proposed deviations or waivers from these DLA supplemental procedures shall be forwarded to HQ DLA Acquisition and Policy Division (J71) for review and processing to the Director, DLA Acquisition.
1.4 Source Selection Team (SST) Roles and Responsibilities.
1.4.1. The Source Selection Authority (SSA).
1.4.1.1. Appointment of SSA. DLAD 15.303 shall be followed for appointment of the SSA.
1.4.1.2. SSA Responsibilities.
1.4.1.2.2. When the SSA is the procuring contracting officer (PCO), he/she may also fulfill a role as a member of the SSEB.
1.4.1.2.3. When the SSA is the PCO, establish an SST as needed. The size/composition of the SST and SSEB are scalable and should be based on the technical complexity and dollar threshold of the acquisition . SST members, when contracting activity personnel, may perform multiple roles such as evaluation of cost/price proposal and past performance. Evaluation teams and members shall be identified in the source selection plan (SSP).
1.4.1.2.6. Templates for the Source Selection Non-Disclosure Agreement (NDA) and Conflict of Interest Statement are provided at attachment 1 and attachment 2. The NDA may be used for an individual acquisition or as a blanket NDA that covers all acquisitions that may be reviewed. The blanket NDA may be for an annual basis or for as long as the individual is assigned to the position.
1.4.1.2.6.1 All members of and non-counsel advisors to Source Selection Evaluation Boards,Technical Evaluation Boards, and Source Selection Advisory Councils who will have access to source selection information shall sign individual NDAs for each acquisition in which they participate. Counsel advising source selection boards and councils may sign a blanket NDA covering all source selections for which they provide advice.
1.4.1.2.6.2 A blanket NDA may be used for:
● Everyone in the local contracting office who participates in or has access to source selection sensitive information.
● Review personnel at the local and HQ level above the SSEB/TEB and SSAC.
● If a field activity uses a standing SSAC for multiple procurements then a blanket NDA may be used at the discretion of the field activity.
● A blanket NDA may be valid for 12 months or as long as the individual is in the position.
● PLFAs and other organizations that use blanket NDAs shall establish and issue local procedures to manage the blanket NDAs.
1.4.1.2.6.3 An NDA is not required for senior DLA officials, such as the J3, J6, J7, General Counsel, and Vice Director and Director, and they should not be required to sign NDAs, although members of their staffs should if they will have access to source selection/procurement sensitive information. NDAs should also not be required from non-DLA participants in DPAP Peer Reviews, unless the head of the peer review team requires it.
The NDA shall be executed by individuals upon assignment to the SST. A Conflict of Interest Statement must be executed by each member of the SST after receipt of proposals (when the offerors/ subcontractors are known) and before accessing a proposal (or performing their respective role).
1.4.1.2.9. Approve the SSP before solicitation release and approve subsequent revisions to the plan.
1.4.1.2.10. Document their best value decision using a Source Selection Decision Document (SSDD). The SSDD should be tailored based on the size and complexity of the acquisition. A SSDD template, preparation tips, and checklist are found at Attachment 4.
1.4.2 PCO.
1.4.2.2. PCO responsibilities. The PCO shall:
1.4.2.2.3. In conjunction with the SSEB or evaluation team, ensure all information to be transmitted using electronic communications is carefully reviewed prior to release to prevent improper/unauthorized release of electronic documents that contain sensitive or embedded source selection files. If email is used to transmit source selection information, it shall be encrypted and the subject line shall include the phrase “Source Selection Information - See FAR 2.101 and 3.104”. See Attachment 3 for the Source Selection Information Coversheet.
1.4.2.2.2. For the acquisition of services over $10,000,000, ensure a copy of the Requirements Certification, as required by Section 863 of the FY2011 National Defense Authorization Act (NDAA), is included in the contract file and is briefed to the Acquisition Strategy Review Panel (ASRP). The certification is required for both DLA and non-DLA requirements.
1.4.2.2.4. Maintain evaluation material and any related supporting information developed by any member of the SST that has been presented in any form to the SSA as an official record that must not be altered. Updates, revisions, or changes to that evaluation information must be captured in subsequent documentation in a way that the original records remain distinct. Evaluation materials are considered working papers prior to their disclosure to the SSA. These working papers may be changed or modified by their author, as necessary, in order to support the evaluation process.
1.4.2.2.4.1. Ensure that any requests for source selection delegations are properly accomplished and documented in the source selection file.
1.4.2.2.9. Ensure appropriate contract clearances are obtained and documented in accordance with DLAD 1.690-4.
1.4.3. Source Selection Advisory Council (SSAC).
1.4.3.1 Establishment and Role of the SSAC.
The makeup and size of the SSAC may be tailored to the acquisition based on the estimated value and complexity of the acquisition.
1.4.3.1.2. A SSAC is not required for acquisitions which use the lowest price technically acceptable (LPTA).
1.4.3.3 Responsibilities of SSAC.
1.4.3.3.1. The SSAC Chairperson shall:
1.4.3.3.1.2. Ensure all SSAC members are knowledgeable of their responsibilities.
1.4.3.3.1.3. Review the SSP and any revisions in conjunction with the SSAC prior to SSA approval.
1.4.3.3.1.4. Review significant acquisition actions requiring SSA approval, such as the decision to establish a competitive range and enter into discussions, in conjunction with the SSAC.
1.4.4. SSEB.
1.4.4.1. Composition of the SSEB.
1.4.4.1.1. The makeup and size of the SSEB may be tailored to the acquisition based on the estimated value and complexity of the acquisition. When the PCO fulfills the role and responsibilities of the SSA, the PCO shall appoint, as appropriate, evaluation teams and personnel to conduct the technical, past performance and cost evaluations. The lead functional member can simultaneously serve as an evaluation team member and SSEB Chair. Personnel may serve on multiple teams and/or perform multiple functions as determined by the SSA, except that teams involved with cost or price evaluation will normally not include non-contracting personnel who are also on a technical evaluation team unless the CCO has determined that it is appropriate to disclose cost information to the technical evaluators IAW DLAD 15.305(a)(4). The evaluation teams shall be identified in the SSP, and the team members may either be identified in the SSP or in separate appointment documents.
1.4.4.2. Responsibilities of the SSEB.
1.4.4.2. Prepare Evaluation Notices (EN) for the PCO’s use in discussions and review contractor responses. More on ENs can be found in Chapter 3, paragraph 3.2.1.1 of this guide. EN templates can be found at Attachment 10.
1.4.4.2.1. SSEB Chairperson shall:
1.4.4.2.1.2. After approval of the SSP, document any subsequent SSEB personnel replacement/additions in an addendum to the SSP. The replacement/addition of SSEB personnel requires SSA approval unless the SSA delegates approval authority to the SSEB or SSAC Chairperson.
1.4.4.2.1.5. Review ENs prior to their being provided to the PCO.
1.4.4.2.1.8. If the SSEB is established during the acquisition planning phase of the procurement, be responsible for establishing an effective liaison with the requiring office to ensure requirements are effectively addressed within the requirements documents.
1.5 Program Management/Requirement Office Roles and Responsibilities.
1.5.1.1. For the acquisition of services over $10,000,000, the program management or requirements office is responsible for providing the PCO with a copy of the “Requirements Certification” as required by Section 863 of Public Law 111-383 (2011 NDAA).
Sample Source Selection Organization**
2.1 Conduct Acquisition Planning.
2.1.1. Acquisition Planning.
2.1.1.3. Acquisition of Services. See DLAD 1.690, Contract Clearance and Oversight for specific reviews and approvals required for the acquisition of services. See paragraph 1.5 for requiring office certification of requirements for services over $10,000,000.
2.1.1.4. Independent Management Review. See DLAD 1.170 Peer Reviews and 1.690 Contract Clearance and Oversight for specific reviews and approvals required by for peer reviews, integrated acquisition review board (IARB) and the acquisition strategy review panel (ASRP).
2.1.3. Acquisition Strategy.
If an acquisition strategy briefing is required, it must identify the evaluation factors, the proposed SST’s organizational structure, risk and all known unique aspects of the evaluation. An acquisition strategy may require a business case analysis (BCA).
The DOD Product Support BCA Guidebook, dated April 2011 is located at: https://eworkplace.dla.mil/sites/col11/DLA_Land_and_Maritime_PBL/Shared%20Documents/DoD%20PBL%20Policy/BCA%20Guidebook%20April%202011.pdf
The PCO and, if applicable, the SSEB chair (with assistance from SSEB members as necessary) shall prepare the SSP. An SSP template is at Attachment 4. This document should be tailored based on the size and complexity of the acquisition. The PCO shall maintain the SSP after approval (see DoD SSP paragraph 1.4.2.2.4). Subsequent proposed changes to the SST, to include the SSAC members when used, shall be documented in an attachment or addendum to the SSP (see 1.4.2.1.2.). The SSP should be submitted sufficiently in advance of the planned acquisition action to permit review and approval by the SSA and early establishment of the SST.
2.2.5. See DLAD 15.304 for required evaluation factors.
2.2.10. Identify and explain requested or approved deviations and delegations.
2.3 Develop the Request For Proposal. No DLA text.
2.4 Release the Request For Proposal. No DLA text.
3.1.1.1 If the solicitation requires a price realism analysis, this analysis must also be fully documented.
3.1.1.2 The cost/price or cost or price headings in the source selection documents/templates should be tailored based on what is being evaluated. If it is any variation of fixed price, just “price” should be used wherever the document or template states “cost/price”.
3.1.3. Past Performance Evaluation.
Relevancy is a threshold question when considering past performance, not a separate element of past performance. Relevancy shall not be described as a subfactor, and the DoD SSP Table 4 past performance relevancy ratings are not included in the final evaluation ratings for award, but instead are used to formulate the final past performance confidence ratings as set out in DoD SSP Table 5.
The personnel or team evaluating past performance should review the information provided for each of the referenced contracts to make the determination of its relevancy to the current requirement. In some cases, referenced contracts as a whole may be similar to the current contract, while in others, only portions of referenced contracts may be relevant. An example where only a portion of the past performance would be relevant is evaluation of the contractor’s management, planning, and scheduling of subcontractors where the solicitation requirement is for subcontract management skills.
Once the relevancy has been determined the evaluation team then reviews the past performance assessments and any other past performance information reflecting how well the contractor performed on the referenced contracts. The past performance assessments or information may be in many forms and come from a variety of sources. It could be numerical scores as with Past Performance Information Retrieval System (PPIRS) Statistical Reporting (delivery and quality), DLA’s Automated Best Value System (delivery and quality), or adjectival ratings on past performance questionnaires or PPIRS- Report Cards. The evaluation team must analyze this information and determine a rating of how well the contractor performed on each referenced contract.
The SSP will include a detailed approach to evaluating the quality of past performance; this approach is not required to be stated in the solicitation. Relevant considerations, such as degree of conformance to contractual requirements, responsiveness and effectiveness in addressing issues, quality of customer service, and conformance to cost/price requirements or goals, should be used in the qualitative past performance evaluation. The qualitative past performance evaluation, combined with the relevancy (as defined in the solicitation) and recency (as defined in the solicitation) of the past performance, results in the final confidence assessment.
The adjectival ratings and definitions provided below is one example in analyzing and assigning a rating of how well the contractor performed on the referenced contracts. The evaluation team may elect to establish other descriptions.
Performance Level |
Description |
Outstanding |
The contractor’s performance meets contractual requirements and exceeds many (requirements) to the Government’s benefit. The contractual performance was accomplished with few minor problems for which corrective actions taken by the contractor were highly effective. 10% under cost (not counting government directed cost growth). |
Good |
The contractor’s performance meets contractual requirements and exceeds some (requirements) to the Government’s benefit. The contractual performance was accomplished with some minor problems for which corrective actions taken by the contractor were effective. 5% under cost (not counting government directed cost growth). |
Acceptable |
The contractor’s performance meets contractual requirements. The contractual performance contained some minor problems for which corrective actions taken by the contractor appear or were satisfactory. At cost (not counting government directed cost growth). |
Marginal |
Performance does not meet some contractual requirements. The contractual performance reflects a serious problem for which the contractor has not yet identified corrective actions or the contractor’s proposed actions appear only marginally effective or were not fully implemented. Up to 20% over cost (not counting government directed cost growth). |
Unacceptable |
Performance does not meet most contractual requirements and recovery is not likely in a timely manner. The contractual performance contains serious problem(s) for which the contractor’s corrective actions appear or were ineffective. More than 20% over cost (not counting government directed cost growth). |
Not Applicable |
Rating not available or provided. |
The performance confidence assessment is normally provided at an overall level after evaluating aspects of the specified contracts’ past performance and other past performance information, focusing on recent performance that is relevant to the solicitation requirements. Below is a methodology process flow to show how the information flows from the individual contract evaluations into the single performance confidence assessment rating for the past performance factor.
METHODOLOGY PROCESS FLOW FOR PAST PERFORMANCE EVALUATION SAMPLE | |||||
| |||||
Subfactor/ Factor 1 |
Subfactor/ Factor 2 |
Subfactor/ Factor 3 |
Overall Factor(if Subfactors are used) |
Cost (if being reviewed) | |
Contract A |
Very Relevant |
Relevant |
Very Relevant |
Very Relevant |
Very Relevant |
Outstanding |
Acceptable |
Outstanding |
Outstanding |
Outstanding | |
Contract B |
Relevant |
Very Relevant |
Very Relevant |
Very Relevant |
Very Relevant |
Acceptable |
Acceptable |
Acceptable |
Acceptable |
Acceptable | |
Contract C |
Relevant |
Relevant |
Somewhat Relevant |
Relevant |
Relevant |
Good |
Good |
Acceptable |
Good |
Acceptable | |
Contract D |
Somewhat Relevant |
Very Relevant |
Relevant |
Relevant |
Somewhat Relevant |
Good |
Satisfactory |
Satisfactory |
Satisfactory |
Satisfactory | |
Contract E |
Relevant |
Somewhat Relevant |
Somewhat Relevant |
Somewhat Relevant |
Somewhat Relevant |
Acceptable |
Acceptable |
Acceptable |
Acceptable |
Acceptable | |
Note: Example assumes all three technical subfactors are equal. |
3.1.3.2. Previous source selections or contractor performance assessments if the information is recent and relevant.
3.2 Documentation of Initial Evaluation Results.
3.2.1. The SSEB will prepare “Discussion” evaluation notices (ENs) for potential use by the contracting officer during discussion (see paragraph 3.4 below) whenever a proposal aspect may not meet requirements, is unclear, or is unacceptable because the proposal either does not address a requirement or addresses it in a manner not allowed by the solicitation. The contracting officer will issue discussion ENs only to offerors in the competitive range. An EN template can be found in Attachment 10 of this guide.
3.2.1.1. Evaluation Notices (ENs).
3.2.1.1.1. ENs are used to track: a specific point of discussion; a specific proposal weakness or deficiency (see FAR 15.001); or an area requiring more clarification. Each EN is given a unique number to ensure offerors’ responses can be tracked to the appropriate Government concern. Additionally, ENs can be compiled into a spreadsheet format for each offeror to facilitate tracking and presentation; using a spreadsheet or EN-response matrix is a best practice and is highly recommended.
3.2.1.1.2. ENs and offeror responses to ENs shall be kept with the contract file until contract closeout. They may be referenced should questions arise during contract performance as to what was understood by both parties at time of award.
3.3 Award without Discussions.
3.3.3. The PCO shall obtain contract clearance approval in accordance with DLAD1.690 prior to the SSA making the decision to award without discussions.
3.4.1. Through discussions, Government evaluators obtain the necessary information from offerors within the competitive range to resolve outstanding issues. A “Clarification” or “Communications” EN should be used to advise an offer of adverse past performance, and may be used IAW FAR 15.306(a)(2) if discussions will not be held, or before establishment of the competitive range IAW FAR 15.306(b)(1)(i). Do not provide names of individuals providing information about a contractor’s past performance. The issuance of Clarification and Communication ENs should not result in the revision of an offeror’s proposal.
3.4.2. If requested by the contracting officer, the SSEB shall prepare the competitive range briefing, which shall be reviewed by the SSAC (if established for the procurement) . The contracting officer establishes the competitive range IAW FAR 15.304(c)(1), and, if requested by the SSA or as required by the SSP, will brief the SSA for approval of the competitive range determination.
3.4.3. The PCO may provide offerors in the competitive range with their own initial ratings and results of their own initial pricing analysis or total evaluated price. When interim ratings and pricing analysis are provided prior to requesting final proposal revisions, the ratings shall reflect the results of discussions with the offeror.
3.4.3.1. When addressing adverse past performance with an offeror, the names of individuals
who provided information about the offeror’s past performance shall not be identified.
3.5 Final Proposal Revisions (No DLA Text).
3.6 Documentation of Final Evaluation Results.
3.6.1. A Proposal Analysis Report (PAR) will be used to document the results of the SSEB’s evaluation. The PAR shall be prepared by the SSEB. The SSEB Chairperson and PCO (and the SSAC Chairperson if an SSAC is used) shall sign the completed PAR (following review by the SSAC, if used). The PAR evaluates each proposal against solicitation requirements and NOT other proposals. A tailorable PAR template is available at Attachment 13 of this guide.
3.6.1.1. When the SSA is other than the PCO, a mandatory source selection decision briefing is prepared by the SSEB (with review by the SSAC, if used). A tailorable Award Recommendation template is available at Attachment 14 of this guide. While the SSEB will brief the resolution of prior deficiencies and weaknesses. The SSA may discuss with the team, any issues or remaining questions regarding the offerors’ proposals. The briefing should contain matrices displaying the color and/or adjectival ratings for the technical factor(s) and subfactor evaluations and, if rated separately, a technical risk rating; past performance evaluation; and cost and/or price analysis for all offerors. The briefing should also contain supporting narrative in bullet form characterizing all strengths, deficiencies, weaknesses, and performance confidence information to be considered by the SSA regarding the comparison of offerors’ proposals and past performance. If a tradeoff decision is required, strengths, deficiencies, weaknesses, and risk ratings shall be assessed for their potential benefit to, or undesirable impact upon, the Government. Also include those positive and negative aspects which affect the performance confidence assessment, if assigned. Finally, address the proposed cost or price assessment. As a minimum, the following information should also be briefed:
● Recap of distinguishing aspects of this acquisition
● Funding issues
● Contractual considerations
● Exceptions to Terms and Conditions
● Recap of factors and relative importance
● Evaluation criteria for each factor/subfactor
● Summary of offerors’ proposed approaches
● The Source Selection Advisory Council’s (if used) analysis
● The source selection recommendation of the SSET or SSAC, if used, and any minority opinion.
3.6.2. The PAR will include, if applicable, any minority opinion(s).
3.7 Conduct and Documentation of Comparative Analysis.
3.7.3. A tailorable Comparative Analysis Report and Award Recommendation template is provided at Attachment 14.
3.8.1. The PCO shall obtain contract clearance approval prior to the SSA making a source selection decision in accordance with DLAD 1.690. [Note: This does not prohibit the creation of a draft SSDD.]
3.8.2.1. The rationale and justification for business decisions and non-cost or price tradeoff determinations must be reasonable, consistent with evaluation factors listed in the solicitation, and adequately documented. “Adequately documented” means written point-by point qualitative comparisons of the solicitation’s source selection criteria with each offer, and the rationale for any business judgments or tradeoffs. (Specific reference, to include page and paragraph numbers is useful during review.) The rationale must include a comparative analysis of offerors’ relative strengths and weaknesses in all factors and sub-factors and their advantages or disadvantages to the Government. There is no requirement to give credit for special features of a proposal, if it has been reasonably determined and documented that such features will not make a meaningful contribution or better satisfy government needs, or otherwise are of no significant benefit to the government. The decision documentation must include the rationale for a determination that a particular feature or aspect of a proposal does or does not provide a significant benefit to the government.
3.8.2.2. The SSA may use one of the following procedures in making the Source Selection Decision:
● The SSA solely performs the comparative analysis. This normally occurs when the PCO is also the SSA. The SSA will evaluate the proposals and document the analysis, and perform[s] the comparative analysis of the cost/price, and non-cost factors of each offer, including the trade-off analysis if required. The SSA will write the SSDD.
● The PCO drafts the SSDD. The PCO, using input from the SSEB and SSAC, when applicable, will perform the comparative evaluation and document the file on the cost/price and non-cost factors. The PCO will also draft the SSDD for the SSA. The SSA must perform an independent review and evaluation. If the SSA concurs with the contracting officer’s recommendation, the SSA will adopt that decision by signing the document. The SSA may reject the PCO’s draft and may direct revisions in SSDD, edit the draft SSDD, or write the SSDD to reflect the SSA’s independent conclusion.
3.9 Source Selection Decision Document (SSDD).
3.9.1. A SSDD shall be prepared for all source selections. In those instances where information is contained in other acquisition documents, the SSDD should refer to the original document and a copy attached. A tailorable SSDD Template, Preparation Tips and Checklist are provided at Attachment 15. As a minimum, the SSDD shall:
● Describe the solicited requirement.
● Specify the number of offers included in the competitive range if award is being made following discussions.
● Name the offerors.
● List non-cost evaluation factors and subfactors along with their relative importance to each other and to cost/price as stated in the solicitation.
● List each offeror’s overall factor ratings and subfactor ratings. Include cost/price in the listing.
● Provide a narrative comparing the non-cost rating of each proposal, to include the strengths and weaknesses (if not awarding using LPTA).
● Describe the business justification, and/or cost benefit analysis of the best value decision.
● Not contain conclusory statements, partial comparisons, or generalizations. For instance, use quantities or percentages, rather than the generalized “many.”
3.10.1. Preparing for the Debriefing. No debriefing shall be given to any offeror that is not scripted and reviewed prior to presentation by the designated review authorities.
3.10.2. Each debriefing shall be documented using a debriefing summary. As a minimum, the debriefing summary shall include:
● Topics covered (may refer back to debriefing slides or script), and
● A record of offeror questions and Government’s responses to the questions.
4.9 Rationale and justification. The rationale and justification for evaluation results and assignment of interim ratings will be completely and contemporaneously documented and included in the source selection file. This documentation includes the evaluation worksheets and summaries, and is in addition to information regarding the final evaluation results and ratings to be documented in the PAR. The PAR and award recommendation shall be included in the source selection file. Tailorable SSET evaluation templates including the Initial/Final Technical Evaluation Report, Rating Team Worksheet, Analysis Worksheet, Subfactor Summary, Evaluation Notice, Past Performance Report, Proposal Analysis Report, Comparative Analysis Report and Award Recommendation, Source Selection Decision Document, and Debriefing Certificate are provided as attachments.
4.23 Clearance documentation. Clearance documentation in addition to the determination to award without discussions or the final proposal revision request approval and award (if required) shall be included in the contract file.
5.23 Proposal Analysis Report (PAR). The PAR is the narrative report prepared by the SSEB that fully documents the results of the evaluation of each proposal.
5.24 Uncertainty is a doubt regarding the accuracy or completeness of information in a proposal that may prevent a determination of whether an aspect of a proposal meets a material performance or capability requirement. It requires additional information from the offeror to further explain the proposal before the evaluator can complete the review and analysis and should generate the issuance of an EN.
Lowest Price Technically Acceptable Source Selection Process
(No DLA Text)
Debriefing Guide
(No DLA Text)
NON-DISCLOSURE AGREEMENT
Name: ________________________ Grade: ____________________
Job Title:______________________ Organization: ______________
Source Selection: _____________________________________________
Date: _______________________________________________________
1. I acknowledge I have been assigned to the source selection (or position) indicated above. I am aware that unauthorized disclosure of source selection or proprietary information could damage the integrity of this procurement and that the disclosure of such information to unauthorized persons could subject me to adverse action under applicable law and regulation.
2. I will not divulge, publish, or reveal by word, conduct, or any other means, such information or knowledge, except as necessary to do so in the performance of my official duties related to this source selection and in accordance with the laws of the United States or as specifically authorized by an authorized individual. I acknowledge that the information I receive will be given only to persons specifically granted access to the source selection information by an authorized individual or by DLAD 3.104-3(a), and may not be further divulged without specific prior written approval from an authorized individual.
4. If, at any time during the source selection process, my participation might result in a real, apparent, possible, or potential conflict of interest, I will immediately report the circumstances to the Source Selection Authority.
5. Check the applicable block:
☐ I have submitted a current OGE Form 450, Executive Branch Confidential Financial Disclosure Report, as required by DODD 5500.07, Standards of Conduct.
☐ I am not required to submit an OGE Form 450.
☐ I have submitted a current SF278, Executive Branch Confidential Financial Disclosure Report, as required by DODD 5500.07, Standards of Conduct
☐ I am not required to submit a Form SF278
or
☐ I am a non-Government employee. I have signed a proprietary information non-disclosure agreement that has been included in the contract between my firm and the Government that precludes me from divulging any protected information and proprietary data to which I may gain access during the evaluation of proposals.
Neither I, nor anyone in my immediate family, have any financial interest in any company involved in this acquisition as either a prime contractor or as a subcontractor.
Signature: ___________________________ Date: ____________
CONFLICT OF INTEREST STATEMENT
Source Selection ________________________________________ (Insert acquisition name and RFP #)
Please review the list of prime contractors and their subcontractors who are offering proposals in response to the request for proposal (RFP) for the acquisition identified above with the procuring contracting officer (PCO). After reviewing the list, check the appropriate boxes, fill in the information requested, and sign:
I certify that, to the best of my knowledge, neither I nor anyone in my immediate family possess any financial interest in any company, parent or subsidiary, that provided a bid/offer/proposal on the acquisition identified above now being considered by the Source Selection Evaluation Board (SSEB) or Source Selection Advisory Council (SSAC) of which I am a member or advisor. Should any company in which I or my immediate family has a financial interest submit a bid/offer/proposal, I will immediately reveal such interest to the SSEB/SSAC Chairperson and the PCO. (Please note that ownership of a financial interest in a company described in this paragraph which interest is valued at less than $15,000 is not disqualifying and need not be reported on this form. The $15,000 value is additive among your spouse and minor children, if any.)
OR
I do possess a financial interest in a company that is proposing on or is in involved in the acquisition
identified above now being considered by the SSEB or SSAC of which I am a member or advisor. (If you have checked this box, provide a description of your financial interests on this form or as an attachment to the form.)
I further acknowledge my obligation to disclose any friendships; family or social relationships; past, present, or planned employment relationships, or any other type of relationship, such as housing or transportation arrangements, which might be perceived as compromising my independent judgment in connection with this Source Selection. (Disclose any such matters on this form or as an attachment to this form.)
Name (print): ______________________________________________________
Organization: _________________________________________________ Phone: ________________
Signature: ____________________________________________________ Date: _________________
|
SOURCE SELECTION PLAN TEMPLATE
A Source Selection Plan (SSP) is required for all best-value, negotiated, competitive acquisitions under FAR Part 15, regardless of dollar value of the acquisition or source selection approach utilized.
At a minimum, the SSP shall address the nine sections identified in the template (see DoD SSP paragraph 2.2 and DFARS PGI 215.303(b)(2)(C)), however, the template should be otherwise tailored to clearly represent the program or requirements and, when applicable, the particular phase of the action being addressed.
Note on using template: Information printed in parenthetical italics within the template should be deleted from the final SSP before printing/obtaining signatures.
Note: Include the legend “Source Selection Information -- See FAR 2.101 and 3.104.” at the top and bottom of each page of the document.
Note: Include the legend “Source Selection Information -- See FAR 2.101 and 3.104.” at the top and bottom of each page of the document
COVER PAGE
SOURCE SELECTION PLAN FOR
(insert Program or Requirement name)
__________________________
Procuring Contracting Officer
PRINTED NAME: __________________________
POSITION/TITLE: __________________________
OFFICE SYMBOL: __________________________
TELEPHONE: __________________________
DATE SIGNED: __________________________
REVIEWED: (only if SSAC used)
______________________________________
Source Selection Advisory Council Chairperson
Printed Name:__________________
Position/Title:________________
Office Symbol:_________________
Telephone:______________________
Email Address:______________________
Date Signed:____________________
RECOMMENDED FOR APPROVAL:
______________________________________
Source Selection Evaluation Board Chairperson
Printed Name:_____________________
Position/Title: ____________________
Office Symbol:_____________________
Telephone: ________________________
Email Address;___________________
Date Signed:_______________________
REVIEWED: ___________________________
General Counsel/ Legal Advisor
Printed Name:____________________
Position/Title:____________________
Office Symbol:_____________________
Telephone:_______________________
Email Address:___________________
DATE SIGNED:____________ _________
APPROVED: ___________________________
Source Selection Authority
Printed Name:______________________
Position/Title:_______________________
Office Symbol:______________________
Telephone:_________________________
Email Address:______________________
Date Signed: _____________________
TABLE OF CONTENTS
SECTION TITLE( Subsections may be added if required) |
Page Number |
1.0 BACKGROUND AND OBJECTIVES |
|
2.0 ACQUISITION STRATEGY 3.0 SOURCE SELECTION TEAM (SST) |
|
4.0 COMMUNICATIONS |
|
5.0 EVALUATION FACTORS AND SUBFACTORS 6.0 DOCUMENTATION |
|
7.0 SCHEDULE OF EVENTS 8.0 NON-GOVERNMENT PERSONNEL 9.0 SECURING SOURCE SELECTION MATERIALS |
|
SSP Attachment 1 – Source Selection Organization SSP Attachment 2 – SSAC Membership (if used) SSP Attachment 3 – SSEB Membership SSP Attachment 4 – Relevant excerpts from Section L SSP Attachment 5 – Relevant excerpts from Section M |
|
(Include a brief description of the requirement, a summary of the objectives, and any reference to applicable guidance).
(Provide a summary of the approved acquisition strategy, including the approval date. Reference the approved acquisition plan, acquisition strategy review panel (if required), and the market research report for a more detailed discussion).
An acquisition strategy review panel was conducted on _______ (date).
The simplified acquisition plan or acquisition plan was approved on _______ (date).
3.0 SOURCE SELECTION TEAM (SST).
(Describe the organizational structure and identify the various roles and responsibilities of each of the SST, such as the SSEB, the SSAC, the Procuring Contracting Officer [PCO], and the SSA, during the phases of the source selection. List members and advisors by name, position title, company affiliation, if applicable, or by functional area.
The source selection organization chart is at SSP Attachment 1.
3.1. Source Selection Authority (SSA).
____________________ (Insert name, title, and office symbol) is the Source Selection Authority (SSA) for this acquisition. (If the SSA is other than the PCO, reference the appointment letter and date).
3.2. Source Selection Advisory Council (SSAC).
The SSAC will be chaired by: ______________. (Insert name of SSAC chair). SSP Attachment 2 lists the recommended members of the SSAC.
3.3 Source Selection Evaluation Board (SSEB).
The SSEB will be chaired by: _______________. (Insert name of SSEB chair.) SSP Attachment 3 lists the recommended Procuring Contracting Officer (PCO), team chairperson(s) and members of the SSEB and advisors (if any).
3.4 Source Selection Team Roles and Responsibilities.
.) Roles and responsibilities of the SSA, SSAC, and SSEB (and applicable Chairpersons, Members and Advisors) are specified in FAR 15.303 and DoD Source Selection Procedures Section 1.4, as supplemented. (Insert any additional verbiage desired.)
4.0 COMMUNICATIONS.
(Describe the process and controls for communication with industry as well as internal Government team communication, to include the use of email during the source selection. Outline the security measures that will be utilized to ensure the information is protected as source selection information (see FAR 2.101 and FAR 3.104).
The PCO shall serve as the sole focal point for all solicitation-related inquiries from actual or prospective offerors. Government personnel and support contractors (if applicable) will not engage in communications with industry concerning the source selection unless the PCO has authorized such communications.
Meetings among Government personnel listed in SSP Attachments 2 and 3 concerning the source selection will be held _______(describe anticipated frequency, i.e., regular basis, daily, or as-needed. Also describe the location and security measures). These meetings will be predominantly for proposal evaluation, documentation preparation, and decision-making.
(Discuss the types of communications that are authorized by the SSA for the source selection.)
During the source selection, exchanges with industry may include oral presentation by offerors (if applicable) clarifications, communications, and discussions as defined in FAR 15.306, as supplemented. All such exchanges with industry will be documented in the source selection file. Exchanges with industry may be _______________ (indicate written with encrypted E-Mail/facsimile and/or U.S. Postal delivery or face to face meetings, or other means as applicable ), and specified controls to preserve the integrity of the source selection process, as described herein, shall be adhered to. Controls may include _____________ (describe types of controls to be used).
5.0 EVALUATION FACTORS AND SUBFACTORS.
5.1 Solicitation Provisions.
(Identify and describe the evaluation factors, subfactors, their relative order of importance; the importance of all non-cost factors to the cost or price factor; and the evaluation process, including specific procedures, evaluation worksheets, etc. to be used in evaluating and documenting the evaluation of proposals. Attach the most current and relevant portions of Sections L and M to preclude inconsistencies between the SSP and RFP. )
See SSP Attachment 4 “Excerpts from Section L - Instructions, Conditions, and Notices to Offerors,” and SSP Attachment 5 “Excerpts from Section M - Evaluation Factors for Award.” These documents describe the instructions for proposal preparation, the factors and subfactors and their relative importance, and the evaluation criteria.
(Note: Ensure these provisions in the solicitation are verbatim from Attachments 4 and 5. Recommend addressing in this paragraph the procedures involved when changes are made to Section M and/or L, if applicable, and whether or not SSP should be amended. For example: “Changes to Section L by solicitation amendment do not require amendment of the SSP, however changes in Section M require amendment to the SSP and approval by the SSA before a solicitation amendment may be issued.”)
5.2 Evaluation Process.
The SSET will strictly adhere to FAR 15.3, as supplemented by DFARS, DLAD 15.3 and DLAD PGI 15.3, and the evaluation process and criteria stated in SSP Attachment 5, Section M – Evaluation Factors for Award during evaluation of proposals.
(Use the above statement, or tailor the sample language below to discuss your evaluation criteria and procedures.)
Technical/Technical Risk, Past Performance (tailor as appropriate for your source selection strategy), and Cost or Price are the factors that will be evaluated. Their evaluation will be fully supported by the narrative findings. The narrative findings will identify strengths, deficiencies, weaknesses, and significant weaknesses, associated with each Technical evaluation factor/subfactors, as applicable. The narrative summary for the Past Performance factor will describe the recency, relevancy, and quality of past work efforts.
(Note: Tailor 5.2.1 depending upon which Technical Rating Evaluation methodology your source selection is using, if applicable)
5.2.1 Combined Technical/Risk Rating.
A color and/or an adjectival rating will be assigned in accordance with DoD SSP paragraph 3.1.2.1, Table 1 to reflect the degree to which the offeror’s proposed approach meets or does not meet the minimum performance or capability requirements (factor or subfactors, when established) through an assessment of the strengths, weaknesses, deficiencies, and risks of the proposal.
5.2.1. Separate Technical/Risk Ratings.
The offeror’s technical proposal will be rated separately from the risk associated with its technical approach. A color and/or adjectival rating will be assigned in accordance with DoD SSP paragraph 3.1.2.2, Table 2 to reflect the quality of the offeror’s technical solution for meeting the minimum performance or capability requirements (factor or subfactors, when established) through an assessment of the strengths, weaknesses, and deficiencies of the proposal. The separate technical risk rating will be assigned in accordance with DoD SSP paragraph 3.1.2.2.2, Table 3 to access the weaknesses associated with the offeror’s proposed approach as it relates to accomplishing the requirements of the solicitation. Evaluators will make an independent judgment of the probability of success, the impact of failure and the acceptability of the offeror’s proposed risk mitigation solutions when assessing proposal risk.
5.2.1 Technical Ratings (LPTA).
The offeror’s proposal will be evaluated against the Government’s minimum requirements as stated in the solicitation to determine whether the proposal is acceptable or unacceptable, using the ratings and descriptions outlined in the DoD Source Selection Procedures, Table A-1.
5.2.1 Technical Ratings (Performance Price Trade Off (PPT) (Technical Proposal Required).
The offeror’s proposal will be evaluated against the Government’s minimum technical requirements to determine whether the proposal is acceptable or unacceptable using the ratings and descriptions outlined in the DoD Source Selection Procedures Table A-1.
5.2.1 Technical Ratings (Performance Price Trade Off (PPT) (Technical Proposal Not Required).
The offeror’s proposal will be evaluated as to whether it complies with the terms and conditions of the solicitation.
(Modify for source selections if the PCO will conduct the past performance evaluation).
A past performance evaluation team within the SSEB will conduct a past performance evaluation that examines an offeror's recent, relevant past performance record to assess the government’s confidence in the offeror’s ability to perform as proposed. The past performance evaluation will consider the following relevant factors (list relevant factors, such as the number and severity of problems, the effectiveness of any corrective actions taken, and the offeror's overall performance record). This will be assessed at the Past Performance factor level using the Performance Confidence ratings and descriptions in DoD Source Selection Procedures, Tables 4 and 5. (Modify as required)
(If the past performance factor is instead to be rated on an acceptable or unacceptable basis, substitute the following for the sentence above)
Past Performance will be rated on an “acceptable” or “unacceptable” basis at the Past Performance factor level using the ratings in the DoD Source Selection Procedures, Table A-2.
Potential sources of performance data are Government sources such as _____________ and/or non-Government sources, such as __________.
(Include as appropriate for your acquisition.. See DoD Source Selection Procedures paragraph 3.1.3.2 and paragraph A.3. for sources of past performance information.)
5.2.3 Cost or Price (Select appropriate).
(Normally, if the contract type is FFP, cost realism is not required. Address the basis for evaluating cost/price.(i.e., is it based on acquisition cost, total evaluated price, most-probable cost, most-probable life cycle cost). Per FAR 15.305(a)(1): "Normally, competition establishes price reasonableness. Therefore, when contracting on a firm-fixed-price or fixed-price with economic price adjustment basis, comparison of the proposed prices will usually satisfy the requirement to perform a price analysis, and a cost analysis need not be performed. In limited situations, a cost analysis (see 15.403-1(c)(1)(i)(B)) may be appropriate to establish reasonableness of the otherwise successful offeror's price.
When contracting on a cost-reimbursement basis, evaluations shall include a cost realism analysis to determine what the Government should realistically expect to pay for the proposed effort, the offeror's understanding of the work, and the offeror's ability to perform the contract. Cost realism analyses may also be used on fixed-price incentive contracts or, in exceptional cases, on other competitive fixed-price-type contracts [see 15.404-1(d)(3)].)
Price will be evaluated in accordance with the terms of the solicitation. Price will always be evaluated for reasonableness. (tailor, as appropriate, if you will also be evaluating cost realism, most-probable cost, etc., by adding In this procurement, it will also be evaluated, followed by the specific matters that will be evaluated)
(Carefully review and consider whether additional price evaluation criteria are necessary for your particular acquisition. For example if range quantities are being utilized then this will need to be addressed).
(Include only if appropriate)
Section M of the solicitation addresses if reviews and/or plant visits will be conducted and how the visits will be used in the evaluation process. Commonly, reviews (e.g., on-site Software Capability Evaluation) and plant visits are conducted by the Program Manager, Lead Engineer, PCO, and Cost/Price Analyst. These visits are usually conducted during discussions with an offeror. The results of any reviews/visits should be briefed to the SSA during the final briefing as part of the Technical factor evaluation.
6.0 DOCUMENTATION.
(See DoD SSP paragraph 2.2.6 for minimum required documents.
After the final evaluation of proposals against the factors and subfactors is completed and documented, the SSEB chairperson, in conjunction with the PCO, will prepare a Proposal Analysis Report (PAR) for the SSA’s analysis. The SSAC (change to SSEB IF an SSAC is not used and the SSA has specifically requested that the SSEB perform the comparative analysis, with or without an award recommendation; the SSA’s signature on this SSP will document that request) will perform a comparative analysis of the proposals and provide a written report with the results of that analysis and an award recommendation to the SSA. The following documents will be prepared during the course of this source selection: (Identify the types of documents that will be prepared)
The following schedule of significant events delineates the steps that will be accomplished during this source selection.
(List the major acquisition activities and projected completion dates. The sample table below contemplates discussions and must be specifically tailored as appropriate for your acquisition to ensure it is streamlined, yet achievable; supports proper and full compliance with source selection procedures; and meets overall program schedules. The list of events below is not exhaustive. If you are conducting a demonstration or in-plant review, include those dates. If you reserve the right to award without discussions, consider adding an additional column to indicate what the schedule would be if award is made without discussions.
The dates shown are planning estimates only, may not include preparatory time for specific events, additional time for pre-briefs, mandatory source selection team training, or activities such as peer reviews, integrated acquisition review boards, multi-functional independent review team [MIRT] reviews, or legal reviews.)
* These dates are based on the receipt of ________ (number) proposals.
(Note: if small business set-aside, insert "small business size challenge" after event 18, allowing 5 business days and renumbering remaining events.)
Sample Schedule.
Event |
Date* |
1. SSA Delegation Approved |
|
2. Business Clearance (Review and Approval) |
17 - 27 days for DLA HQs or DPAP ($1B) |
3. SSA approves Source Selection Plan |
|
4. Formal Solicitation Release |
|
5. Proposals Received |
30-60 days after Solicitation Release |
6. Initial Evaluations Completed |
Allow X to X days per proposal |
7. SSEB Initial Evaluation/Competitive Range Briefing to SSA (if briefing requested) |
X to X days after Initial Evaluations Complete |
8. Release ENs |
Within 1 or 2 days after Event #6 |
9. Receive responses to ENs |
Normally not more than 10 days after Event #7 |
10. Evaluate EN Responses |
Normally not more than 2 days per proposal |
11 Final Proposal Revision (FPR) Release Briefing to SSA (if briefing requested) |
Within X days after Event #9 |
12. Issue FPR Requests |
Within X days after Event #10 |
13. FPRs Received |
7 - 10 days after Event #11 |
14. Finalize SSEB Evaluation / Prepare Briefing to SSAC (if briefing requested) |
Within X days after Event #12 |
15. Draft Proposal Analysis Report (PAR) |
Work incrementally as information becomes available |
16. Contract Clearance (Review and Approval) |
Allow 1 to 2 days per proposal; dependent on the number of proposals & complexity of action |
17. Finalize SSAC Analysis and Brief SSA (if briefing requested) |
Within X-X days after Event #15 |
18. SSA Decision (See Note) |
3 to 5 days after Event #16 |
19. Finalize Source Selection Decision Document (SSDD) |
1 to 3 days after event #17 |
20. Business Clearance (Review and Approval) |
17 - 27 days for DLA HQs or DPAP ($1B) |
21. 1279 Report |
3 days, or more, prior to contract award |
22. Contract Award |
At least 3 to 5 days after Event #18 |
23. Debriefings (if requested) |
See FAR 15.503, 15.505, and 15.506 for time frames after Event #19 |
(When considering the use of non-Government personnel to advise in source selections, program managers and their teams must be aware of and comply with the restrictions in FAR 9.505-4(b), and FAR 37.203(d). These sections provide the guidelines for use of non-Government personnel to support the source selection, the responsibility of the Head of the Contracting Activity (HCA) to make reasonable efforts to identify Government personnel who can support the source selection, and the requirement for the HCA to complete a determination that no Government personnel are available, before using non-Government personnel and approving their use.)
Non-Government advisors will (or “will not”) be used. (If non-Government advisors are not applicable, delete the remaining part of this section.) Their expertise is required to support evaluation of _______________ (identify the functional disciplines, e.g., system engineering, integration, configuration management, data management, quality, software capability, supportability, or test and evaluation) relative to the acquisition.
Individual and Company names and company addresses of non-Government advisors are identified at Attachment #:
Authority to use non-Government personnel to assist in this source selection was granted by (Name/Title) on (Date).
8.2 Release of Proposal Information to Non-Government Advisors and Notification to Offerors.
The release of proposal information to non-Government advisors will be subject to the controls outlined in DoD Source Selection Procedures, paragraph 1.4.5.2. Offeror proprietary information will not be provided to non-Government advisors unless the solicitation notifies offerors that the non-Government advisor contractors/personnel may be given access to the proprietary information, and the non-Government advisor contract provides for protection of proprietary information and the contractor and its employees who will have access to proprietary data have signed and provided adequate nondisclosure agreements.
If the proprietary information includes information covered by DFARS 227.7103-7, the non-Government advisors must comply with that section before receiving the information. A provision will be included in the solicitation providing notice to prospective offerors that contractor personnel will be used and the manner in which they will be used, and providing the offeror an opportunity to object to the release of proposal information.
Non-Government advisors are prohibited from proposal rating, ranking, or recommending the selection of a source (they also must not have any financial interests with any of the offerors). Also, they are not normally permitted to participate in discussions. Non-Government advisors are not normally permitted to participate in Government decision making meetings such as SSAC (or SSEB) sessions, or SSA briefings, unless invited by the chairperson(s) to be present during a particular portion of the meeting when they may be called upon to provide specific technical information; their participation will be limited to providing the results of the application of their particular expertise and answering questions, and source selection information will not be disclosed to them except as necessary to allow them to provide their analysis. Use of non-Government personnel shall be in accordance with FAR 9.505-4(b), FAR 37.203(d).
8.4.1 Non-Government Technical Advisors: The PCO or SSEB chair will ensure that appropriate OCI clauses are included in the contracts under which non-Governmental technical advisors will provide support to this source selection. The OCI clauses require the companies and individual non-Government advisors to protect offeror proprietary data and Government source selection information and prohibit the companies from otherwise participating as an offeror, a subcontractor, or as a consultant to an offeror/subcontractor in relation to this acquisition.
8.4.2. OCI Issues not related to 8.4.1: OCI issues must be resolved, if possible, during discussions before FPR request, and all OCI matters must be resolved prior to award. Resolution may include determining that an offeror with an OCI is ineligible for award. Refer to FAR 9.506 for further guidance.
(Note: Contracts providing the SSET non-Government source selection advisor(s) should be reviewed by PCO to ensure the appropriate OCI clause are contained in the contracts prior to appointment).
8.5 Notification to Offerors (If non-Government personnel will be utilized).
Provision shall be included in the solicitation to provide notice to prospective offerors that contractor will be used and the manner in which they will be used, and provide the offeror an opportunity to object to the release of proposal information.
If a competing offeror objects to the release of their proposal information to any non-Government advisor, the PCO shall make a determination whether the non-Government advisor(s) shall be permitted to participate in the source selection. The PCO shall inform the objecting offeror of the final determination. If the PCO determines that use of a non-Government advisor is necessary, the objecting offeror may either withdraw its objection or be determined ineligible for award.
9.0 SECURING SOURCE SELECTION MATERIALS.
(Detail the PCO’s plan for securing all source selection materials throughout the evaluation process.)
Source selection materials will be secured in accordance with the following plan: (Describe)
SSP Attachment - SSAC Membership (Use if SSAC is used)
Name Position/Title Organization
Chairperson
Member Office Symbol
Member Small Business
Member Financial Management (when applicable)
Member Engineering/Technical (when applicable)
Member Customer (when applicable)
Member Contracting
Member Logistics/Supply Planner (when applicable)
Advisor Legal
Advisor Source Selection Advisor (if applicable)
Advisor T&E (Test & Evaluation) (when applicable)
SSP Attachment – SSEB Membership
(Modify to document source selection evaluation team if SSEB is not used
Name Position/Title Organization
Chairperson
Procuring Contracting Officer (PCO)
Technical Evaluation Team
Subfactor Lead(s) Office Symbol
Member(s) and Advisor(s)
*Non-Government Advisor (if used) Company name (when applicable)
Contracts/Cost Evaluation Team (Cost/Price team may be separate from the Contracts team)
Contracting Officer
Buyer/Contract Specialist
Cost/Price Analyst
Financial Management (when applicable)
*Non-Government Advisor (if used) Company name (when applicable)
Past Performance Evaluation Team
Chairperson and Member(s)
*Advisors (when applicable)
*Advisor(s) Contracting, Financial Management, Legal, Customer,
Engineering/Technical, T&E (Test & Evaluation), Small Business,
Logistics/Supply Planner, Source Selection Advisor (if applicable)
*These individuals serve as key advisors to the SSEB and do not evaluate or rate proposals.
SSP Attachment
Excerpts from Section L - Instructions, Conditions, and Notices to Offerors
(Insert relevant excerpts that are or will be in Section L exactly as written in the solicitation)
SSP Attachment
Excerpts from Section M - Evaluation Factors for Award
(Insert relevant excerpts that are or will be in Section M exactly as written in the solicitation)
(Written by the Technical Evaluation Team. This is a template that should be tailored for your requirement.)
INITIAL/FINAL TECHNICAL EVALUATION REPORT
PROGRAM NAME, SOLICITATION #
1. This report serves to identify the initial/final (NOTE: if you are not opening discussions, there is just one report, if you open discussions, you will prepare an initial report, then a final report after discussions are concluded) technical evaluation results for the above referenced requirement. XX (insert number) proposals were received and were reviewed by the Technical Evaluation Team pursuant to the RFP Provisions, “Instructions to Offerors” and “Evaluation Basis for Award.” Proposals were submitted by: (insert/ list names). (Note: If using a trade-off and evaluating risk and assigning ratings, you MUST use the approved DoD Source Selection Procedures definitions and color and/or adjectival ratings.)
2. Offerors were instructed in the RFP provision, “Instructions to Offerors” to submit certain, specific technical information in Volume X (insert #) of their proposals. The Technical Team reviewed each proposal to ensure that the information was submitted as specified in the provision. The information required in the “Instructions to Offerors” provision of the RFP relative to the technical proposals is as follows: (pull in the exact language from your solicitation from Section L&M – what is shown below is a sample)
Volume II – Technical Proposal (Sample language)
a. Proposals shall be clear, concise, and include sufficient detail for effective evaluation. Offerors shall cross reference the requirements document, by paragraph number, to the corresponding proposal paragraph which addresses the referenced item. The proposal should not simply rephrase or restate the Government’s requirements, but rather shall provide convincing rationale to address how the offeror intends to meet these requirements. Offerors shall assume that the Government has no prior knowledge of their facilities and experience, and will base its evaluation on the information presented in the offeror’s proposal.
b. The following information shall be provided and will be evaluated to assess the proposed technical approach in accordance with Evaluation Basis for Award. The technical proposal shall:
(1) Describe the offeror’s capability and proposed approach to accomplishing the requirements set forth in identify the documents provided with the solicitation.
(2) Identify any risks associated with the approach and actions to be taken to mitigate that risk.
3. The Technical Team then evaluated each proposal in accordance with the criteria set forth in the RFP Provision, “Evaluation Basis for Award.” Each technical proposal was evaluated on an individual basis to determine the technical acceptability or unacceptability of the technical proposal (if LPTA) or to determine the color/adjectival ratings depicting the offeror’s ability to meet the established Technical Requirements/ Capability factor/sub-factors and corresponding RFP requirements (if a Full Trade-off). The criteria were as follows: (pull in the exact language from your solicitation – what is shown below is merely a sample)
Technical Acceptability. Each offeror’s technical proposal was evaluated to determine if the offeror provided a sound, compliant approach meeting the requirements of describe your requirements documents and demonstrated a thorough knowledge and understanding of those requirements and their associated risks. Technical risk was a part of the criteria considered in the acceptability decision. Technical risk assessed the risk associated with the offeror’s proposed approach as it related to accomplishing the requirements of this solicitation. Offers were determined unacceptable, even though the minimum requirements were met, if the proposed approach posed too great a risk.
OFFEROR ABC ASSESSMENT: (Example --- Do this for each offeror.) State the offeror’s name, address, and size. If the solicitation contained a combination set-aside/unrestricted portion, clearly identify the portion of the proposal that the offeror is proposing. This section should first provide a description of what is offered in the proposal. In general, it’s probably preferable to use the adjectival rating approach rather than the color approach, because the colors still have to be spelled out as words in documents, and because the adjectival ratings are instantly understandable whereas the colors may require reference to the chart to see their relative “goodness”.
It should address in basic terms how the offeror approached the requirements as set forth in the “Instructions to Offerors” of the RFP for the technical factor or subfactors being evaluated. If a specific aspect of one offeror’s proposal is mentioned, it should be addressed on all offerors. Unique aspects of each proposal should be addressed. If an offeror failed to address any of the stated technical criteria, a statement to that effect should be made. (For example, “Evaluation Notice (EN) X-1 will be issued to request missing information regarding paragraph 3.5 of the SOW.”)
The narrative should provide sufficient detail to understand and support the basis for the overall technical acceptability/unacceptability determination for a LPTA source selection, or, for the assigned color and/or adjectival rating for the Technical factor or sub-factor for a Full Trade-off source selection.
For a Full Trade-off only, if the offeror is BLUE/Outstanding, cite the exceptional aspects of the proposal that merit a BLUE/Outstanding rating and the associated benefit to the Government.
If the offeror is determined to have a PURPLE/Good rating, briefly summarize how the proposal meets the requirements and indicates a thorough approach and understanding of the requirements.
If the offeror is GREEN/Acceptable, a statement such as, “the proposal meets requirements and indicates an adequate approach and understanding of the requirements”, may be made.
I f the assigned rating is YELLOW/Marginal, briefly summarize which evaluation standards were not clearly met and why.
If a RED/Unacceptable rating is assigned, identify which minimum requirements were not met and address the scope of the proposal revision necessary to correct the problem.
If either a Yellow/Marginal or Red/Unacceptable rating is assigned, include a sentence regarding issuance of an EN: “EN X-2 will be issued to request the offeror to clearly identify how his proposed approach meets the requirement of paragraph 3.9 of the SOW”.
This narrative should explain the specific issues driving the rating (for a Full Trade-off) or the overall technical acceptability/unacceptability determination. If ENs are issued, state the purpose of each EN, address the response to each EN, and include a statement which indicates whether or not the EN has been closed or remains open.
Technical risks (also defined as weaknesses or deficiencies) associated with cost, schedule, and performance or technical aspects of the proposal must be explained. Risks may occur as a result of a particular technical approach, manufacturing plan, the selection of certain materials, processes, equipment, etc. The prime’s proposed subcontract arrangements may also impact technical risk. In a full trade-off, technical risk may be combined with the Technical evaluation or may be a separate evaluation factor (see DoD Source Selection Procedures, Methodology 1 and Methodology 2).
The narrative must explain what aspect of the proposal drives the risk, and must also explain how the risk will impact the schedule, cost, or performance. The evaluator must convey in the narrative the logic behind the thought process. The words must support the ratings that are assigned.
OFFEROR ABC EVALUATION NOTICE(S): List all of the ENs issued for this offeror along with a brief title or description of the EN. This allows you to visually see the number of ENs associated with this offeror. If you don’t list the ENs here, then list them in the narrative assessment above. A matrix with weaknesses, significant weaknesses, and deficiencies against the appropriate ENs may also be used. Include whether the offer responded and if it did, whether the response resolved the issued. List here, or attach the list and refer to the attachment.
OFFEROR ABC SUMMARY: Summarize the ratings that are assigned. If a LPTA source selection, summarize the results of your evaluation and state whether or not the offeror is determined to be technically acceptable, technically unacceptable, or reasonably susceptible of being made acceptable. If a full trade-off source selection, summarize the color ratings and technical risk ratings assigned to each sub-factor.
(Signature and Date)_______________
TECHNICAL WRITER’S NAME and TITLE
COORDINATION: (As applicable)
SSET CHAIRPERSON(S): (Signature and Date)______________
PCO: (Signature and Date)______________
Buyer/Contracting Officer (CO):
1- “L” Instructions to Offerors, identified page limits on technical proposal. Review each offeror’s technical proposal and remove any extra pages before providing to the technical team.
2- Have members of the technical team sign the non-disclosure agreement.
Technical Team:
3- Read “L” Instructions to Offerors and “M” Evaluation Basis for Award. Keep these readily available.
4- Consider developing a matrix for the technical acceptability or Technical Capability (Full Trade-off) evaluation based on what you told offerors in “L” Instructions to Offerors” and “M” Evaluation Basis for Award. This will help to ensure you evaluate each factor/subfactor identified in the solicitation.
5- Evaluate each offeror’s technical proposal and document the results on the matrix or other such initial working papers. Prepare Evaluation Notices (ENs) for clarifications, weaknesses and deficiencies. (If awarding without discussions, ENs can only be issued for clarifications.)
6- If applicable, consider developing a matrix for technical risk associated with the offeror’s proposed technical approach. This should include both the offeror’s and Governments’ perceived risks. Evaluate each offeror’s technical risk (combined with technical or separately) based on the definitions identified in “Section M, Evaluation Basis for Award” and the DoD SSP. Prepare evaluation notices (ENs) associated with risk or other issues in the proposals. (If awarding without discussions, ENs can only be issued for clarifications.)
7- If discussions are proposed for use, prepare a preliminary technical evaluation report addressing each evaluation factor/subfactor, with supporting rationale.
a. Obtain KO review of technical evaluation report and all ENs.
b. Prepare briefing charts to provide the Source Selection Authority (SSA) with the preliminary evaluation results and obtain approval to open discussions/issue ENs.
c. Participate in Initial Evaluation/Competitive Range Briefing to SSA.
8- Provide ENs to the contracting officer (KO).
9- Evaluate responses to ENs. Repeat steps 8-9, if necessary, until clarifications/discussions are concluded.
10- If discussions were held and FPRs received and evaluated, prepare final technical evaluation report. Obtain KO review of technical evaluation report.
a. Prepare briefing charts for the final decision briefing to obtain the SSA’s award decision.
b. Participate in Final Decision Briefing to SSA.
11- Participate in debriefings, as necessary.
12- If a protest is received involving a technical aspect, the Technical team will work with the KO as necessary in the preparation of a response. (See #16 below.)
Other Responsibilities:
13- The Past Performance Evaluation Team (PPET) may request assistance from the Technical team regarding the review of present/past performance efforts submitted by an offeror. The Technical team would then review the technical aspects of the submitted effort(s) in order to facilitate the PPET determination of relevancy.
14- If authorized by the KO, the price analyst may request assistance from the Technical team regarding an offeror’s proposed price. The Technical team would then review the offeror’s proposed price relative to the offeror’s proposed technical approach in order to facilitate the price analyst’s determination of price reasonableness/balance and realism (if applicable).
Miscellaneous:
15- Your technical evaluation report should be concise and detailed with supporting rationale. The
KO will excerpt language/detail from your report to include in the Source Selection Decision Document (SSDD) as well as in the response to any protest received involving a technical aspect of the source selection decision.
This template is provided as Informational Guidance. It is an example and depending upon the activity may require three separate worksheets (i.e. one for technical, one for past performance and one for cost/price).
OFFEROR: |
οo INITIAL EVALUATION οo FINAL EVALUATION |
TECHNICAL/TECHNICAL CAPABILITY TECHNICAL RATING: οo BLUE (Outstanding) οo PURPLE (Good) οo GREEN (Acceptable) οo YELLOW (Marginal) οo RED (Unacceptable) NARRATIVE: [Include strengths, uncertainties, and deficiencies (material failure to meet Government requirement). Explain how proposal exceeds or fails to meet requirement. If it exceeds the requirement, explain how it benefits the Government ] | |
RISK RATING (IF EVALUATED SEPARATELY): Indicate risk rating of low, moderate, high, or unacceptable for each subfactor, if used, and include weaknesses and deficiencies. Explain how each increase of risk contributes to the rating considering the potential for disruption of schedule, increased cost, or degradation of performance: οo LOW οo MODERATE οo HIGH | |
PAST PERFORMANCE ASSESSMENT: οo SUBSTANTIAL CONFIDENCE οo LIMITED CONFIDENCE οo SATISFACTORY CONFIDENCE οo NO CONFIDENCE οo UNKNOWN CONFIDENCE NARRATIVE: | |
PRICE / COST (Select appropriate) TOTAL PRICE / COST: $ _______________ NARRATIVE: | |
EXCHANGES WITH OFFERORS | |
______ ______________________________________ ______________________________________ SIGNATURE (Contracting Officer) SIGNATURE (Lead Evaluator) |
This template is provided as Informational Guidance. It is an example.
Source Selection:
Evaluator: Offeror:
Factor: Subfactor:
TECHNICAL
Component of Performance or Capability Requirement:
What is Offered:
TECHNICAL RATING
How Proposal Exceeds, Meets, or Fails to Meet Performance or Capability Requirements:
Strengths : (Start narrative with "Strengths:" )
Significant Weakness : (Start narrative with “Significant Weakness:” )
Weaknesses : (Start narrative with "Weaknesses:" )
Deficiencies : (Start narrative with "Deficiencies:" )
Uncertainties : (Areas requiring additional information)
RISK RATING (IF EVALUATED SEPARATELY FROM TECHNICAL CAPABILITY)
Significant Weakness : (Start narrative with “Significant Weakness:” )
Weaknesses : (Start narrative with "Weaknesses:" )
Deficiencies*: (Start narrative with “Deficiencies:” )
Mitigation : (Start narrative with "Mitigation:" )
Evaluation Notice Required?
* A deficiency could be a result of a significant weakness (or combination of weaknesses) that is very likely to cause unmitigated disruption of schedule, drastically increased cost or severely degraded performance.
This template is provided as Informational Guidance. It is an example.
___ Initial Summary ___ Pre-Final Proposal Revision Summary ___ Final Summary
Source Selection:
Author:
Offeror:
Factor:
Subfactor:
Proposal Description:
TECHNICAL
Technical Rating:
Strengths Details:
Strengths Summary:
Deficiencies Details:
Deficiencies Summary:
Uncertainties (Areas Requiring Additional Info):
RISK Rating: (if evaluated separately)
Deficiencies Detail*:
Deficiencies Summary*:
Mitigation Efforts/Weaknesses Details:
Mitigation Efforts/Weaknesses Summary:
Comments:
Reviewed by:
* A deficiency could be a result of a significant weakness (or combination of weaknesses) that is very likely to cause unmitigated disruption of schedule, drastically increased cost or severely degraded performance.
(Template provided as Informational Guidance. )
EVALUATION NOTICE (EN) # ______
PROGRAM/ACQUISTION NAME/SOLICITATION #
_____FAR 15.306(a) Clarification* Offeror_______________________
_____FAR 15.306(b) Communications*
_____FAR 15.306 (c)/(d) Discussions
_____Deficiency
* Government will not accept proposal revisions as a result of Clarification or Communication exchanges.
Request for Proposal Reference: (Specify Request for Proposal paragraph number, Section M and Section L reference, etc.)
Factor: __________________________
Subfactor: _________________________
Proposal Reference: (Specify offeror’s document, proposal volume, paragraph, and page number)
SUMMARY: [Description of issue in question and specific request for additional/supplemental information needed to clarify or correct the issue. Include references to the solicitation if necessary.]
OFFEROR RESPONSE: (Response page limit:____)
(NOTE: EVALUATOR ASSESSMENT OF OFFEROR RESPONSE: Address assessment in the appropriate evaluation report (Past Performance = PCAG Report, Mission Capability/Technical = Technical Report))
----------(Block off/remove the below information before mailing to the offeror.)----------
(For Clarifications/Communications and SSA is someone other than the PCO; OR; SSA has granted PCO approval authority to release when awarding without discussions.)
PCO Approval: ________________________ Date: _________________
OR
(If the SSA is the PCO; OR; if opening discussions and the SSA is someone other than the PCO)
SSA Approval: _________________________ Date: _________________
(Approval of ENs by the SSA can be gained by signature of each individual EN or via signature of a memo placed on top of all ENs)
PAST PERFORMANCE/PERFORMANCE CONFIDENCE ASSESSMENT REPORT
(written by the Past Performance Evaluation Team/Group)
(Program Name and Request for Proposal (RFP) Number)
[INCLUDE ON EACH PAGE: Source Selection Information – See FAR 2.101 and 3.104
For Official Use Only]
1.0. GENERAL.
1.1. BACKGROUND.
This requirement covers the (include summary description of the program and the deliverables).
1.2. EVALUATION METHODS.
The offerors were requested to submit in their Past/Present Performance Volume of their proposal, information that they considered relevant in demonstrating their ability to perform the (describe efforts/functions of the current requirement that are being evaluated) required by subject acquisition. The offerors were requested to provide data on (insert number of contracts) previous or present efforts performed within the last (insert the time frame constituting recent past performance identified in the RFP) by their firm as well as any critical subcontractor. A critical subcontractor is defined as a subcontractor who (insert from RFP).
The Past Performance team utilized the information provided by the offerors along with available Contractor Performance Assessment Reports (CPAR), information obtained from other performance data bases, and, if applicable, information obtained from other sources in evaluating the performance of the contractors (provide the details). Also, questionnaires were sent to and interviews conducted with appropriate personnel regarding the selected contracts (also state if the questionnaire was included as an Attachment to the RFP). The responses received to the questionnaires and interviews were utilized in evaluating the past performance of each offeror and critical subcontractor if applicable.
Past Performance team performed an assessment of relevancy of the data provided and obtained. Relevancy was defined in the RFP as: Same/similar scope is (insert definition from the RFP) and size/magnitude is (insert definition from the RFP) and complexity is (insert definition from the RFP).
The relevancy rating scheme from the DoD SSP was used:
● VERY RELEVANT: Present/past performance effort involved essentially the same scope and magnitude of effort and complexities this solicitation requires.
● RELEVANT: Present/past performance effort involved similar scope and magnitude of effort and complexities this solicitation requires.
● SOMEWHAT RELEVANT: Present/past performance effort involved some of the scope and magnitude of effort and complexities this solicitation requires.
● NOT RELEVANT: Present/past performance effort involved little or none of the scope and magnitude of effort and complexities this solicitation requires.
The Past Performance team review was based on the offeror’s past and present performance as it relates to the probability of successfully accomplishing the current proposed effort. The confidence assessment was accomplished in accordance with the DoD SSP. The Past Performance team’s evaluation and assessment of each offeror is addressed in Sections 2.1 through (fill-in will be dependent on the number of offers being evaluated) of this report.
The DoD SSP confidence assessment ratings and definitions were used, as follows:
Rating Definition
Substantial Confidence Based on the offeror’s recent/relevant performance record, the government has a high expectation that the offeror will
successfully perform the required effort.
Satisfactory Confidence Based on the offeror’s recent/relevant performance record, the government has a reasonable expectation that the offeror will successfully perform the required effort.
Limited Confidence Based on the offeror’s recent/relevant performance record, the government has a low expectation that the offeror will successfully perform the required effort.
No Confidence Based on the offeror’s recent/relevant performance record, the government has no expectation that the offeror will be able to successfully perform the required effort.
Unknown Confidence No recent/relevant performance record is available or the offeror’s performance record is so sparse that no meaningful confidence assessment rating can be reasonably assigned.
1.3. CONFIDENCE ASSESSMENT RATING.
In accordance with DoD SSP, a confidence assessment rating has been assigned to each offeror, as detailed below.
2.0. CONFIDENCE ASSESSMENT ANALYSIS.
Past/Present Performance data was received from and evaluated on the following offerors: (List offerors and critical subcontractors (include, as applicable, division name(s),CAGE codes, and addresses, and indicate if any of them were eliminated from the competitive range.)
2.1. OFFEROR: (List the name and address of offeror and any critical subcontractors.)
2.1.1. Offeror’s Proposed Present and Past Performance Information.
(Offeror’s name) provided information on (insert number) contracts which they considered relevant to this effort. The Past Performance team determined that (insert number) of these contracts are very relevant, (insert number) of these contracts are relevant and (insert number) of these contracts are somewhat relevant to the current effort.
If a critical subcontractor is assessed, provide the following: (Offeror’s name) also provided information on (insert number) contracts for their subcontractor (Critical Subcontractor’s name) which they considered relevant to this effort. The Past Performance team determined that (insert number) of these contracts are very relevant, (insert number) of these contracts are relevant and (insert number) of these contracts are somewhat relevant to the current effort.
(Offeror’s name) is responsible for (discuss the role of the offeror for the current acquisition). (Critical Subcontractor’s name) as a critical subcontractor is responsible for (discuss the role of the subcontractor for the current acquisition).
Identification of contracts, rationale for determination of relevancy, and a synopsis of the data reviewed on each contract follows:
(For each Contract, include the following:)
Contract Number:
Prime Contractor Name: (Name prime contractor, if other than the offeror.)
Program Name and Description:
Period of Performance: (Start-Finish)
Dollar Value: (Current $/ Maximum $)
Relevance: Include the assigned relevancy rating and a narrative discussion of the significance and relevance of the contract. If the offeror was a subcontractor or teaming contractor on this contract, so state and define the effort that was performed and its relevance to the current effort. If this contract was performed by a proposed subcontractor to the prime contractor for the current effort, state and define the effort that was performed and its relevance to the current effort.
(Provide the following for each contract:)
Questionnaires Sent:
Responses Received:
Interviews Conducted:
Synopsis of Performance Information Received:
2.1.2. Compliance with FAR 52.219-8 and/or -9 (Utilization of Small Businesses/Subcontracting Plan) Address compliance with FAR 52.219-8 and/or 9 for each of the submitted efforts.
2.1.3. Contractor Performance Assessment Reports (CPAR). (If no relevant CPARS/PPIRS-RC data is available, simply state: No relevant CPAR data was available for this offeror. If CPAR/ PPIRS-RC data is available, provide description of CPAR and rationale for relevancy to current effort).
(Example: CPAR data was received on (insert number) contracts:
Contract Number:
Under this contract (offeror’s name) performed (provide detail). The effort performed under this contract is (insert relevancy rating) to the current requirement, in that (describe relevance). CPAR period covered is (date to date).
Insert a chart similar to the sample below providing the ratings for the specific contract, by contract number:
FACTOR/PERIOD |
3/08-3/09 |
4/09-3/10 |
4/10-3/11 |
Technical Performance |
Rating |
Rating |
Rating |
Schedule Control |
Rating |
Rating |
Rating |
Quality |
Rating |
Rating |
Rating |
Price |
Rating |
Rating |
Rating |
(Repeat this same type information for each CPARS report received)
2.1.4. Other Sources of Information.
(Cite any other sources of past performance information that were used, such as the offeror’s spreadsheet that was required listing all efforts, Defense Contractor Review List, Preaward Surveys, DCMA input, other performance data bases, etc. State the relevance of the information and give a synopsis of information obtained. If critical subcontractors are applicable, repeat paragraphs 2.1.1 through 2.1.4 for each critical subcontractor.)
2.1.5. Evaluation Notices (ENs) Issued and Analysis.
(List all ENs that were issued to the offeror that are relevant to past performance, summarize the offeror’s response to them, and give a summary of the analysis and conclusions of the Past Performance team. ENs should be issued on any negative information.)
2.1.6. Strengths.
List strengths. (If critical subcontractors are involved, reflect both the prime and the critical subcontractor(s) strengths.)
2.1.7. Weaknesses.
List weaknesses. (If critical subcontractors are involved, reflect both the prime and the critical subcontractor(s) weaknesses.)
2.1.8. Assessment: Identify Confidence Level
Based on the evaluation performed by the Past Performance team, the Government has XXX confidence the offeror (or team as a whole) can perform the proposed effort. (Provide specific detail with supporting rationale for the assigned confidence assessment rating, at a summary level. If critical subcontractors are involved, make sure you have clearly identified the role both the prime and the critical subcontractor will play, and whether or not each has successfully demonstrated their role. Discuss any evaluation of aggregate. )
2.2. OFFEROR:
(Provide the same information as presented above for each offeror evaluated.)
SIGNATURES:
________________________
Past Performance Team Chair (If applicable)
________________________
PCO
PAST PERFORMANCE/PERFORMANCE CONFIDENCE
ASSESSMENT TEAM/GROUP (PCAG) Evaluation Hints
The following information is an overview of the past performance evaluation and is not intended to be a checklist that encompasses all situations/actions to be taken. Follow the DoD SSP and Section M of the solicitation.
1. Getting Started
● Re-read the RFP language, requirements documents, etc.
● Each PCAG member needs to have a copy of the relevant sections of the DoD SSP and Sections L/M. You must know and follow these.
● Each member should read all past performance proposal volumes. Working by consensus (team) can be difficult - but “group think” may be better than individuals doing the jobs separately.
* Splitting up the workload has its drawbacks: the Past Performance/PCAG Chair would have to reconcile different writing styles, different interpretations of feedback, etc.
2. Know your milestones/time constraints
● Set up internal milestones for the past performance evaluation, e.g. questionnaires returned, interviews complete, initial report complete, competitive range briefing, etc….
● Be sure to follow-up at critical junctures so you have time to issue ENs if negative feedback/trends are reported.
3. Read the Offeror’s Past Performance Proposal Volume/Submission
● Make sure you have what was requested in the RFP. If the offeror failed to give you what was required, prepare an EN. If the data is missing entirely, consult with Legal concerning whether or not what is being requested constitutes opening discussions.
● When you read the feedback from the offeror’s customers, you may need to bump what the offeror claimed in its past performance volume against what the customer reported, and reconcile any discrepancies.
● Remember, the Government doesn’t expect perfection - but when an offeror had problems, we want to know they instituted corrective action to fix the root cause and how effective any corrective action was. You may need to consult with the government technical point of contact to verify. Be objective, fair, and consistent in evaluating here, so that we recognize improvements in the most recent contract performance when/if any.
4. Seek to obtain performance information from other sources
● Contact the cognizant contracting officer or request access to the Past Performance Information Retrieval System- Report Card (PPIRS-RC) if necessary to obtain access to the Contractor Performance Assessment Reports (CPAR). Ensure you include the name of cognizant contracting officer and a clear justification as to why you are requesting access to the PPIRS-RC.
* Remember, if you get a CPAR, you must evaluate /document where/extent it is relevant to your requirement.
* Also, if it has been seen/signed/commented on by the offeror already, you can use all information (positive and negative) without further need to clarify with the offeror.
● Look at the spreadsheet of all efforts the offeror was required to submit.
● Call DCMA and talk to the Administrative Contracting Officer, the Pre-Award Monitor, or the Industrial Specialist.
● Utilize other performance collection data bases when appropriate.
5. The definitions of “recent” and “relevant” in your RFP will be utilized to assign a relevancy rating for each contract about which information is received. This includes contracts that have PPIRS ratings or contracts being evaluated from the offeror’s spreadsheet, even if they did not submit a FACTS sheet for each contract.
● As you read/evaluate the offerors’ info/feedback, you will need to document, in every case, how/why/to what extent you rate each contract as either very relevant, relevant, somewhat relevant or not relevant. If you need help in determining relevancy, get it from the experts on your teams (technical, cost, contracts).
● If a contract is Not Relevant for any of the criteria, you need not gather any performance data. However, most contracts submitted for evaluation will have at least some thread of relevancy to the current contract.
6. Just getting in 2-4 questionnaires per contract is not the past performance team’s final objective. In most cases, you will have to follow-up with the respondents for one reason or another
● Some POCs won’t provide the details supporting their input provided on the questionnaire itself - but will offer invaluable details in an interview.
● Not all respondents are equal. Consider such things as:
* Some respondents are closer to the daily performance of the offeror than are others.
* It’s not necessary (or reasonable) for everyone on Contract XXX giving feedback to agree on every questionnaire/interview question - but you should see an overall “group think” that supports one objective conclusion on that offeror’s performance.
● Interviews help round-out the written word in the questionnaire, but CAUTION, customers get tired QUICKLY if you call today with a question, next week with another question, etc. Be sure when you call these questionnaire/interview POCs that you are thorough and exercise them/their patience as little as possible.
7. Preparing your Past Performance Evaluation Team/PCAG report
● Always give the offeror an opportunity to respond to negative feedback or trend data via an EN. It gives them an opportunity to tell you about corrective actions taken.
● REMEMBER, the Past Performance Evaluation Team/PCAG gathers, interprets & reports data - it does not compare offerors or make award decisions - - the SSA does that.
8. Past Performance Team/PCAG Chair Responsibilities
● Must review everything on every offeror, all questionnaires, interviews, and all performance assessments written by the other Past Performance Evaluation Team/PCAG members, and reconcile any differences.
* Ensure consistency, complete and auditable, rational, fair/impartial judgment, compliance with all RFP terms/conditions and the DoD SSP, and an error free process.
● Brief the Past Performance Team/PCAG portion at each juncture - competitive range and any other SSA briefing, and usually briefs (or supports) the de-briefings.
● Keeps the process on track -- ultimately responsible and accountable to the Contracting Officer and Source Selection Authority.
PRESENT/PAST PERFORMANCE QUESTIONNAIRE
RFP Number
Please provide the following information about yourself:
Name/Signature and Role Relative to Contract (e.g. Buyer, Program Manager):__________________________________________________________________________
Agency or Business: _________________________________________________________________
Address: ___________________________________________________________________________
Telephone Number __________________ Fax Number _________________________
Return your completed questionnaire either electronically, by fax or by Email. (PCO Note: tailor to match your Section L language). See Section L of the solicitation for appropriate email address, mailing address, or fax number. Notice: When completed, this questionnaire will be considered source selection sensitive information IAW FAR 3.104.
PART I. SPECIFIC PROGRAM INFORMATION.
Contractor/Business: _____________________________________________
CAGE Code: __________ Program Title and Brief Description: _________________________________________________________________________________________________________________________________________
Role in the Program/Work Performed As: _____Prime _____Subcontractor _____Vendor/Supplier
Contract Number: _________________ Number of Years?: Basic:________ Option: _________
Contract Type(s): List all applicable contract types, i.e. Firm-Fixed-Price, Time & Materials, Cost, etc.__________________________________________________________________________
PART II. GENERAL PROGRAM INFORMATION.
Period of Performance:
1. Original Schedule (assuming all options exercised): Beginning Date _________ through ____________
2. Current Schedule (assuming all options exercised): Beginning Date _________ through ____________
3. Reason for difference (if applicable) ____________________________________________
(WHEN COMPLETED) SOURCE SELECTION INFORMATION – SEE FAR 2.101 AND 3.104. FOR OFFICIAL USE ONLY
Contract Dollar Value:
1. Original MAX Contract $ Value (assuming all options are exercised: _____________________
2. Current MAX $ Value (assuming all options exercised): _______________________________
3. Primary Causes of Changes: _______________________________________________________
Note: In Part III., should your response be other than “satisfactory,” please provide supporting documentation in “Additional Remarks” on page 4 of 5.
PART III. PERFORMANCE ASSESSMENTS. (Question should be tailored to match you factors and subfactors)
Please check the appropriate rating for each of the following questions:
N/A=Not Applicable U=Unsatisfactory M=Marginal S=Satisfactory V=Very Good E=Exceptional
|
N/A |
U |
M |
S |
V |
E |
1. Extent to which the Company’s products and/or services met the specification /performance requirements: |
||||||
2. Configuration management (first product delivered same as last): |
||||||
3. Quality of Completed Product: |
||||||
4. Overall management of subcontracting efforts: |
||||||
5. Quality of technical manuals or commercial manuals: |
||||||
6. Customer’s Satisfaction with warranty response times and corrective actions: |
||||||
7. Adequate number of dedicated resources for your program: |
||||||
8. Timely recognition and notification of administrative, engineering, production, problems affecting the program: |
||||||
9. Company performed independently without significant customer direction/oversight? |
||||||
10. Monitoring of program schedules and critical milestones: |
||||||
11. Completed work on time: |
||||||
12. Company demonstrated positive responsiveness to unscheduled requirements or contract changes: |
||||||
13. Ability to meet forecasted cost and perform within contract costs. |
||||||
14. Ability to meet delivery dates. |
||||||
15. Ability to meet or exceed small business and small disadvantage business goals set forth in the approved subcontracting plan. |
16. Has action been initiated to cancel or terminate the contract for default? If yes, explain. _______________________________________________________________________________
17. Have there been any disputes/claims relative to the contract? If yes, explain.
_______________________________________________________________________________
18. Describe the contractor’s or company’s strong and/or weak points identified as a result of technical performance and any technical performance risk identified during the life of the contract. ________________________________________________________________________________
19. If given a choice, would you award to this contractor again? If not, please explain. ___________________________________________________________________________________
20. Do you feel you “got what you paid for”? Please explain. ______________________________________________________________________________
21. Additional Remarks.
_____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
(WHEN COMPLETED) SOURCE SELECTION INFORMATION – SEE FAR 2.101 AND 3.104. FOR OFFICIAL USE ONLY
RFP NUMBER
PRESENT/PAST PERFORMANCE QUESTIONNAIRE
ASSESSMENT RATING SYSTEM
(Note: Ratings and definitions are from the DoD CPARS Policy Guide and shall be used when questionnaires are used).
EXCEPTIONAL (E): Performance meets contractual requirements and exceeds many requirements to the Government’s benefit. The contractual performance being assessed was accomplished with few minor problems for which corrective actions taken by the contractor were highly effective.
VERY GOOD (V): Performance meets contractual requirements and exceeds some requirements to the Government’s benefit. The contractual performance being assessed was accomplished with some minor problems for which corrective actions taken by the contractor were effective.
SATISFACTORY (S): Performance meets contractual requirements. The contractual performance being assessed contains some minor problems for which corrective actions taken by the contractor appear or were satisfactory.
MARGINAL (M): Performance does not meet some contractual requirements. The contractual performance being assessed reflects a serious problem for which the contractor has not yet identified corrective actions or the contractor’s proposed actions appear/were only marginally effective or were not fully implemented.
UNSATISFACTORY (U): Performance does/did not meet most contractual requirements and recovery is/was not likely in a timely manner. The contractual performance being assessed contains serious problem(s) for which the contractor’s corrective actions appear or were ineffective.
NOT APPLICABLE (NA) : Did not apply to this acquisition; or, questionnaire respondent has no knowledge of, and/or did not observe, the contractor’s performance in this area.
NOTICE TO QUESTIONNAIRE RESPONDENTS: Please do not include this page when returning the completed questionnaire.
(WHEN COMPLETED)
SOURCE SELECTION INFORMATION – SEE FAR 2.101 AND 3.104.
FOR OFFICIAL USE ONLY
PROPOSAL ANALYSIS REPORT (PAR) TEMPLATE
The following template is designed to assist Source Selection Evaluation Boards (SSEB) in preparing a Proposal Analysis Report (PAR) to document the evaluation results for the Source Selection Authority (SSA). You must tailor the content of your PAR to meet the requirements of your program/acquisition and the needs of the SSA. The amount of detail to include in each section of the PAR depends on several things, most notably, the complexity of the proposals and the SSA’s preferences. The key point is to provide the SSA with sufficient information to be able to compare offerors and make an award decision and create an administrative record of the source selection. Discussing the SSA’s expectations ahead of time can avoid frustration and rework later.
The PAR includes the results of final discussions, Final Proposal Revisions (FPR), and other considerations from the SSEB. The SSEB should have the PAR substantially completed at, or prior to, the decision briefing (if a briefing is required by the SSA). The finalized PAR and the separate Comparative Analysis Report and Award Recommendation will be presented to the SSA prior to the SSA signing the Source Selection Decision Document.
(include on each page)
SOURCE SELECTION INFORMATION – SEE FAR 2.101 AND 3.104.
FOR OFFICIAL USE ONLY
PROPOSAL ANALYSIS REPORT (PAR) OUTLINE
1. INTRODUCTION
1.1 Discussion of Requirement
1.2 Source Selection Procedures
1.3 Evaluation Criteria
1.4 Past Performance
1.5 Cost/Price
1.6 Offerors
2. DESCRIPTION OF PROPOSALS
2.1 (Insert Name of Offeror A)
2.1.1 Key Technical Features
2.1.2 Key Contract Features
2.2 (Insert Name of Offeror B)
2.2.1 Key Technical Features
2.2.2 Key Contract Features
Repeat as necessary for each offeror
3. EVALUATION RESULTS
3.1 (Insert Name of Offeror A)
3.1.1 TECHNICAL AND TECHNICAL RISK RATINGS FACTOR (Offeror A)
3.1.1.1 Subfactor 1 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.1.2 Subfactor 2 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.1.3 Subfactor 3 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.1.4 Subfactor 4 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.1.5 Subfactor 5 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.1.6 Subfactor 6 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.1.2 PAST PERFORMANCE FACTOR (Offeror A)
3.1.2.1 Data Gathered
3.1.2.2 Programs/Contracts Evaluated
3.1.2.3 Performance Confidence Assessment (Offeror A)
Strong Points, Positive Performance
Weak Points, Negative Performance
3.1.2.4 Summary
3.1.3 COST/PRICE (Offeror A)
3.1.3.1 Summary of Proposed and Evaluated Cost/Price
3.1.3.2 Other Government Costs (OGC)
3.1.3.3 Identified Risk
3.1.3.4 Summary
3.2 (Insert Name of Offeror B)
3.2.1 TECHNICAL AND TECHNICAL RISK RATINGS FACTOR (Offeror B)
3.2.1.1 Subfactor 1 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.1.2 Subfactor 2 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.1.3 Subfactor 3 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.1.4 Subfactor 4 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.1.5 Subfactor 5 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.1.6 Subfactor 6 – (Insert subfactor title)
Strengths:
Uncertainties:
Deficiencies:
Weaknesses:
Risk:
3.2.2 PAST PERFORMANCE FACTOR (Offeror B)
3.2.2.1 Data Gathered
3.2.2.2 Programs/Contracts Evaluated
3.2.2.3 Performance Confidence Assessment
Strong Points, Positive Performance
Weak Points, Negative Performance
3.2.2.4 Summary
3.2.3 COST/PRICE FACTOR (Offeror B)
3.2.3.1 Summary of Proposed and Evaluated Cost/Price
3.2.3.2 Other Government Costs (OGC)
3.2.3.3 Identified Risk
3.2.3.4 Summary
Repeat Evaluation Results for each additional offeror as necessary
4. COMPARATIVE ANALYSIS OF OFFERS
4.1 MISSION CAPABILITY (TECHNICAL AND RISK RATINGS) FACTOR
The following is a comparative analysis of the technical aspects of all offerors’ proposals.
4.1.1 Subfactor 1 – (Insert subfactor title)
4.1.2 Subfactor 2 – (Insert subfactor title)
4.1.3 Subfactor 3 – (Insert subfactor title)
4.1.4 Subfactor 4 – (Insert subfactor title)
4.1.5 Subfactor 5 – (Insert subfactor title)
4.1.6 Subfactor 6 – (Insert subfactor title)
4.1.7 Overall Comparison of Mission Capability Factor
4.3 PAST PERFORMANCE FACTOR
4.4 COST OR PRICE FACTOR
4.4.1 Summary of Proposed and Evaluated Cost/Price for Each Offeror
4.4.2 Summary of Other Government Costs (OGC) for Each Offeror
4.4.3 Summary of Identified Schedule Risk for Each Offeror
4.4.4 Summary
5. CONTRACTUAL CONSIDERATIONS
5.1 Results of Questions and Answers
5.2 Results of Evaluation Notice (EN) Clarifications and Communications
5.3 Results of the EN Discussions
5.4 Differences in Proposal Features
6. SIGNATURE PAGE
SSEB Chair
SSAC Chair (if applicable)
Attachment – List of Government Advisory Reports
(Insert Program/Acquisition Title)
PROPOSAL ANALYSIS REPORT (PAR)
1. INTRODUCTION.
1.1 Discussion of Requirement.
(Describe your requirement.)
(Describe any outside influences or time pressures that may have affected the source selection such as, procurement priority, funding limitations and so forth. Briefly summarize the results of any required market research.)
1.2 Source Selection Procedures.
The SSEB conducted this source selection in accordance with (IAW) Federal Acquisition Regulation (FAR) Part 15, as supplemented, the approved (insert Program/Acquisition Title) Source Selection Plan (SSP) (insert date), and the Request for Proposal (RFP) (insert RFP number), dated (insert date).
(Provide an overview of your evaluation process. Discuss significant events such as proposal receipt, any Briefings conducted, Evaluation Notices (ENs), discussions, Requests for Final Proposal Revisions, Final Proposal Receipt, etc.)
1.3 Milestone Schedule.
(Include the milestone schedule from the acquisition hat has been part of the approved strategy. This should include the projection up to the point of award.)
1.4 Evaluation Criteria.
The SSA approved the basis for contract award and evaluation factors by approving the SSP. The same basis for contract award and evaluation factors was provided to offerors in the RFP. The factors and subfactors used to perform the evaluation were: (Must be tailored for your source selection [e.g., when using Methodology 1 versus 2 (see paragraphs 3.1.2.1 and 3.1.2.2 of DoD Source Selection Procedures or when evaluating factors/subfactors on an acceptable/unacceptable basis, Appendix A.2 of the DoD Source Selection Procedures]. List factors and subfactors in order of relative importance)
Factor: Technical and Risk Ratings
Subfactor 1: (Insert Subfactor title)
Subfactor 2: (Insert Subfactor title)
Repeat as necessary for each additional Subfactor
Factor: Past Performance
Factor: Cost or Price
(Describe the relative importance of factors and subfactors. IAW FAR 15.304, include a statement about the relative importance of cost/price in comparison to all other evaluation factors, when combined, and be sure to discuss in the evaluation narrative the rating distinctions caused by the relative order of importance of the evaluation criteria)
1.5 Past Performance.
The Past Performance Evaluation Team assessed the Government’s confidence in the offeror’s ability to fulfill solicitation requirements while meeting schedule, budget, and performance quality constraints. The assessment focused on the offeror’s demonstrated performance in specific areas, relevancy and significance of the data, and recency of the data. The team based its assessment on a subjective evaluation of available present/past performance information. The team determined its level of confidence in each offeror’s ability to perform as proposed. The performance confidence assessment is at the factor level. (Include words describing the criteria the team used to determine relevancy and recency – this could be a copy of the definitions used in the RFP.)
The team used (identify the number) sources of past performance data for the risk assessment: 1) Past/Present Performance as provided by the offerors; 2) Contractor Performance Assessment Reports (CPARs); 3) Questionnaires sent to cognizant Government Program Managers, Administrative Contracting Officers, Procurement Contracting Officers, and contractors; and 4) (List any additional sources.)
1.6 Cost/Price.
The offeror’s final proposed costs or prices were evaluated for for the following (include price reasonableness and other considerations as applicable). Describe the cost/price analysis techniques used and appropriateness in determining price reasonableness). Provide an assessment of balanced pricing (If specified in the solicitation). The contract type is (insert contract type).
(Address techniques used to evaluate cost or price such as Total Evaluated Price or Probable Cost, if required. Explain what was included in the Cost or Price Factor, such as “Production and delivery at target quantities, etc.)
1.7 Offerors
(Insert number of offerors) offerors submitted proposals in response to the (insert Program Title) RFP. (Insert number) were included in the competitive range for purposes of discussions.
The offerors and their subcontractors or teaming partners are: (Insert name and location of each offeror and their subcontractors or teaming partners, and indicate whether was included in the competitive range)
2. DESCRIPTION OF PROPOSALS.
The following discussion of all proposals remaining in the competitive range is a summary of the overall proposal as a result of all evaluations from the initial through to the Final Proposal Revision (FPR) for each offeror. This information represents a summary of the salient aspects of each proposal. A complete and thorough analysis of each proposal is contained in the following attached reports:
● Technical Evaluation Team (TET) Report – Final dated
● Performance Confidence Assessment Group (PCAG) Report – Final dated
● Pricing Report – Final dated
The following summary chart provides a synopsis of the final ratings associated with each offeror:
(It is a good idea to include a summary chart depicting the final evaluation, by factor and offeror, allowing the SSAC and SSA to quickly compare the offerors’ proposals.)
(The purpose of this portion of the PAR is to provide a general description of each offeror’s proposal, but does not contain evaluation results. It documents that the SSEB has read and understood the proposals. Present the information by offeror, not by factor.)
2.1 (Insert Name of Offeror A)
2.1.1 Key Technical Features
(Describe the key technical, performance, management, etc. features of Offeror A’s proposal)
2.1.2 Key Proposal Features.
(Insert name of Offeror A) proposal includes the following:
(Use bullet statements to describe unique features of proposal from Offeror A, such as subcontractors or teaming arrangements, including any critical subcontractors whose support is relied upon for a successful effort, show the delivery schedule or period of performance including schedules for the basic and all options.)
2.2 (Insert Name of Offeror B)
2.2.1 Key Technical Factors
2.2.2 Key Proposal Features.
(Repeat as necessary for each offeror.)
3. EVALUATION RESULTS
3.1 (Insert Name of Offeror A)
3.1.1 TECHNICAL AND RISK FACTOR
3.1.1.1 Subfactor 1 – (Insert subfactor title)
(Insert a description of what Offeror A proposes and summarize the SSEB evaluation of that proposal. Identify the adjectival and/or color rating. Identify any strengths or deficiencies that support the adjectival and/or color rating. Identify weaknesses and relate them to risk.)
Strengths:
(Use separate paragraphs to identify strengths, as defined in DoD Source Selection Procedures, Chapter 5)
Deficiencies:
(Use separate paragraphs to identify deficiencies, as defined in DoD Source Selection Procedures, Chapter 5.)
Weaknesses:
(Use separate paragraphs to identify weaknesses and significant weaknesses, as applicable, as defined in DoD Source Selection Procedures, Chapter 5.)
Risk:
(Insert Technical Risk Rating [LOW, MODERATE, or HIGH, as defined in DoD Source Selection Procedures, Table 3] and explain why.)
(Repeat as necessary for each additional Subfactor.)
3.1.1.2 Subfactor 2 – (Insert subfactor title)
Strengths:
Deficiencies:
Weaknesses:
Risk:
(Repeat as necessary for each additional Subfactor.)
3.1.2 PAST PERFORMANCE FACTOR (Offeror A)
3.1.2.1 Data Gathered.
(Insert Offeror A’s Name) proposed (insert quantity) relevant contracts. The Past Performance Evaluation Team located (insert quantity) additional contracts of which (insert quantity) were relevant contracts. A total of (insert quantity) contracts were deemed relevant and evaluated for (Insert Offeror A’s Name).
The prime offeror, major subcontractors, and their respective involvement, are listed in the tables below:
Prime: (Insert Name of Offeror A) (List that portion of the effort the prime performs)
Subcontractors: (Insert Name of Subcontractor) (List that portion of the effort this sub performs)
(Repeat for each major subcontractor)
3.1.2.2 Contracts Evaluated
The Past Performance Evaluation Team evaluated the following programs/contracts performed by (Insert Name of Offeror A): (List by title major programs/contracts performed and evaluated by the team.)
The team relied upon the sources of data, described in paragraph 3.1.2.1 “Data Gathered” above, to assign performance confidence ratings for each offeror. (Describe the evaluation process consistent with the DoD SSP and section M definitions). In the event adverse data was reflected in a questionnaire, the SSEB through the PCO provided the contractor an opportunity to respond if the contractor had not previously been offered the opportunity.
The team used the following considerations in assigning the performance confidence ratings to each offeror (Note: tailor these considerations for your source selection). The offeror’s overall work record; the number and severity of problems; the effectiveness of any corrective actions; and programmatic considerations such as product similarity, complexity, contract type and phase of the program. The offerors’ relevancy and consolidated confidence ratings with their strong points, weak points, and supporting rationale follow:
3.1.2.3 Performance Confidence Assessment -
(Insert Name of Offeror A) was assigned a performance confidence assessment of (insert Substantial Confidence, Satisfactory Confidence, Limited Confidence, No Confidence, or Unknown Confidence, as appropriate. Discussions should include sources of past performance information whether from PPIRS/CPARS or questionnaires. [Note: avoid an “averaging fallacy”—the relevancy of a highly relevant contract is not diminished by several somewhat relevant contracts.] For example, the team analyzed a total of [insert quantity for each category of] very relevant, relevant, somewhat relevant or not relevant contracts. Of the (insert quantity by relevancy) contracts evaluated, CPARs existed on [insert quantity by relevancy] contracts. The CPARs reflected ratings ranging from (insert color or narrative performance rating) to [insert color or narrative performance rating]. Furthermore, approximately [insert percentage] of the CPARs reflected ratings ranging from [insert rating or narrative performance rating] to [insert color or narrative performance rating]. The remaining [insert percentage] of the CPARs reflected CPAR ratings of [insert color or narrative performance rating]. Include a similar summary of the review and assessment of the responses to questionnaires that were sent out, including efforts to ensure responses received, whether written or telephonic, for each category of very relevant, relevant or somewhat relevant contracts).
3.1.2.3.1 Strong Points, Positive Performance
The Past Performance Evaluation Team identified the following positive aspects of performance:
(Use bullet statements to list strong points.)
3.1.2.3.2 Weak Points, Negative Performance
The Past Performance Evaluation Team identified the following negative aspects of performance:
(Use bullet statements to list weak points.)
3.1.2.3.3 Summary
Based on the information identified above, the Past Performance Evaluation Team assigned the performance confidence assessment of (insert Substantial Confidence, Satisfactory Confidence, Limited Confidence, No Confidence, or Unknown Confidence, as appropriate). (Explain and discuss in one or two paragraphs the significant strong and weak points, causes, and corrective action taken by the contractor. Summarize how the team aggregated the data discussed in previous paragraphs to support the overall confidence assessment assigned.)
3.1.3 COST/PRICE FACTOR (Offeror A)
3.1.3.1 Summary of Proposed and Evaluated Cost/Price
Below is a summary of proposed and evaluated prices:
(Summarize proposed and evaluated costs/prices)
A more detailed breakout of the evaluated cost/price follows:
(Provide detailed breakout of the type of evaluation, as appropriate.)
3.1.3.2 Summary
The cost/price (select appropriate) is fair and reasonable (realistic, etc., as required by the RFP) or the cost/price is not fair and reasonable, realistic, etc.) based on (provide rationale).
3.1.4. COMMUNICATIONS/DISCUSSIONS (Offeror A)
3.1.4.1 Results of EN Clarifications and Communications
(Briefly summarize the number of Clarification and Communication ENs and the procedures used.)
3.1.4.3 Results of the EN Discussions
(Briefly summarize the number of Discussion ENs and the procedures used. Describe the discussion process used, the charts presented, and if updated decision briefing charts showing the results of the discussions were provided to the offeror.)
3.1.4.4 Contract Features
Briefly describe key contract features that were discussed and whether or not all critical issues have been resolved. Unique clauses or features should be addressed and any exceptions to terms and conditions must be discussed. Suggest a final closing paragraph such as:
The final proposal received from (insert name of Offeror A) has no significant exceptions to the contract provisions. No waivers or deviations to standard FAR/DFARS/DLAD clauses were requested by (insert name of Offeror A). A contract with (insert name of Offeror A) is awardable and executable.
(Repeat paragraph 3.1 for each offer.)
This report represents the SSEB’s assessment of proposals for the (insert Program title). The evaluation was conducted at (insert site location) between (insert date) and (insert date). This document, together with the SSAC’s (or SSEB’s, as applicable) Comparative Analysis Report, and the decision briefing presented on (insert date if a decision briefing was/is to be held) are offered in support of the SSA’s source selection decision.
(Printed Name) Date: ____________________________
Title SSEB CHAIR
(Printed Name) Date: ____________________________
CONTRACTING OFFICER
APPROVED (if SSAC used):
(Printed Name, __________) Date: ____________________________
Title SSAC Chairperson
Attachment A – List of Government Advisory Reports
(Advisory reports received including audit, field pricing, technical, or any other report. Include the respective report number and date if applicable. If informal assistance was used briefly state the scope and the recommendations given.)
Comparative Analysis Report and Award Recommendation Template
This template (tailored as appropriate) is provided to assist the Source Selection Advisory Council (SSAC) in preparation of the written comparative analysis of proposals and award recommendation to the Source Selection Authority (SSA). The template may also be used by the Source Selection Evaluation Board (SSEB) for acquisitions that do not utilize an SSAC under circumstances whereby the SSA has specifically requested (or the Source Selection Plan requires) that the SSEB provide a comparative analysis of proposals and an award recommendation.
Delete all instructional text (in italic font) and other not applicable text prior to printing and signing the document. Some text was provided as example language, the text should be tailored as required.
Ensure that each page is properly marked:
Source Selection Information – See FAR 2.101 and 3.104
For Official Use Only
(COMPARATIVE ANALYSIS REPORT TEMPLATE)
(Program/Acquisition Title)
COMPARATIVE ANALYSIS REPORT AND AWARD RECOMMENDATION
(Include a comparative analysis of all offers received that were included in the competitive range. If a competitive range determination was not made, this report must address all offerors. If any offerors were excluded from the competitive range, the rationale should be included. The analysis shall identify strengths, deficiencies, and weaknesses, as well as the resulting evaluation ratings. Include a discussion of the results of the past performance evaluation, and the cost/price evaluation. When completed, this report should contain an overall, integrated assessment of technical and associated technical Risk ratings, past performance, and cost or price. This report shall also provide the source selection recommendation and rationale. In the event that there is significant disagreement among the SSAC members regarding the recommendation, a minority opinion shall be documented and presented to the SSA as part of this comparative analysis. Reference the relative order of importance of the evaluation, from the attached PAR, to establish that the evaluation team and the SSA applied the relative order of importance of the evaluation criteria when the comparative analysis was accomplished.)
1.0 INTRODUCTION SUMMARY.
1.1 DLA (insert organization) received (insert #) of proposal in response to the (insert acquisition name and solicitation number), listed in the table below.
Offeror |
Business Size |
Proposed (e.g. Unrestricted/Set Aside) |
CAGE Code |
1.2 After consideration of the Source Selection Evaluation Board (SSEB ) chair or Source Selection Advisory Council (SSAC) (if used) recommendations, the Source Selection Authority (SSA) approved on (insert date) the Contracting Officer’s determination that (insert #)of offerors would be eliminated from further consideration and that (insert #)of offerors would be retained in the competitive range (names marked with * in table above). The offerors eliminated from further consideration were notified on (insert date) that they would not remain in the evaluation for award of this requirement.
1.3 The solicitation identified (insert #) factors for evaluation of proposals: insert appropriate factors and weighting identified in the solicitation, e.g.: Technical, Past Performance, Small Business Participation Plan and Price. The non-price factors are listed in descending order of importance with the first two non-price factors, when combined, being significantly more important than the last non-price factor. The non-price factors, when combined, are significantly more important than price. As non-price evaluation factors become more equal, the evaluated price becomes more important.
1.4 The following were utilized and considered insert appropriate references here, for example: the Proposal Analysis Report (PAR) at Attachment (insert #) and the Source Selection Evaluation Board (SSEB) evaluation report(s) at Attachments (insert # and include dates of final reports) in accordance with the evaluation criteria stated in RPF Section M.X.1 in performing the comparative assessment and award recommendations identified herein.
2. SEPARATE TECHNICAL AND TECHNICAL RISK RATINGS. (Used when DoD Source Selection Procedures, Methodology 1, Combined Technical/Risk Rating Process, paragraph 3.1.2.1. is followed.) (If there is a single Technical factor, then it would be listed and no subfactors listed).
The following is a comparative analysis of the technical aspects of all offerors’ proposals.
2.1 Factor/Subfactor 1 – (Insert subfactor title) (A table may be used to reflect ratings if desired).
(Name of offeror A) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Name of offeror A) was assigned (select one: οo LOW οo MODERATE οo HIGH) risk. (Summarize in a few sentences why they received each of the ratings)
(Name of offeror B) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Name of offeror B) was assigned (select one: οo LOW οo MODERATE οo HIGH) risk. (Summarize in a few sentences why they received each of the ratings)
The primary differences among offerors were: (Use separate paragraphs to compare and contrast offerors. Discuss how they met or exceeded requirements and how it will be advantageous to the Government during contract performance, and compare advantages and disadvantages between offerors). Due to the differences identified above, (insert name of offeror) had the strongest proposal for subfactor (Insert subfactor title).
2.2 Subfactor 2 – (Insert subfactor title.) (A table may be used to reflect ratings if desired.)
(Name of offeror A) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Name of offeror A) was assigned (select one: οo LOW οo MODERATE οo HIGH) risk. (Summarize in a few sentences why they received each of the ratings)
(Name of offeror B) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Name of offeror B) was assigned (select one: οo LOW οo MODERATE οo HIGH) risk. (Summarize in a few sentences why they received each of the ratings)
(Repeat as necessary for each subfactor and offeror)
The primary differences among offerors were: (Use separate paragraphs to compare and contrast offerors. Discuss how they met or exceeded requirements and how it will be advantageous to the Government during contract performance, and compare advantages and disadvantages between offerors). Due to the differences identified above, (insert name of offeror) had the strongest proposal for subfactor (Insert subfactor title).
2. COMBINED TECHNICAL AND TECHNICAL RISK RATINGS. (Used when DoD Source Selection Procedures, Methodology 1, Combined Technical/Risk Rating Process, paragraph 3.1.2.1. is followed.) The following is a comparative analysis of the technical aspects of all offerors’ proposals.
2.1 Subfactor 1 – (Insert subfactor title) ) (A table may be used to reflect ratings if desired).
(Name of offeror A) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Summarize in a few sentences why they received each of the ratings)
(Name of offeror B) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Summarize in a few sentences why they received each of the ratings)
2.2 Subfactor 1 – (Insert subfactor title) ) (A table may be used to reflect ratings if desired).
(Name of offeror A) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Summarize in a few sentences why they received each of the ratings)
(Name of offeror B) was rated (select one: οo BLUE/OUTSTANDING) οo PURPLE/GOOD οo GREEN/ACCEPTABLE) οo YELLOW/MARGINAL οo RED/UNACCEPTABLE for subfactor 1. (Summarize in a few sentences why they received each of the ratings)
(Repeat as necessary for each subfactor and offeror)
The primary differences among offerors were: (Use separate paragraphs to compare and contrast offerors. Discuss how they met or exceeded requirements and how it will be advantageous to the Government during contract performance, and compare advantages and disadvantages between offerors).
Due to the differences identified above, (insert name of offeror) had the strongest proposal for subfactor (Insert subfactor title).
(Repeat as necessary for each subfactor)
2.3. Overall Comparison of Technical and Technical Risk Factors. (This section must discuss the relative order of importance at both the factor and subfactor level (if such a distinction is drawn at the subfactor level.) A summary of the proposal color/adjectival Technical ratings and Technical Risk ratings (if using Methodology 2); or A summary of the proposal color/adjectival Combined Technical and Risk ratings (if using Methodology 1) for Technical subfactors 1, X… is shown below.
OFFERORS A, B, C, etc.; FACTOR/SUBFACTORS 1, 2, etc.
Combined Technical /Risk Rating |
Offeror A |
Offeror B |
Offeror C |
Repeat as Required |
Subfactor 1 |
Rating |
Rating |
Rating |
|
Rating |
Rating |
Rating |
||
Subfactor 2 |
Rating |
Rating |
Rating |
|
Rating |
Rating |
Rating |
||
Subfactor 3 |
Rating |
Rating |
Rating |
|
Rating |
Rating |
Rating |
||
Repeat as required |
Rating |
Rating |
Rating |
|
Rating |
Rating |
Rating |
||
Rating |
Rating |
Rating |
||
Rating |
Rating |
Rating |
Use if combined Technical and Risk Rating methodology is used.
Separate Technical and Risk Ratings |
Offeror A |
Offeror B |
Offeror C |
Repeat as Required |
Subfactor 1 |
Rating |
Rating |
Rating |
|
Risk |
Rating |
Rating |
Rating |
|
Subfactor 2 |
Rating |
Rating |
Rating |
|
Risk |
Rating |
Rating |
Rating |
|
Subfactor 3 |
Rating |
Rating |
Rating |
|
Risk |
Rating |
Rating |
Rating |
|
Repeat as required |
Rating |
Rating |
Rating |
|
Rating |
Rating |
Rating |
Use if separate Technical and Risk ratings methodology are used.
3. PAST PERFORMANCE FACTOR.
(Summarize the Performance Confidence Assessment for each offeror.)
The following is a summary and comparative analysis of the Performance Confidence Assessment for each offeror.
(Insert offeror A name) was assessed as (select one: οo SUBSTANTIAL CONFIDENCE
οo SATISFACTORY CONFIDENCE οo LIMITED CONFIDENCE
οo NO CONFIDENCE οo UNKNOWN CONFIDENCE (Neutral)
as (summarize in a few sentences why they received the rating, including the relevance rating).
(Insert offeror B name) was assessed as (select one: οo SUBSTANTIAL CONFIDENCE
οo SATISFACTORY CONFIDENCE οo LIMITED CONFIDENCE
οo NO CONFIDENCE οo UNKNOWN CONFIDENCE (Neutral)
as (summarize in a few sentences why they received the rating, including the relevance rating).
(Repeat as necessary for each offeror or, if Past Performance was evaluated on an “Acceptable/Unacceptable” basis use the following example)
The following is a summary and comparative analysis of the Performance Confidence Assessment for each offeror.
(Insert offeror A name) was assessed as (select one: οo ACCEPTABLE οo UNACCEPTABLE as (summarize in a few sentences why they received the rating).
(Insert offeror B name) was assessed as (select one: οo ACCEPTABLE οo UNACCEPTABLE as (summarize in a few sentences why they received the rating).
(Repeat as necessary for each offeror)
4. COST OR PRICE FACTOR.
4.1 Summary of Proposed and Evaluated Cost/Price for Each Offeror
Below is a summary of proposed and evaluated costs/prices:
(Summarize proposed and evaluated costs/prices for each offeror)
4.2 Summary
Offeror (insert name of Offeror) has the best cost or price because (state rationale). (Summarize the statement of fair and reasonableness.)
5. DIFFERENCES IN CONTRACT FEATURES. (Briefly describe key contract features that were discussed and whether or not all critical contractual issues have been resolved. Unique clauses or features should be addressed and any exceptions to terms and conditions must be discussed. Suggest a final closing paragraph such as:)
The final proposals received from each offeror have no significant differences in the contract provisions. No waivers or deviations to standard FAR/DFARS/DLAD clauses were requested by any offeror. All contracts are awardable and executable.
6. SOURCE SELECTION RECOMMENDATION. (Provide a source selection recommendation for the SSA to consider. The recommendation is from the SSAC, if used, or the SSEB (if specifically requested by the SSA or required by the SSP). Rationale for the recommendation shall be based upon the evaluation criteria of the solicitation. In the event that there is significant disagreement among the SSAC (or SSEB) members regarding the recommendation, a minority opinion shall be provided with sufficient information for the SSA to fully consider the minority view. The source selection recommendation must also address the relative order of importance of the evaluation criteria in order to prove that the evaluation team understood and applied the relative order of importance of the evaluation criteria. Otherwise, the recommendation might be suspect and result in a challenge by an unsuccessful offeror alleging failure to follow the stated evaluation structure.) This report represents an integrated “best value” assessment of proposals for the (insert Program title). The evaluation was conducted at the (insert site location) between (insert date) and (insert date). This document, the attached Proposal Analysis Report, and the decision briefing presented on (insert date, if held) are offered in support of the SSA’s source selection decision.
_________________________________ Date: ____________________________
Printed Name Title SSAC Chairperson
Source Selection Decision Document (SSDD)
Template, Preparation Tips and Checklist
Current guidance for preparing an SSDD is in FAR 15.308, as supplemented (see DLAD PGI [add appropriate reference to the applicable section in Appendix C]. An SSDD shall be prepared for all Part 15 source selections, regardless of dollar value or source selection approach utilized.
The Source Selection Authority (SSA) is responsible for documenting his/her independent, integrated, comparative assessment and decision; however, in most cases, the SSAC Chairperson (when an SSAC is used), the SSEB Chairperson, and the PCO will collaboratively prepare the SSDD for the SSA’s signature. Some SSAs may prefer to write the SSDD themselves using the reports and analyses prepared by others, particularly if the PCO is the SSA. In any case, assistance from legal counsel is highly recommended in drafting the SSDD.
Delete all instructional text (in italic font) and other not applicable text prior to printing and signing the document. Some text was provided as example language, that text should be tailored as required.
(Ensure the following marking is on each page)
SOURCE SELECTION INFORMATION – SEE FAR 2.101 AND 3.104.
FOR OFFICIAL USE ONLY
SOURCE SELECTION DECISION DOCUMENT
(insert PROGRAM/ACQUISITION NAME), (insert RFP NUMBER)
(Note: Information/placeholders printed in parenthetical italics should not be left within the SSDD.)
1. DETERMINATION. This source selection was conducted in accordance with FAR 15.3, as supplemented, and the XXX Program/Acquisition Source Selection Plan dated xxx. As the Source Selection Authority (SSA), after extensive review of the documentation and in consultation with the Source Selection Advisory Council (SSAC) (if used) and Source Selection Evaluation Board (SSEB), I have determined that the proposal submitted by XXX offers the best overall value to satisfy the stated requirements for the XXX Program. My selection is based upon an integrated assessment of the proposals submitted by the following offerors: (List offerors included in competitive range at time of FPRs or all offerors if discussions were not held.)
2. DESCRIPTION OR REQUIREMENT. (Provide a brief description of the product/services being procured.
3. EVALUATION PROCESS/METHOD. (List the factors, and subfactors, when established, consistent with RFP Section M and include their importance to one another and when combined, compared to cost or price; explain rating method(s) used - colors, adjectival, rankings.) The specific criteria against which the competing proposals were evaluated consisted of the following Factors and Subfactors (when subfactors are established).
4. PROPOSAL EVALUATION. (List of offers included in the competitive range. List each offeror’s overall factor ratings and subfactor ratings. Discuss the non-cost Technical Factors (and subfactors, when used), Technical rating, and Technical Risk rating comparative analysis (Methodology 1 or 2, in accordance with the DoD Source Selection Procedures). Note: The analysis is not comparative for LPTA acquisitions because the LPTA process does not permit tradeoffs between price and non-price factors. Discuss the comparative analysis of the Performance Confidence Assessment at the factor level. The past performance evaluation rating is not comparative, if used, for LPTA acquisitions because the LPTA process does not permit tradeoffs between price and non-price factors. Discuss the cost/price factor--this is usually just termed “Price” on fixed price and LPTA acquisitions)
5. RECOMMENDATION FROM THE SSAC. (If an SSAC is used, or SSEB, if the SSA specifically requested a recommendation from the SSEB. Include rationale for any disagreement with the recommendation(s)/minority opinion(s).)
6. AWARD DECISION. (Discuss justification for the following summary statement. Describe the tradeoffs made between the factors/subfactors of each offeror's proposal within the context of the order of importance of the factors/subfactors described in Section M and the resulting price-technical tradeoff. Provide the rationale for tradeoff decisions IAW [reference applicable section in Appendix C]). In summary, based on my integrated assessment of all proposals in accordance with the evaluation criteria for the XXX Program, it is my decision that the proposal submitted by XXX represents the best overall value to the Government. I direct contract award to XXX.
_______________________ _______
Name Date
Title
Source Selection Authority
SSDD PREPARATION TIPS
Suggested DO’s
1. Re-read Section M of the RFP and then the evaluation documentation, including contents of any briefings that were conducted.
2. Write the SSDD for four audiences: the GAO, the U.S. Court of Federal Claims, the offerors, and the media. Do not expect that these audiences will ever have to read it but ensure that the SSDD is written in a manner that would allow them to understand it if it happened to be read by any one of the four audiences.
3. State the evaluation factors, their subfactors (if any), and their relative importance and ensure consistency with Section M.
4. State the relative importance to cost or price of all other factors when combined. Ensure this is consistent with Section M.
5. For each Technical or subfactor rating, identify each offeror’s strengths and deficiencies within the proposal and then explain how the strengths and deficiencies resulted in the final rating, using the definitions of those ratings contained in the DoD Source Selection Procedures, Table 1, 2, and 3 as applicable.
6. For each Technical subfactor risk rating, identify the weaknesses and significant weaknesses of the approach, if any. (If there were weaknesses and/or significant weaknesses that were resolved during discussions there is no need to discuss those in the SSDD.) For each Technical risk rating, identify the weaknesses, significant weaknesses, and deficiencies of the approach, if any.
7. Adequately address the impact of past performance and its relative order of importance with respect to all of the evaluation criteria. Ensure this is consistent with Section M.
8. In teaming arrangements, such as mentor-protégé, joint ventures, or subcontractors, consider the partner’s relevant past performance only to the degree they will play in performance of the contract.
9. Discuss the cost or price evaluation. Explain how the price(s) were determined fair and reasonable, costs were determined realistic, or other price/cost analysis results reached, as applicable. Use comparative language about which offeror was X% more/less than the others.
10. Discuss those discriminators that make one offeror better than another in terms of benefit to the government. Be as detailed and focused upon discriminators as the source selection results allow. If something was not a discriminator then say so and also state why it was not.
11. In a tradeoff evaluation process, be sure to explicitly explain in the SSDD why a proposal with a higher evaluated cost or price and higher non-price rating was not worth the additional cost or price if the SSA is selecting a proposal with a lower evaluated cost and lower technical rating or past performance rating. Merely stating the conclusion that the higher rating does not justify the higher price is not sufficient – there must be a detailed analysis and rationale supporting the tradeoff discussing the benefits of the competing proposal compared to their price difference. Note that a tradeoff analysis is not required when award will be made to the lowest priced, highest rated offeror, but if another offeror is rated higher than the proposed lowest price awardee in one or more factors or subfactors, a tradeoff analysis should be included.
12. If award will be to a higher priced, higher rated proposal, explain in sufficient detail why the perceived benefits of a higher priced proposal offer the best value. Explain, with supporting evidence, what is worth the additional money.
13. Show the SSA’s thought process and reasons behind the comparative analysis. Include declaration of thinking/intent on the part of the SSA. For example: I selected; I thought; I determined; I reviewed; etc.
14. The summary must be complete and accurate. It must “track” with the contents of the Proposal Analysis Report (PAR)/Comparative Analysis Report (CAR) if the SSA agrees with the conclusions of the SSEB/SSAC. It is meant to very quickly put in words the best of the key discriminators used by the SSA to reach their decision.
15. Have source selection experts (e.g. Legal, Acquisition Support Team, etc.) review and provide assistance/advice on the SSDD.
Suggested DON’TS:
1. Cut and paste from another SSDD. All SSDDs are unique because all acquisitions and all RFPs are different. It is good to understand what a good SSDD looks like but it is inappropriate to “fill in the blanks” from a different procurement or sample.
2. Quantify by assigning numerical scores to the evaluation factor ratings. The rating definitions in the DoD Source Selection Procedures are qualitative and subjective, not numerical.
3. Focus the discussion on only one offeror. The SSDD compares assessments of the successful offeror against the others. Even if there are a large number of offerors, the SSDD must make an assessment of the relative standing of the offerors based upon an application of the relative order of importance of the evaluation criteria.
4. Confuse the Technical rating (color/adjectival) with the Technical Risk rating when using Methodology 2, Separate Technical/Risk Rating Process (DoD Procedures paragraph 3.1.2.2). While color/adjectival rating focuses on the strengths and deficiencies of the proposed approach submitted in response to the requirement (the quality of the offeror’s technical solution for meeting the Government’s requirements), the Technical Risk rating considers potential for disruption of schedule, increased costs, degradation of performance, the need for increased Government oversight, or the likelihood of unsuccessful performance.
5. Use color or adjectival ratings that are inconsistent with the terms used in the standard definitions for those terms in the DoD Source Selection Procedures. Consult with counsel if it appears that the standard rating definitions may not be appropriate for a particular source selection.
6. Identify or list weaknesses, significant weaknesses, or deficiencies without discussing them and their importance to the thought process.
7. Treat an unknown performance confidence assessment favorably or unfavorably. (Don’t disqualify an offeror for having an unknown confidence rating.) However, in a comparative assessment offerors who have more positive and recent/relevant past performance may be considered to offer greater benefit to the government than offerors with an unknown confidence rating. If past performance is used as an evaluation factor within the LPTA process, it shall be rated on an “acceptable” or “unacceptable” basis. In this context, “unknown” shall be considered “acceptable”.
SSDD PREPARATION CHECKLIST
1. Is the SSDD written as a stand-alone document without relying on references to other reports and analyses for information used to support the SSA’s decision? All necessary information supporting the decision should be contained within the SSDD.
2. Does the SSDD tell a complete story and is it clear and concise? Do paragraphs flow logically?
3. Are all of the factors and subfactors impacting the decision process identified in the SSDD and are they identical to the RFP? Are those subfactors that did not impact the decision acknowledged? (For example, “negligible differences among the offerors” or “this did not impact my decision.”)
4. Do the conclusions for each evaluation factor and award decision link directly to the evaluation factors in the RFP?
5. Does the SSDD compare offerors against each other (for example, “I have decided [Offeror ABC’s] approach to the subcontracting plan subfactor was better than [Offeror XYZ’s or all other offerors] because [Offeror ABC] proposed/discussed/resolved/identified/possessed,”?)
6. If the situation is applicable, does the rationale explain why and how the additional benefits and advantages justify a best value award to someone other than the lowest priced offeror?
7. Does this explanation demonstrate a reasonable, certain, and non-arbitrary rationale?
8. If the SSA disagreed with the evaluation team recommendations, does the SSDD explain why?
9. Is the SSDD fully traceable to the evaluation briefing charts and Proposal Analysis Report (PAR) (if used)? There must be total consistency between the RFP, the evaluation, the PAR and the SSDD or a full explanation of any inconsistency.
10. Does the SSDD clearly state that the SSA followed the stated relative order of importance of the evaluation criteria and was it interwoven in the comparative analysis discussion in the SSDD? There must be an up-front statement of the relative order of importance of the evaluation criteria In the SSDD.
11. Did legal counsel review the SSDD?
SSDD ADMINISTRATIVE PREPARATION TIPS
1. Font = Times New Roman; Size = 12(suggested). One inch margins all around. Justified margins or regular margins are fine, but do not mix.
2. Spell out acronyms at first use. This one is important to ensure a common understanding of the key items in the source selection. Do not assume that the reader knows what you are talking about technically or administratively. If necessary, put in a very short explanation of the term especially if it is a key discriminator.
3. Check your numbers, percentages, math, etc. at least twice yourself and then have a third party check them one more time. Attention to detail (getting the small stuff correct the first time) is important because it instills confidence in the quality of the workmanship.
4. Do not needlessly repeat items within the document except for the summary. Expect that readers can look forward or back for the referenced material.
5. Put the ratings in all capital letters (BLUE) to make them standout. Put factors and subfactors (Technical) in first letter capitals to make them standout as well.
6. Add page breaks and use titles to set things off from each other. For example, it should be clear to the reader which factor or subfactor is being detailed.
7. Be consistent throughout the document in format and treatment of the offerors and discussions.
8. Go with the flow – recognize that each reviewer up the chain will recommend their own wording. Don’t let it frustrate you. Remember that ultimately the SSDD needs to say what the SSA thought/felt when he/she made the decision.
SOURCE SELECTION TEAM DEBRIEFING CERTIFICATE
The evaluation Team Debriefing Certificate may be used to document and validate appropriate Source Selection Team personnel have been debriefed, returned all source selection material and are reminded of their nondisclosure responsibilities. The debriefing may be conducted by the SSAC Chairperson (if an SSAC was used), SSEB Chairperson, Past Performance Team Chairperson, Cost/Pricing Team Chairperson or PCO as appropriate. Copies of the certificates shall be placed in the source selection file. Recommend attaching the debriefing certificate with the NDA. This document may be combined with the NDA.
Source Selection Team Debriefing Certificate
I have been debriefed orally by as to my obligation to protect all information to which I have had access during this source selection concerning solicitation number (enter solicitation number). I no longer have any material pertinent to this source selection in my possession except material that I have been authorized in writing to retain by the SSA. I will not discuss, communicate, transmit, or release any information orally, in writing, or by any other means to anyone after this date unless specifically authorized to do so by a duly authorized representative of the United States Government.
SIGNATURE: ________________________________DATE: _____________________