Contract: W91QUZ-06-D-0016

Contract Advantages

SAIC is your key source for leading-edge technologies that can carry our ITES-2S clients into the 21st century and beyond.

The SAIC ITES-2S TEAM provides our customers with a deep, broad range of core competencies, experience, and skill sets that allows us to provide you with the innovative and cost-effective solutions you require.

The ITES-2S IDIQ Vehicle established by the U.S. Army Computer Hardware, Enterprise Software, and Solutions (CHESS) allows SAIC to provide the U.S. Army and any federal agency with a full range of information technology services and solutions to support agency enterprise infrastructure and infostructure goals.

SAIC's e-Kompass is a secure, web-based integrated data environment that empowers our Army and government clients to obtain necessary information on your task orders 24 hours a day.

We strive to meet your evolving needs—from the foot soldier to space.

About the Contract

The U.S. Army CHESS/Army Contracting Agency-Information Technology, E-Commerce, and Commercial Contracting Center (ITEC4) has awarded the SAIC team a prime contract for the ITES-2S vehicle. This IDIQ contract supports the Army enterprise infrastructure and infostructure goals with a full range of innovative, world-class information technology solutions at a reasonable price, consistent with Department of Defense (DoD) and Department of Army policies on standardization and interoperability.

The ITES-2S performance-based IDIQ contract is a decentralized vehicle that any federal ordering official can utilize. There are currently 104 labor categories created to support 10 task areas. In addition, most of the labor categories are further divided into three skill levels (associate,intermediate and senior). ITES-2S Task Areas & Job Descriptions (58k MS Excel file) These task areas are comprised of the following:

  • Business process reengineering
  • Information systems security
  • Information assurance
  • Information technology services
  • Enterprise design, integration, and consolidation
  • Education and training
  • Program and project management
  • Systems operation and maintenance
  • Network management
  • Network support

This vehicle is very diverse in that it permits Cost Plus Fixed Fee, Cost Plus Award Fee, Firm Fixed Price, Time & Material and Fixed Price Award Fee task orders, allowing the customer maximum flexibility in meeting their particular requirements. It has a total contract ceiling value of $20 billion and a nine-year contract duration (three-year base; three two-year options). All security levels are allowed from unclassified to Top Secret SCI. Work can be performed worldwide.

About e-Kompass

e-Kompass is a web-enabled, database-centric system for management of task orders, Army resources, business processes, contracts, and delivery orders across the Army enterprise. It is workable with all contract types and multiple instances of contracts.

e-Kompass' powerful engine, called COMeT—Common Operations and Management eBusiness Toolkit—is currently being utilized throughout the Department of Defense, federal government, and commercial industry. View a list of our clients.

SAIC's e-Kompass provides dedicated task order management supported to the many Program Executive Office Enterprise Information Systems (PEO EIS) and the U.S. Army CHESS. e-Kompass will guide our valued customers to their goals by providing:

  • Program/task-level management
  • Performance-based metrics
  • Task area descriptions
  • Guidelines and support
  • General information
  • Defined roles and responsibilities
  • Metrics for managing success

Clients

e-Kompass' powerful engine COMeT is currently being utilized throughout the Department of Defense, federal government, and commercial industry.

Disa

Defense Information Systems Agency (DISA)

Optarrs

U.S. Army (OPTARSS)

Usama

U.S. Strategic Command

MSL

Military Sealift Command (MSC)

Navsea

Naval Weapons Center (NAVSEA), Crane, Indiana

Spawar

Space and Naval Warfare Systems Center (HQ) (SPAWAR)

Itess

ITESS Contract Management

Lss

Lear Siegler Services, Inc.

SAP is a registered trademark of SAP Aktiengesellschaft.

PeopleSoft is a registered trademark of the Oracle International Corporation.

Procurement Process

PBSA Lessons Learned and Some Observations

In this section, SAIC shares several lessons learned, ideas, and other observations regarding the pre-award procurement/acquisition and post-award execution of PBSA task orders relevant to the ITES-2S contract. We leave the full tutorial and educational treatment on PBSA to other forums.

  • PBSA is the Preferred Method for Acquiring Services
    • The Federal Acquisition Regulation [FAR Section 37.102, Policy] states: “Performance-based contracting is the preferred method for acquiring services. When acquiring services, including those under supply contracts, agencies must…use performance-based contracting methods to the maximum extent practicable.” Additionally, the FAR prescribes policies, application for use to include excluded services, required elements, and the preferred contract type, firm-fixed price, when using PBA.
  • PBSA Benchmark for FY 05
    • Agencies should apply PBSA methods on 40 percent of eligible service actions over $25,000, to include contracts, task orders, modifications, and options, awarded in fiscal year (FY) 2005, as measured in dollars. OFPP Memorandum September 7, 2004 (173k PDF file*)

Establishing a task order that satisfies the spirit and the intent of PBSA and that meets the objectives of FAR 37.102 can be challenging. During the last decade as PBSA has been defined and refined, SAIC and our ITES-2S team members have had many experiences with contracts and task orders that were both good and bad examples of PBSA. From these we have learned many lessons.

Statement of Work

The statement of work (SOW) specifies the government's minimum requirements for services to be performed under the resulting task order. The SOW is one of three optional work statement methods—the others being the Performance Work Statement and the Statement of Objectives—that may be used in an ITES-2S Task Order Request.

The ITES-2S Ordering Guide (1.2M MS Word file) describes the SOW with a specified format and an annotated example provided in Attachment 2.

SOWs are the more traditional and historically more common means for the government to define requirements for services similar to those within the scope of the ITES-2S contract. Normally, an SOW specifies a “best-effort” task that is ordered on a level-of-effort or time-and-materials basis. These methods do not easily support the principles of PBSA.

Related Link:

*Note: PDF documents are viewed using Acrobat® Reader®.

Performance Work Statement

The performance work statement (PWS) specifies the government's requirements for services to be performed under the resulting task order. The PWS is one of three optional work statement methods—the others being the statement of objectives and the statement of work—that may be used in an ITES-2S Task Order Request.

The ITES-2S Ordering Guide (1.2M MS Word file) describes the PWS in paragraph 4 of Chapter 3 with an annotated example provided in Attachment 3, including a section for a Quality Assurance Surveillance Plan.

The use of a PWS (with included Quality Assurance Surveillance Plan [QASP]) is the task order request method that most closely adheres to the principles of Performance-Based Services Acquisition. A well-crafted package accomplishes the following:

  • Requirements (or outcomes) define the work in measurable, mission-related terms.
  • Measurable performance standards are tied to requirements or outcomes.
  • Includes a plan and methodology for measuring performance against standards.
  • Performance incentives promote contractor achievement of desired outcomes and/or performance objectives.

Key elements of the PWS that are essential to supporting the PBSA paradigm include:

Related links:

Top of Page


Developing the PWS

Many variants on the Performance Work Statement format exist on the web and in various training courses. The ITES-2S Ordering Guide (1.2M MS Word file) provides a template for the PWS as used on this contract.

In contrast to a classic SOW, the PWS focuses on what the contract is to perform, rather than how to do the work. This shift is key to establishing a PBSA task.

Use of Templates—"Re-use"—a term from software engineering—is very applicable to developing a PWS. If possible, find a good, well-crafted PWS with strong performance-oriented characteristics in a similar task activity and use that as a starting point in writing your agency's requirements. This may expedite your PWS writing process.

Writing Style—Check all statements within the PWS to make sure that directions to the contractor are eliminated and that all "how" statements are removed. The emphasis is on capturing technical requirements in terms of outcomes desired, rather than details that often devolve into "direction," which should be avoided.

Roles and Responsibilities—The PWS should be clear regarding roles and responsibilities of the government. This must be clear since the responsibilities for service delivery generally belong to the contractor, unless otherwise stated.

Top of Page


Performance Requirements

FAR 2-101 defines "performance-based contracting" as structuring all aspects of an acquisition around the purpose of the work to be performed, with the contract requirements set forth in clear, specific, and objective terms, with measurable outcomes, as opposed to either the manner by which the work is to be performed or broad and imprecise statements of work.

Performance requirements (required services) should focus on the outcomes or end results required rather than the “how” of accomplishing this. A good performance requirement statement directly addresses one or more problems (needs) of the Requiring Activity.

Performance Requirement Examples for Types of Services
ITES-2S Task AreaPerformance Requirement(s)
Business Process Reengineering (BPR) All key managers are interviewed and inputs integrated into the current system definitions.

Information Systems Security Systems installed shall meet the specified security and vulnerability standards.

System backup and disaster recovery plans comply with X.
Information Technology Services— Engineering Lifecycles Super-user workstations must be replaced at two-year intervals.

Software development organization shall be CMMI® Level 5 registered.

Program/Project Management Deliver required reports.

Assess customer satisfaction for each service.

Provide timely notification of issues or other items requiring government action.
Strategic Enterprise IT Policy and Planning



Systems Operation and Maintenance Hardware and software that are declared surplus, must be disposed of within 60 days.

Call center must be available 24 hours per day, 7 days per week, 365 days per year.

Provide preventive maintenance.
Incidental Construction Comply with all health, safety, and environmental regulations and directives.

Top of Page


Performance Standards

A "performance standard" is a characteristic of a "performance requirement" (service or product) that indicates the quality of the outcome achieved and that can be measured. Each performance requirement should have a performance standard. For some services, more than one performance standard may be appropriate if they each individually measure a unique facet of the quality of service being delivered. Conversely, in some cases, one performance standard may apply to several services in an aggregate or composite manner.

Performance standards define the measures of success in delivery of the performance requirement. Normally, it is expressed in terms of a metric that measures the service and in terms of a period of time over which the metric is measured and evaluated.

Metrics Selection—It is important that a performance indicator metric:

  • Is quantifiable/measurable in a numerical or countable manner.
  • Focuses on the critical services provided rather than ancillary items or intermediate results.
  • Stresses the performance of the contractor and not the specific methods or processes used.
  • Is relatively similar to historical values and experience.
  • Minimizes judgmental evaluations.
  • Represents how well the services are meeting the standard.
  • Is realistic—sometimes this may be a compromise.
  • Is unambiguous—each element of the performance standard should be well defined in the context of the task order and the services being ordered.

For example, the number of aircraft departures could be compared to the number of departures scheduled in a period of time to determine the number of cancelled or delayed flights. Then, a percentage (error rate) for delayed flights can be determined. The performance standard would be based on the percentage of cancelled or delayed flights in the given period.

Defects and Exceptions—The performance standard is based on a measurement (count) of the defects or exceptions to the metric used to express the performance standard. Usually, perfection would be zero (0) defects.

Metric Periods—Normally, a performance standard includes a measurement period (per day, per item, per month) for the term of the task order.

Qualitative and Subjective Metrics—It should be noted that most PBSA tasks will have some performance standards that are best evaluated in terms of "qualitative" or “subjective” measures. These often are a substitute for hard, quantifiable metrics when these metrics are not readily identifiable. If qualitative measures are included, they should be defined such that they result in quantifiable measurements or indicators.

Daily Management Metrics—Well-managed organizations rely heavily upon process metrics and like data to accomplish daily management of their operations. Often these metrics are numerous, detailed, perhaps complicated, esoteric, and of value principally to those managing the processes. If they were not of value, an efficient organization would cancel the measurement immediately! This data is often innocuous such as personnel attendance, security logs, help-desk trouble ticket categorization attributes, vendor payment timeliness, etc. These metrics play a key role in the contractor or organization effectively executing its process, which ultimately produce the required services or products for the customer.

Undoubtedly there is a relationship between daily management metrics that measure a process and those that adequately measure satisfaction of specific requirements in accordance with a performance standard. It is important to recognize the difference between "common" daily management metrics and those that must bear a role in the measure of final accomplishment of requirements. A contractor's daily management metrics should be evident in various plans, e.g., PMP, QAP, etc., as well as the processes uses, especially in a CMM® Level 5 or an ISO 9001:2000-registered organization.

The following table provides several additional examples of performance standards that are germane to typical required services within the scope of ITES-2S. These examples illustrate the concepts but may not be applicable to any specific customer’s requirement.

Examples of Performance Standards
Performance RequirementPerformance StandardComment
Provide continuous operations of mainframe systems and services

Availability 98 percent during the hours 0800-1600

One two-hour maintenance outage scheduled monthly

 
Comply with applicable site environmental, health, safety and fire regulations

All incidences are reported

No major violations

Less than one minor violation per month

Illustrates multiple standards for one requirement
Document all product shipments In accordance with accepted management plan  
Assure that customers (call center operations) are satisfied with quality of service No more than 5 percent of callers surveyed express dissatisfaction with service  
*Note: PDF documents are viewed using Acrobat® Reader®.
Note: CMM and CMMI are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.

Quality Assurance Surveillance Plan (QASP)

The Quality Assurance Surveillance Plan defines the government's expectations, how the performance requirements (products or services) will be monitored and evaluated, and the incentives applicable to the task order. The ITES-2S Ordering Guide (1.2M MS Word file) describes the QASP in paragraph 4 of Chapter 3 with an annotated example provided as Attachment 3, including a section for a Quality Assurance Surveillance Plan.

The QASP is a government-owned document. It should be the government's plan for surveillance of contractor performance during the task order's life. The government (or a selected third party) executes the QASP. The contractor has significant roles in the surveillance process, but these are limited primarily to making measurements and reporting information in accordance with the QASP

The QASP, as used in a PBSA Task Order, should not be confused with a Quality Assurance Plan, Quality Control Plan, Process Improvement Plan, or other plan. These are usually contractor-produced separate plans for internal use by the contractor.

In a PBSA task, the contractor is contractually responsible for quality assurance, and this responsibility is motivated through various incentives (or penalties). The government's focus is on assessing or evaluating performance through assessment methods to assure that the government is getting what it is paying for.

Normally, the requiring agency develops the QASP and PWS for inclusion in the Task Order Request. Alternatively, the requesting agency may develop a statement of objectives and request, through the Task Order Request, that contractors propose a PWS and QASP (and other TO proposal elements).

Additional topics on the QASP:

Developing the QASP

The QASP should be developed in conjunction with the PWS. Following the ITES-2S Ordering Guide (1.2M MS Word file), four sections are inserted directly from the PWS, the title should remain the same, and three sections developed to complete the QASP.

Relating PBSA Elements of PWS and QASP for ITES-2S Task Order Request
Performance Work StatementRelationshipQuality Assurance Surveillance Plan
1. Project Title

same

1. Task Order Title
2. Background   n/a
3. Scope Inserted into Arrow (1) 4. Scope of Performance
4. Applicable Documents   2. Work Requirements

5. Performance Requirements

n/a

Inserted into  3. Primary Method of Surveillance

6. Performance Standard

n/a


n/a

 

5. Performance Standards

6. Acceptable Quality Level (AQL)

7. Evaluation Method

7. Incentives Inserted into  8. Incentives (Positive and/or Negative)
8. Deliverables and Delivery Schedule   n/a
9. GFE/GFI   n/a
10. Place of Performance   n/a
11. Period of Performance   n/a
12. Security   n/a
13. QASP   n/a
Note: Usually, the QASP is developed by the government. But in PBSA and in the ITES-2S contract when the Task Order Request package includes a SOO, the contractor is requested to respond with a PWS, including a QASP.

Top of Page


Methods of Surveillance

A good surveillance method will have the following characteristics:

  • It is “transparent” meaning that all elements of the measurement, data collection, metrics determination, and reporting are understood, documented, and auditable.
  • Contractors provide all necessary measurements, tools, and reporting effort.
  • The government execution of surveillance is straightforward and does not duplicate contractor or government quality assurance efforts.
  • Surveillance efforts are scheduled or planned for specific time periods.
  • Results aid the government in determining courses of action with the contractor.
Example of Surveillance Methods
Surveillance MethodsExamplesComments
100 Percent Inspections* Government will review each CDRL for completeness within 7 days Often costly but sometimes necessary for health and safety
Random Sampling* Random sampling of help-desk transaction log showing calls and closure information (monthly)

Statistically based method

Applicable to recurring tasks

Generally requires large numbers of items to measure

Periodic Inspection* Sample the first four trouble logs every M/W/F for analysis Planned sampling at specific dates or time intervals
Customer Input*

Customer satisfaction survey

Survey end user for ease of use for newly produced documentation

Depends on firsthand information from customers

Qualitative method used in lieu of quantitative methods

Supplements or complements quantitative standards

Contractor Self-Reporting*

Contractor / Subcontractor revenues by socio-economic categories to report compliance with Small Business Plan

Network availability metrics (failures, operating hours, etc.)

This method can be the most efficient since it relies on the contractor’s in-place infrastructure and systems for measurement of performance indicators and results. The keys to making this work are effective partnering in the task and transparency of the contractor’s methods. And, of course the reporting must be auditable.

Trend Analysis

Conducted every quarter.
ISO 9001-2000 assessment

Provides for more continual assessment of ongoing performance

Independent (third-party) audit or assessment   Relies on third-party methods and standards
* Surveillance method identified in the ITES-2S Ordering Guide (1.2M MS Word file).

Top of Page


Acceptable Quality Level (AQL)

The AQL states the level of variation that a contractor may have from a performance standard. MIL-STD-105D states that an AQL is "the maximum percent defective (or the maximum number of defects per hundred units) that, for purposes of sampling inspection, can be considered satisfactory as a process average." While this standard and its definition pre-date the current PBSA methods by many years, it still provides good general guidance for approaching development of AQL and related factors for service contracts—whether based on sampling, inspection, or other surveillance methods.

Acceptable Performance—For a service, the AQL expresses the maximum (worst case) deviation from the stated performance standard that a customer is willing to accept during a defined measurement period. Performance with fewer deviations than the AQL is desirable (in most cases) but not required. Performance with more deviations is not desirable, but not necessarily the basis for total rejection of the service.

For example, let's assume that a requiring agency specifies end-to-end email delivery to be provided within 10 minutes of the end user sending the message 95 percent of the time (5 percent error rate). Therefore:

  • E-mail delivery is the "performance requirement" (service)
  • 10 minutes (for each message) is the" performance standard"
  • (less than or equal to) 5 percent (failure) is the "acceptable quality level"

Determining AQL—The initial values of AQL are determined after the required services and associated performance standards are determined. Specific quality levels can be based on historical or current performance, mission support requirements, direction from higher command, and other sources. Significant considerations:

  • Each required service should have one or more performance standard.
  • Each performance standard should have an acceptable quality level.
  • Determine the level of performance variation that the government will permit during execution of the task order.
  • Review each standard to make sure that delivery of the service at that standard meets the needs of the government.
  • Determine the methods and application of positive incentives if the contractor exceeds the performance standard (acceptable level) and this adds value or benefit to the requiring agency
  • Determine the methods and application of negatives incentives if the contractor does not meet the acceptable level.

Cost/benefits Trade Off—Selecting an AQL for most performance standards represents a trade off between allowable defects (or customer satisfaction) against likely costs of service delivery.

For information on this chart please contact Program Manager Michael J. Kwak at 703-676-8836

The figure to the right illustrates a range of defects (horizontal axis) and a notional expression of service cost (vertical axis). As illustrated the designated AQL allows for a level of defect against the standard. Any performance with fewer defects (left of the AQL) would probably mean greater customer satisfaction and acceptance of the service. A greater number of defects in excess of the AQL (to the right) would be the cause of lesser satisfaction, reduced value, and possible rejection. A performance standard may be absolute allowing for no variation or a standard may have a range of acceptable (tolerable) variation. Variations will occur. An AQL of 0 percent (indicating perfect performance against the standard) can be very expensive to achieve and rarely achieved, if not impossible to achieve.

Measuring Defects or Accomplishment—Most measurements are based on identifying defects or failures within a lot. Often, it is easiest to express "perfection" as being ZERO defects and then "less than perfection" being some number of defects being recognized. Performance standards and AQLs may be based on this approach.

Alternatively, we often think in terms of accomplishment of a goal, therefore the goal is thought of in terms of 100 percent, with defects dropping the accomplishment level down. For example, we often think more in terms of "availability" of a network or other service rather than the "downtime" experienced.

Either way of expressing a performance standard is acceptable. It is important to make sure that for a specific standard, AQL and resulting incentive, that the terminology and methodologies are documented in a consistent manner with the approach chosen.

Service Level Needs—Setting the AQL for a new task order that essentially continues a service previously delivered should be pretty straightforward. The AQL should remain relatively consistent with that experienced in the past (unless that service level has been unacceptable). The AQL should be more stringent (fewer defects) than historical levels only when changed requirements are clearly driving a change in service levels and probably cost.

Sometime, though, the AQL can be purely an "intelligent guess" when an historical basis is not available. In other cases, a quantitative measure is not evident, then the standard may be judgmental or subjective, such as the appearance of a physical area.

The following table illustrates the relationship of "availability" with "outages." We often hear of quality at SIX 9s … a very difficult standard to achieve. When establishing metrics, one must consider the reality of the standard and AQL being set. On an annual basis, can the user live with an outage, or total of all outages, of more than 0.5256 minutes, TOTAL? When stablishing percentages or other factors, work through the math to determine what that means on a daily, weekly, or monthly basis. And, is this a realistic standard to be buying?

Outage Per Year
AvailabilityMinimumAverageDay
90% 52560 876 36.5
99% 5256 87.6 3.65
99.9% 525.6 8.76 0.365
99.99% 52.56 0.876 0.0365
99.999% 5.256 0.0876 0.00365
99.9999% 0.5256 0.00876 0.00037

In the case of a guess without sound basis, it would be wise to establish a commitment for the requiring agency to negotiate revised performance standards and/or AQls with the contractor after some period of stable operation.

For information on this chart please contact Program Manager Michael J. Kwak at 703-676-8836

Range of Quality Levels 
While the AQLis a key point in the evaluation of contractor performance, there are additional points on the evaluation curve that may be significant for the evaluation, assessment, incentives, and customer satisfaction. These are illustrated on the chart to the right and summarized in the table below.

Quality Levels for Consideration in AQL and Incentives
Quality LevelUsageComment
“0” Zero Defect No deviation from standard

Perfection—no defects

Usually more costly

Rarely achieved

Essential in areas of health and safety

Exceeds needs Exceeds all needs and economic value. No additional incentive planned.
Exceeds AQL Range of service quality that is better than the AQL, but does not exceed the customer needs. Establishes a range above AQL in which incentives are applicable to encourage performance at this level.
Acceptable Quality Level Maximum allowable level of deviation from performance standard. The only required level for PBSA.
Degraded Service Service delivery at less than the AQL (greater number of defects), but still above a level that is deemed unacceptable. Disincentive (negative incentive)
Unacceptable Rejection of the service Rejection of the service

Top of Page


Evaluation Methods

Example Evaluation Methods
Performance StandardEvaluation MethodComment
Tier 3 problem reports are tracked until closure. Review list of Tier 3 problems for problem log time and closure information (time, etc.). 

Compare problem resolution logs from Tier 3 vendors to problem reports.
 
No more than 5% of callers surveyed express dissatisfaction with service Review and validate customer surveys .

Conduct independent surveys.
Sometimes two or more evaluation methods may be used to assess contractor performance.

Top of Page


Incentives (Positive and/or Negative)

Accountability in PBSA is how the measures of the metrics translate into incentive to the contractor. The structure of the incentive arrangement (positive or negative) ranges widely. The key to whatever arrangement is made is to have quantifiable results that are gathered in a mutually agreed upon manner to identify the earned incentive. Both the requiring agency and contractor must be accountable for actions taken in PBSA. The more these actions can be quantified and placed into the PBSA metrics, the clearer the relationship will be.

AQL Link to Incentives—Incentives and/or disincentives should be linked in an unambiguous manner to the AQL and performance metrics to define what actions are taken when the contractor deviates (plus or minus) from the AQL.

The AQL often equates to a neutral incentive, meaning the price offered (and base fee in award fee plans) are not adjusted at this level. But the requiring agency may want to establish a means of giving incentives to contractors to exceed theAQL. Conversely, in situations where there are more than a few issues or problems with quality and the contractor does not satisfy some or all of the established performance standards, there should be a mechanism for application of negative incentives (also known as "disincentives").

The key question for those establishing incentives is how to translate the significance or criticality of a service into an incentive. Incentives should be directly linked to the value of the required service. This value may be based on price, performance, schedule or other factors. Incentives can be monetary (financial) or non-monetary (non-financial).

Monetary Incentives—As proposed, ITES-2S task orders should be based on a basic or bid level of incentive for the contractor. Monetary incentives are used to encourage a contractor to achieve a higher level of performance (better than AQL) that the requiring agency desires. Conversely, should the contractor not achieve the AQL, then negative incentives, also know as service credits or penalties, may be levied.

There are several means of implementing monetary incentives under ITES-2S which may include the following:

  • Additional monetary incentives can be considered and incorporated by usage of Fixed-Price Award Fee (positive).
  • Price may be reduced proportional to the level of degradation of service.

Cautionary note: As noted in the ITES-2S Ordering Guide (1.2M MS Word file), all maximum financial incentives should be included in initial committed task order funding to assure prompt payment when earned.

Non-Monetary Incentives—While monetary incentives are most desirable, these may not be practical in all cases. And, nothing precludes offering both monetary and non-monetary incentives for outstanding performance. Several representative non-monetary incentives include the following:

  • Award term—Extension of period of performance or automatic exercise of options
  • Reduced task order surveillance, e.g., less frequent measurements, reduced documentation requirements, etc.
  • Government will submit positive performance evaluations to external agencies, such as the NIH Contractor Performance System
  • Increased payment frequency—For example, SAIC records labor (a major component of most ITES-2S task orders) on a bi-weekly basis in conjunction with our payroll and labor distribution systems. Invoicing for labor on a bi-weekly basis has a significant positive impact on corporate cash flow if the government pays in a timely manner.
  • Revise task order schedules for future milestones or deliverables following contractor suggestions that might benefit the contractor in workload management, staffing, reducing conflicts in schedule, etc.
  • Reduced oversight and surveillance such as sample size and/or frequency.
  • Reduced reporting requirements, e.g., eliminate paper copy of all routine reporting and deliverables with email distribution accepted as the approved method.
  • ITES-2S Evaluation of Contractor's Task Order Performance with no less than a satisfactory rating on each item and an overall YES recommendation on item 7.
  • Letters of commendation and other recognition of contractor personnel and teams.
Examples of Incentives Linked to Performance Standards
Performance StandardAQLIncentiveComments
Help desk available 24x7 No more than 1% of calls are answered in more than 45 seconds. 0.50% to 1.00%—no incentive 0.00% to 0.49—price increased 2% for period 1.00%-5.00%—price decreased 2% for period  
Submit Final Test Report within 15 days of completion
of XXX

On-schedule delivery.

No deviation.

No deviation—no incentive

1 deviation (late report)—price decreased 1% for period

Negative incentive (penalty) only

Statement of Objectives

The Statement of Objectives (SOO), as the name implies, specifies the government's objectives to be achieved under the resulting task order. The SOO is one of three optional work statement methods—the others being the performance work statement and the statement of work—that may be used in an ITES-2S Task Order Request.

The ITES-2S Ordering Guide (1.2M MS Word file) describes the SOO in Chapter 3, paragraph 5, and with an annotated example provided in Attachment 4.

The SOO is an emerging new approach in PBSA. It is being used more frequently, in part, because it requires competing bidders to develop a PWS as part of the proposal process. It effectively turns things around where the contractor proposes a PWS including QASP, rather than the government providing these as part of the Task Order Request. This approach is in line with commercial best practices, in which the agreement (contract) tends to be focused on the outcomes or final products rather than the methods used.

A Task Order Request based on the SOO gives contractors significantly more flexibility in proposing creative and innovative products and services. Also, the contractor has significant freedom in the proposed performance standards, AQL, and incentives for the task order. The key here is "proposed." If selected to perform the task order, then the requiring agency negotiates task order measures, metrics, and incentives as documented in the PWS/QASP with the contractor.

Perhaps more so than with a PWS or SOW, this approach encourages partnership between the government and the contractor in the resulting task order.

Top of Page


Developing the SOO

An SOO should be easy to develop; it is typically less than 10 pages and often much shorter. The challenge, of course, is finding those perfect, succinct words that adequately summarize the objectives of the requiring agency that a contractor will satisfy under an ITES-2S task order.

Several suggestions for consideration while developing the SOO and in handling the subsequent acquisition: 

  • The SOO summarizes key outcomes expected from the services being ordered.
  • The SOO must guide the bidders with regard to all applicable constraints.
  • Must, through the introduction or task order request documentation, make sure that all requiring agency and end user expectations are clearly known by the bidders.
  • Consider a due diligence period; this might extend the period for proposal responses.
  • Review all objectives to make sure that they remain consistent with the stated scope or mission of the SOO.

Top of Page


Contractor-Provided PWS and QASP

The Task Order Award Process for ITES-2S includes a newer approach that is rooted in the PBSA objectives. The requiring agency can develop a statement of objectives as part of the task order request, and the contractors are requested to submit a performance work statement with a draft quality assurance surveillance plan as part of their technical and cost proposals. This truly addresses the flexibility issue that PBSA attempts to give to the contractors.

Several key considerations in this approach include:

  • The proposed PWS and QASP elements dealing with measures, metrics, and incentives that must be negotiated between the requiring agency and the contractor prior to task order award.
  • The QASP becomes a government document and the government retains the responsibility for execution of the document.
  • The government must carefully consider and document all constraints and expections as part of the task order request to help guide the bidders to respond with acceptable proposed solutions.
  • Consider a due diligence period during the proposal process to permit contractors a period of dialog with the requiring agency to clarify objectives, constrains, and needs.
*Note: PDF documents are viewed using Acrobat® Reader®.

Additional Topics

Mapping Required Services to Incentives

Incentives and/or disincentives should be linked in an unambiguous manner to the AQL and performance metrics to define what actions are taken when the contractor deviates (plus or minus) from the AQL. The PWS and QASP define all elements that form a direct linkage between the requiring agency's performance requirements and the application of incentives for the contractor. These are expressed in textual form in various sections of the PWS and QASP in accordance with the ITES-2S Ordering Guide (1.2M MS Word file). These linkages may not be as evident as they should be.

SAIC recommends development of a table to manage the data BEFORE the PWS and QASP are developed. This table, albeit a bit cumbersome to work with for large requirements, is an excellent way of managing the critical data that underlies the PWS and QASP. And thus, more relationships are evident in the beginning and throughout the task order requirements development.

As the requiring agency, all elements must be considered carefully to assure a sound PBSA strategy for the task order. While providing the supporting data management discipline to aid in making all parts of the task order request, this table will also aid in making the documents internally consistent as well as complete.

Constraints

One of the objectives of PBSA is to provide greater flexibility for contractors in accomplishing the required services (outcomes) of a task order. But, the approaches, methodologies, processes, etc., employed by a contractor often must be constrained by the operating environment of the requiring agency and end users. These constraints are unavoidable, so they should be documented in the PWS.

Typical Constraints to be Considered
CategoryExamplesComments
Deliverables and Schedule Site survey report 

Monthly status report
Document in the PWS list of deliverables with applicable CDRL and DID identification. If services or products are needed by specific dates, then that information should be documented.
Skills or Personnel Requirements Field service personnel must have a current Top Secret clearance. Generally avoid these except where required for compliance with directives.
Government Furnished Information Legacy system documentation Critical design documentation (produced by incumbent) Anything that the government knows it must provide should be listed.

Anything that is to be provided in a limited or constrained manner should be listed and noted with limitations or constraints.
Government-Furnished Equipment Server farm per attached configuration  
Critical Risks Areas needing management attention, special skills, or limited resources. Expressed as part of the skills requirements, GFI, GFE, or perhaps as part of the evaluation criteria.

Applicable Documentation—Directives may include: technical orders, regulations, military specifications, government documents, programmatic documents, manuals, or industry documents. Following analysis, these are then documented in the "Application Documents" section of the SOW or PWS or in the "Operating or Programmatic Constraints" of the SOO. The key here is that the requiring agency must determine the specific applicability of each item. Several suggested considerations:

  • Collect information on all agency directives and for the local (end-user) levels.
  • Make sure that all identified items are completed cited, i.e. title, identification nomenclature, version or date, etc.
  • Avoid directives that specify "how" work is to be done or note that these sections of the directives are not mandatory for the contractor. (Often directives are written assuming that the government would be performing the work, therefore the procedural "how" aspects may not be applicable under a PBSA.)
  • Narrow any citation to the specific sections of a directive that are applicable to the contractor’s performance and exclude sections that are not applicable.

Workloads—Historical and Projected

Workloads, in a broad sense, are characteristics of a service. Typically the data is very significant in the sizing, design, or scope of a service (or product) being acquired. Workload data may include such things as numbers of items ordered (per day, per week, per month, etc.), average rates for performance, current defect rates, total numbers of lines of code to be maintained, etc. This represents a large universe of information that can be documented as applicable to the services being ordered.

For example, in call center operations the historical data is usually a good basis for forecasting routine call loads and as an indicator of the peak or surge requirements experienced over time.

Task order requests often represent a new order to continue (expand or change) services that are already being provided by the government or a contractor. The SOW/PWS/SOO will define the government's requirements going forward, but often omits valuable detail information regarding the actual workloads involved.

Workload data is often closer to the daily management metrics than to the metrics that ultimately form performance standards. Rarely as an organization transitions from classical services acquisition to PBSA do they have the historical data captured in a manner that is consistent with the new view of required services and outcomes.

Without this data, responders to the task order request must make assumptions about workload, both historical and projected into the future. Providing historical (and projected) workload gives the bidder a better understanding of the magnitude of the requirement and necessary basic understanding of the requirements. Some might say that this data "levels the playing field," giving all bidders an equal opportunity to respond to a task order request—and so much the better for the government—more valid bids mean more competitions and better value.

While a tenet of PBSA is shifting additional responsibility and risk to contractors, an unfortunate side effect is predictably higher pricing for the services offered in consideration of perceived risk. Where solid, germane workload data can be made available, bidders will be able to "sharpen the pencil," both in terms of price and quality offered.

Comprehensive and accurate workload data directly contributes to the quality of a PBSA solicitation—both the task order request and the quality of the task order proposals (and pricing) from contractors. Additionally, workload analysis can also be very useful in building a surveillance plan, especially with regards to sizing the surveillance effort with regard to quality/cost tradeoffs.

Evolution of Performance Standards, AQL, and Incentives

Metrics, and all the processes that surround them, must be viewed as evolving—"sun setting" or canceling ones no longer needed, revising others, and creating new ones where needed. This is especially true for longer tasks. Typical drivers for this include:

  • Task changes, e.g., a project progresses through a development/sustaining lifecycle such that the services being performed are changing significantly.
  • Process improvements may make initial standards “too easy.”
  • Issues or external factors may make initial standards unrealistic.
  • Customer objectives change.

A process for mutual agreement and change should be put in place. The QASP is an excellent place to describe a process for this evolution. This affects the standards, metrics, AQL, and potentially the incentives (or disincentives) that are awarded based on the metrics. It is important to identify changes to metrics and incentive structures well enough in advance for the contractor to prepare to be evaluated against the new metrics.

In some circumstances, performance standards may not be readily definable at the time the Task Order Request and PWS/QASP are prepared. Potentially, the Task Order could specify a subtask to develop performance standards, metrics, and AQL for the task or for a follow-on to the task.

Learning More About PBSA

Related links:

Caution: Searching the web for information on PBSA, samples of successful documents, examples of metrics, etc., is a good way to enhance your initial training in PBSA and related topics. But, you must always consider the sources of information and the audience to whom the information is presented.

And how correct or accurate is the information? The OFPP's "Guide to Best Practices for Performance-Based Service Contracting" (October 1998) has been rescinded yet many web sites still link to the document. It does contain good process information that can be of use today, but the document itself no longer reflects all current practices such as use of a Statement of Objectives.

ITES-2S Team

ITES-2S Point of Contact