Â鶹´«Ã½

May 12, 2020

Benchmarking as a Basis of Estimate?

Posted by Unison

When using predictive models, any cost estimating professional will tell you that it’s best to test your estimates up with either Validation or Benchmarking.  Validation involves modeling projects against known actuals to prove the models can regularly predict a project’s cost outcome within a certain degree of accuracy.  Usually this can involve modeling many projects until there is a level of comfort that following the same process will continue to yield very accurate results.Benchmarking involves calibrating specific project parameters while considering past historical cost outcomes, and then using those parameters in to extrapolate forward looking estimates. It is critical to examine a project’s past performance to understand how an organization might operate in the future. As no two organizations are alike, analyzing and using past project data is the key to data-driven estimating.

In TruePlanning® Hardware projects, we usually think about benchmarking the key cost drivers known as Manufacturing Complexity for Structure or Electronics (MCPLXS/E), but really just about any input can be tuned as part of a benchmarking process.  In a single Hardware Component, one could benchmark MCPLXS/E to a known Production Cost, and then tune Engineering Complexity to correct Development activities.  Then within specific activities like Production Manufacturing, one could modify Material Index to adjust the balance between labor and materials, and at the lowest level, they could modify multipliers for every activity and every resource in the Worksheet Set to make every cost object match actuals, hour for hour and dollar for dollar.

Now, you would need very detailed actuals in order to benchmark a project at this level, and you need the actuals be organized in the same manner as the estimates are developed in order to make any sense.  And to benchmark every cost object at a granular level would require all of the cost data to be well organized and probably a great degree of automation.

Some estimators may never have the opportunity to access data so detailed, and for some it may not be completely necessary.  Maybe benchmarking to a high level total cost by component will suffice for predicting the total cost of a similar item in the future.

How far you go depends on how much data you have, how much time it takes, and what you want to use it for. Ultimately, if you want to build a statistically significant database of benchmarks, you need to find trends in the data points, and analyzing trends at the resource multiplier level could be very tedious.  Regardless, lately there is a strong demand for data-driven estimates, and the more clearly you can tie your model inputs to historical data, the more you can defend or justify your estimate.

With each release of TruePlanning, Unison Cost Engineering is trying to make benchmarking (or calibration)  easier, and more reusable (note the addition of the Calibration option in the input sheet in TP 16.1, and the Regression Analysis capabilities in TrueFindings®), and in some cases we are working in partnerships with customers to help them transform raw data, automate calibration, and eventually find trends through various values of data points that can be extrapolated from in the future.  In a perfect world, we would use those benchmarks as a defendable Basis of Estimate in the proposal process.

Learn More about Streamlining Government Work.

Program Management
Marketplace
Planning, Budgeting and Forecasting
Virtual Acquisition Office
Cost Engineering
Acquisition Support
Contract Lifecycle Management