Even with unbiased judgments and project calculations, an insidious bias will still be present in an approved projects portfolio. The screening and ranking process used to select projects causes an optimizer's curse (OC) effect (Smith and Winkler 2006). A portfolio's true outcome value will tend to fall below the forecast distribution mean. Stated another way, we should expect to be disappointed in the value of the eventual outcomes. This optimizer's curse effect can be huge: the true portfolio value might be half its forecast value when forecast in the usual way.
This paper describes a correction process based upon Bayesian inversion obtained with Monte Carlo simulation. Estimate/Actual distributions characterize component evaluation errors. The mean and standard deviation of the E/A distribution measures the quality of judgments and estimates. Examples demonstrate the value of better information in project or asset evaluation, portfolio optimization, and competitive bidding.
The premise of our approach to correct for optimizer's curse is straightforward enough, though implementing this method requires judging the population of evaluated projects—or at least assessing the shape of this population's distribution. We offer a process and guidelines for correcting for the optimizer's curse in both project and portfolio value calculations. Correcting estimates for systematic biases restores luster to the recommended expected value-maximizing decision policy. We also tie back to the winner's curse phenomenon (Capen et al. 1971) experienced in competitive bidding and how it affects bid optimization.