Improved Framework for Measuring the Magnitude and Impact of Biases on Project Evaluation
- Malik K. Alarfaj (Texas A&M University) | Duane A. McVay (Texas A&M University)
- Document ID
- Society of Petroleum Engineers
- SPE Reservoir Evaluation & Engineering
- Publication Date
- August 2019
- Document Type
- Journal Paper
- 2019.Society of Petroleum Engineers
- calibration, biases, uncertainty assessment, probabilistic
- 6 in the last 30 days
- 10 since 2007
- Show more detail
- View rights & permissions
|SPE Member Price:||USD 12.00|
|SPE Non-Member Price:||USD 35.00|
Several authors over several decades (Capen 1976; Brashear et al. 2001; Rose 2004) have observed that the petroleum industry has consistently performed below expectations. Although this has been painfully obvious during the industry downturn beginning in 2014, available evidence suggests that even when the industry is profitable (e.g., during the decade before the most-recent downturn), it still performs substantially below expectations and its potential (Nandurdikar 2014). Many attribute this underperformance to cognitive biases in project evaluation, resulting in poor project valuation and selection. McVay and Dossary (2014) presented a simplified framework to estimate the cost of underestimating uncertainty. They demonstrated that chronic overconfidence and optimism (estimated distributions of project value are too narrow and shifted positively), which are common in the industry, produce substantial disappointment (realized portfolio values being less than estimated values), also common in the industry.
In this work, we generalized the McVay and Dossary (2014) framework to include full estimated distributions (e.g., normal or lognormal), instead of the truncated distributions they used. In addition, we extended their framework to model underconfidence (estimated distributions too wide) and demonstrate that underconfidence is just as detrimental to portfolio performance as overconfidence. Decision error will be minimized and portfolio value will be maximized only when there is no bias in project estimation (i.e., neither overconfidence nor underconfidence and neither optimism nor pessimism). We compared the value gained from reducing biases with that from reducing uncertainty and found that reducing biases consistently generates more value than reducing uncertainty.
Using either framework, operators can quantitatively measure biases—overconfidence, underconfidence, optimism, and pessimism—from lookbacks (comparing actual performance with probabilistic forecasts) and calibration plots. Once aware of the direction and magnitude of biases, operators have means for eliminating these biases in new forecasts through a combination of internal adjustment of uncertainty assessments, by means of training or ongoing feedback, and external adjustment of assessments, using measurements of bias from calibration results.
|File Size||2 MB||Number of Pages||23|
Alarfaj, M. K. 2018. Measuring and Improving the Reliability of Probabilistic Assessments in Petroleum Engineering. PhD dissertation, Texas A&M University, College Station, Texas (August 2018).
Begg, S. H. and Bratvold, R. B. 2008. Systematic Prediction Errors in Oil and Gas Project and Portfolio Selection. Presented at the SPE Annual Technical Conference and Exhibition, Denver, 21–24 September. SPE-116525-MS. https://doi.org/10.2118/116525-MS.
Begg, S. H., Welsh, M. B., and Bratvold, R. B. 2014. Uncertainty vs. Variability: What’s the Difference and Why is it Important? Presented at the SPE Hydrocarbon Economics and Evaluation Symposium, Houston, 19–20 May. SPE-169850-MS. https://doi.org/10.2118/169850-MS.
Bradley, E. L. 2006. Overlapping Coefficient. In Encyclopedia of Statistical Sciences, Vol. 6, second edition, ed. S. Kotz, C. B. Read, N. Balakrishnan, et al. Hoboken, New Jersey: Wiley-Interscience.
Brashear, J. P., Becker, A. B., and Faulder, D. D. 2001. Where Have All the Profits Gone? J Pet Technol 53 (6): 20–73. SPE-73141-JPT. https://doi.org/10.2118/73141-JPT.
Capen, E. C. 1976. The Difficulty of Assessing Uncertainty. J Pet Technol 28 (8): 843–850. SPE-5579-PA. https://doi.org/10.2118/5579-PA.
Fondren, M. E., McVay, D. A., and Gonzalez, R. A. 2013. Applying Calibration to Improve Uncertainty Assessment. Presented at the SPE Annual Technical Conference and Exhibition, New Orleans, 30 September–2 October. SPE-166422-MS. https://doi.org/10.2118/166422-MS.
Haynes and Boone. 2016. Haynes and Boone Oil Patch Bankruptcy Monitor, http://www.haynesboone.com/-/media/files/energy_bankruptcy_reports/oil_patch_bankruptcy_monitor.ashx?la=en&hash=D2114D98614039A2D2D5A43A61146B13387AA3AE (accessed 17 June 2019).
Hdadou, H. and McVay, D. A. 2014. The Value of Assessing Uncertainty in Oil and Gas Portfolio Optimization. Presented at the SPE Hydrocarbon Economics and Evaluation Symposium, Houston, 19–20 May. SPE-169836-MS. https://doi.org/10.2118/169836-MS.
Hubbard, D. W. 2014. How to Measure Anything: Finding the Value of Intangibles in Business, third edition. Hoboken, New Jersey: Wiley.
McVay, D. A. 2015. Industry Needs Re-Education in Uncertainty Assessment. J Pet Technol 67 (2): 72–76. SPE-0215-0072-JPT. https://doi.org/10.2118/0215-0072-JPT.
McVay, D. A. and Dossary, M. N. 2014. The Value of Assessing Uncertainty. SPE Econ & Mgmt 6 (2): 100–110. SPE-160189-PA. https://doi.org/10.2118/160189-PA.
Merrow, E. W. 2012. Oil and Gas Industry Megaprojects: Our Recent Track Record. Oil and Gas Fac 1 (2): 38–42. SPE-153695-PA. https://doi.org/10.2118/153695-PA.
Nandurdikar, N. 2014. Wanted: A New Type of Business Leader to Fix E&P Asset Developments. J Pet Technol 66 (10): 15–19. SPE-1014-0015-JPT. https://doi.org/10.2118/1014-0015-JPT.
Nandurdikar, N. S. and Wallace, L. 2011. Failure to Produce: An Investigation of Deficiencies in Production Attainment. Presented at the SPE Annual Technical Conference and Exhibition, Denver, 30 October–2 November. SPE-145437-MS. https://doi.org/10.2118/145437-MS.
Rose, P. R. 2004. Delivering on Our E&P Promises. The Leading Edge 23 (2): 165–168. https://doi.org/10.1190/1.1651465.
Smith, J. E. and Winkler, R. L. 2006. The Optimizer’s Curse: Skepticism and Postdecision Surprise in Decision Analysis. Manag. Sci. 52 (3): 311–322. http://pubsonline.informs.org/doi/abs/10.1287/mnsc.1050.0451.
Welsh, M. B., Begg, S. H., and Bratvold, R. B. 2007. Modelling the Economic Impact of Common Biases on Oil and Gas Decisions. Presented at the SPE Annual Technical Conference and Exhibition, Anaheim, California, 11–14 November. SPE-110765-MS. https://doi.org/10.2118/110765-MS.
Xu, C. and Bell, L. eds. 2016. Big Losses Reported in 4Q 2015 on Writedowns, Low Oil Prices. Oil & Gas J. 114 (4a).