Abstract
The use of proxy models for optimisation of expensive functions has demonstrated its value since the 1990's in many industries. Within reservoir engineering, similar techniques have been used for over a decade for history matching, both in commercial tools and in-house software.
In addition to efficient history matching, proxy models have a distinct advantage when performing uncertainty quantification of probabilistic forecasts. Markov Chain Monte Carlo (MCMC) methods cannot realistically be applied directly with reservoir simulations, and even fast proxy models can fail dramatically to adequately represent the range of uncertainty if implemented without due care.
A pitfall of the use of proxy models is that they are considered ‘black box’ and their quality is difficult to measure. Engineers prefer to deal with deterministic simulation models which they can evaluate and understand.
The main pitfall of simple random walk MCMC techniques, which have begun to appear within reservoir engineering workflows, is a focus on theoretical properties which are not observed in practical implementations. This gives rise to potential gross errors, which are not generally appreciated by practitioners. Advances in recent years within the field of Bayesian statistics have significantly improved this situation, but have not yet been disseminated within the oil and gas industry.
This paper describes the limitations of random walk MCMC techniques which are currently used for reservoir prediction studies, and shows how Hamiltonian MCMC techniques, together with an efficient implementation of proxy models, can lead to a more reliable and validated probabilistic uncertainty quantification, whilst also generating a suitable ensemble of deterministic reservoir models. Scientific comparison studies are performed for both an analytical case and a realistic reservoir simulation case to demonstrate the validity of the approach.
The benefit of this methodology is to allow asset teams to effectively manage reservoir decisions using a robust and validated understanding of uncertainty. It lays the scientific foundations for the next generation of uncertainty tools and workflows.