Abstract

Geostatistical algorithms are being widely used to integrate different data such as seismic amplitude, well logs &core measurements into reservoir models. However approaches to integrate dynamic/production data efficiently into these models are largely lacking. Production data differs from other types of static data (such as porosity, permeability, amplitude, etc.) primarily because they are non-linearly related to the connectivity characteristics of the reservoir. In this paper we develop a gradual deformation methodology to integrate twophase production data in order to give rise to a suite of reservoir models that are conditioned to both the static data as well as dynamic data. We utilize the Sequential Indicator Simulation algorithm within a non-stationary Markov Chain to iteratively update the realizations till a history match is obtained. The methodology is tested on a synthetic 2-D and 3-D reservoir.

Introduction

Reservoir flow simulation is used to forecast oil and gas production profiles corresponding to different development scenarios. The reservoir models on which the flow simulations are performed are themselves uncertain due to the sparse information that is available to construct them. This uncertainty is minimum at the well locations and maximum at locations away from the wells.

Hence the production profile for a particular development scheme cannot be predicted exactly. Geostatistical simulation algorithms such as Sequential Gaussian (SGS) and Sequential Indicator (SIS) are being widely used nowadays to develop multiple equi-probable reservoir models. Each of these models is conditioned to the available static data such as seismic, well logs, cores, etc. and the set of reservoir models quantify the uncertainty stemming from the lack of complete information about the reservoir.

Historic production data contains valuable information pertaining to the connectivity characteristics of the reservoir. However, constraining reservoir models to measurements of pressure, oil, gas and water rates at different times is considerably more complicated due to the nonlinearity between the dynamic response outputs and the model parameters. The procedure to adjust a reservoir model such that the production history is reflected correctly is known as history matching. It is a very time consuming process and may require several months of work by the reservoir engineer. In order to alleviate this problem, various attempts have been made to automate the process of history matching.

History matching is an ill-posed problem and the parameter set that results in minimizing the deviation from data is non-unique. Mathematically, the history matching problem can be posed in an optimization context i.e. the minimization of a complex least squares objective function in a parameter space populated by multiple local minima. Two broad approaches for solving the problem are:

  • Trial and error methods: Run repeated flow simulations on multiple reservoir models and retain only those models that reflect the historic production characteristics within some acceptable tolerances.

  • Gradient based methods: Trial and error methods are computationally inefficient. The gradient methodsI, II on the other hand capitalize on the nature of the physical relationship between the observed flow response and the model parameters in order to expedite convergence of the optimization process.

This content is only available via PDF.
You can access this article if you purchase or spend a download.