The existence of multiple processors on a network or a parallel machine enables sets of reservoir simulations to be performed. It is then possible to build up a model of the reservoir as a response surface. This can be a function of input variables related to symbols in the user's data definition. Controlling these variables by a master program and using experimental design methods in a series of runs we can model the behavior of the simulation. This would then predict its response to further engineering requests. Sequential designs are particularly relevant when a bank of available processors exists. Once the basic number of runs required to parameterize the reservoir response functions have been performed, additional simulations may be used to obtain error bars on quantities such as recoverable oil.

Once the response surface exists, both automatic optimization and large scale risk analysis predictions may be performed at high speed.

This approach combines well with multiple realization geological modeling. Running a number of geologies enables the error involved in the simulation of a given engineering scenario to be quantified - this can be used to predict the uncertainty on all the predictions of the study.

By monitoring the results of the simulations interactively, and measuring the quality of the history match obtained with each, it is possible to condition subsequent runs - for example by reducing the use of realizations which consistently yield poor matches.

We describe software to perform such multiple realization studies, acting as an interactive supervisor through a simple open PVM (parallel virtual machine) interface to a reservoir simulator.

You can access this article if you purchase or spend a download.