Abstract
In the oil industry, most of the reservoir studies are routinely run using commercial reservoir simulators developed about 30 years ago. These simulators were conceived for serial computers, logically structured grid with moderate permeability contrasts and moderate physical complexity; parallel computing only came in later stages. This is crucial during the reservoir model construction: computational time prevents real-life simulation grids from hosting more than one or two millions of active cells thus, grid coarsening and property upscaling still play a key role. Often, this acts as a hard constraint in EOR/IOR simulation which usually demands increasing accuracy to resolve heterogeneity and structure complexity.
In the last decade many reservoir simulators have been specifically developed for parallel architectures, adopting flexible formulations and more robust linear solvers. This provides opportunity to drastically speed-up reservoir simulations, handling large models without upscaling. The next step is to use new simulators as enablers to unlock accurate high resolution models. In this framework, the Company is implementing a step-change in reservoir simulation, deploying a new generation high-resolution simulator for the most critical and complex assets. At the same time, the Company put into operation a 4 Pflop/s High Performance Computing (HPC) system to more effectively support exploration and reservoir activities.
This paper documents a study carried out to highlight the benefits to reservoir modeling attained combining the high-resolution reservoir simulator and the top-class HPC facility. Main target of the study was to provide indications about the size of the models that these new technologies allow to manage in real-life applications, where model complexity may be due to geology, number of wells or field development schemes and physical processes.
For this purpose, two case-studies were selected, namely a brown and a green field. The brown field is a gas-condensate carbonate reservoir with 30 years of history and 25 years of forecast, both periods characterized by large number of wells and a mix of primary depletion and gas injection. The green field, instead, is a very heterogeneous, faulted, undersaturated oil reservoir with a 15-years development plan based on water flooding.
Several reservoir models, from millions to hundreds of millions active cells, were built and simulated in different parallel configurations. The components of the work-flow, modeling packages, simulator and HPC configurations, were stretched to run large scale models within typical reservoir study time-scales, where simulations are expected to last less than one day. The models were deliberately chosen to be larger and more complex than those currently used operationally with the purpose of identifying potential limitations imposed by hardware and software which may impact future generations of simulation models, thus ensuring that the simulator development, model development and hardware requirements are aligned.
The present work indicated that models with tens of millions of cells can now be easily simulated by combining HPC systems and high-resolution reservoir simulators. However, moving Giga-cell reservoir models from papers to engineer's desk will require further improvements in simulation technology, with emphasis on scalability and optimal management of multi-core architectures.