Reservoir characterization is a process that spans the life of a reservoir in order to facilitate economic decisions. This paper discusses the computing, communication and operational systems that support the process of optimizing the return on investments in a reservoir. These systems employ high performance computing, interactive visualization, global telecommunications, shared earth models and other technologies. The current status and trends of these technologies are given.
The value of a model, be it of a reservoir or an automobile, is as a heuristic. A model combines measured data, judgment calls and preconceptions into a representation useful for decision-making. Information from all sources increases our understanding of the reservoir, which is stored in our reservoir models. The parties with economic interests in the reservoir work to minimize and to quantify the risks resulting from reservoir unknowns. It is essential that the physical reservoir models used by an asset management team are capable of providing the required information. The operational costs of acquiring and assimilating new data into the models must also be addressed.
The process of reservoir characterization entails the construction of a representation of geometry, topology, and materials. This process is an integral part of the business goal of optimizing return on investments in reservoir assets. The majority of publications address geological and geophysical issues in the creation of better dynamic models. The focus here is on systems and operational topics that must be addressed when considering reservoir characterization in the broader context of asset optimization.
High-performance computing (HPC) and interactive visualization are two technologies regarded as standard components in the larger process of integrated reservoir optimization. There is a slight semantic play here, in that the definition of "High" in HPC changes from year to year. The reason that these technologies (and others - see below) are required is simply that the data volumes, geoscience subtleties, and complexity of the static reservoir structure and its production dynamics combine to create an enormous computational task. The good news is that the computing and telecommunications industries are embroiled in constant, furious competition to bring us the performance, functionality, and usability that we need at ever reducing prices.
The goal of reservoir characterization is to provide sound models to support production decision-making. This involves integrating data from diverse sources. The overall goal is to optimize return on investment. This translates into having a good enough reservoir model to be able to predict and to plan production and treatments. In addition to the traditional data inputs to this process newer techniques are emerging such as 4D seismic monitoring of fluid boundaries. The goal also includes minimizing the costs associated with the data acquisition and the decision-making activities. An often hidden cost is computing systems support for data handling and data integration. Poor data management or inefficient data integration reduce any advantage from possessing those data.
Oilfield processing tasks have traditionally demanded the largest computers.