Summary

Managing complexity and technological complexification is a necessity in today's business environment. This paper outlines a method to increase value addition significantly by multidisciplinary reservoir studies. In this context, value addition refers to a positive impact on a business decision. The approach ensures a level of complexification in line both with business questions at hand and the realities of reservoirs.

Sparse well control, seismic uncertainties, imperfect geologic models, time constraints, software viruses, and computing hardware limitations represent some common reservoir realities. The process model detailed in the paper uses these apparent shortcomings to moderate (i.e., guide) the level of complexification. Several project examples illustrate the implementation of the process model. The paper is an extension of three previous investigations1–3 that deal with issues of method and uncertainty in reservoir-performance forecasting.

Introduction

Multidisciplinary teams and data have become the standard 1990's methods to address large-scale reservoir-management issues. Concurrently, reservoir simulation has assumed the role as a "knowledge manager" of ever-growing quantities of information.

The paper pursues three basic questions:

  1. How can we maximize the value added from integrated reservoir studies,

  2. How can we achieve a pragmatic balance between business objectives/timetables and problem complexification, and

  3. How best can we use the technology dividend provided by the explosion of computing power Primarily because of their size, Saudi Arabian fields amplify the significance of these three questions.

What has emerged is the realization that reservoir simulation needs to provide a proper demarcation between scientific and business objectives to remain business-relevant.

The discussion that follows consists of two main parts. First, we present an analysis of complexity in general and reservoir systems in particular. This is followed by a process model (i.e., parallel planning plus) and a set of principles that link business needs, reservoir realities, and simulation in the context of multidisciplinary studies.

The following definitions will facilitate the discussion that follows. Complex (adjective): Composed of interconnected parts.

Complexity: The state of being intricate. The degree of interconnection among various parts.

Complexification: The process of adding incremental levels of complexity to a system.

Detail vs. Dynamic Complexity

A vast array of multisourced information makes up reservoir systems (Fig. 1). Reservoir simulation is our attempt to link the "detail complexity" of such a system to the "dynamic complexity"4,5 expected in business decisions. In this regard, a systems engineering perspective to reservoir management is very relevant. Senge4 defines two types of complexity: detail and dynamic. Detail complexity entails defining individual ingredients in fine detail, while dynamic complexity refers to the dynamic, often unpredictable, outcomes of the interactions of the individual components. Senge4 states that "the real leverage in most management situations lies in understanding dynamic complexity, not detail complexity." This is precisely true for many of the questions facing reservoir-management project teams in the industry. When to initiate an EOR project or pattern realignment or how to develop a field are typical dynamic complexity problems.

Relative-permeability data, field-management strategies, or wellbore hydraulics are examples of detail complexity. Geologic, geostatistical, and reservoir-simulation models are also examples of detail complexity, but represent higher orders of organization. Interestingly, reservoir-simulation models have a dual function: first, as an organizer of detail complexity, and, second, as a tool for interpreting dynamic complexity (a distinction from geologic models).

Technological complexification is the process of adding incremental levels of detail complexity to a system to represent its dynamic complexity more rigorously. Each one of the components depicted in Fig. 1 offers an avenue of complexification. Perhaps ironically, every component also carries an element of uncertainty.

New technologies are adding significantly to the detail complexity available to multidisciplinary teams. One can see that advances in computing technology, for instance, play a role in the cycle of complexification that Fig. 2 shows. As we acquire more computing power, we can build more complex models, which will further delineate the questions being addressed, calling for more computing power, and so on. The real question, however, is whether we are in fact getting a better answer to the questions posed. Or, alternatively, are we making a difference?

Multidisciplinary studies are vulnerable to the tendency towards maximal detail complexity. As one of the constituent disciplines (e.g., seismic, geostatistics) produces a more detailed reservoir representation, the pressure mounts for the other disciplines to match the level of complexification in their respective areas. However, for many reservoir problems, we may have a nonlinear relationship between dynamic and detail complexity (Fig. 3). As the number of detail complexity elements rise, the number of interactions among the elements proliferate. Any one of these interactions can be a show stopper. For example, reservoir-simulation models constructed at the detail level (i.e., scale) of geocellular models can become numerically unstable or prohibitively central-processing-unit (CPU) intensive - either way, a nonsolution.

Complexification vs. Error Expectations

The reservoir system depicted in Fig. 1 does not represent a controlled data environment; i.e., we are not operating in a setting where we can control the quality and quantity (sufficiency) of data. Therefore, in reservoir systems, the concept of "garbage in/garbage out," when taken literally, is an oxymoron. There is always some contamination (error or uncertainty) in one of the detail complexity elements. Thus, we need to redefine our mission as "given the data environment as is, what is an acceptable error, and what is an appropriate level of complexification?"

This content is only available via PDF.
You do not currently have access to this content.