One of the chief challenges facing the EP industry is to increase the recovery of hydrocarbons from the existing asset base. With access to new resources getting more difficult and production declines facing several major mature provinces, the awareness that we need to manage our assets differently is growing. Recovery optimization is therefore a growing issue with significant future capital, technological and workforce competence development implications.
One of the approaches to address the challenge is the "Smart Field" (a.k.a. "digital oil field") concept, introducing advanced system engineering and optimization concepts to asset management. This includes new IT-enabled ways of collaborative working and tight integration of surface and subsurface technologies. In this paper we will focus on novel approaches and technology for detecting and monitoring subsurface changes, an area often referred to as reservoir surveillance or reservoir monitoring. On the basis of examples we will argue that a paradigm shift in reservoir management is called for to ensure optimal field development..
Traditional approaches to recovery optimization are too multi-faceted to do justice to in a few introductory remarks. However, there is ample evidence that if technological advances were adopted faster, hydrocarbon recovery for many types of fields would improve faster as well, and the value to investment ratio (VIR) for several types of new technology far exceeds unity. In the past decade, the increased role of IT to process, integrate and visualize data has provided many examples. Workflow process integration, encapsulation of knowledge in a shared earth model, advanced 3D visualization, increasing detail in reservoir models, and remotely controlled operation centers are increasingly adopted by most operators.
Another trend, which seems to be progressing at a slower pace, is the increasing awareness that the above IT-enabled advances, in spite of their undisputed value, are insufficient. For instance, more detailed reservoir simulation will certainly contribute to a better representation of available data, and industry thrusts to build more precise models, using tens of millions of grid blocks, should therefore pay off. However, the question whether available ("traditional") data is sufficient to turn precision into accuracy --an important distinction-- is asked less frequently than it should.