Smart Fields and Intelligent Wells are the new buzz words currently dominating in the petroleum industry. Many papers have been published so far on the implementation of intelligent wells but less has been seen on the actual application of the continuously measured data.
There are many different problems the engineer is facing during his attempt to use the real time data for real time optimization. Just to mention a few of them:
Moving huge amounts of data
Many engineering software packages are not able to handle high frequency data
Basically, the question is, how can the real time data stream be used for continuous asset optimization and is there any economic value to it?
The first part of the SPE paper will deal with a general introduction into current problems and how they can be solved.
The second part will give an actual example of the implementation of an Automated Reservoir Surveillance System1 in the Medusa field, Gulf of Mexico (GOM). Different levels of implementation have been identified:
Data management and integration
Real time rate allocation
The first three points have been successfully implemented. The remaining point is currently in progess. A detailed view will be given on the goals which should be achieved in the end - in particular how to actually use real time data for continuous monitoring and optimization.
The uniqueness of this approach lies in the continuous and automatic comparison of real values to estimated values based on computational models. If the two values deviate from each other, the engineer is notified by an alarm system. This is a powerful and new approach and can be seen as the first step towards a Smart Field implementation.
Real time data can assist the engineer in managing the asset. The continuously measured data reveal more information than sporadic measurements. This helps to shorten the learning curve and to understand the behavior of the field much faster and better. In this SPE paper, we do not want to focus on the hardware aspect of intelligent wells or smart fields. We will demonstrate the integration of continuously measured data and how this process can assist the engineer with his day-to-day work. It is about Automated Reservoir Surveillance (ARS).
Today, almost all deepwater offshore wells are equipped with a basic set of sensors, such as bottomhole and wellhead pressure and temperature gauges to mention only the most common ones. These sensors usually measure at frequencies ranging from seconds to minutes - depending on the requirements. The huge amount of production data generated during a month makes a manual manipulation of the data impossible. In most cases the bottom hole pressure gets extracted from the real time data historian for pressure transient analysis, but all the other measured parameters remain unused in the high frequency database.
Looking into the SPE library reveals that there are many papers on how to implement an intelligent well or how to use the generated data on a very high level, but less has been said on the most important task in the middle - how to manage the real time data and bring it to the engineer.
De Jonge and Stundner2 presented an approach of using data mining for automated reservoir surveillance. This is basically the theoretical background to this real life example.
The Real Time Optimization Technical Interest Group3 gave a very good definition of this topic.
Saputelli4 et al went one step further and introduced the Self-Learning Reservoir. This kind of field theoretically does not need human intervention - it is self-adapting.