One of the goals of reservoir characterization, particularly in mature reservoirs, is to identify unswept regions containing high oil or gas saturation for targeted infill drilling or enhanced recovery. Given the dense spatial sampling associated with time-lapse seismic surveys, there is a potential to map detailed interwell saturation changes in the reservoir. However, seismic attributes such as travel time and amplitude changes are indirectly related to reservoir saturation changes. Reservoir production data such as water-cut response at wells are related directly to saturation variations. An effective integration of the two forms of data is necessary to reliably characterize saturation distribution and still remains an important unresolved problem. A fundamental difficulty is the burden of reservoir simulation: use of conventional finite-difference reservoir simulators render coupled inversions of seismic and production data computationally challenging.

Recently streamline-based flow simulation models have demonstrated significant potential in integrating dynamic data into high-resolution reservoir models. The streamline approach has relied on computing and matching ‘travel time’ of fluid fronts at the wells and can be naturally extended to directly relate time-lapse images of a multiphase front to reservoir properties. In particular, saturation, porosity, and pressure variations in the reservoir will induce changes in seismic attributes such as ‘amplitude’ or the ‘travel time’. These changes can be incorporated into the streamline-based history matching using appropriate rock physics models. The result will be interwell saturation maps that are constrained by both production and seismic data. A key advantage of our approach will be its computational efficiency that makes it ideally suited for large-scale field applications.

At the Bay Marchand Field in the Gulf of Mexico both time-lapse seismic and water-cut observations are integrated to infer permeability variations within the reservoir. Some 40 years of historic production data are available at the Bay Marchand Field. In all, water-cut data from eight wells are used to constrain permeability variations in the reservoir. The streamline-based inversion is able to match the observations in 15 iterations, approximately four hours of computation for this 20,000 cell 3D model. The time-lapse seismic data are related to the reservoir saturation and pressure changes through Gassmann's equation and laboratory derived relationships. The time-lapse seismic data reveal large-scale amplitude variations that appear to correlate with saturation changes in the reservoir.


Time-lapse monitoring is maturing as a technology and is being increasingly used for reservoir management and monitoring purposes. The guidelines for its application are also becoming clear based on the industry experience.1 The methodology is advancing rapidly and new techniques are under development which allow for discrimination between pressure and fluid saturation changes over time.2,3,4

To date, time-lapse seismic has primarily served as a monitoring tool. That is, time-lapse seismic observations are used to map changes in reservoir saturation and pressure. The next logical step is to use time-lapse seismic data to characterize the reservoir, to infer reservoir permeability and porosity heterogeneity. Such time-lapse generated models of reservoir permeability aid in optimizing secondary recovery of by-passed oil and gas. The additional information provided by the time-lapse data will result in more accurate predictions of the performance of future production efforts.

Currently, there have been very few attempts at formal reservoir characterization using 3D time-lapse observations.5,6 By formal reservoir characterization, we mean some manner of inversion of the time-lapse field data for reservoir flow properties. Time-lapse reservoir characterization is hampered by computational difficulties. Typically, finding a reservoir model which is compatible with a set of saturation and pressure changes, as would be derived from the time-lapse data, requires a significant number of reservoir simulations. Because each reservoir simulation may take hours, if not days, of CPU time on a workstation, formal inversion can be prohibitively expensive. For example, stochastic methods, such as the simulated annealing approach used by Huang et al.,6 require hundreds, if not thousands of reservoir simulations.

This content is only available via PDF.
You can access this article if you purchase or spend a download.