Abstract
The Dynamic Numerical Simulation in Flowlines (DNSFL) is an alternative tool adapted to handle Dynamic Models in Fine Scale. This feature has been particularly relevant for studying the current case, a huge multilayered waterflooding process developed in a giant field of great extension reservoirs with considerable facial and stratigraphical variations.
The DNSLF develops these tasks suitably because uncouples reservoir geometry and heterogeneity from transport equations, solving the problems dominated by convective flows in a faster and computationally more efficient way. This allows to build models of greater space discretization and, therefore, to represent better the heterogeneity of the reservoirs.
The analyzed field is constituted by fluvio-lacustrine deposits, nine sand-clay cycles of normal grading (only eight of them were modeled), partially connected reservoirs; with 250 actives wells in commingled production and water injection; and with a long and detailed history of simultaneous primary and secondary events.
In a previous paper (SPE 94815) a Streamline-based Global History Matching of this field was presented. This process enabled to achieve the Geological Modeling Calibration, a clear conceptualization of the current primary and secondary production mechanisms, its productive behavior, and to evaluate the geostatistical and Upscaling procedures to apply for the definition of the Simulation Model.
This paper illustrates how the Integral Model achieved, with a detailed Streamline-based History Matching, is used for Waterflooding Design Optimization, Surveillance and Monitoring, showing that these principles are key factors to understanding reservoir performance and identifying opportunities that will improve the ultimate recovery.
During the detailed History Matching process CPU runtimes around 200 minutes were achieved using a 1225000 grid cells Model, with 190 timesteps, quarterly at the starting, and monthly after, based in a Pentium 4 PC, 3.2GHz CPU and 2GB RAM. It showed that it is possible to work with a big Streamline-based Model in relatively short processing time.