This paper describes a methodology to improve facies model robustness, by a systematic integration of well data ("hard" data) and outcrop analog studies ("smooth" data), both implemented as conditioning data in alternative numerical methods. The methodology presented is illustrated by a case study of a fluvial reservoir analog in the Triassic of the Lodeve basin (France).

On this example, most of reservoir grids are populated by kriging-based stochastic algorithms, supported by geostatistical tools controlling lateral continuity, vertical organization, or simple facies organization. If these tools are sufficient to handle simple environments with smooth facies transitions, all are deeply affected by sampling, giving a skewed representation of sedimentological model and consequently, a skewed numerical model. Moreover, these statistics-based methods are generally inefficient to represent complex fluvial systems. If well data and geostatistical tools are necessary to elaborate a robust conceptual model, a secondary constrain must be used in parallel to better handle characteristics of depositional system.

The methodology developed here is reviewing the fundamental steps of facies modeling process: data analysis, conceptual model elaboration coupled with a "back to outcrop" process and finally, the numerical implementation and it associated quality control procedures. This study is supported by an exhaustive dataset, with 6 drillholes including one cored (42m), conventional welllog data (6), GeoRadar profiles (10 lines), CCAL (>150 samples) and supported by a full outcropping cliff of 200m length. The systematic data review allow to define which facies can be identified at well, how these facies are vertically stacked in the well and how they evolve laterally. These observations help to harden a conceptual sedimentological model able to predict the facies partitionning enhanced by hydraulic mechanisms. To fill the gap between concept elaboration and facies modeling implementation, a "back to outcrop" is fundamental to provide critical elements, directly impacting the robustness of geological models. This includes (but not limited to) geobodies dimension, interaction, preservation, at an intermediate scale between wells (<1m) and seismic (>50m). Finally, these observations will be implemented as external constrain in a concept-based algorithm (nested Boolean, Multi-point statistic), to capture more precisely the rules governing the depositional model.

The final critical step consists in discussing the strengths, limitations and uncertainties associated to these alternative methods. Indeed, the back-to-outcrop process acts as an absolute quality control procedure, highlighting where algorithms or methods are not sufficiently constrained to capture the depositional model. The observations extracted during this process allow a continuous improvement, with the final objective to improve drastically the geological model robustness, and it associated forecasts.

This content is only available via PDF.
You can access this article if you purchase or spend a download.