Abstract
In the field of geosciences for oil and gas, there is a disconcerting question that is not often tackled: could we correctly model and predict what might not have been understood? The answer seems natural and obvious. Therefore, the capability of a geological modeling workflow to meet the needs of correctly predicting reservoir petrophysical properties in the 3D space requires that the geological processes driving the current organization of these properties have been correctly identified and characterized prior to the modeling phase.
In the case of carbonate reservoirs, whether marine or lacustrine, the complexity of prediction is largely linked to the multiplicity of processes that have generated the heterogeneity of the reservoirs. Distributing in a reliable way the final properties resulting from geological history shaped both by different organizational logics and by dependence on successive heritages is an immense challenge with major technical and economic consequences. Claiming to go beyond it, as is often the case in industrial studies, despite a lack of concepts, techniques, tools or time, could appear as a lack of lucidity. But reservoir geologists are sometimes constrained by the "good enough" approach were a lack of global understanding and modelling is weighted with regards to a potential loss of revenues generated by miscalculations of reserves.
Nevertheless, reservoir modeling remains a crucial step in the reservoir appraisal process, especially when limited dynamic data is available. For several decades, this modeling has been based on the use of geostatistical techniques whose strength lies more in their ability to honor the available hard data than in the quality and reliability of extrapolation beyond the drilled zones. It is known that the use of these techniques does not allow the validation of the geological hypotheses underlying the stochastic simulations, and that multi-scenario approaches are usually lacking in the evaluation of reserves and resources attached to discoveries.