Abstract
As the role of reservoir flow simulation increasingly impacts existing operations and field development decisions, it follows that rigor, fitness and consistency should be imposed on the calibration of reservoir flow models to dynamic data through history matching. Although a wealth of history matching techniques exist in the petroleum literature that propose novel algorithms or share case studies, seldom does the content guide the modeler in fit-for-purpose reservoir model calibration for an operating asset. To evaluate the applicability of these diverse techniques against standards required for reservoir management, an internal study was performed to benchmark four assisted history matching (AHM) techniques commonly promoted in the oil and gas industry. The techniques were vetted against a comprehensive suite of modeling requirements for multiple asset classes, integrating a variety of historical dynamic data types through the calibration of reservoir properties that control flow behavior from the field to inter-well scale. The methods benchmarked were: (1) Design of Experiments (DoE)-based, (2) Ensemble Kalman Filter and Ensemble Smoother, (3) Genetic Algorithm and (4) Generalized Travel Time Inversion. This manuscript focuses solely on the DoE-based technique.
In order to consistently benchmark the techniques, a set of standards was defined against which each was evaluated to determine its suitability for widespread history matching applications. The standards involve: the capacity to parameterize (and therefore calibrate) a diversity of reservoir flow model attributes, the capacity to integrate different types of dynamic data, the level of independence from the flow simulator and the capability to provide probabilistic outcomes for predictive uncertainty assessment. Of the four techniques, the DoE-based approach uniquely satisfied all requirements. Its history matching workflow has the flexibility to incorporate any form of reservoir model parameter and to assimilate a history matching error metric for any individual or group of historical data types; therefore, benchmarking established DoE-based techniques as unambiguously the most compliant with generic asset modeling requirements. The approach was also identified as the most straightforward, both theoretically and in practical computation, and therefore applicable to the broadest range of practitioners. Perhaps most importantly, the approach demonstrated the capacity for accurate quantification of uncertainty (or non-uniqueness) in reservoir quality resulting from an exhaustive, although approximate, exploration of model parameter space and the associated history matching error metric(s).
This manuscript compiles the results and insights gained from benchmarking of the DoE-based techniques through the proposal of a comprehensive assisted history matching workflow. The workflow is designed for generality while providing best practices that guide the modeler in fit-for-purpose application. Limitations of the workflow are also recognized. Key components include: selection and screening of calibration parameters based on statistical significance, development of surrogate models to characterize the relationship between parameters and the simulated historical data being integrated into the reservoir model, use of the surrogate models for exhaustive yet efficient exploration of parameter space to identify (non-unique) history matched models, and (deterministic or probabilistic) discrete reservoir model selection for use in forecast-based decision making. Each step of the AHM workflow is presented from a conceptual and applied perspective, and field applications are provided to demonstrate key concepts. Although the applications presented include two deepwater Gulf of Mexico assets, the workflow and insights provided are developed from benchmarking across a diverse suite of asset types.