Abstract
Increased complexity in reservoir models, advanced understanding of fluid flow physics, and improved computational capabilities continuously push E&P companies to improve their reservoir simulation tools and associated workflows. This paper describes a robust framework to handle two critical steps in the development of the integrated simulation workflows: simulator testing and release process to end-users companywide. Selection of benchmark models, coverage of functionalities and stress testing of the simulator, are also discussed.
Ideally, the simulation results should not change with new versions of the simulation software, but this is hardly the case. With the addition of new features, bug fixes and updates, some of the existing functionalities could behave differently - sometimes an improvement over previous results and at other times, an unintentional side-effect leading to errors. The simulator testing process starts with the automated regression test covering a wide range of features and functionalities, which is evaluated through comprehensive benchmark criteria and metrics for success/failure. Regression tests are followed by different levels of investigation to identify and resolve potential problems within each individual software component, the integrated workflows, or the simulation model itself. The web-based continuous integration tool (to collect code changes and create new builds) along with the automation helped to reduce the turnaround time for identification of issues with any code changes. This also helped to streamline and prioritize the scheduling and investigation of the regression tests based on the criticality of features under development and perform focused testing for user-preferred workflows.
Due to the increase in operational requirements, reservoir complexities, and the need to share the models with the partners, several types of simulation workflows are in practice within the company, all of which require proper testing before each new release version. Previously, sheer volume and monotonicity of the regression test process caused frequent incidences of human error; therefore, automation has not only reduced those errors but also made quality time available for in-depth analysis of the issues shortlisted by the automated process.
The paper details the testing and release framework using several real field simulation models, where the automated process resulted in timely resolution of issues, seamless transition to newer version of software/workflows, and reliable results from simulation tools. This framework also facilitated the collaboration and coordination among the cross-functional teams and provided updates on the release status and software changes to ensure a smooth and successful release.