Based on historical production data, decline curve analysis (DCA) can be used to monitor production, identify potential problems, and predict well performance, life, and economics. Optimized production history matching is crucial to economic analysis on future operations and decision making. The widely used decline models such as Arps rate-time relations and their variations are based on fitting predefined equations and often times do not work for shale gas and oil wells since most of the production data from these wells exhibit fracture-dominated flow regimes and rarely reach late-time-flow regimes. This approach can mislead the trend of the decline curves and produce poor matches and unreliable production forecasts. A suitable data-driven model combining physical or operating parameters can be greatly beneficial and serve as the basis for decline analysis and prediction.

This paper discusses a method for automatic history matching and decline analysis for shale production data based on machine learning, which can be effectively applied to production surveillance and process automation. This approach is based on time series (TS) analysis and neural networks (NNs), which was then extended to applications with operational parameters available, such as bottom-hole pressure (BHP). The proposed TS and NN models were applied to production data from a Barnett gas well. The historical production data was divided into two parts. The first part of the data is used to train the NN model, and the second part of the data is used to verify the accuracy of the prediction results from each input parameter. The results were analyzed and compared with classic Arps decline models.

You can access this article if you purchase or spend a download.