Corrosion of a well represents a degradation of a well structure or components of the well due to internal or external factors affecting the well. Corrosion monitoring is a key element to ensure a healthy well life specially DH tubular integrity. Thus, wells are corrosion logged to ensure casing integrity. This paper demonstrates novel approach utilizing machine learning techniques to prioritize wells for corrosion logging ensuring critical wells attended timely and avoid well integrity potential hazards.

The proposed method involves a multi-step process, incorporating data preprocessing, feature engineering, and ensemble modeling. We employ a custom-built feature extraction module, automatically generating relevant predictors from raw data inputs. Hyperparameter tuning optimizes model configuration through Bayesian optimization and grid search, while regularization techniques implement L1/L2 penalties and dropout to combat overfitting.

This methodology has been piloted and used to generate corrosion logging priority rank for entire wells in the area using a dataset of historical logs summing up to millions of corrosion data points. Evaluation results demonstrate outstanding performance, achieving average accuracies of 85.2% on internal validation and 82.5% on external testing. Sensitivity analyses highlight the relative contributions of individual predictor variables, allowing for targeted interventions and optimized resource allocation.

This paper presents a new approach to improve well integrity assessment and optimize resources efficiently by using data driven models to predict corrosion logs and prioritize wells for corrosion logs and metal loss anomalies to monitor casing integrity. These models not only capture historical trends of changing corrosion inside the wellbore, but also address spatially distributed environmental trends that are not measured vigilantly.

You can access this article if you purchase or spend a download.