Narrow annulus is frequently encountered in drilling operations as in Casing while Drilling, Liner Drilling etc. Hydraulics of narrow annulus is a relatively new topic of research in drilling. Current analytical solutions have limited applicability for complex flow regimes affected by casing motion, pipe rotation, eccentricity and cuttings. Therefore, the objective of this research is to develop data-driven statistical learning models which can be very effective in making pressure loss predictions for given operating conditions. The data for proposed supervised learning was obtained from large scale experiments conducted on a narrow annulus wellbore configuration on LPAT (Low Pressure Ambient Temperature) flow loop at TUDRP, Tulsa University Research Projects Group. Exploratory visualizations were used to determine the relationship between operational parameters and pressure drop. Resampling methods, such as cross-validation and bootstrapping, were used to split the dataset into training and test data. Shrinkage and Decomposition technique was applied to make multivariate regression more robust. Comparison was made between different algorithms to determine the best model in terms of Least Mean-Squared-Error (MSE) on test data prediction and interpretability. Multivariate exploratory plots were used for data inference. Relationships between different factors and annular pressure drop were mostly linear. As expected, pressure drop increased with increase in flow-rate, inclination angle, ROP and for non-Newtonian polymeric fluids. Principal Component Analysis (PCA) was performed to reduce the dimensionality of the data set. Approximately 98% of variance in data was explained by 5 principal components and the resulting model produced a MSE less than 1% of median pressure drop. Even though PCA regression model performed well on test data, final model was more difficult to interpret because it does not perform feature selection or even produce coefficient estimates. Therefore, Partial Least Squares (PLS) regression was used which gives better model interpretability as it is supervised by feature-outcome relationship. Shrinkage methods-Lasso and Ridge Regression were also used. These methods add an additional penalty term to Least Square Regression to get a bias-variance tradeoff. Cross-validation was used to select the penalty term that gave the lowest MSE. Both methods produced competitive MSE but performed better than PCA and PLS regression. In conclusion, Lasso-Regression performed the best with lowest error and good interpretability.

You can access this article if you purchase or spend a download.