Harvesting vast amounts of data has long been identified as an enabler of operational performance. The measurement of key performance indicators is a routine practice in well construction, but a systematic way of statistically analyzing performance against a large data bank of offset wells is not a common practice. The performance of statistical analysis in real time is even more rare. With the introduction of edge computing, devices capable of complex analytical functions in physical proximity to sensors and operations, this practice can be realized. Two case studies are presented: rate of penetration (ROP) and amount of vibration per run.

Hypothesis testing is a statistical method in which a sampled dataset is compared against an idealized or status quo model. This model is built using many samples from a population. The characteristics of the population are then inferred from these samples. The model is built in centers where large amounts of data are available. These models are then transferred to an edge device in the field. The device collects real-time data and compares results to the status quo model. In the two cases presented, hypothesis testing was used to determine maximum and minimum levels of ROP and downhole vibration. This information is used to determine the effectiveness of new drilling practices, technologies, or methodologies. Because calculations are performed in real time, changes to drilling practices can be adopted quickly.

The ROP case was performed at a US operating unit; the vibration case was performed in a Middle East unit. The models showed what improvement values should be. It was revealing to find wells that were thought to be poor performers were actually well within the population normal. Wells were also found that were thought to be good performers, but where new drilling methods were used, actually fell within the population model and thus suggested that the new methods had not affected performance. By performing this analysis on the edge device, operations can make changes early in such a way that results fall outside the status quo model and deliver real performance improvements.

The paper presents the novel use of statistical models calculated in data centers in conjunction with real-time operations. Similar approaches in technical and physics modeling exist in which models are produced in the office and used in the field. However, building models for operations management, from a large bank of offset data, and performing analysis in the field with real-time data is a not common practice. This paper shows both technology and statistical methods that provide a valid scientific framework for operational performance improvement.

You can access this article if you purchase or spend a download.