Data quality issues have for many decades been a problem for drilling data. To some extent, development of data transfer standards has helped out in achieving better data quality and data transport. In the early stages of WITSML, poor data quality was a concern and in this paper we will be looking at various steps that have been taken to improve data quality.

Sensor technology has improved a lot in recent years with fieldbus options which allow for remote calibration and diagnostic. In addition calibration routines are streamlined and range checks can be implemented at point of acquisition. The data acquisition software now has some inbuilt quality control to addresses errors in manual data input. In addition we have developed software at the rig-site that will perform several data quality checks in the database.

After acquisition, the data is converted and transferred to a central hosted WITSML 1.4.1.1 server. Here several applications will perform data quality assurance on the data, e.g. to check for data gaps. In addition the data flow is monitored 24/7 from an operation center before data is consumed by several applications.

We have been working closely with one operator for several years to improve processes in WITSML data deliveries. To ensure there is an agreement of what data is expected to be delivered, this company has established electronic order forms that will be sent to us for quality check before the section starts. In addition this operator has developed a sophisticated data quality monitoring system that will produce KPI scores linked to the SLA.

Some results from research in using statistics to uncover abnormal sensor response in acquired data will also be presented. Statistic will show how data quality is improving while the amount of data is acquired from one rig is increasing year by year.

You can access this article if you purchase or spend a download.