The amount of data collected in the information age has grown to amounts barely manageable. Currently available technologies are already capable of transmitting the readings of any sensor to worldwide locations at high frequencies and with nearly no time delay. With an ever-increasing flow of data, the need for criteria to measure and evaluate data quality are more pressing than ever, as this data forms the basis for many critical business decisions. This paper addresses these problems and shows essential steps to a successful data and quality management strategy:

  • Quality control and improvement

  • Data quality benchmarking

  • Accessibility of controlled data

Simple but very effective signal processing algorithms are presented to ensure that the data is in the right value range, outliers are removed and missing values are substituted where possible. More complex control instances may not be able to correct data completely by automation. Here the human expert is still necessary for corrections. To minimize the work for engineers, a processing engine produces smart alarms whenever an automated data correction is not possible. If no correction is possible at all, the questionable portion of the data must be flagged invalid, or even be deleted to prevent misuse and wrong conclusions. This may be dependent on the application; so good data management is essential to provide a minimum standard for each individual case.

The final step is to publish and present the data at the correct level of detail to different parts of the corporation. Key in this procedure is that every individual has easy and fast access to the data for their decision-making. In this context, the appropriate resolution, as well as the most efficient use of data processing time, is of critical interest. A unique solution to browse large volumes of drilling data is presented.

You can access this article if you purchase or spend a download.