The paper discusses how data quality influences workflows and decision-making in drilling and completions and examines the use of semi-automated processes for quality assurance. With poor data, additional steps are required and workflows must be repeated. In even relatively simple situations, controlled tests suggest that small changes or omissions may have a significant influence on the work efficiency or outcome.

In earlier work, the quality of any data stream has been described in terms of identity, presence, measurement frequency, accuracy, continuity, units and associated metadata. For some of these a degree of self checking is possible, applying simple algorithms to the data stream to detect presence and bounds, with alarms to alert the operator if these are transgressed. In other cases, such as the change in drag and torque with depth, the stream must be checked against a trend, called a pseudo-log determined from the physics. These calculations are performed by "smart agents" directly in real time on the WITSML data feed from the rig. The paper describes the early work developing smart agents to address data quality and structure of the associated toolkit that can be used to construct more complex agents from a wider selection of data sources, including system generated ones. The computational resources required are also discussed.

The increase in digital data and skills shortage makes manual assurance of all the data streams neither practical nor cost effective. Since current applications are not tolerant of errors and omissions, a step change in data quality is needed if more automated workflows are to be achieved. Greater assurance of the data at source and an improved understanding of the workflows will help.

You can access this article if you purchase or spend a download.