The development of remote operational centers has improved data quality due to the increased focus on data acquisition and real-time usage of data.

Data quality has different connotations to various participants in the oilfield. Data streaming and continuous flow of data is the main focus for the technical part in delivering and receiving data. Scientists focus on the accuracy of each data point. Most surface sensors measure milliamps with calibrations for accuracy. Reservoir measurements have porosity and permeability as the main reservoir properties, values that are not measured directly and derived from other sources.

In automation processes data streaming must be flawless. Through remote operations the surveillance of data streaming quality and accuracy is performed. In most areas the response time on the network is too narrow to stream data from the well to a remote location and to then stream a command or solution back for full closed-loop control.

During the acquisition, aggregation, distribution and finally visualization there are room for changes in the data point s. These changes are from uncertainty, stacking, filtration, unpacking, transmission, and any other data handling process for data sharing.

For the digital oil field, data are evaluated in two different settings, real-time and after the event. The interpretation is performed either remotely or at the wellsite. There is room for improvement in all areas, depending on objectives in the process:

  • -

    Automation in the operational phase

  • -

    Interpretation based on a model update

  • -

    Automated quality control

This paper illustrates the differences and similarities between real-time operations and processes performed on the data later and how combined local and remote operations enhance data quality. We will follow up by making improvement suggestions in all areas moving into the digital oil field.

You can access this article if you purchase or spend a download.