Big Data in the Digital Oilfield Requires Data Transfer Standards to Perform
- Philip Neri (Energistics Inc)
- Document ID
- Offshore Technology Conference
- Offshore Technology Conference, 30 April - 3 May, Houston, Texas, USA
- Publication Date
- Document Type
- Conference Paper
- 2018. Offshore Technology Conference
- 4.5 Offshore Facilities and Subsea Systems, 1.10 Drilling Equipment, 4.5 Offshore Facilities and Subsea Systems, 1.6 Drilling Operations, 4.5.5 Installation Equipment and Techniques, 4 Facilities Design, Construction and Operation, 1.10.5 Offshore Drilling Units
- Metadata, Analytics, Standards, Transfer, Big Data
- 4 in the last 30 days
- 337 since 2007
- Show more detail
- View rights & permissions
The digital oilfield is evolving with the number of sensors and measuring devices and the frequency of data sampling both increasing dramatically, which is the onset of what is commonly referred to as Big Data. These increases are needed as the motivation for the collection and transmittal of data has moved from the remote observation of operations to the remote operation of increasingly automated systems, as well as the goal of reliable prediction tools that improve efficiency.
A lot is written and discussed about how to handle Big Data in the context of data management and analytics. However, the task of transferring the digital information from the field to the back office is critical. This is built on advances in standards for data formats, metadata and transmission protocols that have been published to support the digital oilfield of the future. The first advance has been the larger amount of metadata, providing detailed information on the data source's characteristics, eventual transformations to the data (by who, using what tools, etc.) and prior vetting. The second one is to replace transmission protocols based on polling different servers with streaming processes that do away with inefficient interrogation, handshake and packaging overheads.
The implementation of these improved standards allows operators to build solutions that involve different vendors. A test was put in place involving an active North Sea drilling operation, with data being transferred to onshore observers both at the operator's facilities and on the premises of a service provider. Data was streaming on to the screens of both observing parties less than one second after the event took place on the rig. This compares to lag times of 15 to 20 seconds using polling methods. The bandwidth usage was approximately 1/10 compared to previous methods, such that more sensors could be added to the data transmittal.
With more immediate access to a much broader and richer set of data both at the operator's facilities and also at the premises of many contractors, service providers in particular could make more educated assessments of developing situations on the rig, which in several cases avoided the need for a team to go to the offshore rig, thus reducing the number helicopter flights to the offshore installations and accounting for fewer people-on-board (POB), which contributed to lower HSE exposure and lower costs.
|File Size||599 KB||Number of Pages||6|
Ernst & Young, 2016 Recruiting during a downturn: http://www.ey.com/Publication/vwLUAssets/ey-recruiting-during-a-downturn/$FILE/ey-recruiting-during-a-downturn.pdf
EIA - HIS, 2016. Upstream Cost Study https://www.eia.gov/analysis/studies/drilling/pdf/upstream.pdf
Hollingsworth, J. and Schey, J., 2017, From Data Transfer to Data Assurance: Trusted Data is the Key for Good Decisions http://www.energistics.org/Assets/pnec2017paperjayhollingworthenergistics.pdf