Seismic data is one of earliest data acquired in a prospect evaluation and the data are utilized throughout the exploration and production stages of a prospect. With recent advances in the handling of big data, it is essential to re-evaluate the best practices in the seismic data ecosystem. This paper presents the idea to leveragingthe technology advancement in big data and cloud computing for Seismic data ecosystem with the aim to providing an improve user experience.

This new seismic platform would be capable of handling, managing and delivering the full spectrum of seismic data varieties starting from acquired field data to interpretation ready processed data. The system to have the following capabilities:

  • Capability to entitle the right portion of data to every user as per interest

  • Organization of seismic data as per the business units

  • Data security by sharing data only with legitimate users/groups.

  • Direct or indirect integration with all the data sources and applications who are consuming and/or generating data

  • Sharing of and collaboration on data within company and/or across organization for shareholding partner, perspective seismic buyer for trading and relinquishment, regulatory agency resource certifying agencies and service providers etc. over limited network connectivity.

  • Provide intergration/data deliverivey to End Users applications where this seismic data will be utilizaed

Implementation of Seismic ecosystem will enable:

  • Sharing of seismic data by the acquisition, quality control, data processing and interpretation with user communities from one centralized storage

  • Collaboration of stake holders in real time over an encrypted network

  • Leveraging cloud and mobility technology advancement for agility and interaction. The system will be connected and interactive yet has the power of complex high-performance computing infrastructure on the background.

  • Data delivery and auditing to wider and more diverse user community that consumes data from different platforms.

  • Secure data access based on organizational business units to make sure data does not fall into unauthorized hand.

  • Reduction in seismic data turnaround time by reading and ingesting large volume of data through parallel input/output operation.

  • Improved data delivery and map interface with contextual information out of the centralized data store.

  • Augment traditional workflows with machine learning and artificial intelligence for example automated fault detection, etc.,

The proposed best practice aims to bring all of the different disciplines working with seismic data to one centralized seismic data repository and enable them to consume and share seismic data from big data lake. This is live and interactive when compared to traditional technologies of using the archive and restore system in standalone application.

You can access this article if you purchase or spend a download.