Abstract

Efficient oil and gas extraction and associated waste-water disposal in many jurisdictions with the potential to generate earthquakes requires monitoring to satisfy particular "traffic light system" protocols. However, seismic monitoring can yield many products that allow for a detailed level of understanding of the geological and geomechanical setting of the array, allowing for proactive versus reactive reservoir management. Densification of networks drives application of different approaches to characterize the seismicity to higher resolution. We show from a number of case studies the connection between the density of the monitoring network and the quality of interpretive data products.

Induced-seismicity-monitoring networks can be broadly grouped into four categories: public networks, regional subscriber arrays, private local arrays, and microseismic arrays. The monitoring resolution of the network ultimately governs the utility of the resulting catalogs. Relating induced seismicity to the underlying process essentially requires upgrading the data products to a "next-generation" level. The data from the network need to facilitate a detailed understanding of critical geological structures and geomechanics of the study area. This goal is accomplished through both a densification of hardware and a higher order of event processing. High-precision locations delivered through relative relocation methodologies delineate slipping fault structures, often resolving previously unknown features. Moment tensor inversion processing also helps reveal the orientations of faults and pre-existing structures and provides information on stress in the region. The resolution of these data provides critical insight into understanding how a field is reacting and how to optimize the operation.

We illustrate the impact of network size and resulting interpretations by presenting multiple case studies with different seismic monitoring resolutions from regional scale, local and microseismic arrays relating to oil and gas operations in North America. We aggregate the recorded data to obtain the highest resolution catalog of earthquake hypocenters, delineating fault features. Moment tensors of the highest-quality events augment our understanding of the stress regime. The differences in the performance of each network and the accuracy of the data products reveal how the interpretation changes as data quality improves when additional data becomes available. In addition, we show how the outcome of the induced seismicity forecasting model is significantly governed by the quality of the seismicity catalog.

Acknowledging that the primary goal of induced seismic networks is assisting operators in making operational decisions, providing additional interpretative data can help to optimize the operation, while also aid in developing effective risk management strategies.

This content is only available via PDF.
You can access this article if you purchase or spend a download.