Abstract
The geophysical world has been recording seismic waves for over a century now, with the seismograph seeing first utilization towards exploration of oil and gas in the early 1900s. We started shooting 3D seismic data half a century later, and since then both our acquisition methods and how we reconstruct waves propagating through the earth with hyper-computing capabilities has evolved tremendously. Pushing the envelope of what we can image, in particular, is a major tranche of changes in seismic acquisition that started roughly a decade ago. Some of these recent developments include broader-bandwidth seismic acquisitions, particularly emphasizing low frequencies for both land and marine, and changes to sensors, sensor layouts and patterns used for shooting. These new acquisitions have refocused our emphasis on fundamentals in seismic processing, significantly advancing our ability to see the subsurface. Some of the ideas in seismic processing formulated for these acquisitions, not surprisingly, are not exclusively applicable only for modern acquisitions. Combining some of these newer approaches with pioneering ideas previous generations of geophysicists mastered, has allowed a fresh take on how large archives of legacy seismic, sitting-on-shelves, can be improved to provide fresh insights towards exploration. For the Middle East and North Africa region, where we often deal with ‘difficult’ seismic typically characterized by extremely high noise content, this re-look at older data has resulted in an evolution of workflows for vintage seismic data conditioning, leading to higher quality datasets that increase confidence and reduce uncertainty for the plays, leads and prospects we pursue.
The data conditioning workflow involves a number of steps, and are mostly applied post-migration and often post-stack. These are applicable across a spectrum of data types from different sources and geological settings. Typically, these workflows are fine-tuned for each seismic dataset in a matter of days or often ‘on-the-fly’, with implementation on the next generation interpretation platform in Shell, providing extremely rapid turnaround, resulting in dramatic uplift in image quality in many cases. These have demonstrably impacted decision-making in exploration and production, providing the ability to, quite simply, see with significant more clarity what we could not see before.
We share examples of utilization of these workflows, contributing towards a number of projects, including an extensive joint Abu Dhabi National Oil Company (ADNOC) and Shell effort to rejuvenate the Exploration Portfolio of Abu Dhabi, working with a country-wide database of multi-vintage onshore and offshore datasets constituting approximately 2500 2D seismic lines and more than 50 3D seismic volumes.
Our workflows are grounded in past experience, yet leverage latest signal-processing innovations. They provide a step-change in our ability to rapidly investigate and interpret large volumes of challenging seismic data efficiently, in addition to enabling visualization of geological features indistinguishable on original seismic.