Latest evolutions in seismic data acquisition have led to ever increasing improvements in resolution. Finer source and receiver sampling intervals, combined with a move away from receiver arrays have led to the acquisition of the fully sampled signal and noise fields. However, modern acquisition means potentially higher fidelity data is masked by noise requiring exceptionally careful processing and QC.
Unmigrated Qusawhira raw stacks show little or no coherent energy, indicating that data noise content vastly outweighs signal. The Qusawhira processing flow employs successive algorithms to incrementally remove "layers" of noise generated through different mechanisms. Throughout each stage, QC data was broken down into frequency bands to identify noise and signal signatures while gauging the efficacy of processes employed. Selective Pre Stack Time Migrations (PSTM) were run to ensure the integrity of results. Applying the QC methodology outlined revealed unexpected results in several cases.
Acquisition utilizing Geophone Accelerometers (GACs) requires integrating to velocity in order that processes behave as we expect in the velocity processing world. As GAC amplitude response increases at 6dB/Octave, low frequencies are suppressed. Correcting to velocity at a primary stage ensures that low frequency response can be thoroughly assessed. Acceleration to velocity correction on Qusawhira was considered a critical enabler for QC.
In this paper I critically review initial seismic processing steps employed on the recently acquired Qusawhira "S4" survey and illustrate how seismic processing and QC must evolve with the changing nature of our data. Default processing parameters, including those hard coded, should be reviewed to ensure they are appropriate for modern broad band data. QC migrations should be performed at an early stage to confirm diffracted energy is not harmed.