Effective quality control of offshore survey operations has traditionally required the quality control representative to have a broad technical knowledge in a range of areas plus sound operational and commercial abilities.
During the last three years however, there has been an increasing emphasis placed on new technical developments, many of which are based on the application of micro-computer technology. Specific examples of this are the use of on-board processing and the wider application of portable and laptop computer systems in a range of offshore operations.
In the short term, this has placed an increasing load on the offshore representative - both in terms of work load, and the requirement to be both computer numerate and to have a basic knowledge of seismic processing. The benefits of such trends have already been significant and promise to provide further benefits as technology advances further.
The paper will review current quality control procedures and comment on the impact which recent technological advances have had. Some ideas on likely development directions over the next three years will also be presented.
Shallow Gas determination, prior to drilling, is carried out using ‘Engineering Seismic’ survey methods
Seismic acquisition data quality is fundamental in achieving this objective as both the data processing methods and interpretation accuracy are subject to the quality of the data obtained.
The recent application of workstation based data analysis and interpretation has clearly demonstrated the importance of acquisition data quality on the ability to determine the risks of gas with a high level of confidence.
The following pages summarise the 5 primary issues that influence acquisition data QC, suggests future trends and considers their potential impact.
The primary issues covered in this paper are:
Seismic
Positioning
QC Data Analysis
Communications
Personnel
(Fig. 1 is available in full paper) The often complex influences of Technical, Commercial and Operational constraints on the acquisition of high quality data require careful management by the QC supervisor in order to achieve a successful seismic survey data set.
The following pages only consider the Technical aspects of QC and assume that no Commercial or Operational restrictions are imposed in the achievement of optimum data quality.
It is noted however, that such restrictions are frequently responsible for significant compromise in data coverage and quality during routine rig site surveys.
Traditional ‘Engineering Seismict surveys utilise a number of Interfaced and Computer based systems (shown under ‘2’ and ‘3’ in (Fig. 2 is available in full paper)the figure) as the primary methods of monitoring and analysing the relevant data strings for each critical parameter of the survey.
These data strings usually provide, as output for QC, only line-by-line analyses in the form of separate listings of parameter values with Mean/Average/S.D type Histograms as line summaries.
‘Exploration 3D’ Seismics have adopted, in addition to the above, Onboard Seismic Processing (‘1’) and special Interfaced Streamer/Source positioning (‘4’) systems as standard techniques in order to measure the data quality in terms of Stacked Data and Reflection Point Positions respectively.