Seismic Data Processing

The aim of seismic data processing is to create an accurate image of the subsurface by removing noise and repositioning seismic reflections from their recorded positions in time to their true subsurface locations in depth.

Geometry Merging

Geometry merging is the first step in seismic data processing. There are three main geometry files RPS, SPS and XPS, in which data were recorded in the field.

The headers of the raw seismic records are updated with the field geometry, known as geometry merging.

SPS Files Loading

There are three main geometry files known as SPS files –

  • RPS (Receiver Point Sequence): It contains information about the receivers in the geometry. The RPS includes receiver line numbers, picket or station numbers, sensor code (e.g. G1 for geophone), receiver statics, receiver location coordinates, and receiver elevations.
  • SPS (Shot Point Sequence): It contains details about the shot points, it includes shot line numbers, shot pickets, source code (e.g. E1 for explosives and V1 for vibroseis), shot hole depth, uphole time, shot statics, shot location coordinates (x,y), shot location surface elevations, shot record file number (FFID), Julian day and index informations.
  • XPS (Cross Point Sequence): This file contains relationship details about source and corresponding active receivers for that shot template in the acquisition geometry. It describe the relation of each shot with corresponding receiver template in the geometry.
    The XPS file contains informations of both shot and receiver lines, files numbers (FFID), shot pickets, active template of receiver stations (start picket to end picket).

Raw Data Loading

Raw field records in segy or segD formats are loaded in the internal formats of the processing software. Do the header QC will have limited details in the raw gathers such as channel set, sampling interval, total number of samples. It does not contains geometry information for the shot template. You need update the geometry by merging sps file with raw data to update seismic headers.

Shot Coordinate QC

Pick and plot first-break slope with offsets and propagate to all the shots. Deviations on the first-break plot indicating the shifted shot location. It might required coordinate corrections.

Header Verification

After geometry merging is completed you need verify the updated headers in the raw shot gathers. Check for the important header like shot, ffid, channel, offset, cdp, receiver stations, statics, elevation, fold and inline, xline and sequence numbers.

Basic Signal Processing

Aux Channels

First step is to remove aux channels, they can be identify by channel types and can be sorted out by sequence numbers.

Filtering

Apply band-pass filter to eliminate unwanted frequencies in seismic data. Band limited filters are utilised to remove power-line frequency of 50 or 60Hz. Notch filter is no more in use, because it could remove frequency of both seismic as well as power-line frequency at 50hz.

Spike Removal

Identify spikes by median analysis window and utilising rms threshold to eliminate spikes. The reasons for the spikes in the data may vary, it may caused due to current leakage, and poor geophone coupling.

Amplitude Corrections

The amplitude losses of seismic reflection event can be compensated by applying geometrical spreading correction using suitable gain module.

The signal losses due to absorption and transmission can be compensated by estimating inverse Q-factor and apply in seismic migration.

The amplitude variations due to varying charge size during data acquisition and effects of poor geophone coupling can be compensated by surface consistent amplitude balancing and trace equalisation.

In this process a surface consistent amplitude balancing scalar is applied across the shot gathers and receiver stations, compensate the effect of charge size and receiver coupling.

Avoid application of AGC as it may affects the surface consistent variations in seismic section.

These corrections are applied based on wave propagation physics to ensure that compensated amplitude accurately reflect the rock properties.

Noise Removal

The strategy to remove different types of noise is very important during data processing. Identify at least two variable parameters like amplitude – frequency or frequency – velocity. Choose suitable denoise module to remove the noise. e.g. groundroll is high amplitude and low frequency noise.

Frequncy Analysis

Frequency analysis is done before and after after removing any noise we need to protect low frequency range 3 to 10 Hz.

Static Corrections

Static corrections are applied to seismic data to remove the effects of topography (corrections applied for elevation variations) and compensating near-surface velocity anomalies due weathering layer.

Application of static corrections in seismic data processing is very important to estimate accurate velocity corresponding to seismic event at given target time.

Datum

Datum is a reference level from which measurements (such as elevations, depth or time) are made during the survey or data processing.

MSL(Mean Sea Level) is most used datum for measuring elevation variations in land data acquisition, the elevation at MSL is zero.

Field vs Refraction Static

Near-surface velocity model estimation for field statics are based on uphole survey conducted at sparse locations approx. 1km distance. Whereas, in the refraction statics direct arrival times (i.e. first break time) for each shots are recorded and an initial velocity model is derived from the first-break time and corresponding offset distance. The near-surface velocity estimated from refraction model is highly accurate, so static calculated from refraction model is more accurate than the field static.

SC Amplitude Corrections

Apply amplitude corrections to ensure consistency and accuracy in signal amplitudes. Normalise or scale the data to a standard reference level. A surface consistent amplitude balancing is done for correcting effects of varying charge size in field, and variations in receiver response for different shots due to poor coupling.

No corrections shall be applied for path effect, or you can say the effect due geological changes in the subsurface i.e. the corrections are surface consistent, which ensure accurate representation of subsurface variations while correcting for source, and receiver effects. The technique shall maintain the relative amplitude of the data, both spatially and in time.

Corrections for Variations in Charge Size

Normalise seismic traces to account for variations in the charge size due to varying quantity of explosives used during seismic data acquisition. Extract amplitude of each trace, and then compute rms amplitude scalar. Apply scaling factors to each trace to standardise the energy released by the source. Compare two successive shot gathers with varying charge size, they must show equal energy level after correction is applied.

Corrections for Receiver due to Poor Coupling

Correct for variations in receiver coupling, which can affect the amplitude of seismic signals. Extract amplitude of each trace, and then compute rms amplitude scalar. Apply correction factors to compensate for differences in receiver characteristics (e.g., frequency response, sensitivity).

QC the and verify the effectiveness of amplitude corrections by comparing corrected data with original. The process of surface consistent amplitude balancing can be iterate, if necessary to fine-tune the correction parameters to optimize the corrected shot gathers.

Deconvolution

The primary aim of the deconvolution in seismic data processing is to recovery the reflectivity series of seismic trace. Spiking deconvolution attempts to compress the wavelet to a spike, whereas predictive deconvolution, attenuates the periodicity of reverberation and shot period multiples. The deconvolution ultimately improve the temporal resolution by compressing the source wavelet. It improve the frequency band of output data.

In predictive deconvolution gap or prediction length shall be is equal to the first or second zero crossing of the autocorrelation function. This is most common practice in industry. The operator length for deconvolution shall be designed 10-times or more.

Although deconvolution usually is applied to pre-stack data trace by trace, it is not uncommon to design a single deconvolution operator and apply it to all the traces on a shot record. Deconvolution techniques used in conventional processing are based on optimum Wiener filtering.

Examine some of the individual reflections and comparing with the original trace and note, how the wavelet associated with the significant reflections is compressed and reverberations that trails behind each reflection is largely attenuated by deconvolution.

Autocorrelation of input data is used in designing a deconvolution operator, it is appropriate to examine the autocorrelation before and after deconvolution.

The operator length dictates the ability of deconvolution in removing reverberations and short-period multiples. The autocorrelation function is symmetric. Therefore, only one side of the autocorrelation needs to be computed to design the operator length and prediction lag (gap). Testing of deconvolution parameters such as operator length, prediction lag (gap), calculation and application window are very important before coming to conclusion.

Velocity Analysis

Velocity analysis in seismic data processing involves several steps to estimate the velocity model of subsurface layers accurately.

  • CMP Sorting: Sort data into common mid-point versus offset domain, from shot gather.
  • Velocity semblance analysis: Compute semblance values for different velocity and time windows. Identify peaks in semblance values to estimate the optimal stacking velocity. The semblance is a measure of coherence or similarity between seismic traces at different offsets and travel times. Velocity semblance analysis helps identify the optimal velocity for stacking seismic data.
  • Constant Velocity Stack (CVS Stacks): If velocity analysis on semblance is not possible then we can use CVS stacking method of velocity analysis. In this method we take range of velocities for creating constant stacks. These stacks are analyse and pick the best velocities, which produces the flat event, by looking at the gather flattening we can decide the better velocity for that cdp location at a given time. CVS stacking is method is always recommended for first pass of velocity analysis.
  • NMO (Normal Moveouts) correction: Correcting for the effects of the varying offset and velocity of seismic reflections. This correction aligns the reflection events horizontally on a time-offset panel. It is frequently applied before stacking the CMP traces.
  • Velocity picking: Identifying the best velocity values for each seismic trace. This can be done manually or through automated picking algorithms.

Residual Statics Corrections

Iteratively updating the velocity model to minimize the difference between the observed seismic data and calculated value using the current velocity model.

The moveouts in CMP gathers does not always conform to a perfect hyperbolic trajectory. This is due to near-surface velocity irregularities that cause a static or dynamic distortion problem.

Lateral velocity variations caused by a complex overburden can cause moveouts that could be negative — a reflection event arrives on long-offset traces before it arrives on short-offset traces. Velocity analyses then are often repeated to improve the velocity picks.

  • Residual Statics Estimation: Compute residual statics by comparing the observed and predicted arrival times of seismic events after the initial static corrections.
  • Update Model: Update the velocity model based on the residual statics analysis.
  • Apply Correction: Remove travel time delay (static errors) and correct for structural continuity.
  • Iterative Process: Iterate between static corrections and velocity model updating until the residual statics are minimised.

Seismic Data Regularisation

In some cases, it may be necessary to resample the merged geometry to ensure uniform spacing between shot and receiver locations, which can improve data processing and interpretation.

The objectives of the data regularisation includes –

  • Seamless merging of overlapping survey boundaries
  • Improving spatial sampling for migration and AVO analysis
  • Bin centring
  • Fold equalisation
  • Offset regularisation, and
  • Data interpolation for filling small gaps.
  • Removal of acquisition footprint

RMS Velocity

Root Mean Square velocity, is typically estimated on common image gathers, the RMS velocity is derived by fitting a velocity model that best represents the subsurface geological structure. Iterative velocity analysis techniques are often employed to refine the velocity model and achieve an accurate estimation of velocity.

Seismic Migration

The migration in seismic data processing refer to the process of repositioning seismic reflections from their recorded position in time to their true subsurface locations in depth. As the seismic data is generated by propagation of sound wave through the sub-surface the migration aims to corrects the effects of wave propagation.

The ultimate aim of the migration is to accurately image subsurface geological features by correcting for the effects of velocity variations, and reflector dip. The most common migration algorithm used is Kirchhoff Migration.

Post-Stack Processing

Post stack processing is crucial to improving the quality of the the migrated image. It involves removing noise and footprints.

EBCDIC HEADER

Preparing Detail EBCDIC Header with providing details such as acquisition parameters, First and Last CMP and grid parameters and Coordinates, processing steps and other parameters like spherical spreading Tpow, Regular class, scalers applied on processed data, Header Byte Locations, processed output type, data length, etc.

Report & Documents

Documenting each step of the processing workflow. Provide templates for processing parameters, settings, and results. Include any relevant metadata and interpretations.

Learn More About Seismic

Click 👉 Seismic Data Acquisition

Click 👉 Seismic Survey Geometry

Click 👉 Seismic Data Processing

Click 👉 Seismic Depth Imaging

Click 👉 Seismic Data Interpretations

Self-Learning Blogs

Click 👉 How to self-publish a Book

Click 👉 Create and Publish eBook & Audio Book

Click 👉 Create Your Own Website

Click 👉 Technology Advancement (Learn About AI)

Click 👉 Learn Vedic Tricks – Improve Mental Calculations

5/5 - (1 vote)