Am I Blue? Finding the Right (Spectral) Balance

Seismic interpreters have always desired to extract as much vertical resolution from their data as possible – and that desire has only increased with the need to accurately land horizontal wells within target lithologies that fall at or below the limits of seismic resolution.

Although we often think of increasing the higher frequencies, resolution should be measured in the number of octaves, whereby halving the lowest frequency measured doubles the resolution.

There are several reasons why seismic data are band-limited.

First, if a vibrator sweep ranges between 8 and 120 Hz, the only “signal” outside of this range is in difficult to process (and usually undesirable) harmonics.

Dynamite and airgun sources may have higher frequencies, but conversion of elastic to heat energy (intrinsic attenuation), scattering from rugose surfaces and thin bed reverberations (geometric attenuation) attenuate the higher frequency signal to a level where they fall below the noise threshold. Geophone and source arrays attenuate short wavelength events where individual array elements experience different statics. Processing also attenuates frequencies. Processors often need to filter out the lowest frequencies to attenuate ground roll and ocean swell noise. Small errors in statics and velocities result in misaligned traces that when stacked preserve the lower frequencies but attenuate the higher frequencies.


Currently there are two approaches to spectral enhancement.

More modern innovations that have been given names such as “bandwidth extension,” “spectral broadening” and “spectral enhancement,” are based on a model similar to deconvolution, which assumes the earth is composed of discrete, piecewise constant impedance layers. Such a “sparse spike” assumption allows one to replace a wavelet with a spike, which is then replaced with a broader band wavelet that often exceeds the bandwidth of the seismic source.

Model-based processing is common to reflection seismology and often provides excellent results – however, the legitimacy of the model needs to be validated, such as tying the broader band product to a well not used in the processing workflow.

We have found bandwidth extension algorithms to work well in lithified Paleozoic shale resource plays and carbonate reservoirs.

In contrast, bandwidth extension can work poorly in Tertiary Basins where the reflectivity sequence is not sparse, but rather represented by upward fining and coarsening patterns.

In this article, we review the more classical workflow of spectral balancing, constrained to fall within the source bandwidth of the data.

Spectral balancing was introduced early in digital processing during the 1970s and is now relatively common in the workstation environment.


As summarized in figure 1, the interpreter decomposes each seismic trace into a suite of 5-10 overlapping pass band filtered copies of the data. Each band-passed filtered version of the trace is then scaled such that the energy within a long (e.g. 1,000 ms) window is similar down the trace.

This latter process is called automatic gain control, or AGC.

Once all the components are scaled to the same target value they are then added back together, providing a spectrally balanced output.

A more recent innovation introduced about 10 years ago is to add “bluing” to the output. In this latter case, one stretches the well logs to time, generates the reflectivity sequence from the sonic and density log and then computes its spectrum. Statistically, such spectra are rarely “white,” with the same values at 10 Hz and 100 Hz, but rather “blue,” with larger magnitude spectral components at higher (bluer) frequencies than at lower (redder) frequencies.

The objective in spectrally balancing then is to modify the seismic trace spectrum so that it approximates the well log reflectivity spectrum within the measured seismic bandwidth.

Such balancing is achieved by simply multiply each band-pass filtered and AGC’d component by exp(+βf), where f is the center frequency of the filter and β is the parameter that is obtained from the well logs that varies between 0.0 and 0.5 (black boxes in figure 1).

There are several limitations to this classic workflow:

First, one balances the measured seismic data, which is the sum of the signal plus noise. Ideally, we want to balance the signal.

Second, since the filters are applied trace by trace, the process as a whole is not amplitude friendly and inappropriate as input to more quantitative amplitude-sensitive analysis such as AVO and post-stack or prestack inversion.

Third, if the AGC window is too small or the statistics of the reflectivity sequence insufficiently smooth (an end member example would be coal bed cyclothems and sabkha sequencies), then reflectors of interest can be suppressed and artifacts created.

A fairly common means of estimating the spectrum of the signal is to cross-correlate adjacent traces to differentiate that part of the signal that is consistent (signal) and that part that is inconsistent (random noise). One then designs the spectral balancing parameters (AGC coefficients) on the consistent part of the data.

Unfortunately, this approach is still not amplitude friendly and can remove geology if the spectra are not smooth.


Figure 2 illustrates a more modern approach that can be applied to both post-stack and prestack migrated data volumes.

First, we suppress crosscutting noise using a structure-oriented filtering algorithm, leaving mostly signal in the data.

Next, the data are decomposed into time-frequency spectral components.

Finally, we compute a smoothed average spectrum.

If the survey has sufficient geologic variability within the smoothing window (i.e. no perfect “railroad tracks”), this spectrum will represent the time-varying source wavelet.

This single average spectrum is used to design a single time-varying spectral scaling factor that is applied to each and every trace. Geologic tuning features and amplitudes are thus preserved.

We apply this workflow to a legacy volume acquired in the Gulf of Mexico:

  • Figures 3a and b show the average spectrum before and after spectral balancing.
  • Figures 3c and d show a representative segment of the seismic data where we see the vertical resolution has been enhanced.

 

Comments (0)

 

Geophysical Corner

Geophysical Corner - Satinder Chopra
Satinder Chopra, award-winning chief geophysicist (reservoir), at Arcis Seismic Solutions, Calgary, Canada, and a past AAPG-SEG Joint Distinguished Lecturer began serving as the editor of the Geophysical Corner column in 2012.

Geophysical Corner - Kurt Marfurt
AAPG member Kurt J. Marfurt is with the University of Oklahoma, Norman, Okla.

Marcílio Matos is a research scientist for Signal Processing Research, Training and Consulting, and co-investigator for the Attribute Assisted Seismic Processing and Interpretation Consortium at the University of Oklahoma, Norman.

Geophysical Corner

The Geophysical Corner is a regular column in the EXPLORER that features geophysical case studies, techniques and application to the petroleum industry.

VIEW COLUMN ARCHIVES

Image Gallery

See Also: Bulletin Article

The continuity of clay smears evolving in sealed direct shear experiments of initially intact sandstone-mudrock sequences was quantified to large displacements up to more than ten times the thickness of the sealing layer. The sample blocks consisted of a preconsolidated clay-rich seal layer, which was embedded and synthetically cemented in quartz sand. The mineralogy and mechanical properties of the clay layer and the reservoir sandstones were varied systematically to mimic a range of natural clastic rock sequences. The fluid-flow response across the fault zone was monitored continuously during deformation using a new type of direct shear cell. The displacement at which seals break down is closely linked to the amount of phyllosilicates in the seal layer. Contrary to expectations, softer seal layers do not seal better than stiff seal layers for a given clay content. In the testing range of normal effective stresses between 4 to 24 MPa (580–3481 psi) covering maximum burial depth conditions of approximately 800 m (2625 ft) to approximately 4 km (2 mi) (assuming normal fault tectonics), a systematic trend is also observed, indicating better smear continuity by increasing the effective normal stress. Predominantly brittle processes such as slicing and wear, and not ductile drag or plastic flow, appear to be responsible for the generation of clay smears. The test results offer the prospect of incorporating critical shale smear factors (i.e., normalized displacement at which seal breakdown occurs) into probabilistic fault seal algorithms that consider important properties that can be measured or estimated, namely, clay content and fault-normal effective stress.
Desktop /Portals/0/PackFlashItemImages/WebReady/fault-transmissibility-in-clastic-argillaceous-sequences.jpg?width=50&h=50&mode=crop&anchor=middlecenter&quality=90amp;encoder=freeimage&progressive=true 3720 Bulletin Article

See Also: CD DVD

Giant Oil and Gas Fields of the Decade: 1990-1999 is the fourth of a four-decade series of Memoirs commemorating important giant discoveries. This title presents major trends that characterized giant-field discoveries in the 1990s and includes tectonic and sedimentary-basin maps.

Desktop /Portals/0/images/_site/AAPG-newlogo-vertical-morepadding.jpg?width=50&h=50&mode=crop&anchor=middlecenter&quality=90amp;encoder=freeimage&progressive=true 10483 CD-DVD

See Also: DL Abstract

Seismic amplitude anomalies have been used for over 40 years to identify and de-risk exploration opportunities with a great degree of success. Beginning in the late 90s, the global industry portfolio of solid amplitude-supported opportunities started to get depleted in many basins. The depletion of high-confidence opportunities resulted in drilling of intrinsically riskier amplitude anomalies leading to significant exploration failures and unexpected outcomes. This paper presents several examples of volume and scenario-based DHI assessment workflows from selected Circum-Atlantic basins, with discussion of underpinning rock properties systems and lessons learned from drilling results.

Desktop /Portals/0/images/_site/AAPG-newlogo-vertical-morepadding.jpg?width=50&h=50&mode=crop&anchor=middlecenter&quality=90amp;encoder=freeimage&progressive=true 11673 DL Abstract

See Also: Learn! Blog

This conference will provide an opportunity for attendees to share knowledge and identify differences in depositional processes, define how variability affects play elements, and define differences in stratigraphic models and sedimentary concepts.

Desktop /Portals/0/PackFlashItemImages/WebReady/hero-latitudinal-controls-on-stratigraphic-models-and-sedimentary-concepts.jpg?width=50&h=50&mode=crop&anchor=middlecenter&quality=90amp;encoder=freeimage&progressive=true 11087 Learn! Blog

See Also: Online e Symposium

Recognition and Correlation of the Eagle Ford, Austin Formations in South Texas can be enhanced with High Resolution Biostratigraphy, fossil abundance peaks and Maximum Flooding Surfaces correlated to Upper Cretaceous sequence stratigraphic cycle chart after Gradstein, 2010.

Desktop /Portals/0/PackFlashItemImages/WebReady/oc-es-genetic-sequences-in-eagle-ford-austin.jpg?width=50&h=50&mode=crop&anchor=middlecenter&quality=90amp;encoder=freeimage&progressive=true 1486 Online e-Symposium