The processing geophysicist as a scientist and an artisan. Illustrating the complexity of the seismic processing.

01_TitlePhoto_art_or_science.SMALL

The exploration geophysics is a wide branch of the Earth science, and with good reasons geophysicists can call themselves geoscientists.

There are many specialties involved in the seismic exploration chain, and one of them is the seismic processing and imaging. It is often regarded as an obscure field characterize by a perplexing jargon of waves, ghosts, operators, migrations, anisotropy (and all the acronyms). It requires advanced physics and math foundations, a firm theory, efficient implementations of complex algorithms running on colossal computers to process enormous piles of data.

It is clearly science: a systematic, objective study of the processes occurring inside the earth’s, and of its physical properties, through observations and experiments, as the scientific method prescribes. It is based on general physical laws ruling the geophysical world.

On the seismic processing, we can say that is the science of transforming into images of the subsurface the acquired seismic data, which are just humble recordings of the vibrations of the surface of the earth.

You could think that it is an exact science: that if we give the very same data to two different groups of geoscientists, we get twice the same final image.

Ahahah! Yes, like the two photos in the figure below.

02_TorreDiPisa_small

If you start from modern, good quality marine data maybe you will not get such extreme differences: but when you look for small details (count the people in line to get into the leaning tower!), the differences can be crucial.

And with land data I have seen larger differences between processed images.

So, why this scientific process does not give a unique result, why it is not easily deterministically reproducible? Because the information is hidden under a lot of noise, and because the processing is very complex: the seismic data go through tens of steps, each with many parameters, which can be combined in many different ways.

Start with small tiles, partial views of the data, we calibrate them, remove the effects of the experiment, remove some noise, build an image, removing more noise, project and reconstruct the shape, remove more noise, until we get the final cube.

03_TuttiCubiProcess_small

We still talk about processing sequences, hinting at a series of steps run one after the other. But the true relationships between the steps are more complicated than in a sequence: in land data, for example, the behaviour of the deconvolution is affected by the performance of the noise attenuation which impacts the accuracy of the velocity modelling, which conditions the migration all the other things that come after it.

We still draw very near flowcharts with a sequence of steps, but in reality things are more interconnected and dependent from each other.

04_processing_steps_small

The complexity of the processing increased a lot in the past years with the computing capabilities: we can use more realistic assumptions and physical models, and be more scientific in the methods and theory. And flows have now loops, parallel branches and more complex shapes.

If you look at a modern sequence for a full-azimuth 3D onshore seismic processing, the many interconnected steps have relationships that are layered and even difficult to represent. What you do at a certain step might produce unpredicted consequences at a later step. It becomes impossible to test and check everything, with the growing size of the data and the immense number of combinations of possible parameters and steps.

A sequence can be written as a flow chart, but has to be dynamic, as at the beginning you don’t know how the data are and how they will evolve: the processing of a land 3D (only in time!) looks more like this.

05_LandFullAzimuth3_small

This complexity can become overwhelming. In some cases it can make us feel incapable to understand what happens, and predict and control what will happen. And people are tempted to test and see. But we should not forget that there are underlying basic reasons and objectives for all steps and for their order.

But with such a complexity, part of the processing is much closer to craftsmanship. The processing requires intuition and creativity: it is artistic in the sense that we create images (not particularly beautiful, but they are visual representation of an underground landscape). And the vicious intricacy of some software adds the need for manual dexterity.

Even when everything seems to go well and we think that we have the best possible result, we just create an image of the subsurface, and this image can be ambiguous. The cube of the initial figure, when loaded and analysed, can be difficult to interpret. The structural geologist we work for can come back and tell that it is an impossible cube; but we can also produce impossible triangles, impossible triangular zones.

ImpossibleCube

Sometimes, in the foothills for example, it is simply not possible to get a meaningful image: the limitations of the adopted method (in general, the seismic; or in particular, some assumptions in the workflow) will prevent you from getting an explicit image of the subsurface.

The processing is not easy: it requires theory, experience, intuition, but also patience, organisation, discipline. Data processors know that it takes years of study and practice to learn: and that each new project is a challenge, and that in every project we can make a mistake.

But the technology evolves, and maybe one day the processing will become very easy, like assembling a book shelf from IKEA. The one below.

07_FinalImageIKEA_small

Leave a Reply

Your email address will not be published. Required fields are marked *