Optimizing Velocity Models for Modern 2D Programs in Frontier Areas – Do More With Less

Spectrum’s Milos Cvetkovic, Chris Benson, Cesar Arias, Laurie Geiger, Paolo Esestime, Leandro Gabioli and Laura Younker evaluate how changes in the seismic industry have resulted in the evolution of the research and development that support modern 2D exploration programs. This information was presented at SEG 2018 in Anaheim, California.

As the seismic industry redefines itself, modern 2D exploration programs and the research and development that support them are evolving as well. With the necessity for more economically efficient business models, the integration of geological and geophysical workflows has proven to be a successful strategy for large scale multi-client programs in frontier basins.

Using a holistic approach in the time and depth velocity model building part of the workflow we have developed a robust methodology that gives us reliable models while reducing the timeline. Here, we highlight improvements in regional 2D seismic marine programs describing how we incorporate interpretation and non-seismic data early on in the workflow to constrain the seismic velocity models and aid further automatic velocity analysis. Examples shown are from large scale 2D projects in the deep waters offshore Brazil, Santos and Campos basins and offshore Somalia.

Deep water offshore Brazil, Santos and Campos basins
The shallow waters of both the Santos and Campos basins have been heavily explored, while the deep water can still be treated as frontier. Only coarse regional lines of 2D seismic cover this area (Figure 2). The seismic available to us consists of ~17000 km acquired in late 2016 with 12 km of offsets, 8 m source and a 15 m deep cable tow. Shot spacing is 25 m while we are continuously recording and shooting at 10 s intervals with the intention of separating individual shots during the data processing phase.

Figure: Velocity models and respective PSDM image before (left) and after (right) canyon fill update. Colormap differs from other figures in this abstract as it exaggerates small velocity variations needed to produce geologically plausible structures. Arrows pointing to imaging of late Jurassic carbonates after final model modifications.

The pre-processing sequence can be described as conventional and includes designature, deghosting and demulitple steps. The exception is individual shot separation, or deblending, of continuously recorded data following the steps presented by Seher and Clarke (2016). In order to reduce project turnaround time a practical use of available data is implemented where we start the initial model building immediately after designature and zero-phasing, concentrating in the shallow section above the multiples and continuous recording noise. Once the data is further processed we continue working with the mid-section and deep parts of the model.

For the PSTM velocity analysis we extended the approach presented by Esestime et al. (2016), where we incorporate all available seismic and non-seismic information to derive an initial model that we then refine with automatic picking. Compared to conventional velocity picking this workflow is saving a significant amount of time as we are covering ~17000 km of line length over an area of ~60000 km².We find that manually picked velocities during acquisition and even in the processing center in complicated geologic settings, such as salt basins, are often inconsistent in quality and will rarely tie at line intersections (Figure 3a). However, we can still use these RMS models to get an initial pre-stack migrated section that can be used in this streamlined workflow. We interpret key regional horizons, in this case water bottom and top of salt, while we use base of salt horizons from legacy PSDM projects to create another model building surface. Although the underlying grid of legacy PSDM data is composed of different vintages on an irregular and very sparse grid, regional interpretation surfaces correlating relatively well on the new grid. We use a geostatistical distribution scheme to create a smooth 3D time interval velocity model that will honor picked velocities and distribute them along these regional horizons. The velocity model will be consistent and will tie across the survey (Figures 3a and 3b). We then update these velocities with a pass of vertical semblance based auto-picking and use it to generate a new PSTM image. Next, we refine the interpretation of the top and base of salt, adjust the 3D geostatistical model and run additional passes of auto-picking on migrated CDP gathers. Auto-picking in the salt and pre-salt sections does not produce stable results even with targeted heavy preconditioning, hence these velocities will be finalized with a loop back to the PSDM models. Our new workflow significantly reduced the time line for a PSTM project as the regional interpretation does not require precise picking. Vertical semblance picking, horizon based 3D ties and auto-picking are mostly automated and are resource light processes, where parameter setting and testing takes most of the processors’ time…

Click to read the FULL EXTENDED ABSTRACT here:
Optimizing velocity model building for modern 2D programs in frontier areas – Doing more with less