Produção Científica
Artigo em Revista
Fast Seismic Inversion Methods Using Ant Colony Optimization Algorithm This letter presents ACOBBR - V, a new computationally efficient ant-colony-optimization-based algorithm, tailored for continuous-domain problems. The ACOBBR - V algorithm is well suited for application in seismic inversion problems, owing to its intrinsic features, such as heuristics in generating the initial solution population and its facility to deal with multiobjective optimization problems. Here, we show how the ACOBBR - V algorithm can be applied in two methodologies to obtain 3-D impedance maps from poststack seismic amplitude data. The first methodology pertains to the traditional method of forward convolution of a reflectivity model with the estimated wavelet, where ACOBBR - V is used to guess the appropriate wavelet as the reflectivity model. In the second methodology, we propose an even faster inversion algorithm based on inverse filter optimization, where ACOBBR - V optimizes the inverse filter that is deconvolved with the seismic traces and results in a reflectivity model similar to that found in well logs. This modeled inverse filter is then deconvolved with the entire 3-D seismic volume. In experiments, both the methodologies are applied to a synthetic 3-D seismic volume. The results validate their feasibility and the suitability of ACOBBR - V as an optimization algorithm. The results also show that the second methodology has the advantages of a much higher convergence speed and effectiveness as a seismic inversion tool. |
Artigo em Revista
Migration velocity analysis using residual diffraction moveout in the poststack depth domain Diffraction events contain more direct information on the medium velocity than reflection events. We have developed a method for migration velocity improvement and diffraction localization based on a moveout analysis of over- or undermigrated diffraction events in the depth domain. The method uses an initial velocity model as input. It provides an update to the velocity model and diffraction locations in the depth domain as a result. The algorithm is based on the focusing of remigration trajectories from incorrectly migrated diffraction curves. These trajectories are constructed by applying a ray-tracing-like approach to the image-wave equation for velocity continuation. The starting points of the trajectories are obtained from fitting an ellipse or hyperbola to the picked uncollapsed diffraction events in the depth-migrated domain. Focusing of the remigration trajectories points out the approximate location of the associated diffractor, as well as local velocity attributes. Apart from the migration needed at each iteration, the method has a very low computational cost, but relies on the identification and picking of uncollapsed diffractions. We tested the feasibility of the method using synthetic data examples from three simple constant-gradient models and the Sigsbee2B data. Although we were able to build a complete velocity model in this example, we think of our technique as one for local velocity updating of a slightly incorrect model. Our tests showed that, within regions where the assumptions are satisfied, the method can be a powerful tool. |
Artigo em Revista
Estimating quality factor from surface seismic data: A comparison of current approaches The performances of the spectral ratio (SR), frequency centroid shift (FCS), and frequency peak shift (FPS) methods to estimate the effective quality factor Q are compared. These methods do not demand true amplitude data and their implementations were done following an “as simple as possible†approach to highlight their intrinsic potentials and limitations. We use synthetic zero-offset seismic data generated with a simple layer-cake isotropic model. The methods can be ranked from simple to complex in terms of automation as: FPS, FCS and SR. This is a consequence of: (i) peak identification consists basically of a sorting procedure, (ii) centroid estimation involves basically the evaluation of two well-behaved integrals, and (iii) implementation of the SR method involves at least choosing a usable frequency bandwidth and fitting a gradient. The methods can be ranked from robust to sensitive in the presence of noise content in the sequence SR, FCS, and FPS. This is consequence of: (i) the gradient estimate associated to the SR method averages out the noise content in the entire usable frequency bandwidth, (ii) in the presence of moderate-to-high noise level, the centroid estimation is biassed towards overestimating Q due to noise contribution in the tail of the amplitude spectrum, and (iii) peak identification is unstable due to local noise fluctuation in the amplitude spectrum around the peak frequency. Regarding the stability of the estimates relative to the attenuation amount, SR and FCS methods show similar behaviours, whereas FPS method presents an inferior performance. This fact is an indirect consequence of the sensitivity of FPS method to the noise content because the higher is the attenuation the lower is the signal-to-noise ratio. Finally, regarding the robustness of the methods to the presence of dipping layers, only SR and FCS methods provide good estimates, at least to typical dips in non-faulted sedimentary layers, with the estimates obtained with SR method being more accurate that those obtained with FCS method. Except in relation to the automation complexity, which is less important than the performances of the methods, SR method was superior or showed similar performance to FCS method in all scenarios we tried. |
Artigo em Revista
On the elastic wave equation in weakly anisotropic VTI media The knowledge of the wave equation is of fundamental importance for a good and satisfying understanding of the phenomena of wave propagation. However, it is unsatisfactory and inefficient to work with the full anisotropic wave equation in media that exhibit certain symmetries. We derive a specific elastic wave equation for weakly anisotropic VTI media by linearizing the expression of the stiffness tensor in terms of the Thomsen parameters. The resulting wave equation is a system of three coupled differential equations for the three components of the displacement vector. For δ = 0, the third equation becomes an independent equation for the third component of the particle displacement, identical to the isotropic situation, and the first two equations remain coupled. Using zero-order ray theory, we derive the associated eikonal and transport equations for q-P, q-SV and q-SH waves. These are finally reduced to the pseudo-acoustic case where the vertical S-wave velocity is zero. This allows for a better understanding of the pseudo-S-wave artefact in such media. |
Artigo em Revista
Using SVD filters for velocity analysis and ground-roll attenuation This study investigates the adaptive filtering approach based on the Singular Value Decomposition (SVD) method to improve velocity analysis and ground-roll attenuation. The SVD filtering is an adaptive multichannel filtering method where each filtered seismic trace keeps a degree of coherence with the immediate neighboring traces. Before applying the adaptive filtering, in order to flatten the primary reflections the seismogram is corrected using the Normal Move Out (NMO) method. The SVD filtering helps to strengthen the spatial coherence of reflectors. It works as multichannel and can be applied by selecting a set of seismic traces taken from around the target trace. Thus traces from different shots can be represented by a five-point areal operator, which we call five-point cross operator. In this paper we run this operator along the coverage map of the seismic survey. At each operator position, the filtered trace (center of the operator) is obtained by taking the firstor adding the first eigenimages. Thereby we enhance the coherence corresponding to the primary reflections in detriment of the remaining events (ground-roll, multiples, and other non-correlated events) remained in the other eigenimages. The method was tested on a seismic line of the Tacutu, Brazil. The obtained results show the velocity spectra with better definition, as well as better post-stacked section exhibiting better continuity of seismic reflections and lower noise, compared with the raw processing results (without SVD filtering). |
Artigo em Revista
True-amplitude single-stack redatuming Based on the chaining of diffraction-stack migration and isochron-stack demigration, we derive a general true-amplitude Kirchhoff-type single-stack operator for 3D and 2.5D redatuming. It consists of performing a single weighted stack along adequately chosen stacking surfaces or lines. The corresponding traveltimes and weight functions can be calculated using quantities obtained from dynamic ray tracing. The operator simplifies when specified for zero-offset data. For simple types of media, we derive analytic expressions for the stacking lines and weight functions and demonstrate their functionality with numerical examples. |
Artigo em Revista
Symplectic scheme and the Poynting vector in reverse-time migration We developed a new numerical solution for the wave equation that combines symplectic integrators and the rapid expansion method (REM). This solution can be used for seismic modeling and reverse-time migration (RTM). In seismic modeling and RTM, spatial derivatives are usually calculated by finite differences (FDs) or by the Fourier method, and the time evolution is normally obtained by a second-order FD approach. If the spatial derivatives are computed by higher order FD schemes, then the time step needs to be small enough to avoid numerical dispersion, therefore increasing the computational time. However, by using REM with the Fourier method for the spatial derivatives, we can apply the proposed method to propagate the wavefield for larger time steps. Moreover, if the appropriate number of expansion terms is chosen, this method is unconditionally stable and propagates seismic waves free of numerical dispersion. The use of a symplectic numerical scheme provides the solution of the wave equation and its first time derivative at the current time step. Thus, the Poynting vector can also be computed during the time extrapolation process at very low computational cost. Based on the Poynting vector information, we also used a new methodology to separate the wavefield in its upgoing and downgoing components. Additionally, Poynting vector components can be used to compute common gathers in the reflection angle domain, and the stack of some angle gathers can be used to eliminate low-frequency noise produced by the RTM imaging condition. We numerically evaluated the applicability of the proposed method to extrapolate a wavefield with a time step larger than the ones commonly used by symplectic methods as well as the efficiency of this new symplectic method combined with REM to successfully handle the Poynting vector calculation. |
Artigo em Revista
Signal and traveltime parameter estimation using singular value decomposition Signal detection and traveltime parameter estimation can be performed by computing a coherence function in a data window centered around a traveltime function defined by its parameters. We used singular value decomposition of the data matrix, not eigendecomposition of a covariance matrix, to review the most commonly used coherence measures. This resulted in a new reduced semblance coefficient defined from the first eigenimage, assuming that the signal amplitude was the same on all data channels (as in classical semblance). In a second signal model, the time signal was constant on each channel, but the amplitude changed. Then, the semblance coefficient is the square of the first singular value divided by the energy of the data. Two normalized crosscorrelation coefficients derived from the first eigenimage can also be used as a coherence measure: The normalized crosscorrelation of the spatial singular vector with a vector with all elements equal to one, and the normalized crosscorrelation of the temporal singular vector and the average time signal (the stacked trace). We defined a multiple signal classification (MUSIC) measure as the inverse of one minus any of the normalized coherence measures described above. To reduce the numerical range, we preferred to use log10 MUSIC. Numerical examples with different coherence measures applied to seismic velocity analysis of synthetic and real data revealed that the normalized crosscorrelation coefficients performed poorly and that log MUSIC gave no resolution enhancement on real data. The normalized eigenimage-energy coherence measure performed poorly on synthetic data but gave the best result for a simulated reflection with a polarity reversal. It also gave good time resolution on the real data. The classical semblance coefficient and the reduced semblance coefficient gave similar results with the reduced semblance coefficient having better resolution. |
Artigo em Revista
Entropic Regularization to Assist a Geologist in Producing a Geologic Map The gravity and magnetic data measured on the Earth’s surface or above it (collected from an aircraft flying at low altitude) can be used to assist in geologic mapping by estimating the spatial density and magnetization distributions, respectively, presumably confined to the interior of a horizontal slab with known depths to the top and bottom. To estimate density or magnetization distributions we assume a piecewise constant function defined on a user-specified grid of cells and invert the gravity or magnetic data by using the entropic regularization as a stabilizing function that allows estimating abrupt changes in the physical-property distribution. The entropic regularization combines the minimization of the first-order entropy measure with the maximization of the zeroth-order entropy measure of the solution vector. The aim of this approach is to detect sharp-bounded geologic units through the discontinuities in the estimated density or magnetization distributions. Tests conducted with synthetic data show that the entropic regularization can delineate discontinuous geologic units, allowing a better mapping of sharp-bounded (but buried) geologic bodies. We demonstrate the potential of the entropic regularization to assist a geologist in obtaining a geologic map by analyzing the estimated magnetization distributions from field magnetic data over a magnetic skarn in Butte Valley, Nevada, U.S.A. We show that it is an exoskarn where the ion exchange between the intrusive and the host rock occurs along a limited portion of the southern intrusive border. |
Artigo em Revista
Reconstruction of geologic bodies in depth associated with a sedimentary basin using gravity and magnetic data We present a comprehensive review of the most common gravity and magnetic interpretation methods to in depth retrieval of the geometry of geologic bodies associated with a sedimentary basin. We identify three types of bodies: 1) the sedimentary basement relief, 2) salt bodies and 3) mafic intrusions in a sedimentary section. In reconstructing basement topography through gravity and/or magnetic data we identify three groups of methods: the automatic, the spectral and the nonspectral methods. The reconstruction of salt bodies from gravity data usually uses interactive forward modelling but recently gravity inversion methods have been developed to interpret this kind of geologic environment. Finally, the problem of reconstructing intrusive bodies using magnetic and/or gravity data employs three strategies to interpret mafic or ultramafic intrusions in a sedimentary section: the automatic methods, interactive forward modelling and the inversion methods. |