Produção Científica

**Artigo em Revista**

Offset-continuation stacking: Theory and proof of conceptThe offset-continuation operation (OCO) is a seismic configuration transform designed to simulate a seismic section, as if obtained with a certain source-receiver offset using the data measured with another offset. Based on this operation, we have introduced the OCO stack, which is a multiparameter stacking technique that transforms 2D/2.5D prestack multicoverage data into a stacked common-offset (CO) section. Similarly to common-midpoint and common-reflection-surface stacks, the OCO stack does not rely on an a priori velocity model but provided velocity information itself. Because OCO is dependent on the velocity model used in the process, the method can be combined with trial-stacking techniques for a set of models, thus allowing for the extraction of velocity information. The algorithm consists of data stacking along so-called OCO trajectories, which approximate the common-reflection-point trajectory, i.e., the position of a reflection event in the multicoverage data as a function of source-receiver offset in dependence on the medium velocity and the local event slope. These trajectories are the ray-theoretical solutions to the OCO image-wave equation, which describes the continuous transformation of a CO reflection event from one offset to another. Stacking along trial OCO trajectories for different values of average velocity and local event slope allows us to determine horizon-based optimal parameter pairs and a final stacked section at arbitrary offset. Synthetic examples demonstrate that the OCO stack works as predicted, almost completely removing random noise added to the data and successfully recovering the reflection events.The offset-continuation operation (OCO) is a seismic configuration transform designed to simulate a seismic section, as if obtained with a certain source-receiver offset using the data measured with another offset. Based on this operation, we have introduced the OCO stack, which is a multiparameter stacking technique that transforms 2D/2.5D prestack multicoverage data into a stacked common-offset (CO) section. Similarly to common-midpoint and common-reflection-surface stacks, the OCO stack does not rely on an a priori velocity model but provided velocity information itself. Because OCO is dependent on the velocity model used in the process, the method can be combined with trial-stacking techniques for a set of models, thus allowing for the extraction of velocity information. The algorithm consists of data stacking along so-called OCO trajectories, which approximate the common-reflection-point trajectory, i.e., the position of a reflection event in the multicoverage data as a function of source-receiver offset in dependence on the medium velocity and the local event slope. These trajectories are the ray-theoretical solutions to the OCO image-wave equation, which describes the continuous transformation of a CO reflection event from one offset to another. Stacking along trial OCO trajectories for different values of ave |

**Artigo em Revista**

Estimation of quality factor based on peak frequency-shift method and redatuming operator: Application in real data setQuality factor estimation and correction are necessary to compensate the seismic energy dissipated during acoustic-/elastic-wave propagation in the earth. In this process, known as QQ-filtering in the realm of seismic processing, the main goal is to improve the resolution of the seismic signal, as well as to recover part of the energy dissipated by the anelastic attenuation. We have found a way to improve QQ-factor estimation from seismic reflection data. Our methodology is based on the combination of the peak-frequency-shift (PFS) method and the redatuming operator. Our innovation is in the way we correct traveltimes when the medium consists of many layers. In other words, the correction of the traveltime table used in the PFS method is performed using the redatuming operator. This operation, performed iteratively, allows a more accurate estimation of the QQ factor layer by layer. Applications to synthetic and real data (Viking Graben) reveal the feasibility of our analysis.Quality factor estimation and correction are necessary to compensate the seismic energy dissipated during acoustic-/elastic-wave propagation in the earth. In this process, known as QQ-filtering in the realm of seismic processing, the main goal is to improve the resolution of the seismic signal, as well as to recover part of the energy dissipated by the anelastic attenuation. We have found a way to improve QQ-factor estimation from seismic reflection data. Our methodology is based on the combination of the peak-frequency-shift (PFS) method and the redatuming operator. Our innovation is in the way we correct traveltimes when the medium consists of many layers. In other words, the correction of the traveltime table used in the PFS method is performed using the redatuming operator. This operation, performed iteratively, allows a more accurate estimation of the QQ factor layer by layer. Applications to synthetic and real data (Viking Graben) reveal the feasibility of our analysis. |

**Artigo em Revista**

AN ALGORITHM FOR WAVE PROPAGATION ANALYSIS IN STRATIFIED POROELASTIC MEDIAAbstract The classic poroelastic theory of Biot, developed in 1950â€™s, describes the propagation of elastic waves through a porous media containing a fluid. This theory has been extensively used in various fields dealing with porous media: seismic exploration, oil/gas reservoir characterization, environmental geophysics, earthquake seismology, etc. In this work we use the Ursin formalism to derive explicit formulas for the analysis of propagation of elastic waves through a stratified 3D porous media, where the parameters of the media are characterized by piece-wise constant functions of only one spatial variable, depth. Key words: poroelasticity, Biot system, low-frequency range, layered media, Ursin algorithm |

**Artigo em Revista**

Relief Geometric Effects on Frequency-Domain Eletromagnetic DataA perpendicular transmiter-receiver coils arrangement used in the frequency-domain eletromagnetic survey can have deviations in relation to its standard geometric definition due to the relief geometry of the surveyed area when combined with large transmitter-receiver distance and large transmitter loop. This happens because the local relief characteristics along the equivalent magnetic moment axis from the vertical, and receiver positions at different elevations. A study about that is carried on here substituting the rugged relief by an inclined plane. We have developed a new formulation for the n-layered model that allowed us to investigate the relief geometry effects on FDEM data but restricting the analysis to the two-layer earth model, considering three cases of transmitter-receiver situations controlled by the relief model. Such procedures resulted to be very useful to demonstrate their behavior departing from those curves obtained for an inclined and a horizontal ground. These results show that small deviations in the verticality of the transmitter loop axis or in the horizontality of the surficial plane causes significant deviations, even for angles as small as 1Âº |

**Artigo em Revista**

How much averaging is necessary to cancel out cross-terms in noise correlation studies?We present an analytical approach to jointly estimate the correlation window length and number of correlograms to stack in ambient noise correlation studies to statistically ensure that noise cross-terms cancel out to within a chosen threshold. These estimates provide the minimum amount of data necessary to extract coherent signals in ambient noise studies using noise sequences filtered in a given frequency bandwidth. The inputs for the estimation process are (1) the variance of the cross-correlation energy density calculated over an elementary time length equal to the largest period present in the filtered data and (2) the threshold below which the noise cross-terms will be in the final stacked correlograms. The presented theory explains how to adjust the required correlation window length and number of stacks when changing from one frequency bandwidth to another. In addition, this theory provides a simple way to monitor stationarity in the noise. The validity of the deduced expressions have been confirmed with numerical cross-correlation tests using both synthetic and field data. Key words: Time-series analysis; Interferometry. |

**Artigo em Revista**

Automatic data extrapolation to zero offset along local slopeVelocity-independent seismic data processing requires information about the local slope in the data. From estimates of local time and space derivatives of the data, a total least-squares algorithm gives an estimate of the local slope at each data point. Total least squares minimizes the orthogonal distance from the data points (the local time and space derivatives) to the fitted straight line defining the local slope. This gives a more consistent estimate of the local slope than standard least squares because it takes into account uncertainty in the temporal and spatial derivatives. The total least-squares slope estimate is the same as the one obtained from using the structure tensor with a rectangular window function. The estimate of the local slope field is used to extrapolate all traces in a seismic gather to the smallest recorded offset without using velocity information. Extrapolation to zero offset is done using a hyperbolic traveltime function in which slope information replaces the knowledge of the normal moveout (NMO) velocity. The new data processing method requires no velocity analysis and there is little stretch effect. All major reflections and diffractions that are present at zero offset will be reproduced in the output zero-offset section. Therefore, if multiple reflections are undesired in the output, they should be removed before data extrapolation to zero offset. The automatic method is sensitive to noise, so for poor signal-to-noise ratios, standard NMO velocities for primary reflections can be used to compute the slope field. Synthetic and field data examples indicate that compared with standard seismic data processing (velocity analysis, mute, NMO correction, and stack), our method provides an improved zero-offset section in complex data areas. |

**Artigo em Revista**

Parallel Scalability of a Fine-Grain Prestack Reverse Time Migration AlgorithmSeismic imaging has evolved significantly due to the high demand from the oil/gas industry for hardware technological advancements, boosting the development of more sophisticated algorithms. In order to deliver the quality and accuracy required, the execution of these algorithms may lead to time infeasible solutions. Aiming at performance improvement, this work conducted the parallelization of the core of a reverse time migration (RTM) algorithm. Furthermore, analysis such as speedup and efficiency was performed in order to assess the scalability of the proposed method. While the many parallelization efforts so far deal with coarse-grain approaches, this letter tackles the intrashot fine-grain parallelization of prestack RTM, which increases the overall concurrency degree of the algorithm. Results using 2-D synthetic data show that the proposed approach is scalable, which means that an increase in hardware resources and/or in problem size will lead to a proportional increase in speed and/or accuracy. |

**Artigo em Revista**

Mapping Neogene and Quaternary sedimentary deposits in northeastern Brazil by integrating geophysics, remote sensing and geological field dataNeogene and late Quaternary sedimentary deposits corresponding respectively to the Barreiras Formation and Post-Barreiras Sediments are abundant along the Brazilian coast. Such deposits are valuable for reconstructing sea level fluctuations and recording tectonic reactivation along the passive margin of South America. Despite this relevance, much effort remains to be invested in discriminating these units in their various areas of occurrence. The main objective of this work is to develop and test a new methodology for semi-automated mapping of Neogene and late Quaternary sedimentary deposits in northeastern Brazil integrating geophysical and remote sensing data. The central onshore ParaÃba Basin was selected due to the recent availability of a detailed map based on the integration of surface and subsurface geological data. We used airborne gamma-ray spectrometry (i.e., potassium-K and thorium-Th concentration) and morphometric data (i.e., reliefedissection, slope and elevation) extracted from the digital elevation model (DEM) generated by the Shuttle Radar Topography Mission (SRTM). The procedures included: (a) data integration using geographic information systems (GIS); (b) exploratory statistical analyses, including the definition of parameters and thresholds for class discrimination for a set of sample plots; and (c) development and application of a decision-tree classification. Data validation was based on: (i) statistical analysis of geochemical and airborne gamma-ray spectrometry data consisting of K and Th concentrations; and (ii) map validation with the support of a confusion matrix, overall accuracy, as well as quantity disagreement and allocation disagreement for accuracy assessment based on field points. The concentration of K successfully separated the sedimentary units of the basin from Precambrian basement rocks. The reliefedissection morphometric variable allowed the discrimination between the Barreiras Formation and the Post-Barreiras Sediments. In addition, two units of the latter (i.e., PB1 and PB2) previously mapped in the field were promptly separated based on Th concentration. A regression analysis indicated that the relationship between geophysical and geochemical values obtained for the PB1, PB2 and Barreiras Formation is significant (R-squared Â¼ 0.91; p-value <0.05). Map validation presented a high overall accuracy of 84%, with a coefficient of quantity disagreement of 12% and a coefficient of allocation disagreement of 8%. These results indicate that the methodology applied in the central onshore ParaÃba Basin can be successfully used for mapping the Barreiras Formation and Post- Barreiras Sediments in other areas of the Brazilian coast. The ability to rapidly and precisely map these units using such methodology could reveal their geographic distribution along the northeastern coast of Brazil. |

**Artigo em Revista**

Neotectonic reactivation of shear zones and implications for faulting style and geometry in the continental margin of NE BrazilThe eastern continental margin of South America comprises a series of rift basins developed during the breakup of Pangea in the Jurassicâ€“Cretaceous. We integrated high resolution aeromagnetic, structural and stratigraphic data in order to evaluate the role of reactivation of ductile, Neoproterozoic shear zones in the deposition and deformation of post-rift sedimentary deposits in one of these basins, the ParaÃba Basin in northeastern Brazil. This basin corresponds to the last part of the South American continent to be separated fromAfrica during the Pangea breakup. Sediment deposition in this basin occurred in the Albianâ€“Maastrichtian, Eoceneâ€“Miocene, and in the late Quaternary. However, our investigation concentrates on the Mioceneâ€“Quaternary, which we consider the neotectonic period because it encompasses the last stress field. This consisted of an Eâ€“W-oriented compression and a Nâ€“S-oriented extension. The basement of the basin forms a slightly seaward-tilted ramp capped by a late Cretaceous to Quaternary sedimentary cover ~100â€“400 m thick. Aeromagnetic lineaments mark the major steeply-dipping, ductile Eâ€“W- to NE-striking shear zones in this basement. The ductile shear zones mainly reactivated as strike-slip, normal and oblique-slip faults, resulting in a series of Mioceneâ€“Quaternary depocenters controlled by NE-, Eâ€“W-, and a few NW-striking faults. Faulting produced subsidence and uplift that are largely responsible for the present-day morphology of the valleys and tablelands in this margin. We conclude that Precambrian shear zone reactivation controlled geometry and orientation, aswell as deformation of sedimentary deposits, until the Neogeneâ€“Quaternary. |

**Artigo em Revista**

GPR investigation of karst guided by comparison with outcrop and unmanned aerial vehicle imageryThe increasing importance of carbonate rocks as aquifers, oil reservoirs, and for urban problems is demanding detailed characterization of karst systems, a demand that can be partially satisfied with GPR imaging. However,the goal of imaging and interpreting karstified carbonate rocks is notoriously difficult due to the complex nature of the geometry of the dissolution and the GPR intrinsic limitations. One way forward is the direct comparison of GPR images with similar outcropping rocks. A joint study involving a 200 MHz GPR survey, unmanned aerial vehicle imagery (UAV), and outcrop characterization is presented aiming to improve the interpretation of sedimentary structures, fractures and karst structures in GPR images. The study area is a 500 m wide and 1000m long carbonate outcrop of the JandaÃra Formation in Potiguar basin, Brazil, where sedimentary, fracture,and karst features can be directly investigated in both vertical and horizontal plan views. The key elements to interpret GPR images of karstified carbonate rocks are: (1) primary sedimentary structures appear in radargrams as unaltered imaged strata but care must be taken to interpret complex primary sedimentary features, such as those associated with bioturbation; (2) subvertical fractures might appear as consistent discontinuities in the imaged strata, forming complex structures such as negative flowers along strikeâ€“slip faults; (3) dissolution may create voids along subhorizontal layers, which appear in radargrams as relatively long amplitude shadow zones; and (4) dissolutionmay also create voids along subvertical fractures, appearing in radargrams as amplitude shadow zones with relatively large vertical dimensions, which are bounded by fractures. |