Produção Científica

Artigo em Revista
A wavefield domain dynamic approach: Application in reverse time migration This paper proposes a novel technique to handle the wavefield domain involved in the procedures of seismic modeling, reverse-time migration (RTM), and full-waveform inversion (FWI). This method considers that the size of the wavefield domain varies with time, in other words, that it expands concomitantly to the propagation. However, in the geophysical literature, this dynamism has always been neglected as the wavefield domain is constantly considered to be fixed, thus, representing what we call a static approach (SA). This assumption may incurunnecessary use of available computational resources, thereby compromising application performance. Herein, we create a so-called dynamic approach (DA), capable of obtaining truly significant gains in terms of memory consumption and computational time. This new methodology is based on the application of an empirical filter that delimits the wavefront. This filter functions as a window and it is applied at each timestep until the wavefront reaches the model's boundaries, selecting the area where the seismic wavefield exists. This approach tries to approximate the computational domain to the propagation domain in order to obtain valuable computational gains, by eliminating unnecessary work, thus reducing the amount of work needed to perform forward and backward propagation. We compare both approaches using the Pluto model. The seismic data generated from the Pluto model is very large and it was not possible to use the static approach with it relying only on the randomaccess memory (RAM) of the used hardware. In order to perform the conventional RTM, we implement and compare the effective boundary technique for wavefield reconstruction with the RTM using the proposed dynamic approach. With the dynamic approach, it was possible to perform RTM of a 2D seismic data obtained from the Pluto model using only the RAM of the computational nodes and without the need of reconstruction techniques. |
|

Artigo em Revista
Multiphase flow mobility impact on oil reservoir recovery: An open-source simulation This work uses Computational Fluid Dynamics (CFD) to simulate the two-phase flow (oil and water) through a reservoir represented by a sandbox model. We investigated the influence in the flows of water having higher and lower mobilities than oil. To accomplish this, we also developed a dedicated solver, with the appropriated equations and representative models implemented in the open-source CFD OpenFOAM platform. In this solver, the black-oil model represented the oil. The results show that the Buckley–Leverett water-flood equation is a good approach for the three-dimensional flow. We observe that the water wall front is mixed to some extent with the oil and evolves obeying an exponential law. Water with mobility lower than oil is not common. However, in this case, the oil recovery is improved and the amount of injected water is reduced. The results comparing different mobilities show that a careful economic assessment should be performed before the field development. We have shown that the low water mobility can increase, as in this studied example, the water front saturation from 0.57 to 0.73, giving a substantial improvement in the oil recovery. The reservoir simulation can provide all process information needed to perform an economical assessment in an oil field exploration. |
|

Artigo em Revista
Cloud-computing approach for an environmental, social, and corporate governance focus in universities and businesses There is an increasing demand for high-performance computing on geophysical exploration applications, which implies more carbon emissions due to higher energy consumption. Furthermore, increasing the concern about the environmental and social impact that this can generate. We show how cloud computing can handle these challenges simultaneously and thereby assist business leaders in their decision-making. Cloud computing is a paradigm in which users rent computing capacity from providers on a pay-as-you-go basis, thereby reducing the carbon footprint by up to 88%. It can run software for years uninterrupted using the same capital required to acquire and run on-premises infrastructure, even if such infrastructure has over a thousand graphics processing units. However, the managers must consider that challenges arise from using the cloud, such as trusting their data in a third-party server and expenses throughout the years, especially with storage. |
|

Artigo em Revista
Faster and cheaper: How graphics processing units on spot-market instances minimize turnaround time and budget Cloud computing is enabling users to instantiate and access high-performance computing clusters quickly. However, without proper knowledge of the type of application and the nature of the instances, it can become quite expensive. Our objective is to indicate that adequately choosing the instances provides a fast execution, which, in turn, leads to a low execution price, using the pay-as-you-go model on cloud computing. We have used graphics processing unit instances on the spot market to execute a seismic-data set interpolation job and compared their performance with regular on-demand central processing unit (CPU) instances. Furthermore, we explored how scaling could also improve the execution times at small price differences. The experiments have shown that, by using an instance with eight accelerators on the spot market, we obtain up to a 300 times speedup compared with the on-demand CPU options, while being 100 times cheaper. Finally, our results have shown that seismic-imaging processing can be sped up by an order of magnitude with a low budget, resulting in faster and cheaper processing turnaround time and enabling new possible imaging techniques. |
|

Artigo em Revista
Introduction of the Hessian in joint migration inversion and improved recovery of structural information using image-based regularization Joint migration inversion (JMI) is a method based on one-way wave equations that aims at fitting seismic reflection data to estimate an image and a background velocity. The depth-migrated image describes the high spatial-frequency content of the subsurface and, in principle, is true amplitude. The background velocity model accounts mainly for the large spatial-scale kinematic effects of the wave propagation. Looking for a deeper understanding of the method, we briefly review the continuous equations that compose the forward-modeling engine of JMI for acoustic media and angle-independent scattering. Then, we use these equations together with the first-order adjoint-state method to arrive at a new formulation of the model gradients. To estimate the image, we combine the second-order adjoint-state method with the truncated-Newton method to obtain the image updates. For the model (velocity) estimation, in comparison to the image update, we reduce the computational cost by adopting a diagonal preconditioner for the corresponding gradient in combination with an image-based regularizing function. Based on this formulation, we build our implementation of the JMI algorithm. Our image-based regularization of the model estimate allows us to carry over structural information from the estimated image to the jointly estimated background model. As demonstrated by our numerical experiments, this procedure can help to improve the resolution of the estimated model and make it more consistent with the image. |
|

Artigo em Revista
Characterization of Seismic Noise in an Oil Field Using Passive Seismic Data from a Hydraulic Fracturing Operation We use a 5-h-long experiment with 182 vertical 2-Hz velocity sensors deployed on the surface to characterize noise before and during a hydraulic fracturing monitoring experiment in the Potiguar Basin, NE Brazil. We observe that the seismic noise is mainly from electromagnetic inductions and machinery vibration near the wellhead, and within 2 km of the array center from pumpjacks, pipelines, roads, and industrial facilities. We investigate the origin of the main recorded noise features using amplitude decay analysis and beamforming. We also report resonance composed of a body wave coming from the treatment wellhead area, which is only present when the injection takes place and is most likely associated with body-wave energy coming from the wellhead. To assess the utility of such a data set to retrieve the shallow velocity of the area using ambient noise seismic interferometry (ANSI), different strategies were employed to cross-correlate and stack the data: classical geometrical normalized cross-correlation (CCGN), phase cross-correlation (PCC), linear stacking and timefrequency phase-weighted stacking (tf-PWS). Because of the unsuitable distribution of the noise source and geometry of acquisition, spurious arrivals arise in the correlograms. We propose a simple method to attenuate these unwanted effects, which consists of applying a linear moveout (LMO) correction, stacking the data in the shot domains, and f-k filtering. The correlograms and their correspondent dispersion curves are significantly improved. |
|

Artigo em Revista
Adding Prior Information in FWI through Relative Entropy Full waveform inversion is an advantageous technique for obtaining high-resolution subsurface information. In the petroleum industry, mainly in reservoir characterisation, it is common to use information from wells as previous information to decrease the ambiguity of the obtained results. For this, we propose adding a relative entropy term to the formalism of the full waveform inversion. In this context, entropy will be just a nomenclature for regularisation and will have the role of helping the converge to the global minimum. The application of entropy in inverse problems usually involves formulating the problem, so that it is possible to use statistical concepts. To avoid this step, we propose a deterministic application to the full waveform inversion. We will discuss some aspects of relative entropy and show three different ways of using them to add prior information through entropy in the inverse problem. We use a dynamic weighting scheme to add prior information through entropy. The idea is that the prior information can help to find the path of the global minimum at the beginning of the inversion process. In all cases, the prior information can be incorporated very quickly into the full waveform inversion and lead the inversion to the desired solution. When we include the logarithmic weighting that constitutes entropy to the inverse problem, we will suppress the low-intensity ripples and sharpen the point events. Thus, the addition of entropy relative to full waveform inversion can provide a result with better resolution. In regions where salt is present in the BP 2004 model, we obtained a significant improvement by adding prior information through the relative entropy for synthetic data. We will show that the prior information added through entropy in full-waveform inversion formalism will prove to be a way to avoid local minimums. |
|

Artigo em Revista
Structural and sedimentary discontinuities control the generation of karst dissolution cavities in a carbonate sequence, Potiguar Basin, Brazil Epigenetic karstic systems in carbonate rocks commonly result from progressive dissolution by acidic meteoric waters over thousands to millions of years. The generation of secondary porosity and permeability improvement due to dissolution in carbonate reservoirs of geofluids (e.g., groundwater, hydrocarbons, and CO2) can profoundly impact reservoir storage capacity and subsurface fluid flow. This study investigates the control of structural discontinuities such as stylolites, fractures, and primary sedimentary discontinuities on the generation of multiscale karst dissolution cavities by epigenetic fluid percolation in a Late Cretaceous carbonate sequence (JandaÃra Formation) in the Potiguar Basin, Northeastern Brazil. The study relies on micro- and macroscale nalyses such as stratigraphic logs, field structural investigations, rock strength data collected in the field (Schmidt hammer), microtomographic and drone images, thin section analyses, porosity and permeability laboratory measurements. The results show that bed-perpendicular stratabound and non-stratabound stylolites and fractures can be enlarged due to meteoric water percolation until they merge and form a single channel system that crosscuts all sedimentary multilayers. Bed-parallel stylolites are ubiquitous in carbonate sequences overprinting bed interfaces and layers. Where not dissolved, bed-parallel stylolites have low porosity and permeability and thus can act as barriers to vertical fluid flow. Where dissolved, such stylolites can contribute to horizontal fluid flow and form channel porosity. The results of this study led to a formulation of a conceptual model of rock dissolution along structural and sedimentary discontinuities that affects carbonate rock successions in the subsurface. |
|

Artigo em Revista
Image-guided ray tracing and its applications Eikonal solvers have important applications in seismic data processing and inversion, the so-called image-guided methods. To this day, in image-guided applications, the solution of the eikonal equation is implemented using partial-differential-equation solvers, such as fast-marching or fast-sweeping methods. We have found that alternatively, one can numerically integrate the dynamic Hamiltonian system defined by the image-guided eikonal equation and reconstruct the solution with image-guided rays. We evaluate interesting applications of image-guided ray tracing to seismic data processing, demonstrating the use of the resulting rays in image-guided interpolation and smoothing, well-log interpolation, image flattening, and residual-moveout picking. Some of these applications make use of properties of the ray-tracing system that are not directly obtained by eikonal solvers, such as ray position, ray density, wavefront curvature, and ray curvature. These ray properties open space for a different set of applications of the image-guided eikonal equation, beyond the original motivation of accelerating the construction of minimum distance tables. We stress that image-guided ray tracing is an embarrassingly parallel problem that makes its implementation highly efficient on massively parallel platforms. Image-guided ray tracing is advantageous for most applications involving the tracking of seismic events and imaging-guided interpolation. Our numerical experiments using synthetic and real data sets indicate the efficiency and robustness of image-guided rays for the selected applications. |
|

Artigo em Revista
Computational cost comparison between nodal and vector finite elements in the modeling of controlled source electromagnetic data using a direct solver The Finite Element method can be implemented to model geophysical electromagnetic data using one of two methodologies called Nodal and Vector Finite Elements. This paper presents a comparison between the two approaches, emphasizing memory usage and processing time, when simulating Marine Controlled Source Electromagnetic (MCSEM) data in three-dimensional models. The study is carried out using unstructured meshes and a direct solver. Computational cost information from both methodologies are gathered from four different 3D models, each emphasizing a different aspect of the problem. The results indicate that the Vector Finite Element methodology requires less memory and processing time to calculate the same data using the same mesh. Although the nodal method generates a smaller linear system than the vector method, the vector coefficient matrix is significantly more sparse than the nodal one. The greater sparsity makes the vector approach more computationally efficient, requiring less memory and running in less time than the nodal method to generate results with the same level of accuracy. |
|