Produção Científica



Artigo em Revista
14/03/2022

The effectiveness of spectral decomposition-based layer thickness estimation: A seismic physical modeling example
We have constructed a channel complex model at a scale of 1:10,000 by stacking 3D-printed polylactide layers with negative relief meandering channels. This model was subjected to an ultrasonic common-offset acquisition in a water tank (with the water filling the channels), and the result was treated as a zero-offset 3D acoustic reflection seismogram, receiving a deterministic deconvolution and a poststack migration as data treatment. We then developed an algorithm to yield volumes of estimated two-way time layer thickness from multiple-frequency volumes obtained through the short-time Fourier transform. The estimated thicknesses were compared with the measurements of the physical model obtained through X-ray computed tomography. Despite
the strong signal attenuation and imaging issues, the results were rather satisfactory, increasing the confidence
in using spectral decomposition for quantitative seismic analysis.

Artigo em Revista
14/03/2022

Core plug and 2D/3D-image integrated analysis for improving permeability estimation based on the differences between micro- and macroporosity in Middle East carbonate rocks
Carbonate rocks are porous systems, with pores and pore throats of varying morphologies that result from
depositional and diagenetic processes. Such heterogeneity produces a complex arrangement between the grains
and pores that affects the petrophysical properties while limiting the utility of measurement techniques. Petrophysical properties are generally acquired by conventional laboratory methods, although to provide accurate results, core plugs need to be recovered intact. Two-dimensional digital image analysis (2D DIA) enables the
processing of any core cut and requires minimal data manipulation and computation when compared to threedimensional approaches. In DIA, permeability is calculated using models described in the literature that often do not provide good predictions for carbonate rocks. Often, the permeability, which is controlled by the size and shape of the pores and pore throats, is related to porosity values; however, the porosity of a rock varies from the micro to macro level, resulting in enormous uncertainty in estimating permeability. In this article, we present a new strategy to improve the prediction of permeability by using pore-shape parameters from 2D DIA, which provides data related to the macropores resulting from the optical resolution. This gas technique measures the absolute permeability, which is used as a calibration parameter, and the total porosity, which is used to calculate the microporosity. The test samples used are from Oman outcrops of the Huqf Supergroup and Salalah Formation, which are analogous to the carbonates of the giant reservoirs in the Middle East. Microporosity was
characteristic of all the samples due to the calcite mud matrix, recrystallized calcite cement, microcracks and the
crystalline texture caused by dolomitization. The pore-shape parameters from the 2D DIA improved the
permeability prediction and were found to relate to the pore types that make up the rock, whereas the 3D
technique did not provide a good result. The R2 of the 2D data was 0.96, demonstrating the efficiency of the
procedures applied in mitigating the uncertainties of the models for the set of samples studied.

Artigo em Revista
14/03/2022

Dynamics of the nonequilibrium flow in a duct with obstruction
This work aims to numerically simulate the dynamics of a channel flow with an obstruction since the moment we inject a fluid with an homogeneous velocity profile. The simulations uses the open source tools of the OpenFOAM platform, the pisoFoam and the LES turbulence model,
describing in detail the velocity profiles of laminar and turbulent flows. We also perform a boundary layer mapping in the presence of an obstacle. We used three different domains to follow the evolution of the velocity profile while the fluid progresses downstream and passes the
obstruction. The results reproduce the well-known results of laminar flow in a channel, as well as the average velocity profile in the turbulent regime and the occurrence of attachments by the obstruction. These preliminary results are used to validate the solvers and the mesh used. Next,an analysis of the velocity profile dynamics resulted in determining an exponential decay of the root mean square deviations of the homogeneous to the parabolic, and to the turbulent regime in the channel.

Artigo em Revista
14/03/2022

A wavefield domain dynamic approach: Application in reverse time migration
This paper proposes a novel technique to handle the wavefield domain involved in the procedures of seismic
modeling, reverse-time migration (RTM), and full-waveform inversion (FWI). This method considers that the size of the wavefield domain varies with time, in other words, that it expands concomitantly to the propagation. However, in the geophysical literature, this dynamism has always been neglected as the wavefield domain is constantly considered to be fixed, thus, representing what we call a static approach (SA). This assumption may incurunnecessary use of available computational resources, thereby compromising application performance. Herein, we create a so-called dynamic approach (DA), capable of obtaining truly significant gains in terms of memory consumption and computational time. This new methodology is based on the application of an empirical filter that delimits the wavefront. This filter functions as a window and it is applied at each timestep until the wavefront reaches the model's boundaries, selecting the area where the seismic wavefield exists. This approach tries to approximate the computational domain to the propagation domain in order to obtain valuable computational gains, by eliminating unnecessary work, thus reducing the amount of work needed to perform forward and backward propagation. We compare both approaches using the Pluto model. The seismic data generated from the Pluto model is very large and it was not possible to use the static approach with it relying only on the randomaccess memory (RAM) of the used hardware. In order to perform the conventional RTM, we implement and compare the effective boundary technique for wavefield reconstruction with the RTM using the proposed dynamic
approach. With the dynamic approach, it was possible to perform RTM of a 2D seismic data obtained from the
Pluto model using only the RAM of the computational nodes and without the need of reconstruction techniques.

Artigo em Revista
14/03/2022

Multiphase flow mobility impact on oil reservoir recovery: An open-source simulation
This work uses Computational Fluid Dynamics (CFD) to simulate the two-phase flow (oil and water) through a reservoir represented by a sandbox model. We investigated the influence in the flows of water having higher and lower mobilities than oil. To accomplish this, we also
developed a dedicated solver, with the appropriated equations and representative models implemented in the open-source CFD OpenFOAM platform. In this solver, the black-oil model represented the oil. The results show that the Buckley–Leverett water-flood equation is a good
approach for the three-dimensional flow. We observe that the water wall front is mixed to some extent with the oil and evolves obeying an exponential law. Water with mobility lower than oil is not common. However, in this case, the oil recovery is improved and the amount of injected water is reduced. The results comparing different mobilities show that a careful economic assessment should be performed before the field development. We have shown that the low water mobility can increase, as in this studied example, the water front saturation from 0.57 to 0.73, giving a substantial improvement in the oil recovery. The reservoir simulation can provide all process information needed to perform an economical assessment in an oil field
exploration.

Artigo em Revista
09/03/2022

Cloud-computing approach for an environmental, social, and corporate governance focus in universities and businesses
There is an increasing demand for high-performance computing on geophysical exploration applications, which implies more carbon emissions due to higher energy consumption. Furthermore, increasing the concern about the environmental and social impact that this can generate. We show how cloud computing can handle these challenges simultaneously and thereby assist business leaders in their decision-making. Cloud computing is a paradigm in which users rent computing capacity from providers on a pay-as-you-go basis, thereby reducing the carbon footprint by up to 88%. It can run software for years uninterrupted using the same capital required to acquire and run on-premises infrastructure, even if such infrastructure has over a thousand graphics processing units. However, the managers must consider that challenges arise from using the cloud, such as trusting their data in a third-party server and expenses throughout the years, especially with storage.

Artigo em Revista
09/03/2022

Faster and cheaper: How graphics processing units on spot-market instances minimize turnaround time and budget
Cloud computing is enabling users to instantiate and access high-performance computing clusters quickly. However, without proper knowledge of the type of application and the nature of the instances, it can become quite expensive. Our objective is to indicate that adequately choosing the instances provides a fast execution, which, in turn, leads to a low execution price, using the pay-as-you-go model on cloud computing. We have used graphics processing unit instances on the spot market to execute a seismic-data set interpolation job and compared their performance with regular on-demand central processing unit (CPU) instances. Furthermore, we explored how scaling could also improve the execution times at small price differences. The experiments have shown that, by using an instance with eight accelerators on the spot market, we obtain up to a 300 times speedup compared with the on-demand CPU options, while being 100 times cheaper. Finally, our results have shown that seismic-imaging processing can be sped up by an order of magnitude with a low budget, resulting in faster
and cheaper processing turnaround time and enabling new possible imaging techniques.

Artigo em Revista
09/03/2022

Introduction of the Hessian in joint migration inversion and improved recovery of structural information using image-based regularization
Joint migration inversion (JMI) is a method based on one-way wave equations that aims at fitting seismic reflection data to estimate an image and a background velocity. The depth-migrated image describes the high spatial-frequency content of the subsurface and, in principle, is true amplitude. The background velocity model accounts mainly for the large spatial-scale kinematic effects of the wave propagation. Looking for a deeper understanding of the method, we briefly review the continuous equations that
compose the forward-modeling engine of JMI for acoustic media and angle-independent scattering. Then, we use these equations together with the first-order adjoint-state method to arrive at a new formulation of the model gradients. To estimate the image, we combine the second-order adjoint-state method with the truncated-Newton method to obtain the image updates. For the model
(velocity) estimation, in comparison to the image update, we reduce the computational cost by adopting a diagonal preconditioner for the corresponding gradient in combination with an image-based regularizing function. Based on this formulation, we build our implementation of the JMI algorithm. Our image-based regularization of the model estimate allows us to carry over structural information from the estimated image to the jointly estimated background model. As demonstrated by our numerical experiments, this procedure can help to improve the resolution of the estimated model and make it more consistent with the image.

Artigo em Revista
09/03/2022

Characterization of Seismic Noise in an Oil Field Using Passive Seismic Data from a Hydraulic Fracturing Operation
We use a 5-h-long experiment with 182 vertical 2-Hz velocity sensors deployed on the surface to characterize noise before and during a hydraulic fracturing monitoring experiment in the Potiguar Basin, NE Brazil. We observe that the seismic noise is mainly from electromagnetic inductions and machinery vibration near the wellhead, and within 2 km of the array center from pumpjacks, pipelines, roads, and industrial facilities. We investigate the origin of the main recorded noise features using amplitude
decay analysis and beamforming. We also report resonance composed of a body wave coming from the treatment wellhead area, which is only present when the injection takes place and is most likely associated with body-wave energy coming from the wellhead. To assess the utility of such a data set to retrieve the shallow velocity of the area using ambient noise seismic interferometry (ANSI), different strategies were employed to cross-correlate and stack the data: classical geometrical normalized cross-correlation (CCGN), phase cross-correlation (PCC), linear stacking and timefrequency phase-weighted stacking (tf-PWS). Because of the unsuitable distribution of the noise source and geometry of acquisition, spurious arrivals arise in the correlograms. We propose a simple method to attenuate these unwanted effects, which consists of applying a linear moveout (LMO) correction, stacking the data in the shot domains, and f-k filtering. The correlograms and their correspondent dispersion curves are significantly improved.

Artigo em Revista
09/03/2022

Adding Prior Information in FWI through Relative Entropy
Full waveform inversion is an advantageous technique for obtaining high-resolution subsurface information. In the petroleum industry, mainly in reservoir characterisation, it is common to use information from wells as previous information to decrease the ambiguity of the obtained
results. For this, we propose adding a relative entropy term to the formalism of the full waveform inversion. In this context, entropy will be just a nomenclature for regularisation and will have the role of helping the converge to the global minimum. The application of entropy in inverse problems usually involves formulating the problem, so that it is possible to use statistical concepts. To avoid this step, we propose a deterministic application to the full waveform inversion. We will
discuss some aspects of relative entropy and show three different ways of using them to add prior information through entropy in the inverse problem. We use a dynamic weighting scheme to add prior information through entropy. The idea is that the prior information can help to find the path of the global minimum at the beginning of the inversion process. In all cases, the prior information
can be incorporated very quickly into the full waveform inversion and lead the inversion to the desired solution. When we include the logarithmic weighting that constitutes entropy to the inverse problem, we will suppress the low-intensity ripples and sharpen the point events. Thus, the addition of entropy relative to full waveform inversion can provide a result with better resolution. In regions
where salt is present in the BP 2004 model, we obtained a significant improvement by adding prior information through the relative entropy for synthetic data. We will show that the prior information added through entropy in full-waveform inversion formalism will prove to be a way to avoid local minimums.

<<  <   1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20   >  >>