Currently, the focus of seismic exploration is increasingly shifting towards areas marked by dual complexities: challenging surface conditions and intricate subsurface geological structures. In these environments, seismic data often exhibit a relatively low signal-to-noise ratio (SNR), highlighting the critical need for effective denoising strategies to enhance SNR. Traditional denoising techniques largely rely on a linear assumption for the signal, which is difficult to meet linear assumption under complex geological structures. In this abstract, we treat the wavelets in seismic section as a template and detect the nonlinear time differences (NLTD) in seismic signals through a template matching approach. This involves generating a difference matrix based on the degree of discrepancy between the template and the characteristic data of the signals. Subsequently, we employ the Bellman optimization method to solve for the optimal sequence of time differences. By correcting the NLTD, we make the signal comply with the linear assumption, allowing for denoising through linear methods. This process achieves denoising of nonlinear signals and enhances the signal-to-noise ratio (SNR). Our numerical experiments show that the template matching method proposed for NLTD detection achieves relatively high accuracy. The denoising results on field data also demonstrate the superiority of our approach.

This content is only available via PDF.
You can access this article if you purchase or spend a download.