SUMMARY

It is superior to formulate an inverse problem in a Bayesian framework and fully solve it by stochastically constructing the posterior probability density function or PPD surface using MCMC (Monte Carlo Markov Chain) algorithms. The estimated PPD can also be used to compute several measures of dispersion in model space. However, for many realistic applications, MCMC can be computationally expensive and cheap measures may lead to inaccurate PPD estimation as well as uncertainty analysis due to the strong nonlinearity and high dimensionality. In this paper, to address the fundamental issues of efficiency and accuracy in parameter estimation and PPD sampling, we incorporate some new developments into a standard Genetic Algorithm (GA) to design a more powerful Markov Chain Monte Carlo (MCMC) algorithm - a multi-scale GA based MCMC, for practical geophysical inverse problems. Multiple MCMC chains of different scales are run simultaneously in parallel in this new method. To gain the benefits of both the faster convergence of the coarse scale and the greater detail of the fine scale, realizations of chains on different scales are combined for intelligent proposals that facilitate exploration of the model space on the fine scale. In this study, this new MCMC is demonstrated using an analytical example and its performance on PPD estimation and uncertainty quantification is evaluated using a nonlinear seismic inverse problem. We find multi-scaling to be particularly attractive in addressing model parameterization issue especially for seismic waveform inversion.

INTRODUCTION

A fundamental role of geophysics is to make quantitative inference of earth’s interior physical properties from indirectly related surface data and measurements, which constitutes an inverse problem. Due to the inherent nonuniqueness, a deterministic solution of inverse problems may be impossible. Therefore, it makes more sense to cast an inverse problem in a statistical framework and solve it by constructing the posterior probability distribution (PPD). In a Bayesian inference framework, the PPD is a lumping of consistent models together with uncertainties in light of the current state of knowledge, which allows us to directly calculate the probability of any particular model or value of any particular parameter. “Parameter estimation” and “model selection” are two main Bayesian inference problems. The former deals with problems in which a parameterized model is assumed to be true and values of model parameters and their uncertainties are aimed at. Bayes theorem provides a convenient way to mathematically formulate PPD by combining the independent prior information with available data. However, in many realistic applications, the PPD is neither analytically tractable nor easily estimated because of the high-dimensionality. Although Markov Chain Monte Carlo (MCMC) methods have been used more favorably to explore the PPD in highdimensional problems, uncertainty quantification based on the PPD still can be computationally daunting in such problems especially those with a time-consuming nonlinear forward simulator. Some developments in MCMC algorithms have recently been developed for better performance. The idea of temperature ladder was introduced to design new MCMC methods in which a sequence of distributions is simulated along a temperature ladder.

This content is only available via PDF.
You can access this article if you purchase or spend a download.