In gradient based automatic history matching, calculation of the derivatives of all production data with respect to gridblock rock properties (sensitivities) and other model parameters is not feasible for large-scale problems. Thus, the Gauss-Newton method and Levenberg-Marquardt algorithm, which require calculation of all sensitivities to form the Hessian, are seldom viable. For such problems, the quasi-Newton and nonlinear conjugate gradient algorithms present reasonable alternatives as these two methods do not require explicit calculation of the complete sensitivity matrix or the Hessian. Another possibility, the one explored here, is to define a new parameterization to radically reduced the number of model parameters.

We provide a theoretical argument which indicates that reparametrization based on the principal right singular vectors of the dimensionless sensitivity matrix provides an optimal basis for re-parametrization of the vector of model parameters. We present and apply to example problems two algorithms for using this parametrization. Like LBFGS, these algorithms avoid explicit computation of individual sensitivity coefficients. Explicit computation of the sensitivities is avoided by using a partial singular value decomposition based on a form of the Lanczos algorithm. At least for all synthetic problems that we have considered, the reliability, computational efficiency and robustness of the methods presented here are as good as those obtained with quasi-Newton methods.

You can access this article if you purchase or spend a download.