Abstract
Rock matrix permeability is a key parameter for characterizing the source rock reservoir and for controlling the well performance over a long period of time. During the depletion of a shale gas reservoir, two important competing mechanisms, among others, impact the gas flow. They are the rock mechanic deformations that reduce the permeability with increasing effective stress and Knudsen diffusion/slippage flow that enhances the permeability at low pore pressures. Therefore, it is important to measure the pore-pressure-dependent permeability to better characterize the fluid flow during gas production. The conventional laboratory methods, based on linearized solutions to gas flow through core samples, measure the pressure-dependent permeability using the “point-by-point” approach. Permeability is measured at a given pore pressure for a given confining stress and then measured again at a different pore pressure so that a pore-pressure-dependent permeability curve can be generated consisting of a number of data points. These methods require multiple test runs and thus are very time consuming. To improve experimental efficiency, we previously proposed a method based on the nonlinear solution to the gas flow equation. This measures the rock matrix permeability as a function of pore pressure using a single test run without any presumptions regarding the form of the parametric relationship between permeability and pore pressure (Liu et al., 2018a). The focus of this paper is to evaluate the previously proposed method through implementation into a carefully designed experimental system (or a nanopermeameter). The validity and practical usefulness of this method is demonstrated by its successful applications to shale core samples and by the consistency between measurement results and those independently obtained from other methods.