Numerous laboratory experiments show that even very small amount (below 100 ppm) of oil in injected water can cause severe injectivity damage. On the other hand, field experience in produced water re-injection reveals difficulty (and high cost) of de-oiling the water with very low oil concentration. Moreover, not only the oil amount but its droplet size hampers the injection process. The most recent example of this problem is the commercial failure of downhole oil water separation (DHOWS) technique due injectivity damage resulting from incomplete separation and residual oil in the injected water.
Although injectivity decline caused by oil droplets has been studied experimentally, there is still lack of an easy-to-use and widely accepted model to predict the decline behavior. In this work, we developed an analytical model to predict time-dependent advancement of the water permeability reduction in linear flow by analyzing experimental data obtained from linear core flooding.
The model considers mass transfer of oil phase from the water oil mixture to the rock due capture effects by dispersion, advection and adsorption inside the rock. As the captured oil saturation increases, permeability reduces following the relative permeability drainage relationship. The reduction stabilizes when the oil saturation comes to an equilibrium value controlled by oil droplet size and injection velocity. Resulting from the model is a distributed change of permeability vs. time and distance from the point of injection that can be converted to the overall injectivity damage or skin factor. The model is calibrated using published experimental data from prolonged core floods with oil-contaminated waste water. Theoretical runs of the model demonstrate all the effects known from experimental observations, such as an ultimate value of captured oil saturation, the effect of oil droplet size and concentration, injectivity improvement at high rate, and the rock wettability effect.