Although waterflooding is a common method for increasing oil recovery, certain production problems may arise after water breakthrough. In addition to affecting oil recovery, these problems may also have an adverse impact on the environment. One particular production problem is mineral scale formation due to mixing of incompatible brines. For example, barium sulphate formation in the production well and tubulars occurs in many oilfields when sulphate rich injection water (usually seawater) mixes with barium rich formation water close to or in the wellbore. Understanding where the scale forms is important when attempting to prevent it damaging productivity. However, there is some field evidence which suggests that scale deposition deep within the reservoir may occur without having any adverse effect on oil recovery. This paper addresses the two related issues of
where sulphate scales form, and
whether any operator action is required where wells are producing lower concentrations of scaling ions than would be expected if scale formation was completely inhibited.
These issues were raised by two papers presented at the 1st SPE International Symposium on Oilfield Scale in January 1999, and it was the ensuing discussion which initiated the present study.
All of the principal mechanisms of brine mixing in waterflood displacements are considered and modelled. This leads to a fuller understanding of where - and to what extent -brine mixing will occur within the reservoir. As a consequence, we may address the questions of why scale deposition deep within the reservoir may in certain circumstances occur, while in other situations it is probably very limited. It is demonstrated that, although connate water in the oil zone is mobilised by injected water, typically little mixing would be expected far from the wellbore. Conversely, as the different brine types, which have retained the scaling ions, approach the wellbore, mixing and scale deposition are to be expected. Mixing in the aquifer where brine mobility can be much higher, however, may result in lower than expected ion concentrations. The reduced ion concentrations may be caused by scale dropout, or by mixing of formation brines with different compositions, or by a combination of both mechanisms. In either case, modelling of such processes has confirmed that they are possible explanations for low barium levels. The impact of scale dropout deep in the reservoir on the design of inhibitor squeeze treatments is discussed.
Modelling tools are routinely applied in the design of a scale prevention strategy1,2 in order to obtain a better understanding of the system to be treated, and to minimise the risks in the provision of flow assurance. For example, there are five simulation techniques that can be used to optimise the design of an inhibitor squeeze treatment3. These are used to:
Predict water production. Reservoir simulation models are used to predict when, where and what type of water production should be expected.
Predict scaling tendency. Scale prediction models can calculate the type and mass of scale that will be deposited.
Derive inhibitor-rock interaction table. A core flood model can derive the adsorption isotherm that relates the mass of inhibitor adsorbed to the concentration in solution.
Predict inhibitor placement. Reservoir or near-well models can predict chemical placement under bullheading, or with mechanical or chemical diversion. Placement of chemical diverter can also be modelled.
Optimise treatment. Near-wellbore or full-field models can calculate squeeze lifetimes, and are used to design and optimise treatments by sensitivity analysis.