Where wells are protected by scale inhibitor squeeze treatments, it is critical that there is an effective means of determining whether the produced fluids are inhibited against scale formation. Conventionally this is achieved by determining the minimum concentration of inhibitor required to prevent scale formation in the laboratory and then taking wellhead samples which are sent to an offsite laboratory for analysis to determine the concentration of inhibitor that they contain. Whilst this method can be an effective monitoring technique for inhibitor concentrations in excess of the minimum inhibitor concentration (MIC), it can introduce risk when wells are close to the MIC owing to the time required to turn around samples. Additional problems arise with poor sample preservation and the often-low frequency of sampling. A more robust approach to managing producing well scale risk is to utilise an on-site test that can be simply performed on freshly taken wellhead samples. The greatest value arises from a test, moreover, that is independent of laboratory determined measures of scale inhibition effectiveness.

In this paper we report the development, field validation, and routine application on the Miller Field in the North Sea of a direct test for the degree of scale inhibition exhibited by wellhead samples. The test, as described, relies upon the stressing of samples with excess barium or sulphate ions and the monitoring of turbidity developed in the solution. In contrast to other stress tests that have been reported previously, the test described relies upon simple robust equipment and is straightfoward to perform allowing samples to be taken routinely and monitored in near-real time.

The recent introduction of the stress testing technology on the BP operated Miller Field has already provided significant rewards with respect to HSE and scale management in one of the harshest barium sulphate scaling environments in the world.


Downhole mineral scaling is either a product of the self-scaling of the formation water (carbonate scaling) or the mixing of incompatible waters (injection water and formation water). Downhole scale control is, to a large extent, a question of economics. Two main strategies are available; prevention and remediation. The capex and opex of prevention have to be compared, on a risk-weighted basis, with the capex and opex of remediation. In conditions where individual well values and the cost of intervention are low a remedial strategy may give the best return. However, as well values increase and/or the cost of intervention increases then preventative strategies predominate. For offshore Fields prevention is usually the favoured option1.

By far the most widely employed method of downhole scale control is the inhibitor ‘squeeze’ treatment. In this technique, scale inhibitor is ‘squeezed’ or injected into the near-wellbore region where the chemical is retained by adsorption and/or precipitation with the chemical being returned in the produced water by elution from the rock surface1.

The current industry standard for monitoring scale inhibitor squeeze performance relies upon determining product concentration (often referred to as inhibitor residual) in the produced brine. The inhibitor residual is then correlated with physical evidence of scale deposition or the results of laboratory testing to determine what minimum concentration of product is necessary to prevent scale deposits from occurring. This minimum concentration is often referred to as minimum inhibitor concentration, MIC.

This content is only available via PDF.
You can access this article if you purchase or spend a download.