The corrosion of carbon steel tubulars is an important completion integrity management issue. It is critical for the safe lifetime containment of well fluids, to avoid risk to people, property and the environment, and to avoid costly workover and repair during the well's life.
An effective tubing lifetime corrosion prediction model (VirtualCalliper) was developed to improve the management of integrity in carbon steel wells. The model has proved to be an order of magnitude more accurate than many widely used corrosion models, when predicting the internal corrosion of producer and injector wells.
The main reasons for this are:
In addition to the standard parameters considered in most corrosion prediction models, VirtualCalliper accounts for the influences of well architecture, water cut development, water salinity, erosion corrosion and H2S corrosion.
The model has been rigorously tuned with extensive field corrosion inspection data from a large number of well calliper surveys
The use of a well performance simulator output as the input files for key well parameters, ensures accurate modelling of temperature, pressure and flow characteristics across the whole well profile.
The software is capable of time stepping to consider the impact of historical and future changes to parameters such as lift gas, oil, gas and water rates.
The corrosion model output can be integrated with completion stress analysis (CSA), to enable the remaining tubing life to be predicted and to assess potential problems during well operations or workover.
VirtualCalliper has been proven in a number of fields. It has been used to plan a risk-based sequence of well workover priorities in a large, mature North Sea field. This has been achieved without the cost, HSE risk and lost production associated with well interventions to obtain calliper survey data.
This paper describes the structure and main functions of the model. Three case studies, used during the model development to compare the predicted tubing metal loss from VirtualCalliper and tubing metal loss measured from calliper surveys, are also presented.
Corrosion of carbon steel tubulars is an important completion integrity management issue. At the design stage the correct selection of metallurgy is critical for safe lifetime containment of well fluids. This avoids risk to people, property and the environment, and also costly workovers and well repair activities. In addition, by reliably predicting the corrosion rate, considerable CAPEX can be avoided by ensuring that high-grade tubulars are not selected unnecessarily.
Reliable corrosion prediction avoids the cost, HSE risk and lost production associated with a well intervention for a calliper survey. When applied across a number of wells it is possible to identify the wells and the production most at risk and enable the optimal workover strategy for the field to be developed. Corrosion prediction is also critical as part of the due diligence process for mature asset acquisition or divestment.
Many models are available for predicting carbon steel corrosion1–9. The majority of these models were developed based on laboratory corrosion tests. The advantage of these tests is that the most of the corrosion related factors can be controlled. Some of these models have been adopted for corrosion prediction of production and injection tubulars in wells. However, the difference between the predicted and observed tubing corrosion rates can vary by up to a factor of 10.
Many oilfield produced fluids can be very corrosive as they contain CO2, H2S, and high salinity water. These corrosive fluids are typically produced in high temperature, high flow velocity environments, in three or even four phases if sand is produced. Under these conditions it is difficult to apply laboratory based corrosion prediction models. Therefore, the oil and gas industry relies heavily on mechanical or ultrasonic calliper surveys. Data from such calliper surveys is only valid at the time it is acquired and cannot be extrapolated to reliably predict future corrosion.