Abstract

The volume of cuttings and produced waste, particularly produced water, will increase as brownfield developments progresses. Missing environmental and economic targets represent a major investment exposure. As a resut, there is an increasing demand on engineering to mitigate process risks and ensure injection and sound subsurface waste management.

This paper describes the outcome of pressure analysis successfully used for Cuttings Re-Injection (CRI) and disposal of produced water resulting from closely monitoring pressure injection and decline. It addresses proper root-cause engineering diagnostic processes that were implemented during abnormal pressures increases and restrictions observed during annulus injection in North Sea projects.

Severe risks were mitigated with injection pressures observed, on occasion above overburden, to maintain safe injection assurance. A methodology was developed and implemented, as a result of process mapping based on signature pressures from pressure decline. Their characterization represents a major step in mitigating risks posed by waste injection, providing not only engineering understanding valuable to overall drilling operations planning, but also minimizing potential environmental impact with sound engineering subsurface waste management.

Introduction

In the early 1990s, Cuttings Re-Injection (CRI) emerged as a new technology that had the potential to provide an environmentally safe and economically sound solution to the disposal of oily drill cuttings for offshore operations. Through the years, this disposal process has become the preferred method for Drilling Waste Management (DWM). Millions of barrels of slurry have been safely injected around the world, and the approach has been able to keep pace with continuously tightening regulations by achieving zero discharge.

In most cases, the injection process progresses without incident. It is not uncommon, however for injection well or the annulus to become plugged, most often because of inadequate slurry design or poor implementation of operation guidelines. It is common knowledge that slurry migrating through natural fractures, micro-annulus, poorly cemented jobs, or even fractures growing to surface releases material to the environment. These incidents can cause serious harm to the environment, creating for the operator clean-up liabilities, problems with regulatory agencies where the injection licenses can be revoked, and the need for remediation, which can even include drilling a new well, a process that can generate enormous economical impact in the operation.

For all these reasons, it is essential to implement a risk management process that evaluates system vulnerabilities and the threats facing it as an essential part of any CRI program. CRI risk management implements a methodology that identifies early warnings of risks during actual operations, the probable consequences, and the mitigation options, providing the basis for establishing a safe and cost-effective injection program.

Understanding the injection pressure signatures recorded during CRI operations together with rheological slurry properties is the key to preserving long-term safe injection assurance. Having statistics on the overall pressure trend and pressure decline analysis allow characterization of sub-surface injection behavior. This knowledge is a valuable tool that allows a firm understanding of CRI processes. Continuous monitoring of well behavior to waste injection not only expands knowledge of fracture behavior, but also adds a powerful risk management tool for avoiding potential operational failures, which in turn reduces possible environmental impacts and related costs.

Overview of Pressure Analysis in CRI

Lack of reliable field data in CRI projects (i.g., in-situ stresses) is one of the main uncertainties that significantly affects the reliability and accuracy of hydraulic fracture simulations, which are necessary to predict fracture containment and disposal capacity before CRI operations begin. Injection formation parameters can be inferred from wireline logs and through laboratory fluid and core testing procedures. However, the reliability of the data inferred from these methods is reduced by factors such as the scale and accuracy of measurements, variability in the geological environment and tectonics, assumptions during logs interpretation, and significant deviations of the test environment from in-situ formation conditions. Uncertainties in these parameters can result in inadequate prediction of disposal fracture geometry and consequently, limited well disposal capacity, fracture breaching to surface, or in the worst-case scenario, complete failure during injection operations.

This content is only available via PDF.
You can access this article if you purchase or spend a download.