ABSTRACT

This paper discusses measurement resolution for in-line inspection (ILI) of pipelines by intelligent pigs. In the early days of ILI, defect sensing was limited to a modest number of sensor channels, and inspection reports typically placed corrosion defects into only three categories such as "light," "moderate," and "severe."

Today's high-resolution ILI tools have shrunk the coverage of individual sensor channels to, in some cases, a few millimeters. This, among other improvements, has made it possible for inspection vendors to promise defect depth measurement accuracies of 5 to 10 percent of wall thickness and defect length measurement accuracies of several millimeters.

Once the defect configuration has been estimated from ILI data, assessment procedures are invoked to forecast the remaining strength of the pipe. In this paper, we consider the effect of measurement resolution on the accuracy of assessment calculations and quantify the error introduced into failure-pressure calculations made from "low-resolution" data compared to "high-resolution" data.

BACKGROUND

This paper discusses measurement resolution of pipeline inspection by intelligent pigs, primarily magnetic flux leakage (MFL) pigs. There are two dimensions to ILI measurement resolution, the axial and the circumferential extent of the corrosion defect. Intelligent pigs provide coverage to the entire pipe wall by carrying an array of sensors, generally displaced circumferentially with respect to each other.

There are decisions to be made about the selection of "standard-resolution" tools vs. "high-resolution" tools. The characteristics of each are summarized in the literature) Figure 1 shows a sensor layout that is typical of the conventional (low-resolution) inspection pig. In these pigs, defect sensing was limited to a modest number of channels, typically 24. This limit was dictated by the data recording system that was used; for example, a 28-channel analog tape recorder. Each sensor channel was connected to a single recording channel that saved the entire analog waveform from that sensor. Since there were a limited number of channels even for the large pipe sizes, each sensor had to cover a substantial portion of the pipe surface to assure total coverage and to avoid missing any defects. For example, a 750-mm pipe, with its 2.4-m circumference, required that each of the 24 channels covered approximately 100 mm of pipe circumference, with the result that the ability to resolve defects smaller than 100 mm in the circumferential direction was seriously compromised. Since the MFL response is closely related to the percent- age of metal missing from a given cross section, a 75-mm-wide defect that was 20-percent deep would produce about the same signal as a 25-mm-wide defect that was 60-percent deep. Fortunately for data analysts, reasonable assumptions could be made about corrosion morphology and some estimates of defect width made from the signal circumferential extent. This helped reduce the errors of measurement, but put an increased burden on the analyst, adding more human element to his decisions.

Despite the limitations of the early low-resolution systems, they did exhibit one strong point. They did not miss any of the inspected surface. The overlapping sensors together with continuous analog recording meant that there were no gaps in the measurement grid.

When the industry made the transition to the high-resolution systems, they essentially solved the problem of sensor width. No longer would there be confusion between narrow deep defects and wide shallow ones. However, high-resolution systems almost universally employed sensors with "point" detectors. For example, the active ar

This content is only available via PDF.
You can access this article if you purchase or spend a download.