Abstract

Statistical and machine learning approaches to pipeline leak detection can benefit from augmentation with simple physical models that can predict line pack, particularly in cases where the fluids involved change density strongly with temperature and pressure. Determining the system wide temperature changes needed for these kinds of compensation models is not trivial, as the temperature measurements taken on the system are often only valid local to the sensors. This paper presents a machine learning approach to obtaining the temperature differences needed for compensation and then applying them.

The compensation method is tested against the uncompensated method in three different cases with NGLs, one of which is simulated and two of which come from real pipeline data. The compensated imbalance shows marked improvement in reducing false alarms while increasing sensitivity to leaks.

This content is only available via PDF.
You can access this article if you purchase or spend a download.