We follow up on our previous paper describing an API RP 1130 compliant test method that imposes simulation-based leak perturbation signatures on archived SCADA data. The perturbation-based simulated leak testing (PSLT) approach combines a faithful representation of the leak with the real-world impacts of noise, calculation uncertainties, and measurement errors. The current paper expands on the previous work by providing an analysis and testing of hydraulic limitations as well as benchmarking against other commonly employed leak testing methods. The work continues to demonstrate that previous conclusions regarding its relatively low cost, high hydraulic fidelity, coverage, and flexibility are maintained, while producing extensive output metrics that include detailed leak detection sensitivity maps, false alarm rates, and supporting statistical analysis.
A leak detection system (LDS) is a safety and integrity-critical component of an operating pipeline that is designed to help mitigate negative consequences following an unplanned commodity release. Its intended purpose is to reduce the potential negative impacts from a breach in pipeline hydraulic integrity (e.g., a leak with its resulting spill). Rapidly detecting the leak and determining its most probable location enables the pipeline operator to respond rapidly, effectively, and with precision to the spill, thus reducing its size and negative impacts. Note that the most commonly applied method for leak detection is via Computational Pipeline Monitoring (CPM) systems, which are the explicit focus of this document.
As part of the operator's overall spill response plan the organization should be able to quantify the leak detection system's predicted performance. This allows the operator to identify areas where further leak detection improvements are desirable and refine location specific response plans. It also provides a mechanism by which the LDS performance can be monitored and tracked over time.