A better understanding of integrating applications requiring high-speed data exchange with the realtime control system is paramount to the successful deploying of these technologies. This paper will address the implications of data exchange between realtime and application systems and the associated impact on the data requirements specific to a pipeline simulation application. Throughout today's petrochemical industry a growing number of applications are required to ensure pipeline safety and operational integrity. These applications consistently rely on information from the realtime control system, and more so than ever before, this information must be obtained and processed in realtime or near realtime fashion. Not only are process values to be retrieved, but also their associated status, quality and time-stamp information must be obtained. This information, taken in total, is used to determine an individual data element's value within the context of the application. This information is required for all critical simulation-relevant values such as pressures, flows, temperatures, valve positions, and pump status. Many applications may require this information, but pipeline simulation products live and die by the frequency and quality of it. As a company's operations change through acquisition and through improved monitoring capacity, the simulation application will also continue to grow and require ever more amounts of information. The more data that is available from the real world, the more accurate and timely are the resulting applications such as leak detection and batch tracking. The timeliness and accuracy of leak detection is paramount to assuring pipeline safety and minimizing any potential crisis situation. The batch tracking accuracy provides for more accurate cuts or transfers with less contamination or missed delivery amounts—which translates into bottom line dollars. Server side issues that will be addressed include:

  • Impact to the realtime server by providing all required data

  • Consideration of push versus pull transfers as a exchange methodology

  • Exchange protocols such as OPC, ODBC, DDE or proprietary interfaces

  • Write-back approaches for alarms, events and other model generated data The client side issues pertain primarily to how the data is retrieved and how the information is processed:

  • Proper distillation of the various status and quality flags to determine a values actual data quality in terms of the applications requirements

  • Frequency of data acquisition

  • Data Specification

  • Aggressively quantify the value of each data element, thus referencing the amount of data to be exchanged The visualization of information becomes the next issue. Each simulation vendor has developed a proprietary HMI, but experience has shown that many customers desire an integrated solution. Following the exchange of data, how the resulting information is presented actually defines how valuable the application is to Operations. In general, the topics for this portion would include:

    • Display-Integration versus Separate-But-Equal

    • Display characteristics for data presentation and distribution

    • Security and how to account for it

This content is only available via PDF.
You can access this article if you purchase or spend a download.