ABSTRACT

Up to now, subsea pipeline design was the matter of skilled experts, able to discriminate technical solutions using specific software or calculation models to reach a conclusion. However, with the increasing level of details justifiably required by oil companies, this experience is not enough if coupled to outdated tools. In the current day and age where data is created, stored, shared and improved at every second, the methodologies used for the past decades or so will not suffice. This sets the stage for a profound transformation, at the heart of which is the SPIDEV suite.

INTRODUCTION

In a context of low barrel prices and cutting-edge designs, the current trend in the oil & gas sector calls for shorter project timeframes as well as more refined analyses, with more precise calculations and additional sensitivities required. In this environment, the linear workflow approach widely used in the industry up to now becomes a critical point in project execution. Many engineering activities were performed in a rather sequential, individual process: calculations and sensitivities were performed using local, self-contained files (eg. Spreadsheets, local finite element models and isolated macros) found on engineers' workstations.

This isolated approach shows its limits:

  • Data transmission and data duplicity: a process like a sealine design entails various activities which need to be broken down into several computation tools. Any transition from one to the other implies the copy of data, which if done manually and repeatedly, as is the case when multiple sensitivities are performed, comes at a significant risk. How to trace data ? How do we ensure the consistency of identical data sets across all activities?

  • Tool and models versioning: the very nature of the pipeline design activity (and oil field development as a whole) makes a "one size fits all" approach incompatible with a client's requirements. As a contractor, we aim to fulfil our clients' particular requirements, which is synonym of modification or upgrades to the existing tools. On a contractor side, however, these improvements and features shall be registered for all future projects to use, should the need arise. In this context, an isolated approach will lead to tools branching off one another, hindering the capitalization of the know-how. This results in tools and models keeping unshared improvements, and centralized tools never exactly up-to-date, see Fig 1:

  • Isolated tools are generally perceived as means of calculation only. They are as such devoid of import and export solutions, rendering the reporting or communication of results quite tedious and comparison across several models flat-out impossible. Then, the natural process of discussing a potential solution with a client, another engineering center or even within a same team is significantly slowed down;

  • Finally, an isolated tool is considered in itself of small enough interest not to warrant the protection of the intellectual property it contains. Similar to the point above, sharing becomes more of a headache.

This content is only available via PDF.
You can access this article if you purchase or spend a download.