Abstract

Traditionally within the energy industry, much emphasis has been placed upon the gathering and management of geoscientific data for easy access, analysis, and decision-making purposes. Typically, however, much less emphasis has been placed on the management of operational data and its integration with geoscientific, economic, and public-domain data sets. As a result, much of the data is never captured or stored, and attempting to access these data is time-consuming and sometimes unsuccessful.

In 1995, Halliburton Energy Services launched a major project to design and implement an environment for managing operational and technical data with integrated applications to access, retrieve, and analyze the data. The foundation of this project was Halliburton's technical data model (HTDM), which has been under development since 1991. While the initial goal was to provide operational and technical data to Halliburton's internal users, it quickly became apparent that efficient management of wellsite data from the point of acquisition presents a major opportunity to the industry for more effective and efficient operations. Informed, timely, optimized decision making is enabled by easy access to these data and by the analysis of these data with integrated application software. Halliburton's plan includes development of a globally distributed relational database with application and communications methods that allow acquisition, access, and analysis of data from any location.

This paper also presents examples of successful implementation of the system to lower costs and improve efficiency by effective data mangement.

Introduction

For years, the energy industry has gathered and managed geoscientific data for easy access, detailed analysis, and critical decision-making. Such data have included those derived from seismic, geologic, petrophysical, and well-test surveys. Typically, however, the industry has placed much less emphasis on managing operational data and integrating that data with geoscientific, economic, and public-domain data sets. Although much operational data has been captured, they are usually found over disconnected and frequently incompatible computer databases. Just as often, operational data are found in hardcopy files in widely separated locations. Thus, the difficulty in gathering operational data from these disjointed sources have made attempts at comprehensive analysis of such data expensive, time consuming and often impractical.

Today, operators and service companies concentrate on overall reservoir management and seek the most cost-effective methods to achieve and maintain optimal reservoir performance. A high-quality, well-populated, easily accessible operational database can be analyzed to help minimize operational costs and determine the most successful operational procedures.

In 1995, Halliburton Energy Services launched a major project to design and implement an environment for managing operational and technical data with integrated applications to access and analyze the data. This project was named TIMS (Technical Information Management Systems). To implement the operational aspects of the TIMS project, project goals were clarified and workflow processes were reviewed. Then, a comprehensive relational data model and an associated workflow scheme were developed, and software and hardware were selected to implement the database and workflow scheme. The database and workflow are currently being phased into field operations.

Project Goals

The database and workflow scheme that were to be developed through the TIMS project would:

  • support and improve mainstay workflow processes across all the company's product-service lines (PSLs)

  • capture and manage job experience to support decision functions

This content is only available via PDF.
You can access this article if you purchase or spend a download.