This article, written by Technology Editor Dennis Denney, contains highlights of paper OTC 17729, "The Key to Unlocking Innovation in the Upstream? Data Standards," by S.C. Comstock, ExxonMobil, prepared for the 2005 Offshore Technology Conference, Houston, 2-5 May. Copyright 2005 Offshore Technology Conference. Reproduced by permission.

The key to unlocking innovation in the upstream is standards. What? Does this sound backward and nonintuitive? Usually, do we not associate innovation with freewheeling, no-boundaries kinds of approaches? How can people feel free to innovate if you tell them to “do it this way?” It all depends on where you think the value comes from.

Background

Technical computing in the upstream oil and gas industry has evolved into extremely powerful, yet extremely complex, systems. Varied and coagulated connectivity networks underlie a myriad of platforms, from Unix to PC to mainframes to supercomputers…to server farms, to thin clients and Blackberries, and the list goes on. At the other end of these systems, a multitude of user interfaces becomes the most visible aspect of these complexities to the user community. At the center of such complex technical computing systems is a variety of software applications including geoscience, engineering, operations, business, personal productivity, and information technology (IT), usually supplemented out of necessity by a veritable foundry of line-coded tools, agents, or macros. These line codes are written by one, or all, of several million developers, and sometimes even by end users themselves, and in many different languages. The fuel and lifeblood for this technical computing system are data. These data come in many formats, sizes, and shapes. They come in logarithmically increasing volumes—in numbers making “a million” seem like a quaint and archaic term.

Overload

With the complications of these various elements and their interrelationships in the systems, complexity has become the enemy of innovation. Too many engineers, geoscientists, field operators, accountants, and managers (not to mention IT specialists) spend too much time dealing with the complexity and moving parts of the system, and, so far, too little time evaluating, analyzing, solving, and innovating in the environment.

While there are many targets for simplification in these technical computing systems, data are the most critical element. In the upstream environment, data are at the core of the systems—resting on the networked infrastructure and feeding the applications and user interfaces. Data are what the systems were built for and around. After all, scientific software applications without data are, in essence, just theory. And, computer infrastructure is not needed if data, or even software without data, do not exist. Data are probably in the worst shape of all the elements of the system in terms of its wild and complex nature.

This content is only available via PDF.
You can access this article if you purchase or spend a download.