Skip Nav Destination
Filter
Filter
Filter
Filter
Filter

Update search

Filter

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number

### NARROW

Peer Reviewed

Format

Subjects

Journal

Date

Availability

1-13 of 13

Keywords: input parameter

Close
**Follow your search**

Access your saved searches in your account

Would you like to receive an alert when new items match your search?

*Close Modal*

Sort by

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*73 (03): 53–54.

Paper Number: SPE-0321-0053-JPT

Published: 01 March 2021

... completion pressure drop openhole standalone screen completion complete paper application screen selection simulation completion operation sand control ramp-up method well impairment hawkin model

**input****parameter**pinhole diameter well-flux surveillance filter cake 53JPT MARCH 2021...
Abstract

This article, written by JPT Technology Editor Judy Feder, contains highlights of paper SPE 201662, “A Well-Flux Surveillance and Ramp-Up Method for Openhole Standalone Screen Completion,” by Mehmet Karaaslan and George K. Wong, University of Houston, and Kevin L. Soter, SPE, Shell, et al., prepared for the 2020 SPE Annual Technical Conference and Exhibition, originally scheduled to be held in Denver, Colorado, 5-7 October. The paper has not been peer reviewed. Production and surveillance engineers need practical models to help maximize production while avoiding ramping up the well to an extent that the completion is damaged, causing well impairment or failure. The complete paper presents a well-flux surveillance method to monitor and ramp up production for openhole standalone screen (OH-SAS) completions that optimizes production by considering risks of production impairment and screen-erosion failure. Challenges of Increased Production vs. Well Failure The problem of increased production vs. the risk of well impairment or failure is a pressing problem for sand-control wells in deepwater, where projects tend to have a small number of high-rate wells. In such environments, any well impairments or failures greatly affect the project economics. Following unloading, well surveillance faces the critical step of ramping up to-ward the well’s designed peak rate for the first time when the actual well performance is uncertain. To reduce risk of well impairment or failure, surveillance information and models are needed to help make adjustments during the ramp-up process. Different models are available, from simple to complex and from small to large amounts of input data and computational efforts. Simple nonsurveillance models use field-derived operating limits of completion pressure drop and flow velocity or flux. They are non-surveillance models in the sense that no direct linkage of surveillance results to update flux calculations exists. Simple surveillance models use pressure transient analysis (PTA) results and completion information to evaluate changing well performance and adjust the ramp-up and long-term surveillance operations. The complex surveillance model evaluates well performance and adjusts well operations using probabilistic completion failure risks and coupled reservoir and completion simulations. These models mainly focus on cased-hole gravel pack and frac-pack applications. For openhole completions with sand control, the literature offers limited ramp-up surveillance references. The objective of the well-flux model described in the complete paper is to ramp up the well safely and optimize production using PTA results as surveillance inputs to calculate completion fluxes for well impairment or failure assessment. The method follows an approach presented in the literature.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*68 (07): 63–64.

Paper Number: SPE-0716-0063-JPT

Published: 01 July 2016

... in the reservoir-characterization process; thus,

**input****parameters**are usually uncertain and so is the simulator output. The procedure to calibrate the reservoir-simulation model is called history matching. On the basis of observed data, a set of possible input choices for the reservoir model is identified. Two...
Abstract

This article, written by Special Publications Editor Adam Wilson, contains highlights of paper SPE 169405, “Use of Emulator Methodology for Uncertainty-Reduction Quantification,” by C. Ferreira, Universidade Estadual de Campinas; I. Vernon, Durham University; D.J. Schiozer, SPE, Universidade Estadual de Campinas; and M. Goldstein, Durham University, prepared for the 2014 SPE Latin American and Caribbean Petroleum Engineering Conference, Maracaibo, Venezuela, 21–23 May. The paper has not been peer reviewed. Most simulation models go through a series of iterations before being judged as giving an adequate representation of the physical system. This can be difficult because the input space to be searched may be high dimensional, the collection of outputs to be matched may be very large, and each single evaluation may take a long time. Because the uncertainty analysis is complex and time consuming, in this paper, a stochastic representation of the computer model, called an emulator, was constructed to quantify the reduction in the parameter input space. Introduction Reservoir simulators are important and widely used in reservoir management. They are used in reservoir-performance prediction and for decision making. These simulators are computer implementations of high-dimensional mathematical models for reservoirs, where the model inputs are physical parameters and the outputs are observable characteristics such as well-pressure measurements and fluid production. Uncertainties are always present in the reservoir-characterization process; thus, input parameters are usually uncertain and so is the simulator output. The procedure to calibrate the reservoir-simulation model is called history matching. On the basis of observed data, a set of possible input choices for the reservoir model is identified. Two different procedures can be used to perform the history matching: deterministic and probabilistic approaches. The deterministic approach involves running the initial simulation model with different input values to obtain one simulation model between many probable matches to the field data. In a probabilistic approach, in which several reservoir-model scenarios are considered, the uncertainty analysis procedure is used. Identifying the input parameters for which the simulation outputs match the observed data can be a difficult task because the input space to be searched may be high dimensional, the collection of outputs to be matched may be very large, and each single evaluation may take a long time.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*68 (01): 60–61.

Paper Number: SPE-0116-0060-JPT

Published: 01 January 2016

... Artificial Intelligence objective function optimization optimization run optimization problem total effect decision maker EOR Performance physical quantity Quantitatively uncertain parameter reservoir Upstream Oil & Gas

**input****parameter**efficient frontier complete paper adaptive...
Abstract

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 173295, “Method for Adaptive Optimization of EOR Performance Under Uncertainty,” by N. Chugunov, T.S. Ramakrishnan, A. Lukyanov, and W.J. Bailey, Schlumberger, and D. Wilkinson, Efficient Solutions, prepared for the 2015 SPE Reservoir Simulation Symposium, Houston, 23–25 February. The paper has not been peer reviewed. Enhanced-oil-recovery (EOR) performance may be assessed quantitatively through a variety of metrics. Because formation and fluid properties are often poorly characterized, however, performance predictions of EOR processes are uncertain. The authors present a method to reduce such uncertainty in EOR performance adaptively while identifying an optimal operational strategy for a given tolerance to risk. The proposed approach allows one to reduce uncertainty progressively in the predicted performance of an iteratively optimized EOR strategy by systematic reduction of uncertainty in identified properties of the reservoir. Introduction Classic approaches to optimization under uncertainty use a mean-variance approach. However, they do not provide necessary insight into the underlying nature of the uncertainties inherent in the optimized model. More importantly, they do not provide any specific quantitative guidance on reducing such uncertainty, which is necessary from an operational point of view. This issue is addressed in this study directly by combining global sensitivity analysis (GSA) with optimization under uncertainty, in an adaptive work flow, for systematic uncertainty reduction of the optimized model prediction. Application of GSA to address various problems arising in the industry has been discussed in a number of studies. However, these studies focused mainly on quantifying uncertainty for specific physical quantities and using that analysis to gain insight into the subsequent measurement-program design and interpretation.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*66 (01): 91–92.

Paper Number: SPE-0114-0091-JPT

Published: 01 January 2014

... will be considered to show how they contribute to the cumulative uncertainties in model predictions. In this approach, the

**input****parameters**are assigned appropriate probability distributions. The distributions are then applied in the wellbore-stability models. By means of Monte Carlo simulation, the uncertainties...
Abstract

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 166788, ’Uncertainty Evaluation of Wellbore-Stability-Model Predictions,’ John Emeka Udegbunam, Bernt Sigve Aadnoy, SPE, and Kjell Kare Fjelde, SPE, University of Stavanger, prepared for the 2013 SPE/IADC Middle East Drilling Technology Conference and Exhibition, Dubai, 7-9 October. The paper has not been peer reviewed. The purpose of this work is to investigate typical fracture and collapse models with respect to accuracies in the input data. Uncertainties in the input data will be considered to show how they contribute to the cumulative uncertainties in model predictions. In this approach, the input parameters are assigned appropriate probability distributions. The distributions are then applied in the wellbore-stability models. By means of Monte Carlo simulation, the uncertainties are propagated and outputs, which follow a probability distribution, are generated. Introduction Wellbore-stability analysis is necessary for safe drilling operations, especially now that oil and gas operators venture into more-challenging environments. A wide range of parameters is required for accurate study, many of which are subject to uncertainties caused by measurement errors. Error also can be introduced into data through the methods of interpretation used. Epistemic error, arising from imperfect human knowledge of a system, is another source of input uncertainties. Analytical models used for wellbore-stability analysis are also often associated with uncertainties. Mathematical modeling algorithms only try to approximate physical processes and are not true representations of the problems under study. The modelers should be aware of the imprecision and limitations of these physical models. Thus, output uncertainty stems from the variations in input data and uncertainties caused by wellbore-stability-modeling processes. Expected values give no information about uncertainty. Deterministic estimation of the downhole pressure limits provides only single-point values that lack variability information. Instead, probability distributions can be used. With this approach, cumulative uncertainties in the output predictions can be quantified, leading to a more-informed decision. In-Situ Stress Field For a given formation, the starting point in wellbore-stability analysis is the in-situ or pre-existing stress state. Knowledge of the stress state is key to handling borehole problems such as fracturing, lost circulation, collapse, and sand production. The in-situ stress state is normally assumed to coincide with vertical and horizontal directions. In relaxed depositional basins, the values of these horizontal stresses are usually lower than those of the vertical stress. The horizontal-stress magnitudes, however, may exceed those of the vertical stress in strongly tectonic regions. A stress state can be defined as normal-fault, reverse-fault, or strike/slip-fault state of stress. The normal-fault stress state is assumed in this work. If the magnitudes of the three principal stresses and the direction of one of the stresses are known, then the stress state can be specified. The stress concentration is usually very high around the borehole wall. This effect decreases rapidly away from the hole. At a long distance from the wellbore, the principal in-situ stresses are undisturbed and lie along their in-situ directions.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*62 (07): 41–42.

Paper Number: SPE-0710-0041-JPT

Published: 01 July 2010

... or statistically defined function that replicates the simulation-model output for selected

**input****parameters**. Typical proxy models in reservoir simulation include the following. Sensitivity analysis of uncertainty variables Probabilistic forecasting and risk analysis Conditioning a simulation model to observed data...
Abstract

This article, written by Senior Technology Editor Dennis Denney, contains highlights of paper SPE 124815, ’Pros and Cons of Applying Proxy Models as a Substitute for Full Reservoir Simulations,’ by D.I. Zubarev, SPE, SPT Group, prepared for the 2009 SPE Annual Technical Conference and Exhibition, New Orleans, 4-7 October. The paper has not been peer reviewed. A comparative study was made of proxy-modeling methods (also known as surrogate modeling or metamodeling) as a computationally inexpensive alternative to full numerical simulation in assisted history matching, production optimization, and forecasting. The study demonstrated the solution-space complexity for different simulation models and the applicability of the proxy models to mimic it. Focus was given to practical aspects of model construction and to limitations of which engineers should be aware. Introduction Recent improvements in computational hardware and software have advanced reservoir modeling. However, for many workflows in uncertainty quantification and optimization with application to reservoir simulation, the availability of computing resources is still a limiting factor. Here, a “proxy model” is a mathematically or statistically defined function that replicates the simulation-model output for selected input parameters. Typical proxy models in reservoir simulation include the following. Sensitivity analysis of uncertainty variables Probabilistic forecasting and risk analysis Conditioning a simulation model to observed data (history matching) Field-development planning and production optimization Proxy models, combined with design-of-experiment techniques, are used widely for sensitivity analysis. Application scenarios include the traditional one-parameter-at-a-time approach for linear-sensitivity analyses and advanced experimental designs that are capable of resolving correlation and higher-order effects. For probabilistic forecasting, proxy models are used routinely as input to a Monte Carlo sampling process. The high computational efficiency of proxy models enables exhaustive sampling rates. This study investigated the predictive quality and computational effort required for different proxy-modeling algorithms to provide acceptable results in solving reservoir-simulation problems. Three simulation models of different complexities were selected to understand the effect of the simulation-model structure on the efficiency of the proxy-model performance. Different application workflows [i.e., history matching, production optimization, hydrocarbons initially in place (HCIIP), and oil-recovery forecasting] were used to investigate the effect of the simulation-model output-data type on the proxy-model performance.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*61 (04): 59–61.

Paper Number: SPE-0409-0059-JPT

Published: 01 April 2009

..., and leveraging economies of scale to reduce plant costs. 1 4 2009 1 4 2009 1 4 2009 1 4 2009 2009. Society of Petroleum Engineers gas monetization strategic investment decision

**input****parameter**Upstream Oil & Gas portfolio discovery objective advocacy approach liquified...
Abstract

This article, written by Assistant Technology Editor Karen Bybee, contains highlights of paper SPE 115251, "Reducing Time to First Gas: Lessons Learned in Expediting and Informing International Investment Chain Decisions," by Deborah D. Resley, SPE, and Christopher Reinsvold, SPE, Decision Strategies, originally prepared for the 2008 SPE Asia Pacific Oil and Gas Conference and Exhibition, Perth, Australia, 20-22 October. The paper has not been peer reviewed. The monetization of gas, particularly in liquefied natural gas (LNG) projects, requires large capital investment in most links of the gas value chain from production to end use. Companies that invest in a single component of the value chain, such as nonintegrated gas producers, must understand the value of and risks inherent in every link in the chain, because gas will be priced in relation to its final point of sale. Final sale price as well as the effects of processing and capacity limitations ripple their way upstream to alter the timing, demand, and viability of the upstream investment. Introduction LNG projects are good examples of complex global gas-development projects with large capital-investment requirements. To build a commercial LNG project from gas production to point of final sale requires investment in and development of several capital projects. These linked or integrated projects serve to produce, treat, transport, and sell the gas in specific markets. These projects are joined in a value chain by at least 16 major commercial contracts, beginning with a production-sharing contract at the wellhead and ending with a gas-sales agreement in the end-use market. The entire set of investment decisions in the LNG chain can be driven by objectives such as the ability to book gas reserves by a certain date, optimizing the entire gas portfolio of assets, and in the case of companies with a portfolio of LNG assets, the potential for arbitrage and the ability to leverage and trade each LNG asset. LNG projects now have even more complex business chains than 10 years ago, involving multiple investors in various parts of each chain. Successfully commercializing gas in integrated LNG projects and achieving acceptable returns requires numerous investment decisions. Building such a chain involves high-level coordination and integration of both technical and commercial processes, the option of reducing costs through increasing throughput, and leveraging economies of scale to reduce plant costs.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*60 (12): 59–61.

Paper Number: SPE-1208-0059-JPT

Published: 01 December 2008

... on the judgment of the most trusted member of that team. Decisions that seem particularly vulnerable to biases are choices of distributions for

**input****parameters**that go into simple volumetrics calculations. Such parameters include average porosity, net-/gross-thickness ratio, and formation volume factor. Given...
Abstract

This article, written by Technology Editor Dennis Denney, contains highlights of paper SPE 110765, "Modeling the Economic Impact of Cognitive Biases on Oil and Gas Decisions," by M.B. Welsh, SPE, and S.H. Begg, SPE, University of Adelaide, and R.B. Bratvold, SPE, University of Stavanger, prepared for the 2007 SPE Annual Technical Conference and Exhibition, Anaheim, California, 11-14 November. The paper has not been peer reviewed. Cognitive biases are known to affect decisions made under conditions of uncertainty. Previous demonstrations of these biases have focused on their effect on a single-parameter, typically technical, judgment rather than examining the potential effect on economics when applied to all of the judgments involved in a complex oil and gas (O&G) decision such as reservoir characterization. Three individual biases—overconfidence, trust heuristic, and availability heuristic—were modeled to measure their effect on economic outcome as represented by net present value (NPV). Introduction Cognitive biases are unconscious errors in judgments and decisions (particularly those made under conditions of uncertainty) that arise from the inherent structure and functioning of the brain's cognitive architecture. These errors affect most subjective judgments of probability and value and can be multiplicative—the same biases affecting multiple parameters, thus increasing the effect with the complexity of calculations made with those parameters. The effects that such biases can have on the complex decisions in the O&G industry, however, have not been assessed. Part of the reason is the perceived difference in scale between simple judgments, on which biases operate, and large exploration and development decisions. This difference is, however, largely illusory because the largest industry decisions, at their base, depend on the judgments of individuals and are, thus, vulnerable to any biases that affect those individuals. Many industry decisions are made by teams, but studies suggest that, in many cases, the "team" decision actually is based largely on the judgment of the most trusted member of that team. Decisions that seem particularly vulnerable to biases are choices of distributions for input parameters that go into simple volumetrics calculations. Such parameters include average porosity, net-/gross-thickness ratio, and formation volume factor. Given the high degree of subjectivity in assessing these distributions and the difficulty of calibrating the assessments, they are extremely susceptible to cognitive bias. The problem for large O&G decisions is that they rely on many such subjective estimates, all of which are likely to be affected by bias. Modeling Biases. Three biases were chosen for modeling: overconfidence, trust heuristic, and availability heuristic. The models used for overconfidence and availability heuristic differ slightly but were run using an offshore-development decision regarding a large, but economically marginal, field. To limit the scope of the problem, bias and uncertainty were restricted to parameters affecting volumetrics. The development characteristics were determined by the interplay of reserves according to realistic rules (including determining the number of wells appropriate to a discovery of a certain size) and by capacity limits, development schedule, and pressure depletion.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*59 (09): 80–83.

Paper Number: SPE-0907-0080-JPT

Published: 01 September 2007

... and fit-for-purpose accuracy. The implementation allows modifying the analytical elastic calculations to achieve a more realistic and accurate prediction. The remaining major uncertainties in the predictive capacity of the model arise from uncertainties of the

**input****parameters**rather than from the model...
Abstract

This article, written by Technology Editor Dennis Denney, contains highlights of paper SPE 100944, "Practical Approach To Achieve Accuracy in Sanding Prediction," by K. Qiu, SPE, J.R. Marsden, SPE, J. Alexander, SPE, and A. Retnanto, SPE, Schlumberger, and O.A. Abdelkarim, SPE, and M. Shatwan, Agoco, prepared for the 2006 SPE Asia Pacific Oil & Gas Conference and Exhibition, Adelaide, Australia, 11–13 September. Sand production can reduce oil production, cause erosion in downhole and surface facilities, require additional separation and disposal, and lead to significant economic loss. Precautionary, but unnecessary, sand prevention will result in unwarranted reduction in productivity. Overestimating or underestimating sanding risk increases the chances of serious economic loss. Reliable sanding-prediction analysis provides a basis for designs that achieve appropriate sand-management strategies and maximize economic production. Introduction The giant Messla field is in the southeast portion of Sirte basin in Libya, approximately 500 km southeast from Benghazi. The field has been producing for more than 30 years, and since the mid-1980s, some wells have suffered massive sanding, while others have not. A geomechanics and sanding study was initiated in 2005 to investigate this variation, to evaluate the severity of sanding risk in other wells, and to provide information and interpretations needed to design appropriate completions, maximize economical production, and optimize future reservoir management. Sanding analyses were conducted by use of a proprietary sanding-prediction application. This application incorporates a novel analytical model that integrates a simple linear elastic analysis and then accounts for rock failure, plasticity effects, and scale effects. Information is from simple laboratory tests and the results of previous extensive numerical modeling and field validation. Significant factors contributing to sand production include stresses, rock strength, draw-down and depletion, and completion type and geometry so that the analysis achieves practicability and fit-for-purpose accuracy. The implementation allows modifying the analytical elastic calculations to achieve a more realistic and accurate prediction. The remaining major uncertainties in the predictive capacity of the model arise from uncertainties of the input parameters rather than from the model itself. Therefore, to achieve the desired accuracy in the analyses and predictions, attention should focus on decreasing uncertainties of the input parameters. Sanding-Prediction Approaches There are three main sources of uncertainty in sanding prediction: the predictive model itself, the input data for the model, and processing uncertainties such as truncation and round-off errors. Reliable analyses are obtained by generating a well-constrained mechanical Earth model (MEM) to minimize uncertainties in the input data, by accounting for all the main influences, and by simplifying processing. Empirical Approach. This popular approach relies on data collected from the field, possibly supplemented by laboratory experiments on cores. In most instances, only one or two parameters are used to assess sanding risk and to establish cutoffs for conditions of sanding or no sanding. For example, a cutoff depth or a cutoff compressional-slowness criterion might be used to determine whether or not sand control might be needed in a particular field.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*58 (06): 69–71.

Paper Number: SPE-0606-0069-JPT

Published: 01 June 2006

... distribution of possible outcomes cannot cover all possible parameter combinations. Creating a response surface based on a reduced number of simulation runs becomes necessary. Such a response surface can be used to approximate results for several variations of

**input****parameters**. An approach in which reservoir...
Abstract

This article, written by Technology Editor Dennis Denney, contains highlights of paper SPE 94357, "Treating Uncertainties in Reservoir-Performance Prediction With Neural Networks," by J.P. Lechner, SPE, OMV A.G., and G. Zangl, SPE, Schlumberger Information Solutions, prepared for the 2005 SPE Europec/EAGE Annual Conference, Madrid, Spain, 13–16 June. In development projects, reservoir parameters are known only within certain ranges, which results in various realizations of the subsurface. Because of the computational time involved, simulation models to obtain a probability distribution of possible outcomes cannot cover all possible parameter combinations. Creating a response surface based on a reduced number of simulation runs becomes necessary. Such a response surface can be used to approximate results for several variations of input parameters. An approach in which reservoir response is captured by an artificial neural network (ANN) has been investigated. The trained ANN model was used in Monte Carlo simulations to generate the probability distribution of possible outcomes. Introduction Often, simulation models are built to design field-development plans. Because of the low level of information during the early development stage, a range of uncertainty in the prediction scenarios must be considered. Such uncertainties can be handled with ANNs. Numerical models (simulation models) are detailed and powerful predictive tools that can answer many questions regarding subsurface engineering. However, especially in the early development phase, uncertainties are large, and prediction results will span a broad range. To take into account the whole range of possible outcomes of reservoir simulation, optimization routines would have to run the numerical models perhaps thousands of times. Although the most likely solution that respects subsurface uncertainties is of interest, the most profitable solution for reservoir management must be found. Because of the computational time involved, these methodologies often are not used. Building a response surface that can predict many possible outcomes of a numerical simulation by processing a large variety of input parameters could provide a solution to this problem. The response surface must cover the non-linear dependencies between input and output parameters. Experimental design and response-surface modeling has been used to delineate subsurface uncertainty. Common applications include determining the probability distribution of original oil in place from geological models, quantifying uncertainty in predevelopment or preredevelopment projects, predicting production performance from different realizations of a geological model, and optimizing the location of new wells to maximize net profit. Methodology Fig. 1 shows the proposed workflow. As a first step, a limited number of simulation runs is carried out to define the most sensitive input parameters. In these runs, parameters are altered one at a time, either to the minimum or to the maximum. After selecting the most influential input parameters, three values (minimum, best-estimate, and maximum) are defined, and an experimental design is set up. The experimental design aims at obtaining the maximum information with a minimum number of simulation runs. However, practice showed that a minimum number of training data sets are required by the ANN to be able to make sufficiently accurate predictions. With these data sets containing varying input parameters and the corresponding output, an ANN is trained and tested with several additional simulation runs. After defining the probability distribution of the input parameters, the trained ANN is used as a proxy function for a Monte Carlo simulation. Thereby, a probability distribution of the desired output parameter (e.g., cumulative production after a certain number of years) is gained. The quality of the ANN is measured by the root-mean-square error of the deviation between the predicted and the actual output.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*52 (11): 82–87.

Paper Number: SPE-62415-JPT

Published: 01 November 2000

... approximate reasoning frontier formation restimulation neural network fuzzy rule operation fuzzy-decision support system artificial intelligence author series virtual-intelligence application fuzzy system petroleum engineering

**input****parameter**Virtual-Intell igence Applications in Petroleum...
Abstract

Distinguished Author Series articles are general, descriptive representations that summarize the state of the art in an area of technology by describing recent developments for readers who are not specialists in the topics discussed. Written by individuals recognized to be experts in the area, these articles provide key references to more definitive work and present specific details only to illustrate the technology. Purpose: to inform the general readership of recent advances in various areas of petroleum engineering. Summary Parts 1 and 2 of this series of articles presented a general overview of artificial neural networks and evolutionary computing, respectively, and their applications in the oil and gas industry. 1,2 The focus of this article is fuzzy logic. The article provides overview of the subject and its potential application in solving petroleum-engineering-related problems. As the previous articles mentioned, the most successful applications of intelligent systems, especially when solving engineering problems, have been achieved by use of different intelligent tools in concert and as a hybrid system. This article reviews the application of fuzzy logic for restimulation-candidate selection in a tight-gas formation in the Rocky Mountains. We chose this particular application because it uses fuzzy logic in a hybrid manner integrated with neural networks and genetic algorithms. Background The science of today is based on Aristotle's crisp logic formed more than2,000 years ago. Aristotelian logic looks at the world in a bivalent manner, such as black and white, yes and no, and 0 and 1. The set theory developed in the late 19th Century by German mathematician Cantor was based on Aristotle'sbivalent logic and made this logic accessible to modern science. Subsequent superimposition of probability theory made the bivalent logic reasonable and workable. Cantor's theory defines sets as a collection of definite, distinguishable objects. Fig. 1 is a simple example of Cantor's set theory and its most common operations, such as complement, intersection, and union. The first work on vagueness dates back to the first decade 20th Century, when American philosopher Pierce noted that "vagueness is no more to be done away with in the world of logic than friction in mechanics." 3 In the early 1920's, Polish mathematician and logician Lukasiewicz 4 developed three-valued logic and talked about many-valued, or multi-valued, logic. In 1937, quantum philosopher Black 5 published a paper on vague sets. These scientists built the foundation on which fuzzy logic was later developed. Zadeh, 6 known as the father of fuzzy logic, published his landmark paper "Fuzzy Sets" in 1965. He developed many key concepts, including membership values, and provided a comprehensive framework to apply the theory to engineering and scientific problems. This framework included the classical operations for fuzzy sets, which comprise all the mathematical tools necessary to apply the fuzzy-set theory to real-world problems. Zadeh was the first to use the term "fuzzy," which provoked much opposition. A tireless spokesperson for the field, he was often harshly criticized. At a 1972 conference, Kalman stated that "Fuzzification is a kind of scientific permissiveness; it tends to result in socially appealing slogans unaccompanied by the discipline of hard scientific work." 7 (Note that Kalman is a former student of Zadeh'sand inventor of the famous Kalman filter, a major statistical tool in electrical engineering. The Kalman filter is the technology behind the Patriotmissiles used in the Gulf War. Claims have been made that it has been proved that use of fuzzy logic can significantly increase the accuracy of these missiles. 8,9 ) Despite all its adversaries, fuzzy logic continued to flourish and has become a major force behind many advances in intelligent systems. The word "fuzzy" carries a negative connotation in Western culture, and" fuzzy logic" seems to misdirect the attention and to celebrate mentalfog. 10 On the other hand, Eastern culture embraces the concept of coexistence of contradictions as it appears in the yin/yang symbol ( Fig.2 ). While Aristotelian logic preaches A or Not-A, Buddhism is all about A and Not-A. Many believe that the tolerance of Eastern culture for such ideas is the main reason behind the success of fuzzy logic in Japan. While fuzzy logic was being attacked in the U.S., Japanese industries were busy building amultibillion-dollar industry around it. Today, the Japanese hold more than2,000 fuzzy-related patents. They have used fuzzy technology to build intelligent household appliances, such as washing machines and vacuum cleaners(Matsushita and Hitachi), rice cookers (Matsushita and Sanyo), air conditioners(Mitsubishi), and microwave ovens (Sharp, Sanyo, and Toshiba), to name a few. Matsushita used fuzzy technology to develop its digital image stabilizer for camcorders. Adaptive fuzzy systems (a hybrid with neural networks) can be found in many Japanese cars. Nissan patented a fuzzy automatic transmission that is now very popular with many other manufacturers, such as Mitsubishi and Honda. 10

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*47 (06): 504–505.

Paper Number: SPE-30887-JPT

Published: 01 June 1995

... of Petroleum Engineers drilling performance drilling time problem day risk and uncertainty assessment output distribution historical data problem time afe estimate engineer risk management time estimate upstream oil & gas information risk analysis risk assessment

**input****parameter**...
Abstract

Summary This paper presents a method for developing an authorization-for-expenditure (AFE) -generating model and illustrates the technique with a specific offshore field development case study. The model combines Monte Carlo simulation and statistical analysis of historical drilling data to generate more accurate, risked, AFE estimates. In addition to the general method, two examples of making AFE time estimates for North Sea wells with the presented techniques are given.

Journal Articles

#### Maximum Information at Minimum Cost: A North Sea Field Development Study With an Experimental Design

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*44 (12): 1350–1356.

Paper Number: SPE-23139-PA

Published: 01 December 1992

... obtained when one parameter is varied at a time can be obtained with significantly fewer simulation runs. Interactions among the various

**input****parameters**can be identified and estimated with a more elaborate design. The technique can be applied without profound statistical insight with commercially...
Abstract

Summary Statistical design of experiments is a technique to maximize the information obtained from a minimum number of experiments. This technique, however, has not been used extensively in the oil industry. With a well-designed setup, the same information obtained when one parameter is varied at a time can be obtained with significantly fewer simulation runs. Interactions among the various input parameters can be identified and estimated with a more elaborate design. The technique can be applied without profound statistical insight with commercially available packages for statistical analysis. Introduction In the evaluation and planning of a reservoir development, the common approach is to build the expected geological model with the most representative set of dynamic parameters and the best set of well locations. The platform and production facilities are optimized for this model. This combination of geological model, dynamic parameters, and technical design constitutes the base or reference case. A reservoir simulation is then performed, giving the base-case production profile and recovery factor. Finally, this production profile is combined with a fixed scenario for future oil and/or gas prices and investments to obtain the economic indicators [net present value (NPV) and return on investment (ROI)] for the project. Fig. 1 shows the different elements of this procedure. Most factors in this process are uncertain. The geological model is wrong, the dynamic parameters are incorrect, the well locations and facilities are not optimal, and the price scenario certainly will not be fulfilled. Thus, only one thing is certain: the results from the base-case calculations are definitely wrong. To evaluate how the various parameters that enter the process influence the results, a sensitivity study usually is performed. The standard approach is to vary one parameter at a time, keeping all other parameters at the base-case value. Two runs (with an optimistic and a pessimistic setting) are required for most parameters. The number of runs required for a full sensitivity study quickly becomes prohibitive. Fig. 1 gives an example (somewhat exaggerated) for a North Sea gas field and shows the sensitivities of potential interest. If there are 5 gas-sales scenarios, 5 handling capacities, 5 different numbers of wells, 2 different horizontal-well lengths, 2 vertical start positions, and 15 geological parameters with uncertainties of interest, it will take 15,500 simulations to investigate these sensitivities by varying one parameter at a time. Because a single run of this particular full-field reservoir model takes some 20 CPU hours on a Cray, the study would take about 40 years and would be finished just a few years before the scheduled field shutdown. Thus, any technique that can reduce the number of runs required will be very useful. If two or more parameters differ from their base-case values simultaneously, the effect on production may differ from the cumulative effect of varying one parameter at a time. The effect of varying several parameters simultaneously cannot be investigated by varying one parameter at a time. Thus, a technique that can shed some light on these simultaneous effects will be of value. In experimental design, several parameters are varied simultaneously according to a predefined pattern. With this technique, the same information as that obtained with the one-parameter-at-a-time method can be developed with significantly fewer simulation runs. Some understanding of the possible interactions among parameters also can be obtained. The theory behind experimental design was developed in the 1920's for agricultural applications. Box et al . 1 present a basic but more modern reference, while Ross 2 presents the topic within the framework of modern Japanese ideas about quality control. Sacks et al . 3 introduce experimental design to computer experiments, and Morris 4 extends their ideas. Experimental design has not been applied widely in the petroleum industry. Sawyer et al . 5 is a very early reference, and Chu 6 shows a recent application. Egeland et al . 7 extend the ideas presented here. This paper presents the basic concepts of experimental design. The technique then is applied to a real case study from the North Sea, illustrating that it is possible to obtain the basic sensitivities with a minimum number of simulations and additional information about possible interaction effects from a few more runs. The results from the analysis also can be used as input to a simple Monte Carlo simulation to obtain approximate uncertainty ranges for the parameters under study. Øvreberg et al . 8 present the context for such an analysis.

Journal Articles

Journal:
Journal of Petroleum Technology

Publisher: Society of Petroleum Engineers (SPE)

*J Pet Technol*38 (10): 1111–1112.

Paper Number: SPE-14488-PA

Published: 01 October 1986

... monte carlo simulation technique simulation technique spe 14 4 8 8

**input****parameter**acceptable solution reservoir simulation oil industry operation simulation model random number Principles of Simulation Applied to Oilfield Venture Analysis With Systems Approach Tarkeshwar Kumar,* SPE, Indian...
Abstract

Summary The application of the systems approach to problem solving in general and to the Monte Carlo simulation technique in particular is discussed. The use and development of a simulation model, with examples from oil industry operations, are also presented. Introduction The Monte Carlo simulation technique is generally used for evaluating uncertainty and involves a number of input variables, each one having an estimated probability distribution and one or more interrelationships among these variables. Oil industry operations are carried out under conditions of uncertainty. Exploration, drilling, development, and workovers face a high degree of uncertainty in their outcome. The systems approach, used with the complex and uncertain financial situation of oilfield ventures, provides a quick, more reliable solution. System Concept and Approach The systems approach can be applied when any type of problem - economic, engineering, social, or behavioral - is problem - economic, engineering, social, or behavioral - is being solved. In the oil field, it can be related to drilling, production, pipeline, oil reservoir, processing, or marketing production, pipeline, oil reservoir, processing, or marketing systems. In the systems approach of problem solving, a conscious attempt is made to understand the relationship between various parts of the system and to find alternative solutions after the parts of the system and to find alternative solutions after the objectives, constraints, and restraints are identified. Essentially, the systems approach is based on the conviction that before any functional solution is implemented, one must examine its ultimate effect on other functional areas and on the entire system. After the system is conceptualized, the systems approach is used in 10 steps to solve the problem systematically. These steps constitute the basic structure of the decision model. Identifying the problem. This requires conscious and systematic recognition of the problem based on a scientific approach and systems orientation. The ultimate purpose is the precise definition of the problem. precise definition of the problem. Formulating the goals and objectives. Listing the constraints and restraints on the basis of assumptions and environment, such as financial, legal, and operational factors. Devising various alternative solutions to the problem with the factors present. Developing the decision model or running the simulation model when that is the chosen method. Determining the solution or the simulated behavior of the system under various options with the decision model. Selecting the acceptable solution or behavior of the system on the basis of the acceptable criteria. Implementing the decision. Monitoring the various control parameters. Evaluating the collected information during monitoring, and improving or redefining the system by changing control parameters or by critically examining the validity of the stated parameters or by critically examining the validity of the stated goals and objectives. Monte Carlo Simulation Model This simulation is based on the probability of the occurrence of a number of parameters that constitute a solution. It is assumed with good reason that in the possible outcome of a given system, it is unnecessary for all favorable or unfavorable parameters to occur at one time. The outcome is based on experimentation - i.e., running the model many times to simulate the probability distribution of outcomes. Therefore, the Monte Carlo simulation model has been recognized as a very powerful tool for oil exploration ventures, reservoir parameter estimations, and financial analyses of all types of projects that involve risk and uncertainty. Development of the Simulation Model Development of the decision model for simulation involves the identification of (1) the system that is being studied (e.g., reservoir hydrocarbon systems); (2) components of the system (fluid properties, reservoir rock boundary, and production data); (3) criterion variables (e.g., recoverable oil); (4) decision or controllable variables (e.g., rock volume, porosity, water saturation, and oil FVF's); and (5) functional or system relationships. In each simulation experiment, the elements of the decision model must be fully specified before the model is run. Development of the simulation model has to be based on the best available knowledge of the actual system. JPT P. 1111