Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
- Paper Number
NARROW
Format
Subjects
Close
Date
Availability
1-20 of 25
Information Management and Systems
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Unconventional Resources Conference, April 1–3, 2014
Paper Number: SPE-168972-MS
Abstract
This paper describes a comprehensive methodology to rank drilling locations in a very large, unconventional tight-oil area based on both surface and subsurface characteristics. A North American case study is presented. Operators entering new prospect areas with little well control must make decisions on locating limited numbers of exploratory wells. Large prospect areas can be characterized by a variety of different mineral rights, surface-permitting obligations, uncertainties in geologic subsurface characteristics, complexity of topography, a variety of available infrastructures, potential operational issues, and environmental and regulatory challenges. The decision to appraise an acreage position is very complex and has a high level of risk. One solution to rank and risk candidate drilling locations is a procedure that considers the most critical variables needed for appraisal. The variables are used to generate optimized drilling location scenarios under various uncertainties and constraints using a mathematical probabilistic-optimization procedure and a ‘scorecard’ of surface cultural characteristics.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Unconventional Resources Conference, April 1–3, 2014
Paper Number: SPE-168978-MS
Abstract
In this paper, we present a methodology to predict the performance of horizontal gas wells in unconventional reservoirs using publicly available completion data. Our process combines public domain data with statistical analysis and probabilistic simulation methods to forecast well performance without a detailed reservoir characterization. We have tested our methodology using a 425-well dataset from the unconventional Montney resource play in British Columbia, Canada. We believe this workflow can be applied to other resource plays with similar data. In our SPE Paper 167154 [1], we determined the sensitivity of production performance to completion parameters using multivariate regression analysis on the same 425-well dataset from the Montney formation. We found that the number of fracture stages and the number of perforation clusters per stage were the most influential predictors of well performance. In this paper, we discuss how we combined the regression analysis results with probabilistic methods to predict well performance. The model converts the deterministic regression coefficients into probabilistic distributions to account for parameters not considered in the original regression analysis, including reservoir properties. The results of our study show that by using this model, we can match the range of actual well performance outcomes with a 95% confidence. Considering the importance of shale gas resources to the North American energy supply and the difficulty of characterizing shale gas reservoirs, this methodology offers a distinct advantage by providing a predictive model for well performance without the need for a detailed reservoir characterization. This also could be a beneficial tool to use in scoping studies where high-level, rapid evaluation is required.
Proceedings Papers
Forecasting Well Performance in a Discontinuous Tight Oil Reservoir Using Artificial Neural Networks
Y.. Bansal, T.. Ertekin, Z.. Karpyn, L.. Ayala, A.. Nejad, Fnu Suleen, O.. Balogun, D.. Liebmann, Q.. Sun
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Unconventional Resources Conference-USA, April 10–12, 2013
Paper Number: SPE-164542-MS
Abstract
Improving the economics of the production and development of an unconventional reservoir system is a key to meeting increased demand for hydrocarbons in the near future. In general, reservoir development is vastly assisted by using hard-computing models to evaluate the potential of the formation. These models have been used to identify infill drilling locations and forecast production. However, preparing the simulation models for discontinuous tight oil reservoir systems poses a challenge with hard-computing protocols. This paper discusses a methodology developed to depict the production characteristics of a reservoir via the geological properties of the reservoir. The methodology discussed in the paper is time efficient and is proven to generate effective results. The methodology discussed in the paper utilizes Artificial Neural Networks (ANN) to map the existing complex relationships between seismic data, well logs, completion parameters and production characteristics. ANNs developed in this work are used to forecast oil, water and gas cumulative production for a two year period. The results obtained are also extended to identify potential infill drilling locations. This work enables the practicing engineer and the geoscientist to analyze an entire reservoir in a time efficient manner. The workflow is demonstrated on a discontinuous tight oil reservoir located in West Texas. The results discussed in the paper show the robust nature of the methodology. The workflow also helps in improving the resolution of the production surfaces which help in identifying productive, yet undrilled, locations in the reservoir. The production surface for the entire field is forecasted within a one minute time frame (~6600 locations). The method developed will help in avoiding low producing wells prior to drilling, and thus, is expected to help in the economic development of complex tight oil reservoirs.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the Tight Gas Completions Conference, November 2–3, 2010
Paper Number: SPE-138277-MS
Abstract
This paper discusses a project of data mining through a database of more than 1,000 frac stages in the Barnett Shale. The database includes fracture pressure and other job data as well as the fracture shape that was mapped by microseismic processes. The original intent of the data mining project was to seek relationships between hydraulic fractures staying in zone and aiding in generating a fracture network as these events relate to net pressure trend. Results from the data mining exercise were revealing. The implications of this work potentially yield answers to questions such as: Can a frac model be developed that predicts fracture network development in a shale? Is using poroelastic-based physics in a model appropriate for shales? Can a model of this type be used to analyze frac data? What is the ultimate value of microseismic mapping in shale reservoirs? This paper is the first in a series on the database and will answer most of these questions. It will also propose future issues in the development of artificial analysis that may help select fracture network as opposed to planar frac geometry from geological and geographic data.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the CIPC/SPE Gas Technology Symposium 2008 Joint Conference, June 16–19, 2008
Paper Number: SPE-114454-MS
Abstract
Abstract Successful prediction of the future performance of condensate reservoirs requires accurate values of dew point pressures of the interesting reservoirs. Although the dew point pressure can be determined experimentally from collected laboratory samples, these measurements are frequently not available. In these cases, fluid reservoir properties are determined with the use of empirical correlations or determined iteratively using an equation of state (EOS). The objective of this paper is to present an application of genetic programming (GP)-Orthogonal Least Squares algorithm (OLS) to generate linear-in-parameters dew point pressure model represented by tree structures. The GP-OLS based gas condensate reservoir dew point pressure model was generated as a function of reservoir fluid composition (in terms of mol fractions of methane through heptanes-plus, nitrogen, carbon dioxide, and hydrogen sulphide, and the molecular weight of the heptanes-plus fraction), and reservoir temperature. The new model was developed using experimentally measurements of 245 gas condensate systems covering a wide range of gas properties and reservoir temperatures. One hundred-thirty five gas condensate systems that not introduced in building the new model were used to test and validate it against the other early published correlations. The validity test shows that the new model is more accurate than the other tested correlations, whereas the new model has the lowest average absolute relative error. Therefore, the new model can be considered an alternative method to estimate the dew point pressure when the experimentally measurement is not available. Introduction Gas condensate is considered a very complex reservoir fluid because of its high gas-oil ratio 1 . Hence a large number of components (13 to 18) are typically needed to properly characterize the condensation behavior. One of the most important factors to be evaluated by engineers in planning the development of a gas condensate reservoir is the dewpoint pressure (DPP) of the original fluid. The DPP is defined by a substantial amount of gas phase that exists in equilibrium with an infinitesimal amount of liquid phase. The determination of gas-condensate DPP is essential for fluid characterization, gas reservoir performance calculations, and for the design of production systems. Practically, well-productivity often declines rapidly when near-wellbore pressure drops below the DPP. Thus, it is very important to accurately determine the DPP for gas condensate reservoirs. The experimental determination of DPP at reservoir temperature for gas condensate reservoirs is relatively time consuming and expensive 2 . On the other hand, the DPP is determined with the use of empirical correlations, but to date they have accuracy limitations, or it can be determined iteratively using an equation of state (EOS). Therefore, searching or developing a high accuracy mathematical model for predicting the DPP of gas condensate reservoirs is usually requested. This paper presents an application of genetic programming (GP)-Orthogonal Least Squares algorithm (OLS) to generate dewpoint pressure (DPP) model represented by tree structures. The GP-OLS based gas condensate reservoir DPP model was generated as a function of reservoir fluid composition (in terms methane to heptanes-plus mol fractions, nitrogen, carbon dioxide, and hydrogen sulphide, and molecular weight of heptanes-plus fraction), and reservoir temperature. Dewpoint Pressure Determination Olds et al. 3 studied experimentally the behavior of five paired samples of oil and gas obtained from wells in San Joacuin fields in California. Their investigations resulted in developing a rough correlation relating the retrograde DPP to the gas-oil ratio, temperature and stocktank API oil gravity. The results of this correlation were presented in tabulated and graphical forms. This correlation is applicable only for gas-oil ratio of 15,000–40,000 scf/STB, for temperature of 100–220°F, and for API oil gravity of 52°-64°.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, April 30–May 2, 2002
Paper Number: SPE-75515-MS
Abstract
Abstract Intelligent reservoir management requires assessment of risks caused by geologic heterogeneity and examination of alternative development schemes. Numerical simulation is a powerful tool for integrating geologic and development models, but simulation is often too expensive or time-consuming if there are many sensitivities and scenarios to examine. In this paper, response surface methods are used to approximate the relationship between gas recovery responses and reservoir and production parameters. The response models are based on a set of numerical simulations selected using experimental design, which efficiently estimates effects of many factors across a broad range of uncertainty. The three-dimensional reservoir model examined in this study is from an outcrop study of the Permian-age Bell Canyon, which is interpreted as a channel-levee-lobe turbidite system. Examination of realistic reservoir models is complicated by the large number of factors and by correlations between model factors. Principal component analysis can simplify the modeling process by using the correlations to reduce the dimensionality of the factor space. A large number of reservoir geologic parameters are reduced to a small set of principal components while most of the original geological information is still preserved. The principal component design also spans the scatter-cloud of correlated reservoir factors more efficiently. Based on these principal components and several engineering factors, reservoir simulations are selected using a central composite design. Response surface models relating the simulation responses and the design factors are estimated via linear regression. The validity of the models is verified by reservoir simulation of randomly selected points within the feasible region of geological parameters. Uncertainty analysis, including parameter sensitivity estimation and recovery uncertainty assessment, is done conveniently and inexpensively using response surface models and Monte-Carlo simulations. Quality maps identify optimal well locations within the complex depositional setting documented at Willow Mountain. Introduction Deep-water clastic reservoirs are important targets in worldwide petroleum exploration and production. These reservoirs often have complex architecture and lithologic variations. These complexities, combined with the often high cost of deep-water development, place a premium on reservoir characterization and recovery predictions. Studies of the deep-water sandstones of the Permian-age Bell Canyon Formation, West Texas from well-exposed outcrops 1 provide insight into turbidite reservoir architecture. A detailed 3-D geological model from a single high-order cycle within the Bell Canyon Formation (referred to as Willow Mountain) was created, which is the basis for further work in this paper. Numerical reservoir simulation is often used to investigate sensitivities, recovery performance and optimal well placement. However, this process is computationally intensive and often expensive, especially for large and fine-grid models and numerous simulation runs. For this reason, geological models often cannot be directly used as input to reservoir simulation before they are upscaled to a feasible level, which would lose details of some geological structure. On the other hand, even the most sophisticated reservoir simulation is an imperfect representation of the reality so that reservoir simulations must be viewed in the context of these geologic and engineering uncertainties. 2,3
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, April 30–May 2, 2002
Paper Number: SPE-75702-MS
Abstract
Abstract An excessive high treatment pressure may be required to execute a production enhancing hydraulic fracture treatment in a tight-gas reservoir, particularly if the treatment is designed with inappropriate values for treatment parameters: fracturing fluid viscosity, injection rate, injection time and proppant concentration. Such a high treatment pressure may not only exceed the delivering capacity of specified surface equipment (pump, pressure rating devices and downhole tubing), but may also cause multiple fracture initiation. Multiple fracture initiation may cause near-wellbore tortuosity complexities and large fluid loss, and thus may damage the formation irreversibly that results in productivity lower than even unfractured wells. This paper presents an integrated model for multivariate fracture treatment optimization with adequate trade-offs between production enhancement, equipment capacity and formation compatibility requirements. The model considers both fracture geometry (length, height, width etc.) as well as treatment parameters as free design variables. Compatibility relationships between reservoir properties, treatment parameters and fracture growth are formulated using a modified pseudo-3D fracture model. Design constraints are formulated to ensure that the final optimum design is compatible with specified equipment and formation characteristic to avoid the above-mentioned fracture complexities. The optimal design of fracturing treatments to maximize cumulative production is demonstrated in the paper by a series of applications of the model to a tight-gas reservoir. Sensitivity results of various parameters are also presented in the paper. Introduction Despite many success events, hydraulic fracturing has displayed, in many occasions, poor post-frac productivity that has not been commensurate with the added expense of fracturing the well. In the literature, this poor performance has been attributed to unfavorable formation characteristics and in-situ stresses, such as presence of natural fractures, coal laminations adjacent to reservoirs, relatively high tensile strength of formation, relatively high deviatoric stresses and reverse faulting stress regime for which a conventional vertical well is not appropriate. Disoriented fracture initiation and propagation due to inappropriate perforation practice is also pointed out. These factors are believed to cause near-wellbore fracture tortuosity that eventually results in high fracture initiation and propagation pressures, high frictional pressure drops, fracture width constrictions, low in-place proppant concentrations, premature screen-outs, etc. While these issues have been investigated in many recent works, this paper stems out from the motivation that many of these unfavorable effects may result from inferior treatment design and thus an integrated model is needed for optimizing treatment parameters with constraints so that the unfavorable effects may be avoided.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE/CERI Gas Technology Symposium, April 3–5, 2000
Paper Number: SPE-59752-MS
Abstract
Abstract In this study, the rheology of foams was investigated using aqueous foamsand gelled foams and employing a pipe type viscometer. Water was used as theliquid phase for the aqueous foams while guar was used as the gelling agent forthe gelled foams. Surfactant at 0.5% (vol.) concentration was used as thefoaming agent. The flow data showed that foams behave like Herschel-Bulkleymodel fluids. New empirical correlations have been developed from the experimental data topredict foam fluid apparent viscosity. These correlations are function ofliquid phase properties and foam quality and are applicable to the foam systemstested in the wide range of shear rates and temperatures investigated. The useof these new correlations will provide more accurate estimation of the foamfluid rheological properties. Introduction Currently, the use of foam fluids is increasing rapidly in the petroleumindustry because these fluids exhibit properties that are desirable in manyfield operations 1. In drilling, the reduced density of the foam fluids, theirhigh carrying capacity, and their minimum filtrate and circulation losses areamong the desirable properties of the drilling fluids during underbalanceddrilling operations. The use of foams as a fracturing fluid also presentsseveral advantages. The high carrying capacity, the minimum amount of fluidplaced in the formation, and the excellent fluid recovery after treatment aresome of the advantages that foam fluids present when used during fracturingoperations. Although not limited to, foams have also been used successfully inoil and gas fields in well stimulation, clean up, and fishing operations. Foams are complex mixtures of a gas, a liquid, and a surfactant whoserheological properties are strongly influenced by parameters like temperature, absolute pressure, foam quality, texture, foam-channel wall interactions, liquid phase properties, and type and concentration of surfactant2, 3. Therefore, the rheology of foams is more complex than that of simplerfluids. When used in oil field operations, foam exists in an unsteady state due toconstant changes in temperature, pressure, composition, and shear rates. Thesechanges affect the flow properties of the foam fluids and may drastically altertheir hydraulic behavior. Consequently, all the variable values predictedduring the design process depend on the changes of the foam properties. Forthis reason, the calculation and prediction of the friction losses in laminarand turbulent flow for foam fluids is a major challenge in oil fieldoperations. In order to improve the prediction of the friction losses when pumping foams intubular goods, a comprehensive study of the effects of liquid phase type, shearrate, pipe diameter, temperature, and quality on the rheological properties offoams was conducted using the new foam loop of the Well Construction TechnologyCenter (WCTC) of The University of Oklahoma. Tests were run in laminar flowregime for aqueous and gelled foams at 1000 psia, for qualities ranging fromzero to 80%, and temperatures ranging from 75oF to 175 °F using two differentpipe sizes.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE/CERI Gas Technology Symposium, April 3–5, 2000
Paper Number: SPE-59748-MS
Abstract
Abstract This paper documents attempts to automatically unload gas wells in the Western Sedimentary Basin at various depths and producing conditions. The results include production increases of 25 percent, with reduced operating costs. The criteria for application of this technology and a cost/benefit analysis are presented. This work is being incorporated in a large-scale project currently underway in Oklahoma, which will be discussed briefly as an example of integrating automation technology with expert systems to provide field-wide optimization of a major gas field. Introduction In upstream oil and gas production, supervisory control and data acquisition (SCADA) systems assist operators by constant monitoring of entire fields. These systems will even telephone the operator in the middle of the night if a serious problem occurs. The operator certainly wants to be alerted when a problem occurs, but wellhead automation now has the capabilities of proactively averting problems, literally allowing the operator to sleep better at night. For gas wells, the production of water causes many operational problems up to chronic loss of production. Operators combat water problems by blowing the well down, dropping soap sticks, stopcocking the well, or perhaps rocking the well if the problem is severe. If a well is in a remote area, the operator may not be able to attend the well often enough to avoid problems. Automation equipment has been adapted to perform some of the routine operations. Wellhead equipment can now be set up to detect the presence of liquids in the wellbore and unload the well automatically. The equipment adjusts to changes in water production, adjusting the unloading times accordingly. Advantages to automated unloading are numerous. Unloading the well regularly allows the liquids to be sent down the flowline, reducing venting of gas. The average amount of liquid in the well is reduced, increasing production. The operation is performed day or night when the well needs unloading, as opposed to when the operator is available. Wells in remote areas receive the attention they need constantly, without stressing the operator in times of severe weather. Venting to atmosphere is eliminated, reducing environmental problems. In economic terms, often the largest advantage is that the wells can be flowed up the casing, reducing friction pressure in the wellbore and increasing production.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE/CERI Gas Technology Symposium, April 3–5, 2000
Paper Number: SPE-59749-MS
Abstract
Abstract To facilitate field and wellsite optimization, Natural Gas producers have come to rely on the strategic placement of compressors to increase production. Mature fields, with depleting reservoirs and/or low wellhead pressures, are ideally suited for this cost-effective gas recovery solution. The decision process by which the production engineer quantifies the placement of a compressor is of significant relevance. There are several strategies available to the production engineer to identify wellsites best suited for optimization. Total Systems Analysis simulation software, Decision Tree Analysis and Real Time Selection Opportunities (RTSO) enable this process. This paper reviews these decision alternatives and the application of the RTSO Risk Matrix to select ideal well compression candidates. Introduction The technology of utilizing field compression has been widely adopted by the Natural Gas industry as an optimization tool. The challenge is to select the optimal field compressor location. Decision-makers can make use of several different methodologies to achieve this result. Computer simulation allows production engineers to identify locations, and it provides a clear understanding of the dynamic interaction of each component within a gathering system. Hahn, Moss, Sood and Carley 1 developed a decision tree that integrates all the necessary parameters to identify wellsite compression candidates. An alternative approach that relies upon a combination of real time data, field operators' input and/or pilot testing has also proved to be an effective process. This paper reviews these decision strategies and the application of the RTSO Risk Matrix approach to specific case studies in Southern Alberta, Canada. Decision Strategies An optimization strategy requires an assessment of the overall field opportunities. It is important to clearly state your optimization objectives in the project scope. The adherence to basic project management principles increases the likelihood of project success. This preliminary review will direct you to the most applicable decision tool. Table 1 summarizes the main characteristics of each decision strategy. It describes the necessary inputs, the analytical requirements and the final outcome of each solution. Once the requirements for each decision strategy are identified and determined to be available, the decision-maker is able to select an appropriate methodology. Total Systems Analysis Simulation Software. Trick M. 2 identifies the application of computer models to optimize gas field production and reservoir management. The computer simulator requires information about every component of the gathering system. All this information is entered into a single data file that can be used to evaluate various scenarios or forecasts. Trick also describes the factors that must be included in a model using Total Systems Analysis: Reservoir Performance, Sandface Inflow Performance, Wellbore and Pipeline losses, as well as Field facility performance data. Simulation software provides an understanding of the dynamic interaction of each component of the gathering system, and facilitates the optimal placement of a compressor. Decision Tree Analysis. The selection strategy developed by Hahn et al. identified two key parameters: Gas Well Cumulative Production History (> 28 E6 m 3 ) and Peak Production Rate (> 30 E3 m 3 /D). Wellsite nodal analysis is then combined with material balance techniques to further the selection process. This methodology has proven to be an effective tool in accurately predicting individual well gas rates. Total Systems Analysis Simulation Software. Trick M. 2 identifies the application of computer models to optimize gas field production and reservoir management. The computer simulator requires information about every component of the gathering system. All this information is entered into a single data file that can be used to evaluate various scenarios or forecasts. Trick also describes the factors that must be included in a model using Total Systems Analysis: Reservoir Performance, Sandface Inflow Performance, Wellbore and Pipeline losses, as well as Field facility performance data. Simulation software provides an understanding of the dynamic interaction of each component of the gathering system, and facilitates the optimal placement of a compressor. Decision Tree Analysis. The selection strategy developed by Hahn et al. identified two key parameters: Gas Well Cumulative Production History (> 28 E6 m 3 ) and Peak Production Rate (> 30 E3 m 3 /D). Wellsite nodal analysis is then combined with material balance techniques to further the selection process. This methodology has proven to be an effective tool in accurately predicting individual well gas rates.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE/CERI Gas Technology Symposium, April 3–5, 2000
Paper Number: SPE-59767-MS
Abstract
Abstract This paper summarizes the development of a methodology for the restimulation candidate selection in tight gas sands. The methodology incorporates virtual intelligence techniques (artificial neural networks, genetic algorithms and fuzzy logic) to achieve this objective. Artificial neural networks are used to develop a representative model of the completion and hydraulic fracturing process in a specific field. Genetic algorithms are used as a search and optimization tool to identify the missed incremental production based on the neural network model. Finally fuzzy logic is used to capture the unique field experiences of the engineers as well as detrimental parameters (if such parameters are indeed present) and incorporate them in the decision making process. Approximate reasoning approach is used at the decision making level to identify the restimulation candidates. Once the methodology is introduced, it is applied to an actual tight sand field in the Rocky Mountain region and the results are presented. Statement of the Problem In 1996, the Gas Research Institute (GRI) performed a scoping study to investigate the potential for natural gas production enhancement via restimulation in the United States (lower 48 onshore). The results indicated that the potential was substantial (over a Tcf in five years). Particularly in tight sand formations of the Rocky Mountains, Mid-Continent and South Texas regions. However, it was also determined that industry's current experience with restimulation is mixed, and that considerable effort is required in candidate selection, problem diagnosis, and treatment selection/design/ implementation for a restimulation program to be successful. Given a lack of both specialized (restimulation) technology and "spare" engineering manpower to focus on restimulation, GRI initiated a subsequent R&D project in 1998 with several objectives. Those objectives are to develop efficient, cost-effective, reliable methodologies to identify wells with high restimulation potential, identify and investigate various mechanisms leading to well underperformance, and develop and test restimulation techniques tailored to each cause of to well underperformance 1 . Addressing the first of the project objectives, an integrated methodology has been developed to select high-potential restimulation candidates in a reliable, cost-effective manner. The technique involves several steps. First, sophisticated statistical approaches are utilized to identify both obvious and subtle differences in well performances, and provide initial insights into potential candidate wells. Secondly, virtual intelligence techniques (a hybrid of artificial neural networks, genetic algorithms, and fuzzy logic) are used to recognize patterns in well performances as they relate to both geologic/reservoir conditions and completion/stimulation operations. With this information, controllable well performance "drivers" can be identified, and this information can in turn be used to select candidate wells, identify possible causes of well underperformance, and begin the treatment selection process. Third, engineering methods such as type-curves are used to high-grade potential restimulation candidates by providing a (relative) indication of reservoir quality and completion efficiency, and hence restimulation potential. Finally, high-potential candidates are individually screened for mechanical integrity, reservoir pressure and other important historical information that may not be uncovered in the previous steps. Lastly, low-cost candidate verification tests are performed to ensure candidate selection potential.
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE/CERI Gas Technology Symposium, April 3–5, 2000
Paper Number: SPE-59789-MS
Abstract
Abstract Can an independent producer fulfill a vision to develop and produce a coalbed methane gas field that covers two hundred square miles in a remote desert plateau in the western United States? This paper will answer that question as well as provide answers to other pertinent questions, such as, "What tools would be available to the operator to efficiently and safely produce the wells?"And, "Would the tools allow for future expansion and/or allow new technologies to be incorporated into the system cost-effectively?" This case study will show how River Gas Corporation was able to successfully meet their needs with an automation system that was jointly developed with an engineering/ manufacturing company. All phases of the project, from its inception in 1991 with a 13-well pilot project to a planned 600-well development will be discussed. Innovative solutions successfully answered the above questions and allowed the system to evolve to meet the continuing changes in expansion and production needs. Introduction River Gas Corporation, a coalbed methane (CBM) producer from Western Alabama, began evaluating a new CBM project in Central Utah. Empowered with the knowledge and experience gained from their 535-well Black Warrior project, they decided (in 1991) to invest in a thirteen-well pilot CBM project. They named the project Drunkard's Wash, after the local landmark. Before drilling, River Gas carefully considered how they were going to operate the wells with a minimal staff while maintaining a high degree of production measurement accuracy. Other operational areas of concern and discussion included bad weather conditions, minimal environmental impact, and an increased degree of production safety. Bottom line, the evaluation and justification to invest fully in the development of this new field would depend upon how much gas they could safely and economically extract. When the Drunkard's Wash project was in its earliest stages of evaluation, River Gas determined that an electronics based method of flow measurement was needed. Several factors helped form River Gas' decision: * Inherent inaccuracy of a chart recorder * Added expense of chart integration * Cheaper cost of electronic based systems
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, April 28–May 1, 1996
Paper Number: SPE-35646-MS
Abstract
Abstract A knowledge-based system has been developed which predicts relative permeabilities to describe the flow of fluids in oil, gas or condensate reservoirs. The software applies heuristic knowledge and artificial intelligence techniques to identify the appropriate experimental methods for measuring the relative permeabilities, and to decide on the mathematical models and computational steps to use to generate the data. The selected models and computational steps are used together with the inbuilt database to generate the relative permeability data which honour the physics of the flow system. Rules that relate the combination of field development scenario, fluid PVT properties, rock lithology and petrophysical properties are included in the knowledge base. The paper describes the parts of the software which address the complex problems associated with relative permeability predictions in gas condensate reservoirs undergoing pressure depletion. The current version of the software runs on a PC under the Microsoft Windows operating system and exploits fully the graphical user interface for data input and output. Introduction The increasing emphasis on optimising recovery from gas condensate fields and the extensive development and use of reservoir simulators for predicting reservoir performance are together creating a widespread need for reliable basic data on rock flow behaviour. In general, in reservoir study involving two phase flow, the relative permeability is the parameter with the major control on reservoir performance. Relative permeabilities provide a basic description of the way in which the phases will move in the reservoir. Definition of the flow process can have a significant effect on the predicted gas/oil production rate and duration, and is important in calculating the volume of recoverable hydrocarbon reserves. The predicted production rates, the plateau level and duration, plus the expected water cut will all influence development plans. The number of wells, the balance between injectors and producers, the sizing of separation equipment, and design of facilities in general can all be impacted upon by the multiphase flow properties of the reservoir in the near wellbore region. Ultimately, together with many other inputs, relative permeability assists in determining reservoir economics, and hence guiding investment decisions. Laboratory measurement of representative relative permeability data on a reservoir core-fluid system is a complex task. The experiments are costly, typically more than $100,000 each, and time consuming, often taking up to six months to complete. Accuracy is limited to the specific core samples and is bounded by narrow saturation limits. A fundamental theoretical approach to modelling multiphase fluid flow in porous rocks is prevented by the complex nature of the problem. Major difficulties arise in mathematically describing flow through a porous system where the lengths, diameters and connectivity of channels are largely unquantifiable. For gas condensate systems the issue is complicated further as the thermodynamic behaviour of a multicomponent system close to their critical region needs to be taken into account. As a result, the experimentally determined gas condensate relative permeabilities are few and usually present a wide range of scattering. Consequently, it is very difficult to determine a representative average function on any basis, with a reservoir unit basis being the most difficult. P. 637
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, April 28–May 1, 1996
Paper Number: SPE-35616-MS
Abstract
Abstract Continuous flow gas lift is one of the most common artificial lift methods in use. It works by injecting a continuous volume of high pressure gas into the tubing or tubing/casing annulus to lighten the column of oil and thus facilitate extraction. Excessive gas input is costly because of high gas prices and compressing costs. Inefficient gas allocation in a field with limited gas availability also reduces production and profitability. Historically, a sensible allocation was achieved using a technique called the equal slope allocation method. This method uses graphical procedures to optimally allocate gas to a group of wells in order to optimize total oil production. However, there is a disadvantage in this method, as it cannot be applied to wells where there is not an instantaneous response to gas injection. The purpose of this paper is to present a methodology for automatically determining the optimum gas injection rate for a group of wells in order to maximize the total oil production rate for a given total amount of gas. The search algorithm proposed is a combination of stochastic domain exploration and a heuristic calculation of a descent direction, in order to avoid stopping the algorithm at a local optimum. Introduction Continuous flow gas lift works by injecting a continuous volume of high pressure gas into the tubing or tubing/casing annulus to lighten the column of oil and thus facilitate extraction. Ideally, if there is no restriction in the total amount of gas available, sufficient gas could be injected into an individual well until maximum production is achieved. However, in most cases, the total amount of injection gas volume available for the system of wells is insufficient to reach the maximum oil production for every well. It will force the oil operator to approach maximum production cautiously. Inefficient gas allocation in a field with limited gas availability also reduces profitability. Excessive gas input is costly because of high gas prices and compressing costs. Therefore, it is necessary to allocate a limited amount of gas to each well in an optimal way to get the maximum oil production rate from the field. In view of this situation, the development of a procedure to determine the optimal point to produce a group of wells was recognized as having a potential for gas lift design improvement. Kanu et al. established the method of equal slope allocation under both unlimited and limited gas supply. They presented the formulation of the economic slope and the use of this slope to allocate a total amount of gas at the optimal economic point for a group of wells in a step by step procedure. This method works well under the following hypothesis: P. 375
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 28–30, 1993
Paper Number: SPE-26188-MS
Abstract
Introduction Many wells drilled and completed to produce oil and gas must be stimulated to increase flow rates and ultimate recovery. In low permeability reservoirs, long hydraulic fractures are required to optimize the depletion of the reservoir. In high permeability reservoirs, formation damage during drilling must be overcome using short, highly conductive hydraulic fractures. Most recently, in ultra-high permeability, poorly consolidated formations, a new stimulation method that combines both hydraulic fracturing and gravel packing principles (the FRACPAC) has been successfully used to stimulate oil and gas production. To optimize production, one must optimize the fracture fluid properties, the treatment volume, and the fracture conductivity. Choosing the correct fluid and additives is extremely important to be sure the proppant is placed successfully. We also want to use a fluid that breaks and cleans up properly. Ideally, every fracture treatment should be designed by an expert with adequate data available. However, most fracture treatments are not designed by experts; instead, most stimulation treatments are designed by inexperienced engineers without all data they need. We have developed a PC-based interactive computer model to help an engineer choose the best fluid, additives, and propping agent for a given set of reservoir properties. The computer model also optimizes the treatment volume based upon reservoir performance and economics. To select the fluids, additives, and propping agents, the expert system uses rules developed by surveying stimulation experts from different companies, reviewing the literature, then incorporating the knowledge into rules using an expert system shell. Our design expert queries the user for necessary reservoir data, then using the knowledge base, the expert system recommends the best materials for the treatment. In this paper, we have explained the logic behind the rules we have developed, and we explain our optimization procedure. Although it is physically impossible to include every rule we have developed, we reference the source of many of our rules and present examples to illustrate our methodology. We expect the information contained in this paper will be useful to others working on a stimulation expert system. APPLYING EXPERT SYSTEM TECHNOLOGY The information contained in knowledge bases are the core of any expert system. The most difficult task when building an expert system is knowledge acquisition. As one improves the quantity and quality of the knowledge that is programmed, the expert system will better simulate the behavior of a human expert. P. 515^
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 7–9, 1989
Paper Number: SPE-19087-MS
Abstract
Abstract With recent changes within the gas transmission business both internal (personnel, etc.) and external (open access, markets, etc.), companies are faced with how best to utilize their storage capabilities in this new environment. Natural, faced with this situation, had several options available. Initially, some discussion was given to purchasing a commercial simulator program. However, purchasing a commercial simulator program. However, with 13 reservoirs and the time and the investment involved along with the uncertainty of the final product, Natural took a different approach. Mainly, product, Natural took a different approach. Mainly, organizing data so that both trend and analytical analysis can be performed quickly and easily. With past problems of rigid data bases where data was past problems of rigid data bases where data was too structured and inflexible for review, the decision was made to put as much data into spreadsheet programs as possible. From here data can be reviewed or extracted into another spreadsheet for analysis. This paper will track data from the wellhead (type and frequency) to planning models (the final product). Intermediate steps show how the data is product). Intermediate steps show how the data is stored and analyzed to determine how each reservoir is behaving. From these intermediate analysis individual reservoirs are then combined into a big picture for all reservoirs, which is combined into picture for all reservoirs, which is combined into the overall system plan. Examples, analysis, and conclusions are drawn from various P/z vs Inventory graphs, fields data plots, and historical trends are shown and plots, and historical trends are shown and described. Introduction Many reservoir engineering and underground gas storage publications have outlined mathematical simulation techniques to describe past and future performance of a gas reservoir. How a company performance of a gas reservoir. How a company actually monitors and manipulates data is very important. This paper presents an overview of how Natural Gas Pipeline monitors and manipulates data from its 13 gas storage reservoirs to obtain past, present, and future performance of these reservoirs. present, and future performance of these reservoirs. From a reservoir engineer's viewpoint, certain relationships in this paper may not be totally correct because they are oversimplified. However, by simplifying data presentation, two benefits are derived: Field and non-technical personnel can grasp the concepts faster; More data can be analyzed in its raw form before engineering corrections are made. HISTORY OF DATA COMPUTERIZATION AT NATURAL GAS PIPELINE PIPELINE Until recently computers were not used by Natural to analyze storage field data. Several attempts were made at data storage and analysis involving programable calculators and early simulation programs. However, these attempts proved unsuccessful. The programable calculators were used mainly for Bhp calculations, z factor, and metering calculations. While the calculators eliminated some mundane labor, they were limited on the amount of data storage, data retrieval, and flexibility. The simulation studies, performed in the mid '70s, were also found to be of limited benefit because the reservoirs were run more to fit the simulator program than the simulator fit the data. program than the simulator fit the data. In the early '80s, individual mainframe work stations became available at Natural. Attempts were then made to organize and process the storage field data using programs written in Basic and Fortran. These attempts were somewhat successful. However, engineers ended up doing mostly programming work instead of engineering. P. 295
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 7–9, 1989
Paper Number: SPE-19081-MS
Abstract
Abstract In considering the typical experimental conditions used for studies of methane oxidation to oxygenates or hydrocarbons in the presence of catalysts, it is clear that a significant potential exists presence of catalysts, it is clear that a significant potential exists for the appearance of non-catalyzed, thermally activated gas-phase reactions of methane and oxygen. Experimentally, in a quartz reactor at 800 degrees C with a 3:1 methane-to-oxygen ratio and residence times of the order of 5–10 sec, significant conversion of methane (30%) to light hydrocarbons and COx with appreciable selectivity (25%) to C2+ components is observed due solely to these background reactions. In order to identity and describe the contribution of the gas phase processes during catalyzed reactions, a chemical kinetic model (HCT) processes during catalyzed reactions, a chemical kinetic model (HCT) developed at this Laboratory has been employed to describe these homogeneous gas phase reactions. Overall, the model predicts very well the trends and steady state results observed when tested against a series of experimental reactions comparing the effects of various reaction parameters such as residence time, temperature, and gas composition. The ability to accurately predict the magnitude of these background reactions should provide a means to begin to dissect the contributions of thermal gas phase chemistry from those of solely catalytic action. New catalysts materials containing niobium and lanthanum have been synthesized. It is observe that niobium alone at high loading levels is an oxidation catalyst facilitating the production of carbon dioxide. This behavior is modified when lanthanum is incorporated into the catalyst material. In that case, improved selectivity to oxidative coupling products of methane is observed. These results appear to be a consequence of catalyst interaction with the existing background gas phase reactions. Introduction A significant economic incentive exists to discover technologies that directly convert methane to higher value fuels and chemicals. This incentive has motivated active research efforts aimed at describing catalyzed reactions that facilitate a partial oxidation or oxidative coupling reaction between methane and oxygen. Our efforts have been focussed on the synthesis, characterization, and reactions of new catalysts that perform these oxidative processes with methane. During the study of some of these new catalysts, we observed, for even chemically quite distinct materials, similar reactivity patterns. An examination of the literature shows that despite the patterns. An examination of the literature shows that despite the wide range of catalyst materials that have been studied under an equally wide variety of experimental conditions, a common reactivity pattern emerges. This pattern, characterized by the inverse pattern emerges. This pattern, characterized by the inverse relationship between methane conversion and product selectivity, has been observed for both partial oxidation and oxidative coupling reactions. In considering the typical experimental conditions used for such catalyst studies, it is clear that significant potential exists for the appearance of non-catalyzed, thermally activated gas-phase reactions of methane and oxygen. These reactions can occur in the absence of catalysts and under the appropriate experimental conditions can be quite important. Experimentally, we have observed significant conversion of methane (30%) to light hydrocarbons and COx with appreciable selectivity (25%) to C2+ components due solely to gas-phase reactions. In recognition of these thermal reactions, more experimental consideration is being given to describing the thermally induced, homogeneous gas phase (" background") reactions of methane and oxygen occurring during catalysis studies. In order to better quantity the contribution of these background reactions and to determine whether gas-phase related reactions may in part be responsible for the common reactivity observed during catalyzed reactions, we have employed a chemical kinetic model (HCT) developed at this Laboratory to describe the overall homogeneous gas phase reaction of methane and oxygen. Here we report a description and experimental verification of the model, applied to the case of methane oxidative coupling, in a series of tests comparing the effects of various experimental parameters such as temperature, residence time, feed gas composition. P. 247
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 13–15, 1988
Paper Number: SPE-17732-MS
Abstract
Abstract Natural gas as a raw material, is produced in large quantities in oil and gas wells in Nigeria. Although produced in large quantities, the lack of effective transmission and distribution systems results in a greater percentage of this energy source being flared off, thereby resulting in wastage of valuable and scarce foreign exchange for the country. In 1983 alone, of the total gas produced, 78.65% was flared, amounting to approximately 106.36 T kcal/year (1.241 GSCF/D). In an attempt to curtail this colossal wastage of energy and valuable raw material, the Federal Government of Nigeria passed a law on gas flaring and imposed a fine of thirty kobo (30 kobo per m in 1979 was equivalent to about 45 cents/m) per cubic metre of gas flared. This yielded little or no result since the oil producing companies found it more convenient and economical to flare the produced gases. Based on the above premises, a pipeline network for natural gas distribution for Port Harcourt city was designed with the aid of Hewlett-Packard HP-85 computer. A static design model was used and optimization of the network was carried out on the main pipeline diameter such that no pressure or technological specification was violated. The distribution network was designed to have a capacity of 522 Mscf/D (14.62 Mm /D) based on an estimated gas supply/demand for thirty years in Port Harcourt city. The distribution pattern to the residential areas was based on population density of the city. The total length of the network was 16.6 miles (26.72 km). The maximum and minimum pressure specified at any point in the main pipeline were 300 (2067 kPa) and 100 psi (689 kPa) respectively. The design was also carried out Within the compression ratio range of 1 to 1.6, and diameter of 40.64 to 96.52 cm (16 to 38 in) of the main pipeline. The design was performed for inlet pressures ranging from 150 to 200 psi (1033.5 to 1378 kPa). Introduction The Nigerian gas can be described as "solution gas" because it dissolves naturally as the oil is being produced and occurs in a large number of small, widely scattered reservoirs. It is concentrated in the Niger Delta-L which covers an area of about 41,000 sq. miles (106,189.50 km). Nigeria's proven and probable reserves form about 1.1 per cent of the world's proven reserves. It is estimated that Nigeria's proven and probable reserves are in the order of about 85 E scf (2.407 E m). This is about 17 billion barrels of oil equivalent. Of the total Nigeria's proven reserves, 70% is located on land while 30% is off-shore. About 60% are located east of the River Niger while the rest are to the West of the River Niger. Nigeria has an undiscovered oil/gas reserves of about 65 E scf (1.841 E m). Associated gas accounts for about 507. of the proven reserves. Of these, about 75% exist as cap gas. Experts estimate that the reserves locked in the Nigerian soil is enough to last as long as 500 years, fuelling our industries, homes and for export. From table 1, it will be observed that Nigerian gas contains chiefly methane, while the inert or non-hydrocarbon components are in traces. Table 2 gives the analysis of gas produced, utilised and flared between 1978 to 1984. On closer examination of the table, it will be seen that about 84.932 amounting to about 149.367 T kcal/year (4147.784 G scf/year) of the total gas produced in this country Nigeria, is flared yearly. P. 279^
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 13–15, 1988
Paper Number: SPE-17734-MS
Abstract
Abstract Gas companies have long felt the need for flow modelling in natural gas transmission and distribution systems. With the emergence of digital processing of the information related to an in-depth physical study of gas phenomena it became possible to develop network calculation. And, the evolution of DP hardware combined with their ever decreasing cost expanded their use. Two main categories of applications make use of these calculations : pipeline sizing studies and pipeline management network monitoring and control The first category corresponds to static applications where the totality of the network's component parts have to be considered in order to find a technical economic optimum of the capital investments both as concerns network reinforcements and network extensions. The second category entails the analysis of dynamic physical quantities such as flow rates and instantaneous pressures acquired and processed in real-time. In this case, the aim is to give the monitoring station operator a snapshot of his network and prediction of its behavior under various operating configurations. The present paper analyzes the DP hardware and software suitable for these two kinds of application and pays special attention to the interactive and ergonomic aspects. Introduction The Gaz de France Distribution Division operates three categories of networks within different pressure ranges. Within an urban center, the primary network, called feeder network with a pressure between 14 × 10 and 20 × 10 Pa carries the gas coming from the high pressure delivery stations on the transmission grid to the various districts or neighborhoods. Then, the secondary networks often meshed and with a pressure between 4 × 10 and 12 × 10 Pa or at 21 × 10 Pa, takes over to service, running along the streets, the industrial and residential customers. Finally, a fair number of dead-end extensions or spur lines, making up the tertiary network, and operated-at the same pressure as the secondary network, have been built to feed customers in new neighborhoods situated on the outskirts of the city. Whether concerned by gas transmission or distribution network, the gas companies have been obliged to study and build models of the flows in these networks in order to ensure their customers quality and security of service. Numerous empirical pressure drop formulae for gaseous flows in the various domains of Reynolds have been found. Thus, Gaz de France uses Renouard's formula, based on an explicit calculation of the pressure drop coefficient for distribution networks and Colebrook's formula for entirely turbulent flows for transmission. The Renouard and Colebrook formulae furnish, each in its own domain, the pressure drop on a pipeline of a given diameter and a given length ; there then remains to extend these results to the totality of the pipelines making up the gas distribution or transmission network. P. 303^
Proceedings Papers
Publisher: Society of Petroleum Engineers (SPE)
Paper presented at the SPE Gas Technology Symposium, June 13–15, 1988
Paper Number: SPE-17735-MS
Abstract
Abstract Today's world of gas brokers, spot markets and interruptible service is not conducive to the monthly cycles of billing and customer reconciliation of imbalances which supported the business of transporting gas for large, established, long-term transportation customers. Today's gas transmission company must be able to monitor, control, and report transportation business within shorter time constraints. Traditional divisions between real-time pipeline control and custody transfer billing must be removed and computer systems supporting these functions must reflect these business changes. GSAM is an integrated computer system designed to support real-time and custody transfer transportation management. Through the integration of meter facility, customer, transportation agreement, SCADA, gas composition, and custody transfer volume data, daily monitoring of customer activities and monthly calculations for commodity billing are accomplished to provide pipeline system and customer imbalance control. In order to function in the deregulated gas industry in North America and remain competitive, gas transportation companies need to not only automate but integrate traditionally segregated areas of real-time pipeline control and custody transfer billing. NOVA Corporation of Alberta is meeting this challenge with a gas transportation management computer system, GSAM, integrating the Gas Supply and Measurement areas. Introduction Integrating all gas transportation management software systems under the umbrella of one central data base system is a concept which is achievable and desirable with the recent changes in focus of the natural gas industry. An integrated gas transportation administrative system is especially applicable to providing assistance in the operation of a deregulated, competitive gas transportation pipeline with a rapidly growing number of transportation customers and with demands that each operating department have access to current data as soon as it is available. The GSAM (Gas Supply and Measurement) system is an integrated gas transportation management system, developed by one gas pipeline company to handle this need. The development and functions of this particular system will be reviewed in this paper to show how natural gas pipelines can benefit from a similar approach. This paper defines the approach taken, explores the functions the integrated system performs, discusses the background behind the need to create it and describes how an integrated system can contribute to the smooth operation of the gas transportation business. BACKGROUND AND APPROACH Deregulation in the gas industry has meant a significant difference in the way gas pipeline systems are operated. In the case being studied, the requirement for an integrated gas transportation software package became evident with an increase of over 2,000 percent in the number of gas transportation customers, an expanded variety of transportation services and the dramatic growth in the number of transportation agreements being administered. P. 311^