Support-Vector Regression for Permeability Prediction in a Heterogeneous Reservoir: A Comparative Study
- Ammal F. Al-anazi (University of Calgary) | Ian D. Gates (University of Calgary)
- Document ID
- Society of Petroleum Engineers
- SPE Reservoir Evaluation & Engineering
- Publication Date
- June 2010
- Document Type
- Journal Paper
- 485 - 495
- 2010. Society of Petroleum Engineers
- 5.1 Reservoir Characterisation, 5.6.4 Drillstem/Well Testing, 5.6.1 Open hole/cased hole log analysis, 5.8.7 Carbonate Reservoir, 7.6.6 Artificial Intelligence, 1.6.9 Coring, Fishing, 4.3.4 Scale, 6.1.5 Human Resources, Competence and Training, 2.4.3 Sand/Solids Control
- support vector machines, multilinear regression, core data, linear regression, log data
- 3 in the last 30 days
- 1,084 since 2007
- Show more detail
- View rights & permissions
|SPE Member Price:||USD 12.00|
|SPE Non-Member Price:||USD 35.00|
Permeability is a key parameter in reservoir-engineering computation, and the relationship between rock petrophysical properties and permeability is often complex and difficult to understand by using conventional statistical methods. Neural-network-based methods can be employed to develop more-accurate permeability correlations, but the correlations from these methods have limited generalizability and the global correlations are usually less accurate compared to local correlations. In this research, the objective is to build a permeability model with promising generalization performance. Recently, support-vector machines (SVMs) based on statistical-learning theory have been proposed as a new intelligence technique for both prediction and classification tasks. The formulation of SVMs embodies the structural-risk-minimization (SRM) principle, which has been shown to be superior to the traditional empirical-risk-minimization (ERM) principle employed by conventional neural networks. This new formulation deals with kernel functions, allows projection to higher planes, and solves more-complex nonlinear problems. SRM minimizes an upper bound on the expected risk, as opposed to ERM, which minimizes the error on the training data. It is this difference that equips SVMs with a greater ability to generalize, which is the goal in reservoir-characterization statistical learning. This novel support-vector-regression (SVR) algorithm was first introduced in well-logs intelligent analysis. Here, a permeability-prediction model using SVR from well logs in a heterogeneous sandstone reservoir is developed. Also, an attempt has been made to review the basic ideas underlying support-vector machines for function estimation. To demonstrate the potential of the proposed SVM?s regression technique in prediction permeability, a study was performed to compare its performance with multilayer perceptron neural network, generalized neural network, and radial-basis-function neural networks. Accuracy and robustness were investigated, and statistical-error analysis reveals that the SVM approach is superior to the other methods for generalizing previously unseen permeability data.
|File Size||1 MB||Number of Pages||11|
Al-Anazi, A., Gates, I.D., and Azaiez, J. 2009. Innovative Data-Driven PermeabilityPrediction in a Heterogeneous Reservoir. Paper SPE 121159 presented at theEUROPEC/EAGE Conference and Exhibition, Amsterdam, 8-11 June. doi:10.2118/121159-MS.
Basak, D., Pal, S., and Patranabis, D.C. 2007. Support Vector Regression.Neural Information Processing--Letters and Reviews 10 (10):203-224.
Boser, B.E., Guyon, I.M., and Vapnik, V.N. 1992. A training algorithm foroptimal margin classifiers. In Proceedings of the fifth annual workshop onComputational learning theory, ed. D. Haussler, 144-152. New York: ACMPress.
Burges, C.J.C. 1996. Simplified support vector decision rules. InProceedings of the International Conference on Machine Learning, ed. L.Saitta, L., 71-77. San Mateo, CA: Morgan Kaufmann Publishers.
Burges, C.J.C. and Schölkopf, B. 1997. Improving the accuracy and speed ofsupport vector learning machines. In Advances in Neural InformationProcessing Systems 9, ed. M.C. Mozer, M.I. Jordan, and T. Petsche, 375-381.Cambridge, Massachusetts: MIT Press.
Chen, S., Cowan, C.F.N., and Grant, P.M. 1991. Orthogonal least squareslearning algorithm for radial basis function networks. IEEE Transactions onNeural Networks 2 (2): 302-309.
Cortes, C. and Vapnik, V. 1995. Support-Vector Networks. MachineLearning 20: 273-297.
Cristianini, C. and Shawe-Taylor, J. 2000. An Introduction to SupportVector Machines and other kernel-based learning methods. Cambridge, UK:Cambridge University Press.
Cuddy, S.J. 2000. Litho-Faciesand Permeability Prediction From Electrical Logs Using Fuzzy Logic. SPERes Eval & Eng 3 (4): 319-324. SPE-65411-PA. doi:10.2118/65411-PA.
Fletcher, R. 1987. Practical Methods of Optimization. New York: JohnWiley & Sons.
Guyon, I., Boser, B., and Vapnik, V. 1993. Automatic capacity tuning of verylarge VC-dimension classifiers. In Advances in Neural Information ProcessingSystems 5, ed. S.J. Hanson, J.D. Cowan, and C.L. Giles, 147-155. San Mateo,California: Morgan Kaufmann Publishers.
Haykin, S. 1999. Neural Networks, a Comprehensive Foundation, secondedition. Upper Saddle River, New Jersey: Prentice-Hall Inc.
Huang, Y., Gedeon, T.D., and Wong, P.M. 2001. An integratedneural-fuzzy-genetic-algorithm using hyper-surface membership functions topredict permeability in petroleum reservoirs. Engineering Applicationsof Artificial Intelligence 14 (1):15-21.doi:10.1016/S0952-1976(00)00048-8.
Huang, Z., Shimeld, J., Williamson, M., and Katsube, J. 1996. Permeability prediction withartificial neural network modelling in the Venture gas field, offshore easternCanada. Geophysics 61 (2): 422-436.doi:10.1190/1.1443970.
Huber, P.J. 1981. Robust Statistics. New York: Wiley Series inProbability and Statistics, Wiley-Interscience.
Karush, W. 1939. Minima of functions of several variables with inequalitiesas side constraints. MSc thesis, Department of Mathematics, University ofChicago, Chicago, Illinois.
Kecman, V. 2005. Support Vector Machines--An Introduction. In SupportVector Machines: Theory and Applications, ed. L. Wang, Chap. 1, 1-47.Heidelberg, Germany: Studies in Fuzziness and Soft Computing,Springer-Verlag.
Kuhn, H.W. and Tucker, A.W. 1951. Nonlinear programming. In Proceedingsof 2nd Berkeley Symposium on Mathematical Statistics and Probabilistics,481-492. Berkeley, California: University of California Press.
Morozov, V.A., Nashed, Z., and Aries, A.B. 1984. Methods for SolvingIncorrectly Posed Problems. New York: Springer.
Müller, K.-R., Smola, A., Rätsch, G., Schölkopf, B., Kohlmorgen, J., andVapnik, V. 1997. Predicting time series with support vector machines. InArtificial Neural Networks--ICANN'97, Lausanne, Switzerland, October 8-10,1997, ed. W. Gerstner, A. Germond, M. Hasler, and J.-D. Nicoud, No. 1327,999-1004. Berlin, Germany: Lecture Notes in Computer Science, Springer.
Peng, K.-L., Wu, C.-H., and Goo, Y.-J. 2004. The Development of a NewStatistical Technique for Relating Financial Information to Stock MarketReturns. International Journal of Management 21 (4):492-505.
Rogers, S.J., Chen, H.C., Kopaska-Merkel, D.C., and Fang, J.H. 1995.Predicting permeability from porosity using artificial neural networks. AAPGBulletin 79 (12): 1786-1797.
Rumelhart, D.E., Hinton, G.E., and Williams, R.J. 1986. Learning internalrepresentations by error propagation. In Parallel Distributed Processing,Vol. 1: Foundations, ed. D.E. Rumelhart, J.L. McClelland, and the PDPResearch Group, 318-362. Cambridge, Massachusetts: MIT Press.
Schölkopf, B. and Smola, A. 2002. Learning with Kernels: Support VectorMachines, Regularization, Optimization, and Beyond. Cambridge,Massachusetts: Adaptive Computation and Machine Learning, MIT Press.
Schölkopf, B., Burges, C., and Vapnik, V. 1995. Extracting support data fora given task. In Proceedings of the First International Conference onKnowledge Discovery and Data Mining, ed. U.M. Fayyad and R. Uthurusamy,252-257. Menlo Park, California: AAAI Press.
Schölkopf, B., Burges, C., and Vapnik, V. 1996. Incorporating Invariances inSupport Vector Learning Machines. In Artificial Neural Networks: 6thInternational Conference Proceedings/ICANN'96, Bochum, Germany, July 16-19,1996, ed. C. von der Malsburg, W. von Seelen, J.C. Vorbrüggen, and B.Sendhoff, No. 1112, 47-52. Berlin, Germany: Lecture Notes in Computer Science,Springer.
Schölkopf, B., Mika, S., Burges, C., Knirsch, P., Müller K.-R., Rätsch G.,and Smola, A. 1999. Input Space vs. Feature Space in Kernel-Based Methods.IEEE Transactions on Neural Networks 10 (5): 1000-1017.
Schölkopf, B., Smola, A., Williamson, R.C., and Bartlett, P.L. 2000. NewSupport Vector Algorithms. Neural Computation 12 (2):1207-1245.
Sherrod, P.H. 2009. DTREG Predictive Modeling Software (version 9.1 user'smanual), http://www.dtreg.com.
Smola, A., Schölkopf, B., and Müller, K.-R. 1998. The connection betweenregularization operators and support vector kernels. Neural Networks 11 (4):637-649. doi:10.1016/S0893-6080(98)00032-X.
Specht, D. 1991. A generalregression neural network. IEEE Transactions on Neural Networks 2 (6): 568-576. doi:10.1109/72.97934.
Steinwart, I. and Christmann, A. 2008. Support Vector Machines. NewYork: Information Science and Statistics, Springer Science+Business Media.
Taghavi, A.A. 2005. ImprovedPermeability Estimation through Use of Fuzzy Logic in a Carbonate Reservoirfrom Southwest, Iran. Paper SPE 93269 presented at the SPE Middle East Oiland Gas Show and Conference, Bahrain, 12-15 March. doi: 10.2118/93269-MS.
Tikhonov, A.N. and Arsenin, V.Y. 1977. Solution of Ill PosedProblems. New York: V.H. Winston and Sons.
Vapnik ,V.N. 1995. The Nature of Statistical Learning Theory. NewYork: Springer-Verlag.
Vapnik, V. and Chervonenkis, A. 1964. A note on one class of perceptrons.Automation and Remote Control 25 (1).
Vapnik, V. and Chervonenkis, A. 1974. Theory of Pattern Recognition(in Russian). Nauka, Moscow. (German translation: Wapnik, W. andTscherwonenkis, A. 1979. Theorie der Zeichenerkennung. Berlin, Germany:Akademie-Verlag).
Vapnik, V. and Lerner, A. 1963. Pattern recognition using generalizedportrait method. Automation and Remote Control 24:774-780.
Vapnik, V., Golowich, S., and Smola, A. 1997. Support vector method forfunction approximation, regression estimation, and signal processing. InAdvances in Neural Information Processing Systems 9, ed. M.C. Mozer,M.I. Jordan, and T. Petsche, 281-287. Cambridge, Massachusetts: MIT Press.
Vapnik, V.N. 1982. Estimation of Dependences Based on Empirical Data.Berlin, Germany: Springer Series in Statistics, Springer.