Abstract
A novel Support Vector Regression(SVR) algorithm has been proposed recently by us. This approach, called Lagrangian Support Vector Regression(LSVR), is an reformulation on the standard linear support vector regression, which leads to the minimization problem of an unconstrained differentiable convex function. During the process of computing, the inversion of matrix after incremented is solved based on the previous results, therefore it is not necessary to relearn the whole training set to reduce the computation process. In this paper, we implemented the LSVR and tested it on Mackey-Glass time series to compare the performances of different algorithms. According to the experiment results, we achieve a high-quality prediction about time series.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Deng, N.Y., Tian, Y.J.: New Methods in Data Mining—Support Vector Machine. Science Press, Beijing (2004)
Mangasarian, O.L., Musicant, D.R.: Lagrangian Support Vector Machines. Journal of Machine Learning Research 1, 161–177 (2001)
Shao, X.J., He, G.P.: Lagrange Support Vector Regression. Intelligent Information Management Systems and Technologies 1(3), 434–440 (2005)
Casdagli, M.: Nonlinear Prediction of Chaotic Time-series. Physica D 35, 335–356 (1989)
Mackey, M.C., Glass, L.: Oscillation and Chaos in Physiological Control Systems. Science 197, 287–294 (1977)
Cortes, C., Vapnik, V.: Support Vector Networks. Machine learning 20, 273–297 (1995)
Peking University. Advanced algebra. China High Education Press, Beijing (1987)
Yuan, Y.X., Sun, W.Y.: Optimization Theories and Tethods. Science Press, Beijing (1997)
Mukherjee, S., Osuna, E., Girosi, F.: Nonlinear Prediction of Chaotic Time Series using Support VectorMachines. In: Proc. IEEE NNSP’97, Amelia Island, FL (1997)
Muller, K.R., Smola, A.J., Ratsch, G., Scholkopf, B., Kohlmorgen, J.: Using Support Vector Machines for Time Series Prediction. In: Scholkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods – Support Vector Learning, pp. 243–254. MIT Press, Cambridge (1999)
Müller, K.R., Smola, J.A., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.N.: Predicting Time Series with Support Vector Machines. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 999–1004. Springer, Heidelberg (1997)
Smola, A.J., Schölkopf, B.: A Tutorial on Support Vector Regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, UK (1998)
Quan, Y., Yang, J., Yao, L.X., Ye, C.Z.: Successive Over Relaxation for Support Vector Regression. Journal of Software 15(2), 200–206 (2004)
Demiriz, A., Bennett, K., Breneman, C., Embrechs, M.: Support Vector Machine Regression in Chemometrics. Computing Science and Statistics (2001)
Dibike, Y.B., Velickov, S., Solomatine, D.: Support Vector Machines: Review and Applications in Civil Engineering. In: Schleider, O., Zijderveld, A. (eds.) AI Methods in Civil Engineering Applications, Cottbus, pp. 45–58 (2000)
Collobert, R., Bengio, S.: SVMTorch: Support Vector Machines for Large-scale Regression Problems. Journal of Machine Learning Research 1(1), 143–160 (2001)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Duan, H., Hou, W., He, G., Zeng, Q. (2007). Predicting Time Series Using Incremental Langrangian Support Vector Regression. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72395-0_99
Download citation
DOI: https://doi.org/10.1007/978-3-540-72395-0_99
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72394-3
Online ISBN: 978-3-540-72395-0
eBook Packages: Computer ScienceComputer Science (R0)