Abstract
Input variable selection is a key preprocess step in any I/O modelling problem. Normally, better generalization performance is obtained when unneeded parameters coming from irrelevant or redundant variables are eliminated. Information theory provides a robust theoretical framework for performing input variable selection thanks to the concept of mutual information. Nevertheless, for continuous variables, it is usually a more difficult task to determine the mutual information between the input variables and the output variable than for classification problems. This paper presents a modified approach for variable selection for continuous variables adapted from a previous approach for classification problems, making use of a mutual information estimator based on the k-nearest neighbors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bonnlander, B.V., Weigend, A.S.: Selecting input variables using mutual information and nonparametric density estimation. In: Proc. of the ISANN 2004, Taiwan, pp. 42–50 (1994)
Schoelkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)
Haykin, S.: Neural Networks. Prentice Hall, New Jersey (1998)
Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, J., Vandewalle, B.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)
Herrera, L.J., Pomares, H., Rojas, I., Valenzuela, O., Prieto, A.: TaSe, a Taylor Series Based Fuzzy System Model that Combines Interpretability and Accuracy. Fuzzy Sets and Systems 153(3), 403–427 (2005)
Koller, D., Sahami, M.: Toward Optimal Feature Selection. In: Proc. Int. Conf. on Machine Learning, pp. 284–292 (1996)
Rossi, F., Lendasse, A., François, D., Wertz, V., Verleysen, M.: Mutual Information for the selection of relevant variables in spectrometric nonlinear modeling. Chem. and Int. Lab. Syst. (2005) (In Press)
Benoudjit, N., François, D., Meurens, M., Verleysen, M.: Spectrophotometric variable selection by mutual information. Chem. and Int. Lab. Syst. 74, 243–251 (2004)
Kraskov, A., Stögbauer, H., Grassberger, P.: Estimating mutual information. Phys.Rev. E 69, 66138 (2004)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)
Harald, S., Alexander, K., Sergey, A.A., Peter, G.: Least dependent component analysis based on mutual information. Phys. Rev. E 70, 66123 (2004)
Sorjamaa, A., Hao, J., Lendasse, A.: Mutual Information and k-Nearest Neighbors Approxi-mator for Time Series Prediction. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3697, pp. 553–558. Springer, Heidelberg (2005)
Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, CA (1988)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Herrera, L.J., Pomares, H., Rojas, I., Verleysen, M., Guilén, A. (2006). Effective Input Variable Selection for Function Approximation. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840817_5
Download citation
DOI: https://doi.org/10.1007/11840817_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-38625-4
Online ISBN: 978-3-540-38627-8
eBook Packages: Computer ScienceComputer Science (R0)