Abstract
Feature selection and dimensionality reduction is important for high dimensional signal processing and pattern recognition problems. Feature selection can be achieved by filter approach, in which certain criteria must be optimized. By using mutual information (MI) between feature vectors and class labels as the criterion, we proposed an ICA-MI framework for feature selection. In this paper, we will compare the linear ICA and local linear ICA for the accuracy of MI estimation, and study the bias-variance trade-off on feature projections and ranking.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fano, R.M.: Transmission of Information: A Statistical Theory of Communications. Wiley, New York (1961)
Hellman, M.E., Raviv, J.: Probability of Error, Equivocation and the Chernoff Bound. IEEE Transactions on Information Theory 16, 368–372 (1970)
Torkkola, K.: Feature Extraction by Non-Parametric Mutual Information Maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)
Battiti, R.: Using Mutual Information for Selecting Features in Supervised Neural Networks learning. IEEE Trans. Neural Networks 5(4), 537–550 (1994)
Ai-ani, A., Deriche, M.: An Optimal Feature Selection Technique Using the Concept of Mutual Information. In: Proceedings of ISSPA, pp. 477–480 (2001)
Kwak, N., Choi, C.-H.: Input Feature Selection for Classification Problems. IEEE Transactions on Neural Networks 13(1), 143–159 (2002)
Yang, H.H., Moody, J.: Feature Selection Based on Joint Mutual Information. In: Advances in Intelligent Data Analysis and Computational Intelligent Methods and Application (1999)
Yang, H.H., Moody, J.: Data Visualization and Feature Selection: New Algorithms for Nongaussian Data. In: Advances in NIPS, pp. 687–693 (2000)
Lan, T., Erdogmus, D., Adami, A., Pavel, M.: Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification. In: Proceedings of IJCNN 2005, Montreal, Canada, August 2005, pp. 3011–3016 (2005)
Parra, L., Sajda, P.: Blind Source Separation via Generalized Eigenvalue Decomposition. Journal of Machine Learning Research 4, 1261–1269 (2003)
Learned-Miller, E.G., Fisher III, J.W.: ICA Using Spacings Estimates of Entropy. Journal of Machine Learning Research 4, 1271–1295 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lan, T., Huang, Y., Erdogmus, D. (2006). A Comparison of Linear ICA and Local Linear ICA for Mutual Information Based Feature Ranking. In: Rosca, J., Erdogmus, D., PrÃncipe, J.C., Haykin, S. (eds) Independent Component Analysis and Blind Signal Separation. ICA 2006. Lecture Notes in Computer Science, vol 3889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11679363_102
Download citation
DOI: https://doi.org/10.1007/11679363_102
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-32630-4
Online ISBN: 978-3-540-32631-1
eBook Packages: Computer ScienceComputer Science (R0)