Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

A Comparison of Linear ICA and Local Linear ICA for Mutual Information Based Feature Ranking

  • Conference paper
Independent Component Analysis and Blind Signal Separation (ICA 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 3889))

  • 3125 Accesses

Abstract

Feature selection and dimensionality reduction is important for high dimensional signal processing and pattern recognition problems. Feature selection can be achieved by filter approach, in which certain criteria must be optimized. By using mutual information (MI) between feature vectors and class labels as the criterion, we proposed an ICA-MI framework for feature selection. In this paper, we will compare the linear ICA and local linear ICA for the accuracy of MI estimation, and study the bias-variance trade-off on feature projections and ranking.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Fano, R.M.: Transmission of Information: A Statistical Theory of Communications. Wiley, New York (1961)

    Google Scholar 

  2. Hellman, M.E., Raviv, J.: Probability of Error, Equivocation and the Chernoff Bound. IEEE Transactions on Information Theory 16, 368–372 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  3. Torkkola, K.: Feature Extraction by Non-Parametric Mutual Information Maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  4. Battiti, R.: Using Mutual Information for Selecting Features in Supervised Neural Networks learning. IEEE Trans. Neural Networks 5(4), 537–550 (1994)

    Article  Google Scholar 

  5. Ai-ani, A., Deriche, M.: An Optimal Feature Selection Technique Using the Concept of Mutual Information. In: Proceedings of ISSPA, pp. 477–480 (2001)

    Google Scholar 

  6. Kwak, N., Choi, C.-H.: Input Feature Selection for Classification Problems. IEEE Transactions on Neural Networks 13(1), 143–159 (2002)

    Article  Google Scholar 

  7. Yang, H.H., Moody, J.: Feature Selection Based on Joint Mutual Information. In: Advances in Intelligent Data Analysis and Computational Intelligent Methods and Application (1999)

    Google Scholar 

  8. Yang, H.H., Moody, J.: Data Visualization and Feature Selection: New Algorithms for Nongaussian Data. In: Advances in NIPS, pp. 687–693 (2000)

    Google Scholar 

  9. Lan, T., Erdogmus, D., Adami, A., Pavel, M.: Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification. In: Proceedings of IJCNN 2005, Montreal, Canada, August 2005, pp. 3011–3016 (2005)

    Google Scholar 

  10. Parra, L., Sajda, P.: Blind Source Separation via Generalized Eigenvalue Decomposition. Journal of Machine Learning Research 4, 1261–1269 (2003)

    Article  MathSciNet  Google Scholar 

  11. Learned-Miller, E.G., Fisher III, J.W.: ICA Using Spacings Estimates of Entropy. Journal of Machine Learning Research 4, 1271–1295 (2003)

    Article  MathSciNet  Google Scholar 

  12. http://www.ics.uci.edu/~mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lan, T., Huang, Y., Erdogmus, D. (2006). A Comparison of Linear ICA and Local Linear ICA for Mutual Information Based Feature Ranking. In: Rosca, J., Erdogmus, D., Príncipe, J.C., Haykin, S. (eds) Independent Component Analysis and Blind Signal Separation. ICA 2006. Lecture Notes in Computer Science, vol 3889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11679363_102

Download citation

  • DOI: https://doi.org/10.1007/11679363_102

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-32630-4

  • Online ISBN: 978-3-540-32631-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics