Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Pairwise Selection of Features and Prototypes

  • Conference paper
Computer Recognition Systems

Part of the book series: Advances in Soft Computing ((AINSC,volume 30))

Abstract

Learning from given patterns is realized by learning from their appropriate representations. This is usually practiced either by defining a set of features or by measuring proximities between pairs of objects. Both approaches are problem dependent and aim at the construction of some representation space, where discrimination functions can be defined.

In most situations, some feature reduction or prototype selection is mandatory. In this paper, a pairwise selection for creating a suitable representation space is proposed. To determine an informative set of features (or prototypes), the correlations between feature pairs are taken into account. By this, some dependencies are detected, while overtraining is avoided as the criterion is evaluated in two-dimensional feature spaces. Several experiments show that for small sample size problems, the proposed algorithm can outperform traditional selection methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Blake CL and Merz CJ (1998) UCI Repository of machine learning databases, http://www.ics.uci.edu/mlearn/MLRepository.html.

    Google Scholar 

  2. Bo T and Jonassen I (2002) New feature subset selection procedures for classification of expression profiles, Genome biology 3.

    Google Scholar 

  3. Borg I and Groenen P (1997) Modern Multidimensional Scaling. Springer-Verlag.

    Google Scholar 

  4. Breiman L, Friedman JH, Olshen RA and Stone CJ (1984) Classification and regression trees, Wadsworth, California.

    Google Scholar 

  5. Duda RO, Hart PE and Stork DG (2001) Pattern Classification 2nd. edition, John Wiley & Sons.

    Google Scholar 

  6. Gower JC, A general coeficient of similarity and some of its properties, Biometrics vol. 27, 25–33, 1971.

    Article  Google Scholar 

  7. Harol A, Pekalska E and Duin RPW (2005), Pairwise prototype selection on distance data, submitted.

    Google Scholar 

  8. van der Heijden F, Duin RPW, de Ridder D and Tax DMJ (2004) Classification, Parameter Estimation and State Estimation. An Engineering Approach using Matlab. John Wiley & Sons Ltd.

    Google Scholar 

  9. Jain AK, Duin RPW and Mao J (2000) Statistical Pattern Recognition: A Review. IEEE Trans on PAMI 22:4–37.

    Google Scholar 

  10. Kohavi R and John GH (1997) Wrappers for feature subset selection. Artificial Intelligence 97:273–324.

    Article  MATH  Google Scholar 

  11. Li L, Weinberg CR, Darden TA and Pedersen LG (2001) Gene selection for sample classification based on gene expression data: study of sensitivity to choice of parameters of the GA/KNN method. Bioinformatics 17:1131–1142.

    Article  Google Scholar 

  12. Pekalska E, Duin RPW and Paclík P (2005), Prototype Selection for Dissimilarity-based Classifiers. accepted to Pattern Recognition.

    Google Scholar 

  13. Pekalska E (2005) Dissimilarity representations in pattern recognition. Concepts, theory and applications. PhD thesis. Delft University of Technology.

    Google Scholar 

  14. Pudil P, Novovicova J, and Kittler J (1994) Floating search methods in feature selection. Pattern Recognition Letters 15:1119–1125.

    Article  Google Scholar 

  15. Raudys S, and Duin RPW (1998) On expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix. Pattern Recognition Letters 19:385–392.

    Article  MATH  Google Scholar 

  16. Skurichina M (2001), Stabilizing weak classifiers. PhD thesis. Delft University of Technology.

    Google Scholar 

  17. Somorjai RL, Dolenko B, Demko A, Mandelzweig M, Nikulin AE, Baumgartner R, Pizzi NJ (2004) Mapping high-dimensional data onto a relative distance plane an exact method for visualizing and characterizing high-dimensional patterns, Journal of Biomedical Informatics 37:366–379.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pekalska, E., Harol, A., Lai, C., Duin, R.P.W. (2005). Pairwise Selection of Features and Prototypes. In: Kurzyński, M., Puchała, E., Woźniak, M., żołnierek, A. (eds) Computer Recognition Systems. Advances in Soft Computing, vol 30. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-32390-2_31

Download citation

  • DOI: https://doi.org/10.1007/3-540-32390-2_31

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25054-8

  • Online ISBN: 978-3-540-32390-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics