Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Direct Minimization of Error Rates in Multivariate Classification

  • Published:
Computational Statistics Aims and scope Submit manuscript

Summary

We propose a computer intensive method for linear dimension reduction that minimizes the classification error directly. Simulated annealing (Bohachevsky et al. 1986), a modern optimization technique, is used to solve this problem effectively. This approach easily allows user preferences to be incorporated by means of penalty terms. Simulations and a real world example demonstrate the superiority of this optimal classification to classical discriminant analysis (McLachlan 1992). Special emphasis is given to the case when discriminant analysis collapses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4

Similar content being viewed by others

References

  • I. O. Bohachevsky, M. E. Johnson, M. L. Stein, Generalized Simulated Annealing for Function Optimization, Technometrics, 28, 3, 209–217 (1986).

    Article  Google Scholar 

  • K. Fukunaga, Introduction to Statistical Pattern Recognition, second edition, New York: Academic Press (1990).

    MATH  Google Scholar 

  • U. Heilemann and H. J. Münch, West German Business Cycles 1963–1994: A Multivariate Discriminant Analysis, CIRET-Conference in Singapore, CIRET-Studien 50 (1996)

  • R. E. Lucas, Understanding business cycles, in: Studies in Business-Cycle Theory, Cambridge, Mass. and London, 215–240 (1983)

  • G. J. McLachlan, Discriminant Analysis and Statistical Pattern Recognition, John Wiley & Sons (1992).

  • C. Posse, Projection Pursuit Discriminant Analysis for two Groups, Commun. Statist.- Theory and Methods, 21, 1, 1–19 (1992).

    Article  MathSciNet  Google Scholar 

  • J. Polzehl, Projection pursuit discriminant analysis, Comp. Stat. and Data Analysis, 20, 141–157 (1995).

    Article  MathSciNet  Google Scholar 

  • W. H. Press, B. P. Flannery, S. A. Teukolsky and W. T. Vetterling, Numerical Recipes in C, 2nd edition, 444–455, Cambridge University Press, Cambridge (1992).

    MATH  Google Scholar 

  • M. C. Röhl and C. Weihs, Optimal vs. Classical Linear Dimension Reduction, in: W. Gaul, H. Locarek-Junge (eds.), Classification in the Information Age, Studies in Classification, Data Analysis, and Knowledge Organization, Springer, 252–259 (1999)

Download references

Acknowledgment

This work has been supported by the Collaborative Research Centre “Reduction of Complexity in Multivariate Data Structures” (SFB 475) of the German Research Foundation (DFG). We particularly thank Dipl. Stat. D. Steuer for his constructive criticism. We also thank two unknown referees for their helpful comments.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Röhl, M.C., Weihs, C. & Theis, W. Direct Minimization of Error Rates in Multivariate Classification. Computational Statistics 17, 29–46 (2002). https://doi.org/10.1007/s001800200089

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s001800200089

Keywords