Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

META-DES

Published: 01 May 2015 Publication History

Abstract

Dynamic ensemble selection systems work by estimating the level of competence of each classifier from a pool of classifiers. Only the most competent ones are selected to classify a given test sample. This is achieved by defining a criterion to measure the level of competence of a base classifier, such as, its accuracy in local regions of the feature space around the query instance. However, using only one criterion about the behavior of a base classifier is not sufficient to accurately estimate its level of competence. In this paper, we present a novel dynamic ensemble selection framework using meta-learning. We propose five distinct sets of meta-features, each one corresponding to a different criterion to measure the level of competence of a classifier for the classification of input samples. The meta-features are extracted from the training data and used to train a meta-classifier to predict whether or not a base classifier is competent enough to classify an input instance. During the generalization phase, the meta-features are extracted from the query instance and passed down as input to the meta-classifier. The meta-classifier estimates, whether a base classifier is competent enough to be added to the ensemble. Experiments are conducted over several small sample size classification problems, i.e., problems with a high degree of uncertainty due to the lack of training data. Experimental results show that the proposed meta-learning framework greatly improves classification accuracy when compared against current state-of-the-art dynamic ensemble selection techniques. HighlightsWe propose a novel dynamic ensemble selection framework using meta-learning.We present five sets of meta-features to measure the competence of a classifier.Results demonstrate the proposed framework outperforms current techniques.

References

[1]
J. Kittler, M. Hatef, R.P.W. Duin, J. Matas, On combining classifiers, IEEE Trans. Pattern Anal. Mach. Intell., 20 (1998) 226-239.
[2]
L.I. Kuncheva, Combining Pattern Classifiers: Methods and Algorithms, Wiley-Interscience, New Jersey, 2004.
[3]
A.S. Britto, Jr., R. Sabourin, L.E.S. de Oliveira, Dynamic selection of classifiers-a comprehensive review, Pattern Recognit. 47 (11) (2014) 3665-3680.
[4]
A.H.R. Ko, R. Sabourin, A.S. Britto, From dynamic classifier selection to dynamic ensemble selection, Pattern Recognit., 41 (2008) 1735-1748.
[5]
E.M. Dos Santos, R. Sabourin, P. Maupin, A dynamic overproduce-and-choose strategy for the selection of classifier ensembles, Pattern Recognit., 41 (2008) 2993-3009.
[6]
P.R. Cavalin, R. Sabourin, C.Y. Suen, Dynamic selection approaches for multiple classifier systems, Neural Comput. Appl., 22 (2013) 673-688.
[7]
P.R. Cavalin, R. Sabourin, C.Y. Suen, Logid, Pattern Recognit., 45 (2012) 3544-3556.
[8]
T. Woloszynski, M. Kurzynski, A probabilistic model of classifier competence for dynamic ensemble selection, Pattern Recognit., 44 (2011) 2656-2668.
[9]
J. Xiao, L. Xie, C. He, X. Jiang, Dynamic classifier ensemble model for customer classification with imbalanced class distribution, Expert Syst. Appl., 39 (2012) 3668-3675.
[10]
R.M.O. Cruz, G.D.C. Cavalcanti, T.I. Ren, A method for dynamic ensemble selection based on a filter and an adaptive distance to improve the quality of the regions of competence, in: Proceedings of the International Joint Conference on Neural Networks, 2011, pp. 1126-1133.
[11]
G. Giacinto, F. Roli, Dynamic classifier selection based on multiple classifier behaviour, Pattern Recognit., 34 (2001) 1879-1881.
[12]
K. Woods, W.P. Kegelmeyer, K. Bowyer, Combination of multiple classifiers using local accuracy estimates, IEEE Trans. Pattern Anal. Mach. Intell., 19 (1997) 405-410.
[13]
L. Kuncheva, Switching between selection and fusion in combining classifiers, IEEE Trans. Syst. Man Cybern., 32 (2002) 146-156.
[14]
X. Zhu, X. Wu, Y. Yang, Dynamic classifier selection for effective mining from noisy data streams, in: Proceedings of the 4th IEEE International Conference on Data Mining, 2004, pp. 305-312.
[15]
S. Singh, M. Singh, A dynamic classifier selection and combination approach to image region labelling, Signal Process. Image Commun., 20 (2005) 219-231.
[16]
P.C. Smits, Multiple classifier systems for supervised remote sensing image classification based on dynamic classifier selection, IEEE Trans. Geosci. Remote Sens., 40 (2002) 801-813.
[17]
L.I. Kuncheva, Clustering-and-selection model for classifier combination, in: Fourth International Conference on Knowledge-Based Intelligent Information Engineering Systems & Allied Technologies, 2000, pp. 185-188.
[18]
R.G.F. Soares, A. Santana, A.M.P. Canuto, M.C.P. de Souto, Using accuracy and diversity to select classifiers to build ensembles, in: Proceedings of the International Joint Conference on Neural Networks, 2006, pp. 1310-1316.
[19]
L.I. Kuncheva, A theoretical study on six classifier fusion strategies, IEEE Trans. Pattern Anal. Mach. Intell., 24 (2002) 281-286.
[20]
D.W. Corne, J.D. Knowles, No free lunch and free leftovers theorems for multiobjective optimisation problems, in: Evolutionary Multi-Criterion Optimization (EMO 2003) Second International Conference, 2003, pp. 327-341.
[21]
E.M. dos Santos, R. Sabourin, P. Maupin, A dynamic overproduce-and-choose strategy for the selection of classifier ensembles, Pattern Recognit., 41 (2008) 2993-3009.
[22]
R.M.O. Cruz, R. Sabourin, G.D.C. Cavalcanti, Analyzing dynamic ensemble selection techniques using dissimilarity analysis, in: Artificial Neural Networks in Pattern Recognition ANNPR, 2014, pp. 59-70.
[23]
J.H. Krijthe, T.K. Ho, M. Loog, Improving cross-validation based classifier selection using meta-learning, in: International Conference on Pattern Recognition, 2012, pp. 2873-2876.
[24]
G. Giacinto, F. Roli, Design of effective neural network ensembles for image classification purposes, Image Vis. Comput., 19 (2001) 699-707.
[25]
R.M.O. Cruz, G.D. Cavalcanti, I.R. Tsang, R. Sabourin, Feature representation selection based on classifier projection space and oracle analysis, Exp. Syst. Appl., 40 (2013) 3813-3827.
[26]
E.M. dos Santos, R. Sabourin, P. Maupin, Single and multi-objective genetic algorithms for the selection of ensemble of classifiers, in: Proceedings of the International Joint Conference on Neural Networks, 2006, pp. 3070-3077.
[27]
I. Partalas, G. Tsoumakas, I. Vlahavas, Focused ensemble selection: a diversity-based method for greedy ensemble selection, in: Proceeding of the 18th European Conference on Artificial Intelligence, 2008, pp. 117-121.
[28]
L.I. Kuncheva, J.C. Bezdek, R.P.W. Duin, Decision templates for multiple classifier fusion, Pattern Recognit., 34 (2001) 299-314.
[29]
M. Sabourin, A. Mitiche, D. Thomas, G. Nagy, Classifier combination for handprinted digit recognition, in: Proceedings of the Second International Conference on Document Analysis and Recognition, 1993, pp. 163-166.
[30]
L. Didaci, G. Giacinto, F. Roli, G.L. Marcialis, A study on the performances of dynamic classifier selection based on local accuracy estimation, Pattern Recognit., 38 (2005) 2188-2191.
[31]
R.P.W. Duin, The combining classifier: to train or not to train?, in: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, 2002, pp. 765-770.
[32]
L. Breiman, Bagging predictors, Mach. Learn., 24 (1996) 123-140.
[33]
M. Skurichina, R.P.W. Duin, Bagging for linear classifiers, Pattern Recognit., 31 (1998) 909-930.
[34]
K. Bache, M. Lichman, UCI machine learning repository (2013). URL {http://archive.ics.uci.edu/ml}.
[35]
R.D. King, C. Feng, A. Sutherland, Statlog: comparison of classification algorithms on large real-world problems, 1995.
[36]
J. Alcalá-Fdez, A. Fernández, J. Luengo, J. Derrac, S. García, KEEL data-mining software tool, Multiple-Valued Logic Soft Comput., 17 (2011) 255-287.
[37]
L. Kuncheva, Ludmila kuncheva collection (2004). URL {http://pages.bangor.ac.uk/~mas00a/activities/real_data.htm}.
[38]
R.P.W. Duin, P. Juszczak, D. de Ridder, P. Paclik, E. Pekalska, D.M. Tax, Prtools, a matlab toolbox for pattern recognition, 2004. URL {http://www.prtools.org}.
[39]
P.R. Cavalin, R. Sabourin, C.Y. Suen, Dynamic selection of ensembles of classifiers using contextual information, Multiple Classif. Syst. ( (2010) 145-154.
[40]
E.M. dos Santos, R. Sabourin, Classifier ensembles optimization guided by population oracle, 2011, pp. 693-698.
[41]
D. Ruta, B. Gabrys, Classifier selection for majority voting, Inf. Fusion, 6 (2005) 63-81.
[42]
Y. Freund, R.E. Schapire, A decision-theoretic generalization of on-line learning and an application to boosting, in: Proceedings of the Second European Conference on Computational Learning Theory, 1995, pp. 23-37.

Cited By

View all
  • (2024)Learn Together Stop Apart: An Inclusive Approach to Ensemble PruningProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3672018(1166-1176)Online publication date: 25-Aug-2024
  • (2024)A dynamic multiple classifier system using graph neural network for high dimensional overlapped dataInformation Fusion10.1016/j.inffus.2023.102145103:COnline publication date: 1-Mar-2024
  • (2024)A scalable dynamic ensemble selection using fuzzy hyperboxesInformation Fusion10.1016/j.inffus.2023.102036102:COnline publication date: 1-Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Pattern Recognition
Pattern Recognition  Volume 48, Issue 5
May 2015
360 pages

Publisher

Elsevier Science Inc.

United States

Publication History

Published: 01 May 2015

Author Tags

  1. Classifier competence
  2. Dynamic ensemble selection
  3. Ensemble of classifiers
  4. Meta-learning

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Learn Together Stop Apart: An Inclusive Approach to Ensemble PruningProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3672018(1166-1176)Online publication date: 25-Aug-2024
  • (2024)A dynamic multiple classifier system using graph neural network for high dimensional overlapped dataInformation Fusion10.1016/j.inffus.2023.102145103:COnline publication date: 1-Mar-2024
  • (2024)A scalable dynamic ensemble selection using fuzzy hyperboxesInformation Fusion10.1016/j.inffus.2023.102036102:COnline publication date: 1-Feb-2024
  • (2024)Recent advances in applications of machine learning in reward crowdfunding success forecastingNeural Computing and Applications10.1007/s00521-024-09886-636:26(16485-16501)Online publication date: 1-Sep-2024
  • (2023)Security Relevant Methods of Android's API Classification: A Machine Learning Empirical EvaluationIEEE Transactions on Computers10.1109/TC.2023.329199872:11(3273-3285)Online publication date: 1-Nov-2023
  • (2023)Dynamic ensemble learning for multi-label classificationInformation Sciences: an International Journal10.1016/j.ins.2022.12.022623:C(94-111)Online publication date: 1-Apr-2023
  • (2023)The Krypteia ensembleInformation Fusion10.1016/j.inffus.2022.09.02190:C(283-297)Online publication date: 1-Feb-2023
  • (2023)Exploring diversity in data complexity and classifier decision spaces for pool generationInformation Fusion10.1016/j.inffus.2022.09.00189:C(567-587)Online publication date: 1-Jan-2023
  • (2023)Dynamic ensemble pruning algorithms fusing meta-learning with heuristic parameter optimization for time series predictionExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.120148225:COnline publication date: 1-Sep-2023
  • (2023)An empirical study of dynamic selection and random under-sampling for the class imbalance problemExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.119703221:COnline publication date: 1-Jul-2023
  • Show More Cited By

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media