Abstract
Traditional researches on data stream mining only put emphasis on building classifiers with high accuracy, which always results in classifiers with dramatic drop of accuracy when concept drifts. In this paper, we present our RTRC system that has good classification accuracy when concept drifts and enough samples are scanned in data stream. By using Markov chain and least-square method, the system is able to predict not only on which the next concept is but also on when the concept is to drift. Experimental results confirm the advantages of our system over Weighted Bagging and CVFDT, two representative systems in streaming data mining.
This work is supported by NSF 60373108.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fan, W.: Systematic Data Selection to Mine Concept-Drifting Data Streams. In: Proceeding of the conference KDD, pp. 128–137 (2004)
Chu, F., Zaniolo, C.: Fast and Light Boosting for Adaptive Mining of Data Streams. In: Dai, H., Srikant, R., Zhang, C. (eds.) PAKDD 2004. LNCS (LNAI), vol. 3056, Springer, Heidelberg (2004)
Kolter, J., Maloof, M.: Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift. In: Proceeding of the conference ICDM (2003)
Hulten, G., Spencer, L., Domingos, P.: Mining Time-Changing Data Streams. In: Proceeding of the conference ACM SIGKDD (2001)
Street, W., Kim, Y.: A Streaming Ensemble Algorithm(sea) for Large–Scale Classification. In: Proceeding of the conference SIGKDD (2001)
Zhu, X., Wu, X., Yang, Y.: Effective Classification of Noisy Data Streams with Attribute-Oriented Dynamic Classifier Selection. In: Proceeding of the conference ICDM (2004)
Widmer, G., Kubat, M.: Learning in the Presence of Concept Drift and Hidden Contests. Machine Learning, 69–101 (1996)
Salganicoff, M.: Tolerating Concept and Sampling Shift in Lazy Learning Using Prediction Error Context Switching. AI Review, Special Issue on Lazy Learning 11(1-5), 133–155 (1997)
Harries, M., Sammut, C., Horn, K.: Extracting Hidden Context. Matching Learning 32(2), 101–126 (1998)
Yang, Y., Wu, X., Zhu, X.: Combining Proactive and Reactive Predictions for Data Streams. In: Proceeding of the conference KDD (2005)
Wang, H., Fan, W., Yu, P., Han, J.: Mining Concept-Drifting Data Streams Using Ensemble Classifiers. In: Proceeding of the conference SIGKDD (2003)
Domingos, P., Hulten, G.: Mining High-Speed Data Streams. In: Proceeding of the conference KDD, pp. 71–80 (2000)
Jin, R., Agrawal, G.: Efficient Decision Tree Construction on Streaming Data. In: Proceeding of the conference ACM SIGKDD (2003)
John, G., Langley, P.: Langley: Extimating Continuous Distributions in Bayesian Classifiers. In: Proceeding of the conference on Uncertainty in Artificial Intelligence, pp. 338–345. Morgan Kaufmann, San Francisco (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, Y., Li, Z., Zhang, Y., Zhang, L., Jiang, Y. (2006). Improving the Performance of Data Stream Classifiers by Mining Recurring Contexts. In: Li, X., Zaïane, O.R., Li, Z. (eds) Advanced Data Mining and Applications. ADMA 2006. Lecture Notes in Computer Science(), vol 4093. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11811305_119
Download citation
DOI: https://doi.org/10.1007/11811305_119
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37025-3
Online ISBN: 978-3-540-37026-0
eBook Packages: Computer ScienceComputer Science (R0)