Abstract
In this paper we introduce a novelty EM based algorithm for Gaussian Mixture Models with an unknown number of components. Although the EM (Expectation-Maximization) algorithm yields the maximum likelihood solution it has many problems: (i) it requires a careful initialization of the parameters; (ii) the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model, and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture. We apply our algorithm to the unsupervised color image segmentation problem.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Arandjelovic, O., Cipolla, R.: Incremental learning of temporary-coherent Gaussian mixture models. In: British Machine Vision Conference. British Machine Vision Association, BMVA (2005)
Beirlant, E., Dudewicz, E., Gyorfi, L., Van der Meulen, E.: Nonparametric entropy estimation. International Journal on Mathematical and Statistical Sciences 6(1), 17–39 (1996)
Clark, E., Quinn, A.: A data-driven bayesian sampling scheme for unsupervised image segmentation. In: Proceedings of ICASSP 1999 (1999)
Cover, T., Thomas, J.: Elements of Information Theory. J. Wiley and Sons, Chichester (1991)
Dellaportas, P., Papageorgiou, I.: Multivariate mixtures of normals with unknown number of components. Statistics and Computing (to appear)
Dempster, A., Laird, N., Rubin, D.: Maximum likelihood estimation from incomplete data via the em algorithm. Journal of The Royal Statistical Society B 39(1), 1–38 (1977)
Figueiredo, M.A.T., Jain, A.K.: Unsupervised selection and estimation of finite mixture models. In: Proceedings of the International Conference on Pattern Recognition, ICPR 2000, Barcelona (2000)
Figueiredo, M.A.T., Leitao, J.M.N., Jain, A.K.: On fitting mixture models. In: Hancock, E.R., Pelillo, M. (eds.) EMMCVPR 1999. LNCS, vol. 1654, pp. 54–69. Springer, Heidelberg (1999)
Hornegger, J., Niemann, H.: A novel probabilistic model for object recognition and pose estimation. Pattern Recognition Artif. Intell. 15(2), 241–253 (2001)
Li, S.Z.: Markov random field modeling in computer vision. Springer, London (1995)
Mardia, K.V.: Measures of multivariate skewness and kurtosis with applications. Biometrika 57(3), 519–530 (1970)
Parzen, E.: On estimation of a probability density function and mode. Annals of Mathematical Statistics 33(1), 1065–1076 (1962)
Redner, R.A., Walker, H.F.: Mixture densities, maximum likelihood, and the em algorithm. SIAM Review 26(2), 195–239 (1984)
Richardson, S., Green, P.J.: On bayesian analysis of mixtures with unknown number of components (with discussion). Journal of the Royal Statistical Society B 59(1), 731–792 (1997)
Rissanen, J.: Stochastic complexity in statistical inquiry. World Scientific, Singapore (1989)
Shannon, C.: A mathematical theory of comunication. The Bell System Technical Journal 27(1), 379–423 (1948)
Tu, Z., Zhu, S.C., Shum, H.Y.: Image segmentation by data driven markov chain montecarlo. In: Proceedings of ICCV 2001 (2001)
Tu, Z.W., Zhu, S.C.: Image segmentation by datadriven markov chain monte carlo. IEEE Trans. Pattern Anal. Mach. Intell 24(1), 657–673 (2002)
Ueda, N., Nakano, R., Ghahramani, Z., Hinton, G.E.: Smem algorithm for mixture models. Neural Computation 12(1), 2109–2128 (2000)
Viola, P., Schraudolph, N.N., Sejnowski, T.J.: Empirical entropy manipulation for real-world problems. Adv. in Neural Infor. Proces. Systems 8(1) (1996)
Viola, P., Wells-III, W.M.: Alignment by maximization of mutual information. In: 5th Intern. Conf. on Computer Vision. IEEE, Los Alamitos (1995)
Vlassis, N., Likas, A.: A kurtosis-based dynamic approach to gaussian mixture modeling. IEEE Trans. Systems, Man, and Cybernetics 29(4), 393–399 (1999)
Vlassis, N., Likas, A.: A greedy em algorithm for gaussian mixture learning. Neural Processing Letters 15(1), 77–87 (2002)
Vlassis, N., Likas, A., Krose, B.: A multivariate kurtosis-based dynamic approach to gaussian mixture modeling. Inteligent Autonomous Systems Tech. Report (2000)
Zhang, Z., Chan, K.L., Wu, Y., Chen, C.: Learning a multivariate gaussian mixture models with the reversible jump mcmc algorithm. Statistics and Computing 14(1), 343–355 (2004)
Zhang, Z., Chen, C., Sun, J., Chan, K.L.: Em algorithms for gaussian mixtures with split-and-merge operation. Pattern Recognition 36(1), 1973–1983 (2003)
Zhu, S.C., Zhang, R., Tu, Z.: Integrating bottom-up for object recognition by data driven markov chain montecarlo. In: CVPR. IEEE, Los Alamitos (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Peñalver, A., Escolano, F., Sáez, J.M. (2006). Color Image Segmentation Through Unsupervised Gaussian Mixture Models. In: Sichman, J.S., Coelho, H., Rezende, S.O. (eds) Advances in Artificial Intelligence - IBERAMIA-SBIA 2006. IBERAMIA SBIA 2006 2006. Lecture Notes in Computer Science(), vol 4140. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11874850_19
Download citation
DOI: https://doi.org/10.1007/11874850_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-45462-5
Online ISBN: 978-3-540-45464-9
eBook Packages: Computer ScienceComputer Science (R0)