Abstract
In this paper we address the problem of finding the optimal number of reference vectors in vector quantization from the point of view of the Minimum Description Length (MDL) principle. We formulate the VQ in terms of the MDL principle, and then derive depending on the coding procedure different instantiations of the algorithm. Moreover, we develop an efficient algorithm (similar to EM-type algorithms) for optimizing the MDL criterion. In addition we can use the MDL principle to increase the robustness of the training algorithm. In order to visualize the behavior of the algorithm, we illustrate our approach on 2D clustering problems and present applications on image coding. Finally we outline various ways to extend the algorithm.
This work was supported by a grant from the Austrian National Fonds zur Förderung der wissenschaftlichen Forschung (No. S7002MAT). A. L. also acknowledges the support from the Ministry of Science and Technology of Republic of Slovenia (Projects J2-8829 and J2-0414).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
A.P. Dempster, N.M. Laird, and D.B. Rubin. Maximum likelihood from incomplete data via the EM algorithm. J.R. Statist. Soc. B, 39:1–38, 1977.
R. O. Duda and P. E. Hart. Pattern Classification and Scene Analysis. Wiley, 1973.
B. Fritzke. Some competitive learning methods (draft). Technical report, Inst, for Neural Computation, Ruhr-University Bochum, 1997.
G. E. Hinton and R. S. Zemel. Autoencoders, minimum description length and Helmholtz free energy. In J. D. Cowan, G. Tesauro, and J. Alspector, editors, Advances in Neural Information Proc. Systems, volume 6, pages 3–10. Morgan Kaufmann, 1994.
G. E. Hinton and R. S. Zemel. Minimizing description length in an unsupervised neural network (submitted). Neural Networks, 1997.
S. Hochreiter and J. Schmidhuber. Flat minima. Neural Computation, 9(1):1–43, 1997.
T. Kohonen. Self-Organization and Associative Memory. Berlin: Springer-Verlag, 1989.
Aleš Leonardis and Horst Bischof. An efficient MDL-Based construction of RBF networks. Neural Networks (in press), 1998.
M. Li and P. Vitanyi. An Introduction to Kolmogorov Complexity and its Applications. Springer, 2nd edition, 1997.
J. Rissanen. Stochastic Complexity in Statistical Inquiry, volume 15 of Series in Computer Science. World Scientific, 1989.
S. Roberts. Parametric and non-parametric unsupervised cluster analysis. Pattern Recognition, 30(2):261–272, 1997.
L. Xu. How many clusters?: A YING-YANG Machine based theory for a classical open problem in pattern recognition. In Proc. of 1996 IEEE Int. Conf. on Neural Networks, volume 3, pages 1546–1551, Washington, DC, June 2–6, 1996. IEEE Computer Society.
R. S. Zemel and G.E Hinton. Learning population codes by minimum description length. Neural Computation, 7(3):549–564, 1995.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag London Limited
About this paper
Cite this paper
Bischof, H., Leonardis, A. (1999). Vector Quantization and Minimum Description Length. In: Singh, S. (eds) International Conference on Advances in Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-4471-0833-7_36
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0833-7_36
Publisher Name: Springer, London
Print ISBN: 978-1-4471-1214-3
Online ISBN: 978-1-4471-0833-7
eBook Packages: Springer Book Archive