Abstract
Learning based on kernel machines is widely known as a powerful tool for various fields of information science such as pattern recognition and regression estimation. The efficacy of the model in kernel machines depends on the distance between the unknown true function and the linear subspace, specified by the training data set, of the reproducing kernel Hilbert space corresponding to an adopted kernel. In this paper, we propose a framework for the model selection of kernel-based learning machines, incorporating a class of kernels with an invariant metric.
Chapter PDF
Similar content being viewed by others
Keywords
- True Function
- Reproduce Kernel Hilbert Space
- Kernel Machine
- Parametric Projection
- Machine Learning Problem
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Muller, K., Mika, S., Ratsch, G., Tsuda, K., Scholkopf, B.: An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks 12, 181–201 (2001)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1999)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Recognition. Cambridge University Press, Cambridge (2004)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2000)
Aronszajn, N.: Theory of Reproducing Kernels. Transactions of the American Mathematical Society 68, 337–404 (1950)
Mercer, J.: Functions of Positive and Negative Type and Their Connection with The Theory of Integral Equations. Transactions of the London Philosophical Society A, 415–446 (1909)
Ogawa, H.: Neural Networks and Generalization Ability. IEICE Technical Report NC95-8, 57–64 (1995)
Schatten, R.: Norm Ideals of Completely Continuous Operators. Springer, Berlin (1960)
Sugiyama, M., Ogawa, H.: Incremental Projection Learning for Optimal Generalization. Neural Networks 14, 53–66 (2001)
Imai, H., Tanaka, A., Miyakoshi, M.: The family of parametric projection filters and its properties for perturbation. The IEICE Transactions on Information and Systems E80–D, 788–794 (1997)
Oja, E., Ogawa, H.: Parametric Projection Filter for Image and Signal Restoration. IEEE Transactions on Acoustics, Speech and Signal Processing ASSP–34, 1643–1653 (1986)
Rao, C.R., Mitra, S.K.: Generalized Inverse of Matrices and its Applications. John Wiley & Sons, Chichester (1971)
Saitoh, S.: Integral Transforms, Reproducing Kernels and Their Applications. Addison Wesley Longman Ltd, Amsterdam (1997)
Sugiyama, M., Ogawa, H.: Subspace Information Criterion for Model Selection. Neural Computation 13, 1863–1889 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tanaka, A., Sugiyama, M., Imai, H., Kudo, M., Miyakoshi, M. (2006). Model Selection Using a Class of Kernels with an Invariant Metric. In: Yeung, DY., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2006. Lecture Notes in Computer Science, vol 4109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11815921_95
Download citation
DOI: https://doi.org/10.1007/11815921_95
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37236-3
Online ISBN: 978-3-540-37241-7
eBook Packages: Computer ScienceComputer Science (R0)