Abstract
Subspace learning of Reproducing Kernel Hilbert Space (RKHS) is most popular among domain adaption applications. The key goal is to embed the source and target domain samples into a common RKHS subspace where their distributions could match better. However, most existing domain adaption measures are either based on the first-order statistics that can’t accurately qualify the difference of distributions for non-Guassian distributions or complicated co-variance matrix that is difficult to be used and optimized. In this paper, we propose a neat and effective RKHS subspace domain adaption measure: Minimum Distribution Gap (MDG), where the rigorous mathematical formula can be derived to learn the weighting matrix of the optimized orthogonal Hilbert subspace basis via the Lagrange Multiplier Method. To show the efficiency of the proposed MDG measure, extensive numerical experiments with different datasets have been performed and the comparisons with four other state-of-the-art algorithms in the literature show that the proposed MDG measure is very promising.
Similar content being viewed by others
References
Bruzzone L, Marconcini M (2010) Domain adaptation problems: a DASVM classification technique and a circular validation strategy. IEEE Trans Pattern Anal Mach Intell 32(5):770–787. https://doi.org/10.1109/TPAMI.2009.57
Gopalan R, Li R, Chellappa R (2014) Unsupervised adaptation across domain shifts by generating intermediate data representations. IEEE Trans Pattern Anal Mach Intell 36(11):2288–2302. https://doi.org/10.1109/TPAMI.2013.249
Zhang Y, Deng B, Tang H, Zhang L, Jia K (2020) Unsupervised multi-class domain adaptation: theory, algorithms, and practice. IEEE Trans Pattern Anal Mach Intell https://doi.org/10.1109/TPAMI.2020.3036956
Chen B, Lam W, Tsang IW, Wong TL (2013) Discovering low-rank shared concept space for adapting text mining models. IEEE Trans Pattern Anal Mach Intell 35(6):1284–1297. https://doi.org/10.1109/TPAMI.2012.243
Li J, Lu K, Huang Z, Zhu L, Shen HT (2019) Transfer independently together: a generalized framework for domain adaptation. IEEE Trans Cybern 49(6):2144–2155. https://doi.org/10.1109/TCYB.2018.2820174
Jiang M, Huang W, Huang Z, Yen GG (2017) Integration of global and local metrics for domain adaptation learning via dimensionality reduction. IEEE Trans Cybern 47(1):38–51. https://doi.org/10.1109/TCYB.2015.2502483
Pan SJ, Tsang IW, Kwok JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210. https://doi.org/10.1109/TNN.2010.2091281
Li L, Zhang Z (2019) Semi-supervised domain adaptation by covariance matching. IEEE Trans Pattern Anal Mach Intell 41(11):2724–2739. https://doi.org/10.1109/TPAMI.2018.2866846
Steinwart I, Hush D, Scovel C (2006) An explicit description of the reproducing kernel Hilbert spaces of Gaussian RBF kernels. IEEE Trans Inf Theory 52(10):4635–4643. https://doi.org/10.1109/TIT.2006.88171
Zhang Z, Wang M, Nehorai A (2020) Optimal transport in reproducing kernel Hilbert spaces: theory and applications. IEEE Trans Pattern Anal Mach Intell 42(7):1741–1754. https://doi.org/10.1109/TPAMI.2019.2903050
Deng WY, Lendasse A, Ong YS, Tsang IWH, Chen L, Zheng QH (2019) Domain adaption via feature selection on explicit feature map. IEEE Trans Neural Netw Learn Syst 30(4):1180–1190. https://doi.org/10.1109/TNNLS.2018.2863240
Feng Y, Yuan Y, Lu X (2021) Person reidentification via unsupervised cross-view metric learning. IEEE Trans Cybern 51(4):1849–1859. https://doi.org/10.1109/TCYB.2019.2909480
Tao D, Jin L, Wang Y, Li X (2015) Person reidentification by minimum classification error-based KISS metric learning. IEEE Trans Cybern 45(2):242–252. https://doi.org/10.1109/TCYB.2014.2323992
Gretton A, Borgwardt K, Rasch M, Schölkopf B, Smola A (2006) A kernel method for the two-sample-problem. Adv Neural Inf Process Syst 19:513–520
Sun B, Saenko K (2016) Deep coral: correlation alignment for deep domain adaptation. In: European conference on computer vision. Springer, pp 443–450
Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359. https://doi.org/10.1109/TKDE.2009.191
Pan SJ, Tsang IW, Kwok JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210. https://doi.org/10.1109/TNN.2010.2091281
Yan K, Kou L, Zhang D (2018) Learning domain-invariant subspace using domain features and independence maximization. IEEE Trans Cybern 48(1):288–299. https://doi.org/10.1109/TCYB.2016.2633306
Li J, Jing M, Lu K, Zhu L, Shen HT (2019) Locality preserving joint transfer for domain adaptation. IEEE Trans Image Process 28(12):6103–6115. https://doi.org/10.1109/TIP.2019.2924174
Tsai YHH, Yeh YR, Wang YCF (2016) Learning cross-domain landmarks for heterogeneous domain adaptation. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), pp 5081–5090
Li J, Lu K, Huang Z, Zhu L, Shen HT (2019) Heterogeneous domain adaptation through progressive alignment. IEEE Trans Neural Netw Learn Syst 30(5):1381–1391. https://doi.org/10.1109/TNNLS.2018.2868854
Tzeng E, Hoffman J, Zhang N, Saenko K, Darrell T (2014) Deep domain confusion: maximizing for domain invariance. arXiv:1412.3474
Long M, Cao Y, Wang J, Jordan M (2015) Learning transferable features with deep adaptation networks. In: International conference on machine learning. PMLR, pp 97–105
Liang J, Li L, Zhao C (2021) A transfer learning approach for compressed sensing in 6G-IoT. IEEE Internet Things J 8(20):15276–15283
Saitoh S, Sawano Y (eds) (2016) Theory of reproducing kernels and applications. Springer, Singapore
Gori F, Martínez-Herrero R (2021) Reproducing kernel Hilbert spaces for wave optics: tutorial. JOSA A 38(5):737–748
Yosida K et al (1965) Functional analysis. Springer, Berlin
Paulsen VI, Raghupathi M (2016) An introduction to the theory of reproducing kernel Hilbert spaces. Cambridge University Press, Cambridge
Fukunaga K (2013) Introduction to statistical pattern recognition. Elsevier, Amsterdam
Gong B, Shi Y, Sha F, Grauman K (2012) Geodesic flow kernel for unsupervised domain adaptation. In: IEEE conference on computer vision and pattern recognition, vol 2012. IEEE, pp 2066–2073
Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemom Intell Lab Syst 2(1–3):37–52
Peterson LE (2009) K-nearest neighbor. Scholarpedia 4(2):1883
Bay H, Tuytelaars T, Van Gool L (2006) Surf: speeded up robust features. In: European conference on computer vision. Springer, pp 404–417
Funding
No funding was received to assist with the preparation of this manuscript.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: Identical random variables
Appendix A: Identical random variables
Two second-order moment random variables are identical if and only if their statistical mean square error is zero,
To demonstrate this, the variance of Eq. (22) can be expressed as follows
Because both \(\left( y-y'\right) ^2\) and \( p(y, y')\) are semi-definite or non-negative, Eq. (23) is zero when one of the following two conditions are met for all points in the probability domain \(\Omega \),
It can be shown that Eq. (24) is equivalent to the joint probability \(p(y, y') = f(y) \delta (y-y')\),
from which the marginal probabilities of Y and \(Y'\) are identical and Eq. 22 is proved.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Qiu, Y., Zhang, C., Xiong, C. et al. RKHS subspace domain adaption via minimum distribution gap. Pattern Anal Applic 26, 1425–1439 (2023). https://doi.org/10.1007/s10044-023-01170-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-023-01170-y