Abstract
Transfer learning hopes to borrow transferable knowledge from source domain (related domain) to build up an adapter for target domain. Since the adapter is built on the source domain, the robustness and generalization of a single adapter are more likely to be limited. To further improve the performance of the adapter, in this paper, we propose a parallel ensemble strategy based on evidence theory. Specifically, firstly, we quantify an adaptation degree for instances of source domain based on evidence theory. Secondly, we redefine Determinantal Point Processes (DPP) sampling with adaptation degree, and use the improved DPP sampling to generate k different subsets. Finally, we select and combine the base adapters that are trained by the subsets. In the proposed ensemble strategy, the adaptation degree can ensure the higher transferability of the base adapters, DPP sampling can increase the diversity among the base adapters. Thus, the ensemble strategy can reduce the conflict between accuracy and diversity, and improve the robustness and generalization of the adapters. Numerical experiments on real-world applications are given to comprehensively demonstrate the effectiveness and efficiency of our proposed ensemble strategy. The results show that the ensemble strategy can improve transfer performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bengio, Y.: Deep learning of representations for unsupervised and transfer learning. In: Proceedings of ICML Workshop on Unsupervised and Transfer Learning, pp. 17–36 (2012)
Blitzer, J., Dredze, M., Pereira, F.: Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification. In: Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics, pp. 440–447 (2007)
Chen, M., Weinberger, K.Q., Blitzer, J.: Co-training for domain adaptation. In: Advances in Neural Information Processing Systems, pp. 2456–2464 (2011)
Courty, N., Flamary, R., Tuia, D., Rakotomamonjy, A.: Optimal transport for domain adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 39(9), 1853–1865 (2016)
Dai, W., Yang, Q., Xue, G.R., Yu, Y.: Boosting for transfer learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 193–200 (2007)
Dempster, A.P., et al.: Upper and lower probabilities generated by a random closed interval. Ann. Math. Stat. 39(3), 957–966 (1968)
Denœux, T.: Reasoning with imprecise belief structures. Int. J. Approx. Reason. 20(1), 79–111 (1999)
Denoeux, T.: A k-nearest neighbor classification rule based on dempster-shafer theory. In: Yager, R.R., Liu, L. (eds.) Classic Works of the Dempster-Shafer Theory of Belief Functions, pp. 737–760. Studies in Fuzziness and Soft Computing, vol. 219. Springer, Berlin, Heidelberg (2008). https://doi.org/10.1007/978-3-540-44792-4_29
Denœux, T.: Logistic regression, neural networks and dempster-shafer theory: a new perspective. Knowl.-Based Syst. 176, 54–67 (2019)
Denoeux, T.: Distributed combination of belief functions. Inf. Fusion 65, 179–191 (2021)
Denoeux, T., Shenoy, P.P.: An interval-valued utility theory for decision making with dempster-shafer belief functions. Int. J. Approx. Reason. 124, 194–216 (2020)
Denoeux, T., Sriboonchitta, S., Kanjanatarakul, O.: Evidential clustering of large dissimilarity data. Knowl.-Based Syst. 106, 179–195 (2016)
Duan, L., Xu, D., Tsang, I.W.H.: Domain adaptation from multiple sources: a domain-dependent regularization approach. IEEE Trans. Neural Netw. Learn. Syst. 23(3), 504–518 (2012)
Fernando, B., Habrard, A., Sebban, M., Tuytelaars, T.: Unsupervised visual domain adaptation using subspace alignment. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2960–2967 (2013)
Ghifary, M., Balduzzi, D., Kleijn, W.B., Zhang, M.: Scatter component analysis: a unified framework for domain adaptation and domain generalization. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1414–1430 (2017)
Ghifary, M., Kleijn, W.B., Zhang, M.: Domain adaptive neural networks for object recognition. In: Pham, D.-N., Park, S.-B. (eds.) PRICAI 2014. LNCS (LNAI), vol. 8862, pp. 898–904. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-13560-1_76
Gong, B., Shi, Y., Sha, F., Grauman, K.: Geodesic flow kernel for unsupervised domain adaptation. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2066–2073. IEEE (2012)
Huang, J., Gretton, A., Borgwardt, K., Scholkopf, B., Smola, A.J.: Correcting sample selection bias by unlabeled data. In: Advances in Neural Information Processing Systems, pp. 601–608 (2007)
Karbalayghareh, A., Qian, X., Dougherty, E.R.: Optimal bayesian transfer learning. IEEE Trans. Signal Process. 66(14), 3724–3739 (2018)
Kulesza, A., Taskar, B.: k-DPPs: fixed-size determinantal point processes. In: Proceedings of the 28th International Conference on International Conference on Machine Learning, pp. 1193–1200 (2011)
Kulesza, A., Taskar, B.: Learning determinantal point processes (2011)
Kulesza, A., Taskar, B., et al.: Determinantal point processes for machine learning. Found. Trends® Mach. Learn. 5(2–3), 123–286 (2012)
Long, M., Wang, J., Ding, G., Sun, J., Yu, P.S.: Transfer feature learning with joint distribution adaptation. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2200–2207 (2013)
Pan, S.J., Tsang, I.W., Kwok, J.T., Yang, Q.: Domain adaptation via transfer component analysis. IEEE Trans. Neural Netw. 22(2), 199–210 (2010)
Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2009)
Quost, B., Denœux, T., Li, S.: Parametric classification with soft labels using the evidential em algorithm: linear discriminant analysis versus logistic regression. Adv. Data Anal. Classif. 11(4), 659–690 (2017)
Shafer, G.: A mathematical theory of evidence turns 40. Int. J. Approx. Reason. 79, 7–25 (2016)
Shen, J., Qu, Y., Zhang, W., Yu, Y.: Wasserstein distance guided representation learning for domain adaptation. In: AAAI (2018)
Sun, B., Feng, J., Saenko, K.: Return of frustratingly easy domain adaptation. In: Thirtieth AAAI Conference on Artificial Intelligence (2016)
Tzeng, E., Hoffman, J., Zhang, N., Saenko, K., Darrell, T.: Deep domain confusion: Maximizing for domain invariance. arXiv preprint arXiv:1412.3474 (2014)
Venkateswara, H., Chakraborty, S., Panchanathan, S.: Deep-learning systems for domain adaptation in computer vision: learning transferable feature representations. IEEE Signal Process. Mag. 34(6), 117–129 (2017)
Wang, J., Chen, Y., Yu, H., Huang, M., Yang, Q.: Easy transfer learning by exploiting intra-domain structures. In: 2019 IEEE International Conference on Multimedia and Expo (ICME), pp. 1210–1215 (2019)
Xu, Y., et al.: A unified framework for metric transfer learning. IEEE Trans. Knowl. Data Eng. 29(6), 1158–1171 (2017)
Zhang, J., Li, W., Ogunbona, P., Xu, D.: Recent advances in transfer learning for cross-dataset visual recognition: a problem-oriented perspective. ACM Comput. Surv. (CSUR) 52(1), 1–38 (2019)
Zhuang, F., et al.: A comprehensive survey on transfer learning. Proc. IEEE 109(1), 43–76 (2020)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Lv, Y., Zhang, B., Yue, X., Xu, Z., Liu, W. (2021). Ensemble of Adapters for Transfer Learning Based on Evidence Theory. In: Denœux, T., Lefèvre, E., Liu, Z., Pichon, F. (eds) Belief Functions: Theory and Applications. BELIEF 2021. Lecture Notes in Computer Science(), vol 12915. Springer, Cham. https://doi.org/10.1007/978-3-030-88601-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-88601-1_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-88600-4
Online ISBN: 978-3-030-88601-1
eBook Packages: Computer ScienceComputer Science (R0)