Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Ensemble of Adapters for Transfer Learning Based on Evidence Theory

  • Conference paper
  • First Online:
Belief Functions: Theory and Applications (BELIEF 2021)

Abstract

Transfer learning hopes to borrow transferable knowledge from source domain (related domain) to build up an adapter for target domain. Since the adapter is built on the source domain, the robustness and generalization of a single adapter are more likely to be limited. To further improve the performance of the adapter, in this paper, we propose a parallel ensemble strategy based on evidence theory. Specifically, firstly, we quantify an adaptation degree for instances of source domain based on evidence theory. Secondly, we redefine Determinantal Point Processes (DPP) sampling with adaptation degree, and use the improved DPP sampling to generate k different subsets. Finally, we select and combine the base adapters that are trained by the subsets. In the proposed ensemble strategy, the adaptation degree can ensure the higher transferability of the base adapters, DPP sampling can increase the diversity among the base adapters. Thus, the ensemble strategy can reduce the conflict between accuracy and diversity, and improve the robustness and generalization of the adapters. Numerical experiments on real-world applications are given to comprehensively demonstrate the effectiveness and efficiency of our proposed ensemble strategy. The results show that the ensemble strategy can improve transfer performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bengio, Y.: Deep learning of representations for unsupervised and transfer learning. In: Proceedings of ICML Workshop on Unsupervised and Transfer Learning, pp. 17–36 (2012)

    Google Scholar 

  2. Blitzer, J., Dredze, M., Pereira, F.: Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification. In: Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics, pp. 440–447 (2007)

    Google Scholar 

  3. Chen, M., Weinberger, K.Q., Blitzer, J.: Co-training for domain adaptation. In: Advances in Neural Information Processing Systems, pp. 2456–2464 (2011)

    Google Scholar 

  4. Courty, N., Flamary, R., Tuia, D., Rakotomamonjy, A.: Optimal transport for domain adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 39(9), 1853–1865 (2016)

    Article  Google Scholar 

  5. Dai, W., Yang, Q., Xue, G.R., Yu, Y.: Boosting for transfer learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 193–200 (2007)

    Google Scholar 

  6. Dempster, A.P., et al.: Upper and lower probabilities generated by a random closed interval. Ann. Math. Stat. 39(3), 957–966 (1968)

    Article  MathSciNet  Google Scholar 

  7. Denœux, T.: Reasoning with imprecise belief structures. Int. J. Approx. Reason. 20(1), 79–111 (1999)

    Article  MathSciNet  Google Scholar 

  8. Denoeux, T.: A k-nearest neighbor classification rule based on dempster-shafer theory. In: Yager, R.R., Liu, L. (eds.) Classic Works of the Dempster-Shafer Theory of Belief Functions, pp. 737–760. Studies in Fuzziness and Soft Computing, vol. 219. Springer, Berlin, Heidelberg (2008). https://doi.org/10.1007/978-3-540-44792-4_29

  9. Denœux, T.: Logistic regression, neural networks and dempster-shafer theory: a new perspective. Knowl.-Based Syst. 176, 54–67 (2019)

    Article  Google Scholar 

  10. Denoeux, T.: Distributed combination of belief functions. Inf. Fusion 65, 179–191 (2021)

    Article  Google Scholar 

  11. Denoeux, T., Shenoy, P.P.: An interval-valued utility theory for decision making with dempster-shafer belief functions. Int. J. Approx. Reason. 124, 194–216 (2020)

    Article  MathSciNet  Google Scholar 

  12. Denoeux, T., Sriboonchitta, S., Kanjanatarakul, O.: Evidential clustering of large dissimilarity data. Knowl.-Based Syst. 106, 179–195 (2016)

    Article  Google Scholar 

  13. Duan, L., Xu, D., Tsang, I.W.H.: Domain adaptation from multiple sources: a domain-dependent regularization approach. IEEE Trans. Neural Netw. Learn. Syst. 23(3), 504–518 (2012)

    Article  Google Scholar 

  14. Fernando, B., Habrard, A., Sebban, M., Tuytelaars, T.: Unsupervised visual domain adaptation using subspace alignment. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2960–2967 (2013)

    Google Scholar 

  15. Ghifary, M., Balduzzi, D., Kleijn, W.B., Zhang, M.: Scatter component analysis: a unified framework for domain adaptation and domain generalization. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1414–1430 (2017)

    Article  Google Scholar 

  16. Ghifary, M., Kleijn, W.B., Zhang, M.: Domain adaptive neural networks for object recognition. In: Pham, D.-N., Park, S.-B. (eds.) PRICAI 2014. LNCS (LNAI), vol. 8862, pp. 898–904. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-13560-1_76

    Chapter  Google Scholar 

  17. Gong, B., Shi, Y., Sha, F., Grauman, K.: Geodesic flow kernel for unsupervised domain adaptation. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2066–2073. IEEE (2012)

    Google Scholar 

  18. Huang, J., Gretton, A., Borgwardt, K., Scholkopf, B., Smola, A.J.: Correcting sample selection bias by unlabeled data. In: Advances in Neural Information Processing Systems, pp. 601–608 (2007)

    Google Scholar 

  19. Karbalayghareh, A., Qian, X., Dougherty, E.R.: Optimal bayesian transfer learning. IEEE Trans. Signal Process. 66(14), 3724–3739 (2018)

    Article  MathSciNet  Google Scholar 

  20. Kulesza, A., Taskar, B.: k-DPPs: fixed-size determinantal point processes. In: Proceedings of the 28th International Conference on International Conference on Machine Learning, pp. 1193–1200 (2011)

    Google Scholar 

  21. Kulesza, A., Taskar, B.: Learning determinantal point processes (2011)

    Google Scholar 

  22. Kulesza, A., Taskar, B., et al.: Determinantal point processes for machine learning. Found. Trends® Mach. Learn. 5(2–3), 123–286 (2012)

    Google Scholar 

  23. Long, M., Wang, J., Ding, G., Sun, J., Yu, P.S.: Transfer feature learning with joint distribution adaptation. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 2200–2207 (2013)

    Google Scholar 

  24. Pan, S.J., Tsang, I.W., Kwok, J.T., Yang, Q.: Domain adaptation via transfer component analysis. IEEE Trans. Neural Netw. 22(2), 199–210 (2010)

    Article  Google Scholar 

  25. Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2009)

    Article  Google Scholar 

  26. Quost, B., Denœux, T., Li, S.: Parametric classification with soft labels using the evidential em algorithm: linear discriminant analysis versus logistic regression. Adv. Data Anal. Classif. 11(4), 659–690 (2017)

    Article  MathSciNet  Google Scholar 

  27. Shafer, G.: A mathematical theory of evidence turns 40. Int. J. Approx. Reason. 79, 7–25 (2016)

    Article  MathSciNet  Google Scholar 

  28. Shen, J., Qu, Y., Zhang, W., Yu, Y.: Wasserstein distance guided representation learning for domain adaptation. In: AAAI (2018)

    Google Scholar 

  29. Sun, B., Feng, J., Saenko, K.: Return of frustratingly easy domain adaptation. In: Thirtieth AAAI Conference on Artificial Intelligence (2016)

    Google Scholar 

  30. Tzeng, E., Hoffman, J., Zhang, N., Saenko, K., Darrell, T.: Deep domain confusion: Maximizing for domain invariance. arXiv preprint arXiv:1412.3474 (2014)

  31. Venkateswara, H., Chakraborty, S., Panchanathan, S.: Deep-learning systems for domain adaptation in computer vision: learning transferable feature representations. IEEE Signal Process. Mag. 34(6), 117–129 (2017)

    Article  Google Scholar 

  32. Wang, J., Chen, Y., Yu, H., Huang, M., Yang, Q.: Easy transfer learning by exploiting intra-domain structures. In: 2019 IEEE International Conference on Multimedia and Expo (ICME), pp. 1210–1215 (2019)

    Google Scholar 

  33. Xu, Y., et al.: A unified framework for metric transfer learning. IEEE Trans. Knowl. Data Eng. 29(6), 1158–1171 (2017)

    Article  Google Scholar 

  34. Zhang, J., Li, W., Ogunbona, P., Xu, D.: Recent advances in transfer learning for cross-dataset visual recognition: a problem-oriented perspective. ACM Comput. Surv. (CSUR) 52(1), 1–38 (2019)

    Google Scholar 

  35. Zhuang, F., et al.: A comprehensive survey on transfer learning. Proc. IEEE 109(1), 43–76 (2020)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Bofeng Zhang or Xiaodong Yue .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lv, Y., Zhang, B., Yue, X., Xu, Z., Liu, W. (2021). Ensemble of Adapters for Transfer Learning Based on Evidence Theory. In: Denœux, T., Lefèvre, E., Liu, Z., Pichon, F. (eds) Belief Functions: Theory and Applications. BELIEF 2021. Lecture Notes in Computer Science(), vol 12915. Springer, Cham. https://doi.org/10.1007/978-3-030-88601-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88601-1_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88600-4

  • Online ISBN: 978-3-030-88601-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics