Multi-domain transfer learning for text classification

X Su, R Li, X Li - CCF International Conference on Natural Language …, 2020 - Springer
X Su, R Li, X Li
CCF International Conference on Natural Language Processing and Chinese Computing, 2020Springer
Leveraging data from multiple related domains to enhance the model generalization
performance is critical for transfer learning in text classification. However, most existing
approaches try to separate the features into shared and private spaces regardless of
correlations between domains, resulting in the inadequate features sharing among certain
most related domains. In this paper, we propose a generic dual-channels multi-task learning
framework for multi-domain text classification, which can capture global-shared, local …
Abstract
Leveraging data from multiple related domains to enhance the model generalization performance is critical for transfer learning in text classification. However, most existing approaches try to separate the features into shared and private spaces regardless of correlations between domains, resulting in the inadequate features sharing among certain most related domains. In this paper, we propose a generic dual-channels multi-task learning framework for multi-domain text classification, which can capture global-shared, local-shared, and private features simultaneously. Our novel framework incorporates Adversarial network and Mixture of experts into a neural network for multi-domain text classification, which is very useful for sharing more features among domains. The extensive experiments on the real-world text classification data-sets across 16 domains demonstrate our proposed approach achieves better results than five state-of-the-art techniques.
Springer