Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

DCAT: Combining Multisemantic Dual-Channel Attention Fusion for Text Classification

Published: 01 July 2023 Publication History

Abstract

Text classification is a fundamental and central position in natural language processing. There are many solutions to the text classification problem, but few use the semantic combination of multiple perspectives to improve the classification performance. This article proposes a dual-channel attention network model called DCAT, which uses the complementarity between semantics to refine the understanding deficit. Specifically, DCAT first captures the logical semantics of the text through transductive learning and graph structure. Then, at the attention fusion layer (channel), we use logical semantics to perform joint semantic training on other semantics to correct the predictions of unlabeled test data incrementally. Experiments show that DCAT can achieve more accurate classification on a wide range of text classification datasets, which is vital for subsequent text mining tasks.

References

[1]
E. Cambria, “Affective computing and sentiment analysis,” IEEE Intell. Syst., vol. 31, no. 2, pp. 102–107, Mar./Apr. 2016.
[2]
S. Bandyopadhyay, D. Das, N. Howard, A. Hussain, A. Gelbukh, and S. Poria, “Enhanced SenticNet with affective labels for concept-based opinion mining,” IEEE Intell. Syst., vol. 28, no. 2, pp. 31–38, Mar./Apr. 2013.
[3]
C. Xu, B. Su, Y. Cheng, W. Pan, and L. Chen, “An adaptive fusion algorithm for spam detection,” IEEE Intell. Syst., vol. 29, no. 4, pp. 2–8, Jul./Aug. 2014.
[4]
Y. Kim, “Convolutional neural networks for sentence classification,” 2014. [Online]. Available: https://arxiv.org/abs/1408.5882
[5]
L. Yao, C. Mao, and Y. Luo, “Graph convolutional networks for text classification,” in Proc. Int. AAA Conf. Artif. Intell., 2019, pp. 7370–7377.
[6]
Y. Lin et al., “BertGCN: Transductive text classification by combining GCN and BERT,” 2021. [Online]. Available: https://arxiv.org/abs/2105.05727
[7]
J. Devlin, M. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” 2018. [Online]. Available: https://arxiv.org/abs/1810.04805
[8]
Y. Liu et al., “RoBERTa: A robustly optimized BERT pretraining approach,” 2019. [Online]. Available: https://arxiv.org/abs/1907.11692
[9]
C. Wan and B. Li, “Financial causal sentence recognition based on BERT-CNN text classification,” J. Supercomputing, vol. 78, no. 5, pp. 6503–6527, Apr. 2022.
[10]
D. Lu, “daminglu123 at SemEval-2022 task 2: Using BERT and LSTM to do text classification,” in Proc. Int. Workshop Semantic Eval., 2022, pp. 186–189.
[11]
L. X. Jiang, C. Q. Li, S. S. Wang, and L. G. Zhang, “Deep feature weighting for naive Bayes and its application to text classification,” Eng. Appl. Artif. Intell., vol. 52, pp. 26–39, Jun. 2016.
[12]
L. X. Jiang, S. S. Wang, C. Q. Li, and L. G. Zhang, “Structure extended multinomial naive Bayes,” Inf. Sci., vol. 329, pp. 346–356, Feb. 2016.
[13]
L. G. Zhang, L. X. Jiang, and C. Q. Li, “A new feature selection approach to naive Bayes text classifiers,” Int. J. Pattern Recognit. Artif. Intell., vol. 30, no. 2, Feb. 2016, Art. no. 1650003.
[14]
S. Lai, L. Xu, K. Liu, and J. Zhao, “Recurrent convolutional neural networks for text classification,” in Proc. Int. AAA Conf. Artif. Intell., 2015, pp. 2267–2273.
[15]
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” 2016. [Online]. Available: https://arxiv.org/abs/1609.02907
[16]
Y. Dai et al., “Graph fusion network for text classification,” Knowl. Based Syst., vol. 236, Jan. 2022, Art. no. 107659.
[17]
Y. Sun et al., “ERNIE 2.0: A continual pre-training framework for language understanding,” 2019. [Online]. Available: https://arxiv.org/abs/1907.12412
[18]
L. Xiao, B. Cheng, and X. Huang, “Multi-label text classification based on semantic attention to labels,” J. Softw., vol. 31, no. 4, pp. 1079–1089, 2020.
[19]
S. Poria, E. Cambria, D. Hazarika, N. Majumder, A. Zadeh, and L. Morency, “Multi-level multiple attentions for contextual multimodal sentiment analysis,” in Proc. Int. Conf. Data Mining, 2017, pp. 1033–1038.
[20]
P. Liu, X. Qiu, and X. Huang, “Recurrent neural network for text classification with multi-task learning,” 2016. [Online]. Available: https://arxiv.org/abs/1605.05101

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Intelligent Systems
IEEE Intelligent Systems  Volume 38, Issue 4
July-Aug. 2023
61 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 July 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Oct 2024

Other Metrics

Citations

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media