Abstract
Aspect-level sentiment analysis has identified its significance in sentiment polarity classification of consumer review. For the purpose of specific target sentiment analysis, we put forward a co-attentive deep learning method in the manner of human processing. To start with, the GRUs are taken to extract the hidden states of the different word embeddings. Further, via the interactive learning of the co-attention network, the representations of the target and the context can be obtained. In addition, the attention weights are determined based on the self-attention mechanism to update the final representations. The experimental results evaluated on the SemEval 2014 and Twitter establish a strong evidence of the high accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The detail introduction of this task can be seen at: http://alt.qcri.org/semeval2014/task4/.
- 2.
Pre-trained word vectors of Glove can be obtained from http://nlp.stanford.edu/projects/glove/.
References
Goldstein, I.: Automated classification of the narrative of medical reports using natural language processing. Citeseer (2011)
Ma, D., Li, S., Zhang, X., Wang, H.: Interactive attention networks for aspect-level sentiment classification. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp. 4068–4074. AAAI Press (2017)
Zeng, J., Ma, X., Zhou, K.: Enhancing attention-based LSTM with position context for aspect-level sentiment classification. IEEE Access 7, 20462–20471 (2019)
Du, J., Gui, L., He, Y., Xu, R., Wang, X.: Convolution-based neural attention with applications to sentiment classification. IEEE Access 7, 27983–27992 (2019)
Hu, X., Li, K., Han, J., Hua, X., Guo, L., Liu, T.: Bridging the semantic gap via functional brain imaging. IEEE Trans. Multimedia 14(2), 314–325 (2012)
Wang, J., Zhang, J., Wang, X.: Bilateral LSTM: a two-dimensional long short-term memory model with multiply memory units for short-term cycle time forecasting in re-entrant manufacturing systems. IEEE Trans. Industr. Inf. 14(2), 748–758 (2018)
Wang, Y., Huang, M., Zhao, L., et al.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)
Huang, B., Ou, Y., Carley, K.M.: Aspect level sentiment classification with attention-over-attention neural networks. In: Thomson, R., Dancy, C., Hyder, A., Bisgin, H. (eds.) SBP-BRiMS 2018. LNCS, vol. 10899, pp. 197–206. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93372-6_22
Zheng, S., Xia, R.: Left-center-right separated neural network for aspect-based sentiment analysis with rotatory attention. arXiv preprint arXiv:1802.00892 (2018)
Li, L., Liu, Y., Zhou, A.: Hierarchical attention based position-aware network for aspect-level sentiment analysis. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 181–189 (2018)
Lin, L., Luo, H., Huang, R., Ye, M.: Recurrent models of visual co-attention for person re-identification. IEEE Access 7, 8865–8875 (2019)
Lu, J., Yang, J., Batra, D., Parikh, D.: Hierarchical question-image co-attention for visual question answering. In: Advances in Neural Information Processing Systems, pp. 289–297 (2016)
Nguyen, D.-K., Okatani, T.: Improved fusion of visual and language representations by dense symmetric co-attention for visual question answering. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 6087–6096 (2018)
Zhang, P., Zhu, H., Xiong, T., Yang, Y.: Co-attention network and low-rank bilinear pooling for aspect based sentiment analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2019), pp. 6725–6729. IEEE (2019)
Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Xu, K.: Adaptive recursive neural network for target-dependent Twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 49–54 (2014)
Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 3298–3307 (2016)
Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 214–224 (2016)
Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 452–461 (2017)
Acknowledgments
This work was supported by the National Natural Science Foundation of China under Grant No. 61876205, and the Science and Technology Plan Project of Guangzhou under Grant Nos. 201802010033 and 201804010433.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, H., Xue, Y., Zhao, H., Hu, X., Peng, S. (2019). Co-attention Networks for Aspect-Level Sentiment Analysis. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11839. Springer, Cham. https://doi.org/10.1007/978-3-030-32236-6_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-32236-6_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32235-9
Online ISBN: 978-3-030-32236-6
eBook Packages: Computer ScienceComputer Science (R0)