Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Co-attention Networks for Aspect-Level Sentiment Analysis

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11839))

Abstract

Aspect-level sentiment analysis has identified its significance in sentiment polarity classification of consumer review. For the purpose of specific target sentiment analysis, we put forward a co-attentive deep learning method in the manner of human processing. To start with, the GRUs are taken to extract the hidden states of the different word embeddings. Further, via the interactive learning of the co-attention network, the representations of the target and the context can be obtained. In addition, the attention weights are determined based on the self-attention mechanism to update the final representations. The experimental results evaluated on the SemEval 2014 and Twitter establish a strong evidence of the high accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The detail introduction of this task can be seen at: http://alt.qcri.org/semeval2014/task4/.

  2. 2.

    Pre-trained word vectors of Glove can be obtained from http://nlp.stanford.edu/projects/glove/.

References

  1. Goldstein, I.: Automated classification of the narrative of medical reports using natural language processing. Citeseer (2011)

    Google Scholar 

  2. Ma, D., Li, S., Zhang, X., Wang, H.: Interactive attention networks for aspect-level sentiment classification. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp. 4068–4074. AAAI Press (2017)

    Google Scholar 

  3. Zeng, J., Ma, X., Zhou, K.: Enhancing attention-based LSTM with position context for aspect-level sentiment classification. IEEE Access 7, 20462–20471 (2019)

    Article  Google Scholar 

  4. Du, J., Gui, L., He, Y., Xu, R., Wang, X.: Convolution-based neural attention with applications to sentiment classification. IEEE Access 7, 27983–27992 (2019)

    Article  Google Scholar 

  5. Hu, X., Li, K., Han, J., Hua, X., Guo, L., Liu, T.: Bridging the semantic gap via functional brain imaging. IEEE Trans. Multimedia 14(2), 314–325 (2012)

    Article  Google Scholar 

  6. Wang, J., Zhang, J., Wang, X.: Bilateral LSTM: a two-dimensional long short-term memory model with multiply memory units for short-term cycle time forecasting in re-entrant manufacturing systems. IEEE Trans. Industr. Inf. 14(2), 748–758 (2018)

    Article  Google Scholar 

  7. Wang, Y., Huang, M., Zhao, L., et al.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)

    Google Scholar 

  8. Huang, B., Ou, Y., Carley, K.M.: Aspect level sentiment classification with attention-over-attention neural networks. In: Thomson, R., Dancy, C., Hyder, A., Bisgin, H. (eds.) SBP-BRiMS 2018. LNCS, vol. 10899, pp. 197–206. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93372-6_22

    Chapter  Google Scholar 

  9. Zheng, S., Xia, R.: Left-center-right separated neural network for aspect-based sentiment analysis with rotatory attention. arXiv preprint arXiv:1802.00892 (2018)

  10. Li, L., Liu, Y., Zhou, A.: Hierarchical attention based position-aware network for aspect-level sentiment analysis. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, pp. 181–189 (2018)

    Google Scholar 

  11. Lin, L., Luo, H., Huang, R., Ye, M.: Recurrent models of visual co-attention for person re-identification. IEEE Access 7, 8865–8875 (2019)

    Article  Google Scholar 

  12. Lu, J., Yang, J., Batra, D., Parikh, D.: Hierarchical question-image co-attention for visual question answering. In: Advances in Neural Information Processing Systems, pp. 289–297 (2016)

    Google Scholar 

  13. Nguyen, D.-K., Okatani, T.: Improved fusion of visual and language representations by dense symmetric co-attention for visual question answering. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 6087–6096 (2018)

    Google Scholar 

  14. Zhang, P., Zhu, H., Xiong, T., Yang, Y.: Co-attention network and low-rank bilinear pooling for aspect based sentiment analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2019), pp. 6725–6729. IEEE (2019)

    Google Scholar 

  15. Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Xu, K.: Adaptive recursive neural network for target-dependent Twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 49–54 (2014)

    Google Scholar 

  16. Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 3298–3307 (2016)

    Google Scholar 

  17. Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 214–224 (2016)

    Google Scholar 

  18. Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 452–461 (2017)

    Google Scholar 

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant No. 61876205, and the Science and Technology Plan Project of Guangzhou under Grant Nos. 201802010033 and 201804010433.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yun Xue .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, H., Xue, Y., Zhao, H., Hu, X., Peng, S. (2019). Co-attention Networks for Aspect-Level Sentiment Analysis. In: Tang, J., Kan, MY., Zhao, D., Li, S., Zan, H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science(), vol 11839. Springer, Cham. https://doi.org/10.1007/978-3-030-32236-6_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32236-6_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32235-9

  • Online ISBN: 978-3-030-32236-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics