Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Triplet Embedding Convolutional Recurrent Neural Network for Long Text Semantic Analysis

  • Conference paper
  • First Online:
Web Information Systems Engineering – WISE 2022 (WISE 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13724))

Included in the following conference series:

  • 1249 Accesses

Abstract

Deep Recurrent Neural Network has an excellent performance in sentence semantic analysis. However, due to the curse of the computational dimensionality, the application in the long text is minimal. Therefore, we propose a Triplet Embedding Convolutional Recurrent Neural Network for long text analysis. Firstly, a triplet from each sentence of the long text. Then the most crucial head entity into the CRNN network, composed of CNN and Bi-GRU networks. Both relation and tail entities are input to a CNN network through three splicing layers. Finally, the output results into the global pooling layer to get the final results. Entity fusion and entity replacement are also used to retain the text’s structural and semantic information before triplet extraction in sentences. We have conducted experiments on a large-scale criminal case dataset. The results show our model significantly improves the judgment prediction task.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Hassan, A., Mahmood, A.: Deep learning approach for sentiment analysis of short texts. In: 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), pp. 705–710 (2022)

    Google Scholar 

  2. Lauderdale, B.E., Clark, T.S.: The Supreme Court’s many median justices. Am. Polit. Sci. Rev. 106(4), 847–866 (2012)

    Article  Google Scholar 

  3. Gonçalves, T., Quaresma, P.: Is linguistic information relevant for the classification of legal texts? In: Proceedings of the 10th International Conference on Artificial Intelligence and Law, pp. 168–176 (2005)

    Google Scholar 

  4. Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101. (2016)

  5. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  6. Kim, Y.: Convolutional neural networks for sentence classification. arXiv: 1408.5882. (2014)

  7. Long, S., Tu, C., Liu, Z., Sun, M.: Automatic judgment prediction via legal reading comprehension. In: Sun, M., Huang, X., Ji, H., Liu, Z., Liu, Y. (eds.) CCL 2019. LNCS (LNAI), vol. 11856, pp. 558–572. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-32381-3_45

    Chapter  Google Scholar 

  8. Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C: Emerg. Technol. 54, 187–197 (2015)

    Article  Google Scholar 

  9. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555. (2014)

  10. Yang, Z., Wang, P., Zhang, L., Shou, L., Xu, W.: A recurrent attention network for judgment prediction. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11730, pp. 253–266. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30490-4_21

    Chapter  Google Scholar 

  11. Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies, pp. 1480–1489 (2016)

    Google Scholar 

  12. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078. (2014)

  13. Wei, D., Lin, L.: An external knowledge enhanced multi-label charge prediction approach with label number learning. arXiv preprint arXiv:1907.02205. (2019)

  14. Li, S., Zhang, H., Ye, L., Guo, X., Fang, B.: Mann: a multichannel attentive neural network for legal judgment prediction. IEEE Access 7, 151144–151155 (2019)

    Article  Google Scholar 

  15. Koopman, H., Sportiche, D., Stabler, E.: An Introduction to Syntactic Analysis and Theory. Wiley, Hoboken (2013)

    Google Scholar 

  16. Che, W., Li, Z., Liu, T.: LTP: A Chinese language technology platform. In: Coling 2010: Demonstrations, pp. 13–16 (2010)

    Google Scholar 

  17. Li, K.W., Yang, L., Liu, W.Y., Liu, L., Liu, H.T.: Classification method of imbalanced data based on RSBoost. Comput. Sci. 42(9), 249–252 (2015)

    Google Scholar 

  18. Xiao, C., et al.: Cail 2018: a large-scale legal dataset for judgment prediction. arXiv preprint arXiv:1807.02478. (2018)

  19. Li, Q., et al.: A text classification survey: from shallow to deep learning. arXiv preprint. arXiv:2008.00364. (2020)

Download references

Acknowledgements

This work was supported by the “Six talent peaks" High Level Talents of Jiangsu Province (XYDXX-204), Province Key R &D Program of Jiangsu (BE2020026), XJTLU Research Development Funding (RDF-20-02-10), Suzhou Science and Technology Development Planning Programme-Key Industrial Technology Innovation-Prospective Applied Basic Research Project (SGC2021086), Special Patent Research Project of China National Intellectual Property Office (Y220702).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huakang Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, J., Zhu, M., Ouyang, H., Sun, G., Li, H. (2022). Triplet Embedding Convolutional Recurrent Neural Network for Long Text Semantic Analysis. In: Chbeir, R., Huang, H., Silvestri, F., Manolopoulos, Y., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2022. WISE 2022. Lecture Notes in Computer Science, vol 13724. Springer, Cham. https://doi.org/10.1007/978-3-031-20891-1_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20891-1_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20890-4

  • Online ISBN: 978-3-031-20891-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics