Abstract
Deep Recurrent Neural Network has an excellent performance in sentence semantic analysis. However, due to the curse of the computational dimensionality, the application in the long text is minimal. Therefore, we propose a Triplet Embedding Convolutional Recurrent Neural Network for long text analysis. Firstly, a triplet from each sentence of the long text. Then the most crucial head entity into the CRNN network, composed of CNN and Bi-GRU networks. Both relation and tail entities are input to a CNN network through three splicing layers. Finally, the output results into the global pooling layer to get the final results. Entity fusion and entity replacement are also used to retain the text’s structural and semantic information before triplet extraction in sentences. We have conducted experiments on a large-scale criminal case dataset. The results show our model significantly improves the judgment prediction task.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Hassan, A., Mahmood, A.: Deep learning approach for sentiment analysis of short texts. In: 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), pp. 705–710 (2022)
Lauderdale, B.E., Clark, T.S.: The Supreme Court’s many median justices. Am. Polit. Sci. Rev. 106(4), 847–866 (2012)
Gonçalves, T., Quaresma, P.: Is linguistic information relevant for the classification of legal texts? In: Proceedings of the 10th International Conference on Artificial Intelligence and Law, pp. 168–176 (2005)
Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101. (2016)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Kim, Y.: Convolutional neural networks for sentence classification. arXiv: 1408.5882. (2014)
Long, S., Tu, C., Liu, Z., Sun, M.: Automatic judgment prediction via legal reading comprehension. In: Sun, M., Huang, X., Ji, H., Liu, Z., Liu, Y. (eds.) CCL 2019. LNCS (LNAI), vol. 11856, pp. 558–572. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-32381-3_45
Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C: Emerg. Technol. 54, 187–197 (2015)
Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555. (2014)
Yang, Z., Wang, P., Zhang, L., Shou, L., Xu, W.: A recurrent attention network for judgment prediction. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11730, pp. 253–266. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30490-4_21
Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies, pp. 1480–1489 (2016)
Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078. (2014)
Wei, D., Lin, L.: An external knowledge enhanced multi-label charge prediction approach with label number learning. arXiv preprint arXiv:1907.02205. (2019)
Li, S., Zhang, H., Ye, L., Guo, X., Fang, B.: Mann: a multichannel attentive neural network for legal judgment prediction. IEEE Access 7, 151144–151155 (2019)
Koopman, H., Sportiche, D., Stabler, E.: An Introduction to Syntactic Analysis and Theory. Wiley, Hoboken (2013)
Che, W., Li, Z., Liu, T.: LTP: A Chinese language technology platform. In: Coling 2010: Demonstrations, pp. 13–16 (2010)
Li, K.W., Yang, L., Liu, W.Y., Liu, L., Liu, H.T.: Classification method of imbalanced data based on RSBoost. Comput. Sci. 42(9), 249–252 (2015)
Xiao, C., et al.: Cail 2018: a large-scale legal dataset for judgment prediction. arXiv preprint arXiv:1807.02478. (2018)
Li, Q., et al.: A text classification survey: from shallow to deep learning. arXiv preprint. arXiv:2008.00364. (2020)
Acknowledgements
This work was supported by the “Six talent peaks" High Level Talents of Jiangsu Province (XYDXX-204), Province Key R &D Program of Jiangsu (BE2020026), XJTLU Research Development Funding (RDF-20-02-10), Suzhou Science and Technology Development Planning Programme-Key Industrial Technology Innovation-Prospective Applied Basic Research Project (SGC2021086), Special Patent Research Project of China National Intellectual Property Office (Y220702).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, J., Zhu, M., Ouyang, H., Sun, G., Li, H. (2022). Triplet Embedding Convolutional Recurrent Neural Network for Long Text Semantic Analysis. In: Chbeir, R., Huang, H., Silvestri, F., Manolopoulos, Y., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2022. WISE 2022. Lecture Notes in Computer Science, vol 13724. Springer, Cham. https://doi.org/10.1007/978-3-031-20891-1_43
Download citation
DOI: https://doi.org/10.1007/978-3-031-20891-1_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20890-4
Online ISBN: 978-3-031-20891-1
eBook Packages: Computer ScienceComputer Science (R0)