Abstract
The task of entity linking aims to correctly link mentions in a text fragment to a reference knowledge base. Most existing methods apply single neural network model to learn semantic representations on all granularities in contextual information, which neglecting the trait of different granularities. Also, these solely representation-based methods measure the semantic matching based on the abstract vector representation that frequently miss concrete matching information. To better capture contextual information, this paper proposes a new neural network model called Hybrid Semantic Matching (HSM) for the entity linking task. The model captures two different aspects of semantic information via representation and interaction-based neural semantic matching models. Furthermore, to consider the global consistency of entities, a recurrent random walk is applied to propagate entity linking evidences among related decisions. Evaluation was conducted on three publicly available standard datasets. Results show that our proposed HSM model is more effective compared with a list of baseline models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Gupta, N., Singh, S., Roth, D.: Entity linking via joint encoding of types, descriptions, and context. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 2681–2690 (2017)
Xue, M., et al.: Neural collective entity linking based on recurrent random walk network learning. In: Proceedings of IJCAI, pp. 5327–5333 (2019)
Agarwal, O., Bikel, D.M.: Entity linking via dual and cross-attention encoders. arXiv preprint arXiv:2004.03555 (2020)
Le, P., Titov, I.: Improving entity linking by modeling latent relations between mentions. In: Proceedings of Annual Meeting of the Association for Computational Linguistics, pp. 1695–1604 (2018)
Cao, Y., Hou, L., Li, J., Liu, Z.: Neural collective entity linking. In: Proceedings of International Conference on Computational Linguistics, pp. 675–686 (2018)
Bunescu, R., Pasca, M.: Using encyclopedic knowledge for named entity disambiguation. In: Proceedings of Conference of the European Chapter of the Association for Computational Linguistics (EACL), pp. 9–16 (2006)
Mihalcea, R., Csomai, A.: Wikify! Linking documents to encyclopedic knowledge. In: Proceedings of ACM conference on Conference on information and knowledge management, pp. 233–242 (2007)
Ganea, O.E., Hofmann, T.: Deep joint entity disambiguation with local neural attention. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 2619–2629 (2017)
Yamada, I., Shindo, H., Takeda, H., Takefuji, Y.: Joint learning of the embedding of words and entities for named entity disambiguation. In: Proceedings of SIGNLL Conference on Computational Natural Language Learning, pp. 250–259 (2016)
Radhakrishnan, P., Talukdar, P., Varma, V.: Elden: improved entity linking using densified knowledge graphs. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long Papers), pp. 1844–1853 (2018)
Sun, Y., Lin, L., Tang, D., Yang, N., Ji, Z., Wang, X.: Modeling mention, context and entity with neural networks for entity disambiguation. In: Proceedings of IJCAI, pp. 1333–1339 (2015)
Francis-Landau, M., Durrett, G., Klein, D.: Capturing semantic similarity for entity linking with convolutional neural networks. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1256–1261 (2016)
Sevgili, O., Shelmanov, A., Arkhipov, M., Panchenko, A., Biemann, C.: Neural entity linking: a survey of models based on deep learning. arXiv preprint arXiv:2006.00575 (2020)
Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. In: Proceedings of Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 6442–6454 (2020)
Yamada, I., Washio, K., Shindo, H., Matsumoto, Y.: Global entity disambiguation with pretrained contextualized embeddings of words and entities. arXiv preprint arXiv:1909.00426 (2019)
Broscheit, S.: Investigating entity knowledge in BERT with simple neural end-to-end entity linking. In: Proceedings of Conference on Computational Natural Language Learning (CoNLL), pp. 677–685 (2019)
Zhao, Chen, H., Li, X., Gregoric, A.Z., Wadhwa, S.: Contextualized end-to-end neural entity linking. In: Proceedings of Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and International Joint Conference on Natural Language Processing, pp. 637–642 (2020)
Guo, J., Fan, Y., Ai, Q., Croft, W.B.: A deep relevance matching model for ad-hoc retrieval. In: Proceedings of ACM international on conference on information and knowledge management, pp. 55–64 (2016)
Nie, F., Zhou, S., Liu, J., Wang, J., Lin, C.Y., Pan, R.: Aggregated semantic matching for short text entity linking. In: Proceedings of Conference on Computational Natural Language Learning, pp. 476–485 (2018)
Yang, Y., Irsoy, O., Rahman, K.S.: Collective entity disambiguation with structured gradient tree boosting. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long Papers), pp. 777–786 (2018)
Ganea, O.E., Ganea, M., Lucchi, A., Eickhoff, C., Hofmann, T.: Probabilistic bag-of-hyperlinks model for entity linking. In: Proceedings of International Conference on World Wide Web, pp. 927–938 (2016)
Guo, Z., Barbosa, D.: Robust named entity disambiguation with random walks. Semantic Web. 9(4), 459–479 (2018)
Guo, Z., Barbosa, D.: Robust entity linking via random walks. In: Proceedings of ACM International Conference on Conference on Information and Knowledge Management, pp. 499–508 (2014)
Yang X., et al.: Learning dynamic context augmentation for global entity linking. In: Proceedings of Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 271–281 (2019)
Hou, F., Wang, R., He, J., Zhou, Y.: Improving entity linking through semantic reinforced entity embeddings. In: Proceedings of Annual Meeting of the Association for Computational Linguistics, pp. 6843–6848 (2020)
Acknowledgements
This work was supported by National Natural Science Foundation of China (No.61772146) and Natural Science Foundation of Guangdong Province (2021A1515011339).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Lei, B., Li, W., Wong, LP., Lee, LK., Wang, F.L., Hao, T. (2021). A Hybrid Semantic Matching Model for Neural Collective Entity Linking. In: U, L.H., Spaniol, M., Sakurai, Y., Chen, J. (eds) Web and Big Data. APWeb-WAIM 2021. Lecture Notes in Computer Science(), vol 12859. Springer, Cham. https://doi.org/10.1007/978-3-030-85899-5_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-85899-5_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-85898-8
Online ISBN: 978-3-030-85899-5
eBook Packages: Computer ScienceComputer Science (R0)