Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

A Hybrid Semantic Matching Model for Neural Collective Entity Linking

  • Conference paper
  • First Online:
Web and Big Data (APWeb-WAIM 2021)

Abstract

The task of entity linking aims to correctly link mentions in a text fragment to a reference knowledge base. Most existing methods apply single neural network model to learn semantic representations on all granularities in contextual information, which neglecting the trait of different granularities. Also, these solely representation-based methods measure the semantic matching based on the abstract vector representation that frequently miss concrete matching information. To better capture contextual information, this paper proposes a new neural network model called Hybrid Semantic Matching (HSM) for the entity linking task. The model captures two different aspects of semantic information via representation and interaction-based neural semantic matching models. Furthermore, to consider the global consistency of entities, a recurrent random walk is applied to propagate entity linking evidences among related decisions. Evaluation was conducted on three publicly available standard datasets. Results show that our proposed HSM model is more effective compared with a list of baseline models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://nlp.stanford.edu/projects/glove/.

  2. 2.

    https://dumps.wikimedia.org/.

  3. 3.

    https://github.com/wikipedia2vec/wikipedia2vec.

References

  1. Gupta, N., Singh, S., Roth, D.: Entity linking via joint encoding of types, descriptions, and context. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 2681–2690 (2017)

    Google Scholar 

  2. Xue, M., et al.: Neural collective entity linking based on recurrent random walk network learning. In: Proceedings of IJCAI, pp. 5327–5333 (2019)

    Google Scholar 

  3. Agarwal, O., Bikel, D.M.: Entity linking via dual and cross-attention encoders. arXiv preprint arXiv:2004.03555 (2020)

  4. Le, P., Titov, I.: Improving entity linking by modeling latent relations between mentions. In: Proceedings of Annual Meeting of the Association for Computational Linguistics, pp. 1695–1604 (2018)

    Google Scholar 

  5. Cao, Y., Hou, L., Li, J., Liu, Z.: Neural collective entity linking. In: Proceedings of International Conference on Computational Linguistics, pp. 675–686 (2018)

    Google Scholar 

  6. Bunescu, R., Pasca, M.: Using encyclopedic knowledge for named entity disambiguation. In: Proceedings of Conference of the European Chapter of the Association for Computational Linguistics (EACL), pp. 9–16 (2006)

    Google Scholar 

  7. Mihalcea, R., Csomai, A.: Wikify! Linking documents to encyclopedic knowledge. In: Proceedings of ACM conference on Conference on information and knowledge management, pp. 233–242 (2007)

    Google Scholar 

  8. Ganea, O.E., Hofmann, T.: Deep joint entity disambiguation with local neural attention. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 2619–2629 (2017)

    Google Scholar 

  9. Yamada, I., Shindo, H., Takeda, H., Takefuji, Y.: Joint learning of the embedding of words and entities for named entity disambiguation. In: Proceedings of SIGNLL Conference on Computational Natural Language Learning, pp. 250–259 (2016)

    Google Scholar 

  10. Radhakrishnan, P., Talukdar, P., Varma, V.: Elden: improved entity linking using densified knowledge graphs. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long Papers), pp. 1844–1853 (2018)

    Google Scholar 

  11. Sun, Y., Lin, L., Tang, D., Yang, N., Ji, Z., Wang, X.: Modeling mention, context and entity with neural networks for entity disambiguation. In: Proceedings of IJCAI, pp. 1333–1339 (2015)

    Google Scholar 

  12. Francis-Landau, M., Durrett, G., Klein, D.: Capturing semantic similarity for entity linking with convolutional neural networks. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1256–1261 (2016)

    Google Scholar 

  13. Sevgili, O., Shelmanov, A., Arkhipov, M., Panchenko, A., Biemann, C.: Neural entity linking: a survey of models based on deep learning. arXiv preprint arXiv:2006.00575 (2020)

  14. Yamada, I., Asai, A., Shindo, H., Takeda, H., Matsumoto, Y.: LUKE: deep contextualized entity representations with entity-aware self-attention. In: Proceedings of Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 6442–6454 (2020)

    Google Scholar 

  15. Yamada, I., Washio, K., Shindo, H., Matsumoto, Y.: Global entity disambiguation with pretrained contextualized embeddings of words and entities. arXiv preprint arXiv:1909.00426 (2019)

  16. Broscheit, S.: Investigating entity knowledge in BERT with simple neural end-to-end entity linking. In: Proceedings of Conference on Computational Natural Language Learning (CoNLL), pp. 677–685 (2019)

    Google Scholar 

  17. Zhao, Chen, H., Li, X., Gregoric, A.Z., Wadhwa, S.: Contextualized end-to-end neural entity linking. In: Proceedings of Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and International Joint Conference on Natural Language Processing, pp. 637–642 (2020)

    Google Scholar 

  18. Guo, J., Fan, Y., Ai, Q., Croft, W.B.: A deep relevance matching model for ad-hoc retrieval. In: Proceedings of ACM international on conference on information and knowledge management, pp. 55–64 (2016)

    Google Scholar 

  19. Nie, F., Zhou, S., Liu, J., Wang, J., Lin, C.Y., Pan, R.: Aggregated semantic matching for short text entity linking. In: Proceedings of Conference on Computational Natural Language Learning, pp. 476–485 (2018)

    Google Scholar 

  20. Yang, Y., Irsoy, O., Rahman, K.S.: Collective entity disambiguation with structured gradient tree boosting. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long Papers), pp. 777–786 (2018)

    Google Scholar 

  21. Ganea, O.E., Ganea, M., Lucchi, A., Eickhoff, C., Hofmann, T.: Probabilistic bag-of-hyperlinks model for entity linking. In: Proceedings of International Conference on World Wide Web, pp. 927–938 (2016)

    Google Scholar 

  22. Guo, Z., Barbosa, D.: Robust named entity disambiguation with random walks. Semantic Web. 9(4), 459–479 (2018)

    Article  Google Scholar 

  23. Guo, Z., Barbosa, D.: Robust entity linking via random walks. In: Proceedings of ACM International Conference on Conference on Information and Knowledge Management, pp. 499–508 (2014)

    Google Scholar 

  24. Yang X., et al.: Learning dynamic context augmentation for global entity linking. In: Proceedings of Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 271–281 (2019)

    Google Scholar 

  25. Hou, F., Wang, R., He, J., Zhou, Y.: Improving entity linking through semantic reinforced entity embeddings. In: Proceedings of Annual Meeting of the Association for Computational Linguistics, pp. 6843–6848 (2020)

    Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (No.61772146) and Natural Science Foundation of Guangdong Province (2021A1515011339).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tianyong Hao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lei, B., Li, W., Wong, LP., Lee, LK., Wang, F.L., Hao, T. (2021). A Hybrid Semantic Matching Model for Neural Collective Entity Linking. In: U, L.H., Spaniol, M., Sakurai, Y., Chen, J. (eds) Web and Big Data. APWeb-WAIM 2021. Lecture Notes in Computer Science(), vol 12859. Springer, Cham. https://doi.org/10.1007/978-3-030-85899-5_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85899-5_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85898-8

  • Online ISBN: 978-3-030-85899-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics