Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Trio Neural Model for Dynamic Entity Relatedness Ranking

Tu Nguyen, Tuan Tran, Wolfgang Nejdl


Abstract
Measuring entity relatedness is a fundamental task for many natural language processing and information retrieval applications. Prior work often studies entity relatedness in a static setting and unsupervised manner. However, entities in real-world are often involved in many different relationships, consequently entity relations are very dynamic over time. In this work, we propose a neural network-based approach that leverages public attention as supervision. Our model is capable of learning rich and different entity representations in a joint framework. Through extensive experiments on large-scale datasets, we demonstrate that our method achieves better results than competitive baselines.
Anthology ID:
K18-1004
Volume:
Proceedings of the 22nd Conference on Computational Natural Language Learning
Month:
October
Year:
2018
Address:
Brussels, Belgium
Editors:
Anna Korhonen, Ivan Titov
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
31–41
Language:
URL:
https://aclanthology.org/K18-1004
DOI:
10.18653/v1/K18-1004
Bibkey:
Cite (ACL):
Tu Nguyen, Tuan Tran, and Wolfgang Nejdl. 2018. A Trio Neural Model for Dynamic Entity Relatedness Ranking. In Proceedings of the 22nd Conference on Computational Natural Language Learning, pages 31–41, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Trio Neural Model for Dynamic Entity Relatedness Ranking (Nguyen et al., CoNLL 2018)
Copy Citation:
PDF:
https://aclanthology.org/K18-1004.pdf