Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking

Samuel Broscheit


Abstract
A typical architecture for end-to-end entity linking systems consists of three steps: mention detection, candidate generation and entity disambiguation. In this study we investigate the following questions: (a) Can all those steps be learned jointly with a model for contextualized text-representations, i.e. BERT? (b) How much entity knowledge is already contained in pretrained BERT? (c) Does additional entity knowledge improve BERT’s performance in downstream tasks? To this end we propose an extreme simplification of the entity linking setup that works surprisingly well: simply cast it as a per token classification over the entire entity vocabulary (over 700K classes in our case). We show on an entity linking benchmark that (i) this model improves the entity representations over plain BERT, (ii) that it outperforms entity linking architectures that optimize the tasks separately and (iii) that it only comes second to the current state-of-the-art that does mention detection and entity disambiguation jointly. Additionally, we investigate the usefulness of entity-aware token-representations in the text-understanding benchmark GLUE, as well as the question answering benchmarks SQUAD~V2 and SWAG and also the EN-DE WMT14 machine translation benchmark. To our surprise, we find that most of those benchmarks do not benefit from additional entity knowledge, except for a task with very small training data, the RTE task in GLUE, which improves by 2%.
Anthology ID:
K19-1063
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
677–685
Language:
URL:
https://aclanthology.org/K19-1063
DOI:
10.18653/v1/K19-1063
Bibkey:
Cite (ACL):
Samuel Broscheit. 2019. Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 677–685, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking (Broscheit, CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1063.pdf
Code
 samuelbroscheit/entity_knowledge_in_bert
Data
AIDA CoNLL-YAGOCoNLLGLUESWAG