Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Towards Incremental Learning of Word Embeddings Using Context Informativeness

Alexandre Kabbach, Kristina Gulordava, Aurélie Herbelot


Abstract
In this paper, we investigate the task of learning word embeddings from very sparse data in an incremental, cognitively-plausible way. We focus on the notion of ‘informativeness’, that is, the idea that some content is more valuable to the learning process than other. We further highlight the challenges of online learning and argue that previous systems fall short of implementing incrementality. Concretely, we incorporate informativeness in a previously proposed model of nonce learning, using it for context selection and learning rate modulation. We test our system on the task of learning new words from definitions, as well as on the task of learning new words from potentially uninformative contexts. We demonstrate that informativeness is crucial to obtaining state-of-the-art performance in a truly incremental setup.
Anthology ID:
P19-2022
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Fernando Alva-Manchego, Eunsol Choi, Daniel Khashabi
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
162–168
Language:
URL:
https://aclanthology.org/P19-2022
DOI:
10.18653/v1/P19-2022
Bibkey:
Cite (ACL):
Alexandre Kabbach, Kristina Gulordava, and Aurélie Herbelot. 2019. Towards Incremental Learning of Word Embeddings Using Context Informativeness. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 162–168, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Towards Incremental Learning of Word Embeddings Using Context Informativeness (Kabbach et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-2022.pdf
Code
 minimalparts/nonce2vec