Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context

Shyam Upadhyay, Kai-Wei Chang, Matt Taddy, Adam Kalai, James Zou


Abstract
Word embeddings, which represent a word as a point in a vector space, have become ubiquitous to several NLP tasks. A recent line of work uses bilingual (two languages) corpora to learn a different vector for each sense of a word, by exploiting crosslingual signals to aid sense identification. We present a multi-view Bayesian non-parametric algorithm which improves multi-sense wor d embeddings by (a) using multilingual (i.e., more than two languages) corpora to significantly improve sense embeddings beyond what one achieves with bilingual information, and (b) uses a principled approach to learn a variable number of senses per word, in a data-driven manner. Ours is the first approach with the ability to leverage multilingual corpora efficiently for multi-sense representation learning. Experiments show that multilingual training significantly improves performance over monolingual and bilingual training, by allowing us to combine different parallel corpora to leverage multilingual context. Multilingual training yields comparable performance to a state of the art monolingual model trained on five times more training data.
Anthology ID:
W17-2613
Volume:
Proceedings of the 2nd Workshop on Representation Learning for NLP
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Phil Blunsom, Antoine Bordes, Kyunghyun Cho, Shay Cohen, Chris Dyer, Edward Grefenstette, Karl Moritz Hermann, Laura Rimell, Jason Weston, Scott Yih
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
101–110
Language:
URL:
https://aclanthology.org/W17-2613
DOI:
10.18653/v1/W17-2613
Bibkey:
Cite (ACL):
Shyam Upadhyay, Kai-Wei Chang, Matt Taddy, Adam Kalai, and James Zou. 2017. Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context. In Proceedings of the 2nd Workshop on Representation Learning for NLP, pages 101–110, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context (Upadhyay et al., RepL4NLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-2613.pdf