In this work, we propose a novel framework to align contextual embeddings at the sense level by leveraging cross-lingual signal from bilingual dictionaries only ...
Oct 12, 2022 · Cross-lingual word embeddings (CLWE) provide a shared representation space for knowledge transfer between languages, yielding state-of-the-art ...
Mar 11, 2021 · In this work, we propose a novel framework to align contextual embeddings at the sense level by leveraging cross-lingual signal from bilingual dictionaries ...
In this work, we propose a novel framework to align contextual embeddings at the sense level by leverag- ing cross-lingual signal from bilingual dictionaries ...
This work proposes a novel framework to align contextual embeddings at the sense level by leveraging cross-lingual signal from bilingual dictionaries only ...
This is the source code of our method proposed in paper "Towards Multi-Sense Cross-Lingual Alignment of Contextual Embeddings" accepted by COLING 2022.
In this work, we propose a novel framework to align contextual embeddings at the sense level by leveraging cross-lingual signal from bilingual dictionaries only ...
In this work, we propose a novel framework to align contextual embeddings at the sense level by leveraging cross-lingual signal from bilingual dictionaries only ...
Towards Multi-Sense Cross-Lingual Alignment of Contextual Embeddings ... Multi-level Community-awareness Graph Neural Networks for Neural Machine Translation.
People also ask
What is contextual embedding?
What is the primary purpose of cross-lingual embeddings in NLP?
What is the difference between static and contextual embeddings?
Contextual representation of the same token are clustered together. • The average distance between tokens is larger than within each token.