Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis

Dongfang Xu, Egoitz Laparra, Steven Bethard


Abstract
Recent studies have shown that pre-trained contextual word embeddings, which assign the same word different vectors in different contexts, improve performance in many tasks. But while contextual embeddings can also be trained at the character level, the effectiveness of such embeddings has not been studied. We derive character-level contextual embeddings from Flair (Akbik et al., 2018), and apply them to a time normalization task, yielding major performance improvements over the previous state-of-the-art: 51% error reduction in news and 33% in clinical notes. We analyze the sources of these improvements, and find that pre-trained contextual character embeddings are more robust to term variations, infrequent terms, and cross-domain changes. We also quantify the size of context that pre-trained contextual character embeddings take advantage of, and show that such embeddings capture features like part-of-speech and capitalization.
Anthology ID:
S19-1008
Volume:
Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*SEM 2019)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Rada Mihalcea, Ekaterina Shutova, Lun-Wei Ku, Kilian Evang, Soujanya Poria
Venue:
*SEM
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
68–74
Language:
URL:
https://aclanthology.org/S19-1008
DOI:
10.18653/v1/S19-1008
Bibkey:
Cite (ACL):
Dongfang Xu, Egoitz Laparra, and Steven Bethard. 2019. Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis. In Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*SEM 2019), pages 68–74, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis (Xu et al., *SEM 2019)
Copy Citation:
PDF:
https://aclanthology.org/S19-1008.pdf