Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling

Yijin Liu, Fandong Meng, Jinchao Zhang, Jinan Xu, Yufeng Chen, Jie Zhou


Abstract
Current state-of-the-art systems for sequence labeling are typically based on the family of Recurrent Neural Networks (RNNs). However, the shallow connections between consecutive hidden states of RNNs and insufficient modeling of global information restrict the potential performance of those models. In this paper, we try to address these issues, and thus propose a Global Context enhanced Deep Transition architecture for sequence labeling named GCDT. We deepen the state transition path at each position in a sentence, and further assign every token with a global representation learned from the entire sentence. Experiments on two standard sequence labeling tasks show that, given only training data and the ubiquitous word embeddings (Glove), our GCDT achieves 91.96 F1 on the CoNLL03 NER task and 95.43 F1 on the CoNLL2000 Chunking task, which outperforms the best reported results under the same settings. Furthermore, by leveraging BERT as an additional resource, we establish new state-of-the-art results with 93.47 F1 on NER and 97.30 F1 on Chunking.
Anthology ID:
P19-1233
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2431–2441
Language:
URL:
https://aclanthology.org/P19-1233
DOI:
10.18653/v1/P19-1233
Bibkey:
Cite (ACL):
Yijin Liu, Fandong Meng, Jinchao Zhang, Jinan Xu, Yufeng Chen, and Jie Zhou. 2019. GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2431–2441, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling (Liu et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1233.pdf
Code
 Adaxry/GCDT
Data
CoNLLCoNLL 2003