Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Continual Event Extraction with Semantic Confusion Rectification

Zitao Wang, Xinyi Wang, Wei Hu


Abstract
We study continual event extraction, which aims to extract incessantly emerging event information while avoiding forgetting. We observe that the semantic confusion on event types stems from the annotations of the same text being updated over time. The imbalance between event types even aggravates this issue. This paper proposes a novel continual event extraction model with semantic confusion rectification. We mark pseudo labels for each sentence to alleviate semantic confusion. We transfer pivotal knowledge between current and previous models to enhance the understanding of event types. Moreover, we encourage the model to focus on the semantics of long-tailed event types by leveraging other associated types. Experimental results show that our model outperforms state-of-the-art baselines and is proficient in imbalanced datasets.
Anthology ID:
2023.emnlp-main.732
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11945–11955
Language:
URL:
https://aclanthology.org/2023.emnlp-main.732
DOI:
10.18653/v1/2023.emnlp-main.732
Bibkey:
Cite (ACL):
Zitao Wang, Xinyi Wang, and Wei Hu. 2023. Continual Event Extraction with Semantic Confusion Rectification. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11945–11955, Singapore. Association for Computational Linguistics.
Cite (Informal):
Continual Event Extraction with Semantic Confusion Rectification (Wang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.732.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.732.mp4