Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

CorefPrompt: Prompt-based Event Coreference Resolution by Measuring Event Type and Argument Compatibilities

Sheng Xu, Peifeng Li, Qiaoming Zhu


Abstract
Event coreference resolution (ECR) aims to group event mentions referring to the same real-world event into clusters. Most previous studies adopt the “encoding first, then scoring” framework, making the coreference judgment rely on event encoding. Furthermore, current methods struggle to leverage human-summarized ECR rules, e.g., coreferential events should have the same event type, to guide the model. To address these two issues, we propose a prompt-based approach, CorefPrompt, to transform ECR into a cloze-style MLM (masked language model) task. This allows for simultaneous event modeling and coreference discrimination within a single template, with a fully shared context. In addition, we introduce two auxiliary prompt tasks, event-type compatibility and argument compatibility, to explicitly demonstrate the reasoning process of ECR, which helps the model make final predictions. Experimental results show that our method CorefPrompt performs well in a state-of-the-art (SOTA) benchmark.
Anthology ID:
2023.emnlp-main.954
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15440–15452
Language:
URL:
https://aclanthology.org/2023.emnlp-main.954
DOI:
10.18653/v1/2023.emnlp-main.954
Bibkey:
Cite (ACL):
Sheng Xu, Peifeng Li, and Qiaoming Zhu. 2023. CorefPrompt: Prompt-based Event Coreference Resolution by Measuring Event Type and Argument Compatibilities. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15440–15452, Singapore. Association for Computational Linguistics.
Cite (Informal):
CorefPrompt: Prompt-based Event Coreference Resolution by Measuring Event Type and Argument Compatibilities (Xu et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.954.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.954.mp4