Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Entity-aware Multi-task Training Helps Rare Word Machine Translation

Matiss Rikters, Makoto Miwa


Abstract
Named entities (NE) are integral for preserving context and conveying accurate information in the machine translation (MT) task. Challenges often lie in handling NE diversity, ambiguity, rarity, and ensuring alignment and consistency. In this paper, we explore the effect of NE-aware model fine-tuning to improve handling of NEs in MT. We generate data for NE recognition (NER) and NE-aware MT using common NER tools from Spacy, and align entities in parallel data. Experiments with fine-tuning variations of pre-trained T5 models on NE-related generation tasks between English and German show promising results with increasing amounts of NEs in the output and BLEU score improvements compared to the non-tuned baselines.
Anthology ID:
2024.inlg-main.5
Volume:
Proceedings of the 17th International Natural Language Generation Conference
Month:
September
Year:
2024
Address:
Tokyo, Japan
Editors:
Saad Mahamood, Nguyen Le Minh, Daphne Ippolito
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
47–54
Language:
URL:
https://aclanthology.org/2024.inlg-main.5
DOI:
Bibkey:
Cite (ACL):
Matiss Rikters and Makoto Miwa. 2024. Entity-aware Multi-task Training Helps Rare Word Machine Translation. In Proceedings of the 17th International Natural Language Generation Conference, pages 47–54, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
Entity-aware Multi-task Training Helps Rare Word Machine Translation (Rikters & Miwa, INLG 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.inlg-main.5.pdf