Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

An Empirical Study on the Robustness of Massively Multilingual Neural Machine Translation

Supryadi Supryadi, Leiyu Pan, Deyi Xiong


Abstract
Massively multilingual neural machine translation (MMNMT) has been proven to enhance the translation quality of low-resource languages. In this paper, we empirically investigate the translation robustness of Indonesian-Chinese translation in the face of various naturally occurring noise. To assess this, we create a robustness evaluation benchmark dataset for Indonesian-Chinese translation. This dataset is automatically translated into Chinese using four NLLB-200 models of different sizes. We conduct both automatic and human evaluations. Our in-depth analysis reveal the correlations between translation error types and the types of noise present, how these correlations change across different model sizes, and the relationships between automatic evaluation indicators and human evaluation indicators. The dataset is publicly available at https://github.com/tjunlp-lab/ID-ZH-MTRobustEval.
Anthology ID:
2024.lrec-main.97
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
1086–1097
Language:
URL:
https://aclanthology.org/2024.lrec-main.97
DOI:
Bibkey:
Cite (ACL):
Supryadi Supryadi, Leiyu Pan, and Deyi Xiong. 2024. An Empirical Study on the Robustness of Massively Multilingual Neural Machine Translation. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 1086–1097, Torino, Italia. ELRA and ICCL.
Cite (Informal):
An Empirical Study on the Robustness of Massively Multilingual Neural Machine Translation (Supryadi et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.97.pdf