Abstract
Machine Translation (MT) has been one of the classic AI tasks from the early days of the field. Portuguese and Chinese are languages with a very large number of native speakers, though this does not carry through to the amount of literature on their processing, or to the amount of resources available to be used, in particular when compared with English. In this paper, we address the feasibility of creating a MT system for Portuguese-Chinese, using only freely available resources, by experimenting with various approaches to pairing source and target parallel data during training. These approaches are (i) using a model for each source-target language pair, (ii) using an intermediate pivot language, and (iii) using a single model that can translate from any language seen in the source side to any language seen on the target side. We find approaches whose performance is higher than that of the strong baseline consisting of an MT service provided by an IT industry giant for the pair Portuguese-Chinese.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
If creating an MT system for many languages, this approach only requires two models per language; a much lower number than when using a model for each language pair.
- 2.
- 3.
- 4.
Despite this 50% reduction in the size of the corpus, training the many-to-many model took around 808 GPU hours (more than 33 days) to converge.
- 5.
In the most recent WMT 2018 [2], 33 of the 38 systems used deep neural models, and 29 of these 33 were based on the Transformer model.
- 6.
- 7.
- 8.
- 9.
- 10.
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of the International Conference on Learning Representations (ICLR) (2015). arXiv preprint arXiv:1409.0473
Bojar, O., et al.: Findings of the 2018 conference on machine translation (WMT18). In: Proceedings of the Third Conference on Machine Translation, Volume 2: Shared Task Papers, pp. 272–307 (2018)
Chao, L.S., Wong, D.F., Ao, C.H., Leal, A.L.: UM-PCorpus: a large Portuguese-Chinese parallel corpus. In: Proceedings of the LREC 2018 Workshop “Belt & Road: Language Resources and Evaluation”, pp. 38–43 (2018)
Johnson, M., et al.: Google’s multilingual neural machine translation system: enabling zero-shot translation. Trans. Assoc. Comput. Linguist. 5, 339–351 (2017)
Junczys-Dowmunt, M., et al.: Marian: fast neural machine translation in C++. In: Proceedings of ACL 2018, System Demonstrations, pp. 116–121 (2018)
Liu, S., Wang, L., Liu, C.H.: Chinese-Portuguese machine translation: a study on building parallel corpora from comparable texts. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), pp. 1485–1492 (2018)
Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1412–1421 (2015)
Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318 (2002)
Sennrich, R., Haddow, B., Birch, A.: Neural machine translation of rare words with subword units. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1715–1725 (2016)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Neural Information Processing Systems, pp. 3104–3112 (2014)
Tian, L., Wong, D.F., Chao, L.S., Quaresma, P., Oliveira, F., Yi, L.: UM-Corpus: a large English-Chinese parallel corpus for statistical machine translation. In: Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC 2014), pp. 1837–1842 (2014)
Tiedemann, J.: Parallel data, tools and interfaces in OPUS. In: Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC 2012), pp. 2214–2218 (2012)
Vaswani, A., et al.: Attention is all you need. In: Neural Information Processing Systems, pp. 5998–6008 (2017)
Acknowledgements
The research results presented here were supported by FCT—Foundation for Science and Technology of Portugal, MOST—Ministry of Science and Technology of China, through the project Chinese-Portuguese Deep Machine Translation in eCommerce Domain (441.00 CHINA-BILATERAL), the PORTULAN CLARIN Infrastructure for the Science and Technology of Language, the National Infrastructure for Distributed Computing (INCD) of Portugal, and the ANI/3279/2016 grant. Deyi Xiong was supported by National Natural Science Foundation of China (Grants No. 61622209 and 61861130364).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Santos, R., Silva, J., Branco, A., Xiong, D. (2019). The Direct Path May Not Be The Best: Portuguese-Chinese Neural Machine Translation. In: Moura Oliveira, P., Novais, P., Reis, L. (eds) Progress in Artificial Intelligence. EPIA 2019. Lecture Notes in Computer Science(), vol 11805. Springer, Cham. https://doi.org/10.1007/978-3-030-30244-3_62
Download citation
DOI: https://doi.org/10.1007/978-3-030-30244-3_62
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-30243-6
Online ISBN: 978-3-030-30244-3
eBook Packages: Computer ScienceComputer Science (R0)