Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Hou Hongxu


2021

pdf bib
Low-Resource Machine Translation based on Asynchronous Dynamic Programming
Jia Xiaoning | Hou Hongxu | Wu Nier | Li Haoran | Chang Xin
Proceedings of the 20th Chinese National Conference on Computational Linguistics

Reinforcement learning has been proved to be effective in handling low resource machine trans-lation tasks and different sampling methods of reinforcement learning affect the performance ofthe model. The reward for generating translation is determined by the scalability and iteration ofthe sampling strategy so it is difficult for the model to achieve bias-variance trade-off. Therefore according to the poor ability of the model to analyze the structure of the sequence in low-resourcetasks this paper proposes a neural machine translation model parameter optimization method for asynchronous dynamic programming training strategies. In view of the experience priority situa-tion under the current strategy each selective sampling experience not only improves the value ofthe experience state but also avoids the high computational resource consumption inherent in tra-ditional valuation methods (such as dynamic programming). We verify the Mongolian-Chineseand Uyghur-Chinese tasks on CCMT2019. The result shows that our method has improved the quality of low-resource neural machine translation model compared with general reinforcement learning methods which fully demonstrates the effectiveness of our method.

2004

pdf bib
An EBMT system based on word alignment
Hou Hongxu | Deng Dan | Zou Gang | Yu Hongkui | Liu Yang | Xiong Deyi | Liu Qun
Proceedings of the First International Workshop on Spoken Language Translation: Evaluation Campaign