Learning to Parse and Translate Improves Neural Machine Translation

ACL 2017

Learning to Parse and Translate Improves Neural Machine Translation

Jan 27, 2021
|
29 views
Details
Abstract: There has been relatively little attention to incorporating linguistic prior to neural machine translation. Much of the previous work was further constrained to considering linguistic prior on the source side. In this paper, we propose a hybrid model, called NMT+RNNG, that learns to parse and translate by combining the recurrent neural network grammar into the attention-based neural machine translation. Our approach encourages the neural machine translation model to incorporate linguistic prior during training, and lets it translate on its own afterward. Extensive experiments with four language pairs show the effectiveness of the proposed NMT+RNNG Authors: Akiko Eriguchi, Yoshimasa Tsuruoka, Kyunghyun Cho (New York University, The University of Tokyo)

Comments
loading...