Charles Explorer logo
🇬🇧

CUNI NMT System for WAT 2017 Translation Tasks

Publication at Faculty of Mathematics and Physics |
2017

Abstract

The paper presents this year's CUNI submissions to the WAT 2017 Translation Task focusing on the Japanese-English translation, namely Scientific papers subtask, Patents subtask and Newswire subtask. We compare two neural network architectures, the standard sequence-to-sequence with attention (Seq2Seq) (Bahdanau et al., 2014) and an architecture using convolutional sentence encoder (FBConv2Seq) described by Gehring et al. (2017), both implemented in the NMT framework Neural Monkey that we currently participate in developing.

We also compare various types of preprocessing of the source Japanese sentences and their impact on the overall results. Furthermore, we include the results of our experiments with out-of-domain data obtained by combining the corpora provided for each subtask.