Issue #83 – Selective Attention for Context-aware Neural Machine Translation

21 May20

Issue #83 – Selective Attention for Context-aware Neural Machine Translation

Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic

Introduction

One of the next frontiers for Neural Machine Translation (NMT) is moving beyond the sentence-by-sentence translation that currently is the norm, to a context-aware, document level translation. Including extra-sentential context means that discourse elements (such as expressions referring back to previously-mentioned entities) can be integrated, resulting in better translation of references, for example. Currently the engine has no awareness of something referred to in a previous sentence, which results in a great deal of errors, for example around the gender of referring expressions (mistranslations of ‘it’, ‘he’, ‘she’).

While there has been some work in context-aware or document level MT, integrating various aspects of discourse into machine translation, this is still not the dominant paradigm. Work on context-aware translation has been largely restricted to a few sentences (Miculicich et al 2018). There is an approach that is fully document level, although the context is static, computed once for the sentence being translated (Maruf & Haffari 2018). This research builds on that work. For a full survey on Document level
To finish reading, please visit source site

Leave a Reply