Issue #83 – Selective Attention for Context-aware Neural Machine Translation
21 May20 Issue #83 – Selective Attention for Context-aware Neural Machine Translation Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic Introduction One of the next frontiers for Neural Machine Translation (NMT) is moving beyond the sentence-by-sentence translation that currently is the norm, to a context-aware, document level translation. Including extra-sentential context means that discourse elements (such as expressions referring back to previously-mentioned entities) can be integrated, resulting in better translation of references, for example. Currently the engine has no […]
Read more