Issue #31 – Context-aware Neural MT

28 Mar19

Issue #31 – Context-aware Neural MT

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

In this week’s post, we take a look at ‘context-aware’ machine translation. This particular topic deals with how Neural MT engines can make use of external information to determine what translation to product – “external information” meaning information other than the words in the sentence being translated. Other modalities, for instance speech, images, and videos, or even other sentences in the source document may indeed contain relevant information that could help to resolve ambiguities and improve the overall quality of the translation.

As such, one application of context-aware Neural MT is  document-level translation, a topic which we covered in Issue #15. We saw approaches taking several previous sentences into account when translating the source sentence. This context was modelled in a cache or as an additional neural network: an encoder or an attention network. In the latter case, the attention allows the network to focus on different words and sentences depending on the requirement, which is why it is called hierarchical attention network (HAN). In this case the document context is used on both encoder and decoder.

Some recent approaches

Recently,
To finish reading, please visit source site

Leave a Reply