Issue #24 – Exploring language models for Neural MT
07 Feb19 Issue #24 – Exploring language models for Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Monolingual language models were a critical part of Phrase-based Statistical Machine Translation systems. They are also used in unsupervised Neural MT systems (unsupervised means that no parallel data is available to supervise training, in other words only monolingual data is used). However, they are not used in standard supervised Neural MT engines and training language models have disappeared from common […]
Read more