Issue #49 – Representation Bottleneck in Neural MT
08 Aug19 Issue #49 – Representation Bottleneck in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic In Neural MT, lexical features are fed to the network as lexical representations (aka word embeddings) to the first layer of the encoder and refined as propagate through the deep network of hidden layers. In this post we’ll try to understand how the lexical representation is affected as it goes deeper in the network and investigate if it affects the translation quality. Representation […]
Read more