Issue #13 – Evaluation of Neural MT Architectures
11 Oct18 Issue #13 – Evaluation of Neural MT Architectures Author: Raj Nath Patel, Machine Translation Scientist @ Iconic What are the different approaches to Neural MT? Since its relatively recent advent, the underlying technology has been based on one of three main architectures: Recurrent Neural Networks (RNN) Convolutional Neural Networks (CNN) Self-Attention Networks (Transformer) For various language pairs, non-recurrent architectures (CNN and Transformer) have outperformed RNNs but there has not been any solid explanations as to why. In this post, we’ll evaluate […]
Read more