Issue #41 – Deep Transformer Models for Neural MT
13 Jun19 Issue #41 – Deep Transformer Models for Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic The Transformer is a state-of-the-art Neural MT model, as we covered previously in Issue #32. So what happens when something works well with neural networks? We try to go wider and deeper! There are two research directions that look promising to enhance the Transformer model: building wider networks by increasing the size of word representation and attention vectors, or building […]
Read more