How Does Attention Work in Encoder-Decoder Recurrent Neural Networks
Last Updated on August 7, 2019 Attention is a mechanism that was developed to improve the performance of the Encoder-Decoder RNN on machine translation. In this tutorial, you will discover the attention mechanism for the Encoder-Decoder model. After completing this tutorial, you will know: About the Encoder-Decoder model and attention mechanism for machine translation. How to implement the attention mechanism step-by-step. Applications and extensions to the attention mechanism. Kick-start your project with my new book Deep Learning for Natural Language […]
Read more