What is Teacher Forcing for Recurrent Neural Networks?
Last Updated on August 14, 2019
Teacher forcing is a method for quickly and efficiently training recurrent neural network models that use the ground truth from a prior time step as input.
It is a network training method critical to the development of deep learning language models used in machine translation, text summarization, and image captioning, among many other applications.
In this post, you will discover the teacher forcing as a method for training recurrent neural networks.
After reading this post, you will know:
- The problem with training recurrent neural networks that use output from prior time steps as input.
- The teacher forcing method for addressing slow convergence and instability when training these types of recurrent networks.
- Extensions to teacher forcing that allow trained models to better handle open loop applications of this type of network.
Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.