Replicating Human Memory Structures in Neural Networks to Create Precise NLU algorithms
Introduction
Machine learning and Artificial Intelligence developments are happening at breakneck speed! At such pace, you need to understand the developments at multiple levels – you obviously need to understand the underlying tools and techniques, but you also need to develop an intuitive understanding of what is happening.
By the end of this article, you will develop an intuitive understanding of RNNs, especially LSTM & GRU.
Ready? Let’s go!
Table of Contents
- Simple exercise – Tweet classification
- How does our brain process the English language?
- Notations in this article
- Let’s start with RNN
- Consider Gated Recurrent Unit (GTU) first
- Understanding Long Short Term Memory (LSTM)
- A short note on Bidirectional RNN
- LSTM v GRU – who wins?
Let’s start with a simple exercise – tweet classification
Have a look at this article on NLP. I took a handful of tweets and used the word count