Recurrent Neural Networks (RNN) and Transformers have emerged as two popular deep learning models, especially in the field of Natural Language Processing (NLP). Although research on RNNs goes back to the 1980s the Transformer research paper first came out in 2017 only and its models have become highly successful in recent times. In this post, we will compare Transformers vs RNN and understand their similarities and differences point-wise.
Recurrent Neural Networks (RNN)
Recurrent Neural Networks (RNN) are a type of neural network designed to handle sequential data like text, time series, audio, etc. RNNs were first proposed in 1986 but were mostly confined to theoretical research due to a lack of modern computing powers. It uses a