Notes from Recent Advances in Recurrent Neural Networks by Hojjat Salehinejad et.all
Written by Jaganadh GopinadhanThe paper “Recent Advances in Recurrent Neural Networks” by Julianne Baarbe, Sharan Sankar, Joseph Barfett, Errol Colak, and Shahrokh Valaee[1] provides a comprehensive literature review on various aspects of RNN. The authors give a concise and in-depth analysis of the evolution of RNN.
The paper starts with a brief discussion on the theory of Artificial Neural Networks (ANN) and then Recurrent Neural Networks (RNN). The notable contribution in this part is a short table of noteworthy research papers; starting with Elman et.all to Jing et.all 2017.
A quick walkthrough of the theory of RNN provided in the second section of the paper. Model architecture, activation, and loss functions were explained in this section.
The next section provides a detailed view of training various RNN architecture/models. It starts with weight and bias initialization and then discusses fundamental concepts such as applying gradient-based techniques, BPTT, SGD, and different optimization techniques.
The next section provides various prominent RNN architecture. The discussion starts with deep RNN with MLP, Bidirectional RNN, Recurrent RNN, Multidimensional RNN, LSTM and LSTM varients, GRU, memory networks and structurally constrained RNN’s are discussed. This discussion includes views on pros and cons of this architecture and possible application areas. A chart describing pros and cons of various LSTM architecture is a notable feature of this section. A comparison of major RNN architecture is also provided here.
The next section discusses regularization in RNNand the final section discusses the application of RNN in text, speech, image and video processing.
The authors did an excellent job in summarising significant research in the area of RNN till date. The presentation of the article is exquisite and useful for students academicians and general Deep Learning professionals.
Some of the additional reference on the similar line includes: A Critical Review of Recurrent Neural Networks for Sequence Learning Zachary C. Lipton, John Berkowitz, and Charles Elkan[2]. Comparative Study of CNN and RNN for Natural Language Processing Wenpeng Yin, Katharina Kann, Mo Yu and Hinrich Schütze [3]
[1] https://arxiv.org/abs/1801.01078
[2] https://arxiv.org/abs/1506.00019
[3] https://arxiv.org/abs/1702.01923
RNN
Deep Learning
Artificial Intelligence
Machine Learning
]