You Must Know Before Building Model in RNN(Recurrent Neural Network)!!

Mustafa M
4 min readOct 17, 2021

What is Recurrent Neural Network ?

I assume that we discuss further you know basic understanding of how neural network works .A Recurrent Neural Network is (RNN) a type of neural network which is mostly performed on sequential data or Sequence of data and Time series data .

the algorithms are used commonly for Ordinal or temporal problems ,the applications are to work back bone fundamental area.

Image captioning, Language Translation, Natural Language Processing (NLP),Speech Recognition etc.

like many other deep learning algorithms RNN is also introduced long back in 1980. but in recent time we see that true potential. by introducing different application came in real life for end increasing computational power along with huge amount of data that we process the invention of LSTM introduced it has really brought Recurrent Neural network to the foreground.

Because of Internal memory the RNN can memories the data i.e. store information its received ,its specific future make them more precise to predict what is next level of information with association of sequential data form .this make RNN more Unique for using sequential form of data and make applications

Lets take a Overview of RNN !

RNN is one type of Neural network, which is used up on Sequential data its derived from feedforward network, it s work just like human brains work.

RNN image — Bing images(Image Credit ) RNN Architecture

Here are some of the applications Siri, Google Translate,…

How RNN work ?

sequential data is a ordered data which dependent on previous and future data, its like financial data stock market data which is just sequence of data .

Feedforward neural network and RNN

in feedforward neural network the information only moves in one direction start from input layer, through hidden layers to output layers ,to the o/p layers the information moves straight through the networks and never touches a node twice.

in feed-forward neural networks have no memory at input they receive and are bad predicting ,because of feed-forward networks, the RNN ha inputs present and just past, this feature make RNN Crucial while information what is going to coming next stage and act as input again…

in neural networks backsword propagation and feed-forward propagation make vital role while in architecture.

at neural networks we go for forward propagation to get the output of model to verify the output correct or incorrect with actual ,back propagation is nothing but going backwards through your neural networks to find the partial derivatives of the error with respect to weights, which enables us to sub the value from weights

A gradient is a partial derivatives with respect to its inputs ,gradient measure how much the output of function changes if you change the also known as slope of a function if higher gradient, the steeper the slope and faster the model can learn but if slope less or zeroth model stop learning .

gradient measure by the change in weights with respective change in error

Vanishing Gradient and exploding gradients?

Vanishing Gradient?

vanishing gradient comes where the value of gradient is minimal and the model stops learning or can say model learn too slow to get result. This is the major concern for but we can solve this problem by using LSTM .

Exploding Gradient?

the algorithms without much reason it assign the Hight importance to its weights, we can overcome tis problem by introducing the squashing or truncating the gradients

Long Short-Term Memory(LSTM)

its used for an extension of run used for building blocks for layers of RNN LSTM assign the weights which helps RNN to information in ,forget information or give the information enough to impacts the output.

LSTM working functionality (lstm model — Bing images credits)

We have memory cell, Forget gate, input gate, output gate ,in order to generate the text it will refer some text and it will memorize the information and it also add the information which we previously feed the and start giving the sequential form of text generation data at pointer node.

A complete article for LSTM Link is and some more use full links

Understanding LSTM Networks — colah’s blog

Recurrent Neural Networks (RNN): What It Is & How It Works | Built In

Lets implement Basic level of implementation RNN using python

#import dependencies 
import pandas as pd
import numpy as np
#read the data file from local my_data=pd.read_csv("\userd\Sentimentdata.csv")
import refrom nltk.corpus import stopwordsstCollection=stopwords.words('english')

After doing some cleaning the data

Summery of RNN and LSTM….