Rectifying the Problem of Vanishing Gradient Problem using Relu Activation Function Based on Blstm Neural Network
Pinagadi. Venkateswararao1, S. Murugavalli2
1Pinagadi. Venkateswararao, Research Scholar, Sathyabama University, Chennai, India.
2S. Murugavalli, Professor & Head., Department of Computer Science and Engineering, Panimalar Engineering College, Chennai, India.

Manuscript received on 01 April 2019 | Revised Manuscript received on 05 May 2019 | Manuscript published on 30 May 2019 | PP: 2615-2618 | Volume-8 Issue-1, May 2019 | Retrieval Number: A1264058119/19©BEIESP
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (

Abstract: Character reorganization is a big task to apply on Handwritten documents, using keyword spotting is best solution in now a days. Here keyword spotting is playing major role to extract the character or manuscripts from unconstrained written text and recognize based on probability of the character or letter and manuscripts. It performs template free spotting with the help of CTC Token passing algorithm. the main problem here while performing back propagation for the neural network to find out the vanishing gradient problem. Reorganization rate will get down because error rate will be more, to keep on to increase the no of hidden layers that will be effect the output time of the neural network. Huge amount of data will be lose during carrying the output value from one layer to another layer through activation function. To simulate the problem with the help of the activation function like sigmoid activation function but accuracy of the character reorganization is very low. Instead of sigmoid to use rectified linear unit (ReLU) will update the neuron never reach to zero the output of the hidden layer.
Keywords: Spotting, BLST, Re LU, CTC
Scope of the Article: Problem Solving and Planning