Sum of Step Approximation of a Novel Non Linear Activation Function
Venkatappareddy.P1, M. Deepthi2

1Venkatappareddy.P, Department of Electronics and Communication Engineering, Vignan’s Foundation For Science, Technology and Research, Vadlamudi, Guntur (A.P), India.
2M. Deepthi, Department of Electronics and Communication Engineering, Vasireddy Venkatadri Institute of Technology, Nambur Guntur (A.P), India.
Manuscript received on 13 February 2019 | Revised Manuscript received on 04 March 2019 | Manuscript Published on 08 June 2019 | PP: 246-250 | Volume-7 Issue-5S4, February 2019 | Retrieval Number: E10490275S419/19©BEIESP
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: In this manuscript, we propose sum of steps approximation of a novel nonlinear activation function tunable ReLU for VLSI architecture implementation of neural networks. The characteristics of the proposed activation function depend on a tunable parameter and input data set values. Also, we propose a linear-in-the-parameter model for the proposed activation function using an even mirror Fourier nonlinear filter. Finally, simulation results are presented to show performance of the proposed activation function on various data sets and observe its superiority against to other activation functions.
Keywords: Activation Function, Perceptron, Tunable ReLU, Deep Neural Network, EMFN Filter.
Scope of the Article: Software Defined Networking and Network Function Virtualization