Activation functions:
Abhishek Zirange
Consultant | Product Engineer | Freelance Python Developer , Django, NLP, Machine learning | Redis | Celery | Web Scraping Specialist
NOTE: I would recommend reading up the basics of Artificial Neural Network before reading this article for better understanding
Activation functions are really important for an artificial neural network to learn and make sense of something complicated and non-linear complex functional mappings between the inputs and response variable. They introduce non-linear properties to our Network.
The Above Network will do the sum of products of inputs(X) and their corresponding Weights(W) and apply an Activation function f(x) to it to get the output of that layer and feed it as an input to the next layer.
So what does an artificial neuron do? To keep it simple, it calculates a “weighted sum” of its input, adds a bias, and then decides whether it should be “fired” or not.
Y = Σ ( weight * input ) + bias
?
The Activation Functions can be divided into 2 types-
- Linear Activation Function
- Non-linear Activation Functions
Some of the most common choices for activation function are:
- Sigmoid
- ReLU (rectified linear unit)
- Leaky ReLU
- Generalized ReLU
- Softmax
- Tanh
I hope that now you have a basic knowledge of activation functions. And what is the exact use of activation function in A-NN.
Future articles will look at a deep knowledge of each activation function and code examples. So friends wait for further new posts.
AI Engineer | Building solutions for Enterprise at hobglobin.com | Exploring the future of tech & AI ??
4 年One of the simplest description of activation function. Thank you! Looking for subsequent articles going deep about each. I hope swish activation function will get a appearance too :)
Sr Software Engineer | Full-stack developer | Python | Django | Solidjs
4 年Nice article it helps to understand activation function very well.
Software Developer
4 年Good Information