Neural Network
Neural networks are a set of algorithms, that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering raw input.
Neural Networks learn by example. They can’t be programmed to perform a specific task
Motivation Behind Neural networks:
The building block of a neural network is the Neuron. An artificial neuron works much the same way the biological one does. Like a Biological neuron, Artificial Neuron has an input layer (which is used to give the input to the neuron), Cell Body (where all the inputs are summed and passes through the activation function) and an output layer(which gives the output).
As you know our brain is made up of 86 Billions of neurons, so Neural Network is really just a composition of Neurons, connected in different ways and operating on different activation functions.
Parts of Neural networks:
As you can see from the image there are mainly three parts of an Artificial Neural Network.
- Input Layer: The first layer of a neural network is known as the input layer of the neuron.
- Output Layer: The last layer of the neuron is known as the output layer of the neuron.
- Hidden Layer: The layers between the Input Layer and Output Layer are known as the Hidden Layers.
Work of each Layer in a Neural Network:
For a batter understanding let’s take the example of detecting 8 in a Neural Network.
- The first layer takes the input as an image of 8 and passes to the second layer.
- In the second layer, the smaller parts like curves and lines are detected and they pass output to the next layer.
- In the third layer, the smaller detections are going to be merged to form some shapes like (o, x)
- And in the final layer, the shapes are going to be merged to form 8 and give the output as 8.
Now you may have a question that how these neurons are detecting the shapes and activating themselves after detection? the answer is the Activation Function.
What is Activation Function?
The activation function decides whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it. The purpose of the activation function is to introduce non-linearity into the output of a neuron.
There are mainly three types of activation functions.
Let’s take an example of Mobile buying analogy
Suppose you want to buy a mobile. So your selection will depend on many features like:
- Camera Quality
- Battery Life
- Processer Speed
So your inputs layer will be like:
your complete Network will look like:
NOTE: summation > 80 is the activation function.
Training a Neural Network:
The most common deep learning algorithm for supervised training of the multi-layer perceptron is known as backpropagation. In it, after the weighted sum of inputs and passing through the activation function we propagate backward and update weights to reduce the error.
For our example, we have taken a smaller dataset.
Training-1:
Let's consider the initial weight of the model is 3 and see the model output and the error (Absolute & Square).
Training-2:
Let's update the weight value and make it 4.
From both the training you can observe that with an increase in weight value the error is increasing.
The above experiments are showing that we have to decrease the weight value. But how much we should decrease and how the Neural Network is going to predict the Least Error Weight Value. That is where Gradient Descent comes into the picture.
Gradient Descent:
Gradient descent is an optimization algorithm used to find the values of parameters (coefficients) of a function (f) that minimizes a cost function (cost).
Gradient descent is best used when the parameters cannot be calculated analytically (e.g. using linear algebra) and must be searched for by an optimization algorithm.
For our example 2 is the minimal error weight value and our model is going to have good accuracy.
Hope you have enjoyed learning this. Feel free to connect with me!
YouTube: https://www.youtube.com/channel/UCmF8qppe02J1ot4Jfwl_lFg
LinkedIn: https://www.dhirubhai.net/in/jagwithyou/
Medium: https://medium.com/@jagwithyou