The intution behind backpropagation
In Day 26 of the ML: Teach by Doing Project, we look behind the curtains of backpropagation.
Along with the mathematics, it is essential to understand the meaning and broad level intuition of how backpropagation works.
One of the best ways to understand this is to ask yourself the question:
“What is the effect of one training example on how the weights and biases change?”
Let’s say we are generating a cats-dogs image classifier based on 3 measurements: pupil diameter, ear flappiness index and whisker length.
The aim is to classify whether it’s a cat or dog based on any new measurement.
For that, we will need to train a neural network.
For training the neural network, we will need to change the weights and biases based on our training examples.
That’s where backpropagation really comes into the picture.
Let’s take one training example. Let’s say this cat:
We ask ourselves: What is the effect of this cat on how the weights and bias change?
To answer this question, we look at the last layer. It looks as follows:
The output of the last layer is 0.2. This is not good, because in our convention, if output > 0.5 it’s cat. If output < 0.5 it’s a dog.
领英推荐
So we need to increase the output of the last layer.
How can we do this?
We have 3 ways:
To increase the previous layer outputs, we need to go to the earlier layer and make changes in the weights of the earlier layer.
Thus, the last layer tells us to nudge the weights and biases so that the output is correct.
To nudge these weights and biases, we need to nudge the weights and biases of the layer before that.
To nudge the weights and biases of the layer before the last layer, we need to nudge the weights and biases of the layer before that.
This recursively continues until we reach the first layer.
That’s why it’s called “backpropagation”.
In formal terms, these nudges in the weights and biases are directly proportional to the gradient of the loss function with these weights and biases.
I made a video to explain this in detail. You can find it here:
Stay tuned for Day 27!