Neural networks from scratch series update
I have begun to implement Neural Networks entirely from scratch.
17 videos have been already uploaded on Youtube and they have received a great response so far.
(1) Coding a single neuron and a layer: https://lnkd.in/gfSRKuxt
(2) The beauty of numpy and the dot product in coding neurons and layers: https://lnkd.in/grBjwTu4
(3) Coding multiple neural network layers: https://lnkd.in/gSwmEnZP
(4) Implementing the Dense Layer class in Python: https://lnkd.in/gSEeZzTZ
(5) Broadcasting and Array Summation in Python: https://lnkd.in/gt9u5hca
(6) Coding Neural Network Activation Functions from scratch: https://lnkd.in/gxav-8-2
(7) Coding one neural network forward pass: https://lnkd.in/geyZAvAn
(8) Coding the cross entropy loss in Python (from scratch): https://youtu.be/QgfINhYPnH0
(9) Introduction to Optimization in Neural Network training: https://youtu.be/WVFMxpTD09w
(10) Partial Derivatives and Gradient in Neural Networks: https://youtu.be/Utvna4t0NmA
(11) Understand Chain Rule-The backbone of Neural Networks: https://youtu.be/_rVPkUag1YU
(12) Backpropagation from scratch on a single neuron: https://youtu.be/iE1lccrHfok
(13) Backpropagation through an entire layer of neurons - from scratch: https://youtu.be/wLLuNyZM8Sw
(14) Role of matrices in backpropagation: https://youtu.be/cuveqaYX1bw
(15) Finding derivatives of inputs in backpropagation and why we need them: https://youtu.be/sjbs-usnjIY
(16) Coding Backpropagation building blocks in Python: https://youtu.be/UOgFWMLrWcA
(17) Backpropagation on the ReLU activation class: https://youtu.be/PmqHkytaRSU
I have spent a lot of time and effort in making these lectures. I show everything on a whiteboard and then show it through Python code.
Nothing is assumed. Everything is spelled out.
Of course, I could just use TensorFlow and code everything in 10 lines and in 10 minutes.
However, that is no fun! Only way to truly understand something is to break it apart and build it from scratch.
Learn and enjoy!