Neural Networks (easy introduction)
Neural networks are an interesting form of machine learning. They are like any other model in the sense that they have inputs and outputs however their layered approach makes them unique. In simple terms a forward feed (most simple neural network) is calculated by using two processes. The first process is forward propagation. Forward propagation takes inputs, multiplies them by weights (think coefficients) and adds a bias (think intercept), and then uses an activation function (such as logistic) to transform the output from the weight/bias formula to get an output for the next layer. This is done for every layer from the input layer to the output layer.
The unique part of neural networks is how they learn. This process is called back propagation. In back propagation the goal is to select weights and biases so that the error (actual - predicted) is as small as possible. To adjust the weights and biases in the right direction, gradient descent is used. In simple terms, gradient descent finds the direction with the steepest slope towards the minimum value of the error function. Once the direction is found all of the weights and biases are adjusted by a learning rate which is typically a small number. After the weights and biases are updated forward propagation is ran again and the errors are calculated. If the errors are larger than the threshold we desire, back propagation is conducted again and this loop (called an epoch in machine learning) continues until the errors of the model are small enough or there seems to be no improvement. If there is no improvement then the model structure (number of neurons, layers, and types of input variables) needs to be adjusted.
For a better example check out my video on YouTube with pictures and some basic math. If you like the video Subscribe to my channel for future videos that will cover examples of how to actually calculate a neural network by hand.
Head of Quant Risk and Research
6 年Eric Weber I would love to get your feedback on my video.