Activation functions in Deep Neural Networks:

Hyper Parameters are the adjustable variables which determine the network structure (Eg: Number of Hidden Units) in a Deep Neural Network. The variables determine how the network is trained (Eg: Learning Rate). In Deep Neural Network, hyper parameters are settings that control the learning process of a model. The learning process takes into consideration various parameters in an artificial neural network and allows us to tune them to improve the accuracy and performance of the learning.

Activation functions are used to introduce nonlinearity to models, which allows deep learning models to learn nonlinear prediction boundaries. In order to do that, the activation function decides whether a neuron should be activated based on the input from the previous layer. This allows the network to get influenced and improves the learning process. In real-world terms, it acts as a carrot-stick approach in the learning process.

The activation function enables the network to adjust its weights and biases to minimize errors, thus improving the accuracy of the learning process. Some of the activation functions are ReLU, Sigmoid, TanH. Choosing the right activation function can significantly impact the convergence rate of the network, and its overall efficiency. It can also significantly impact the performance of the model, including its accuracy, ability to generalize to new data, and convergence rate.

For example, in one of my capstone project, I ran a CNN model on Chest Xray data. I used Adam and SGD (Stochastic Gradient Descent) Optimizers on a CNN model. For Activation function, I tried with ReLU, TanH, and Sigmoid. The ReLU and Sigmoid combination of Activation function with Adam Optimizer proved to be the best.

The ReLU activation function is used to introduce nonlinearity in a neural network, helping mitigate the vanishing gradient problem during machine learning model training and enabling neural networks to learn more complex relationships in data.

Reference:

https://lnkd.in/gQWqNTWG

https://lnkd.in/gfqz3DeQ

https://lnkd.in/gWMVdnPz

https://lnkd.in/gGPMCjD8

One of the best Activation Function video on Youtube:

https://lnkd.in/gF5x9gKDRELU

Activation Function explained:

https://lnkd.in/gGtH6dD8

#ActivationFunction #ReLU #TanH #Sigmoid #SoftMax #StepFunction

要查看或添加评论,请登录

Joe Fernandez的更多文章

社区洞察

其他会员也浏览了