List of Activation Functions in neural networks and Their Applications

List of Activation Functions in neural networks and Their Applications


Activation functions play a crucial role in neural networks by introducing non-linearity, enabling the network to learn complex patterns. Below is a categorized list of activation functions and their recommended use cases.


1. Non-Linear Activation Functions

A) Sigmoid-Based Functions

  1. Sigmoid (Logistic) Function
  2. Tanh (Hyperbolic Tangent) Function
  3. Softmax Function


B) ReLU-Based Functions (Most Common in Deep Learning)

  1. ReLU (Rectified Linear Unit) Function
  2. Leaky ReLU
  3. Parametric ReLU (PReLU)
  4. Exponential Linear Unit (ELU)


C) Advanced Activation Functions

  1. Swish (Self-Gated Activation by Google Brain)
  2. Mish
  3. Softplus

  • Pros: Smooth approximation of ReLU.
  • Cons: Slower computation compared to ReLU.
  • Use Case: Alternative to ReLU when smooth activation is required.


2. Linear Activation Functions

  1. Linear (Identity) Activation f(x) = x

  • Pros: No transformation; allows unrestricted output.
  • Cons: Cannot introduce non-linearity.
  • Use Case: Regression tasks (output layer).


Comparison Table


Final Recommendations

  • For hidden layers: Use ReLU (default), Leaky ReLU, or ELU.
  • For classification (output layer):
  • For continuous output (regression): Use Linear activation.
  • For deep learning optimizations: Consider Swish or Mish.


For consulting and buying machine learning enabled automated products, consult with us at [email protected] or WhatsApp us @ +91-7993651356

#mathnalanalytics #analytics #analysis #supplychain #supplychainanalysis #supplychainconsulting #supplychainoptimization #supplychaintech #inventoryoptimization


要查看或添加评论,请登录

Krish Naidu - Consultant and Trainer SupplyChain Analytics的更多文章

社区洞察

其他会员也浏览了