2a. Which activation function to use?

We mainly have the following activation functions-

  1. sigmoid
  2. softmax
  3. tanh
  4. ReLU
  5. Linear Activation

Hidden Layer Activation Function


Output Layer Activation Function

The above charts show which activation functions to use in different scenarios.

要查看或添加评论,请登录

Chandra P.的更多文章

社区洞察

其他会员也浏览了