tanh Activation Function

Tanh Activation Function




tanh Activation Function is also very similar to the Sigmoid Activation Function. But this function returns the value range(-1,1). It is Zero Centric.


Advantages:
  • It is mostly used in the output layer for binary classification.
  • It is a good example in case when input>0, the gradient we will get all negative or positive, this will lead to an exploding or vanishing gradient problem, thus tanh will work well.
Disadvantages:
  • This lead sometimes to Saturated Gradient.

Comments

Popular posts from this blog

Deep Learning

Loss & Cost Functions

Recurrent Neural Network & LSTM