Activation Functions

Activation Functions

What is the Activation Function?

  • Activation Functions are mathematical equations that determine the output of the neural network. It is applied to every neuron and determines when should each neuron be activated and deactivated. 

  • For example, if I pour hot water on your left hand, you suddenly move away from your left hand but not your right hand. This is because only your left-hand neuron is activated and rest of the neurons are deactivated.

  • These functions also normalize the output of each neuron to range between 0 and 1 or -1 and 1.

Why Activation Function?

  • We provide a set of training samples to a computer for training and the computer will adjust the weight and bias value according to minimize the output. 
  • While a computer is training some image samples, we would like neural networks to according change the weights and bias values so, fewer images are recognized wrongly. But our neural network will not change the weights and bias value little by little, it jumps here and there, no continuous change.
  • So, to make the changes continuously we need a relevant continuous function to compute derivative called Activation Function.

Types of Activation Functions:

  1. Sigmoid or Logistic Activation Function
  2. tanh Activation Function
  3. ReLU Activation Function
  4. Maxout Activation
  5. SoftMax Activation
  6. SoftPlus Activation
  7. SoftSign Activation
  8. ELU Activation
  9. Exponential Activation
  10. Selu Activation

Comments

Popular posts from this blog

Deep Learning

Loss & Cost Functions

Recurrent Neural Network & LSTM