Activation Function

Activation Function

An Activation function is used to allow Network to learn complex decision boundaries, (for example, ReLU or Sigmoid) that takes in the weighted sum of all of the inputs from the previous layer and then generates and passes an output value (typically nonlinear) to the next layer.

Activation functions are the functions that are applied on each node of a layer to determine at what extent the signals should be transmitted to next node of next layer from the previous layer node.

Some activations that are commonly used are:

  1. ReLU
  2. Sigmoid
  3. tanh
  4. Softmax
  5. Linear