Activation Functions in Neural Networks

When we create a Neural Network, one of the steps in it is to send the data through Activation Functions before returning the output.

In short, when we create a Neural Network, we go through a series of steps, such as:

Input Data -> Hidden Layer -> Activation Function -> Output

Activation Function decides whether a neuron in the Neural Network should be activated or not and then introduce a non-linearity, depending upon what type of Activation Function we use.

There are several types of Activation Function. These are some of the common ones:

  • Sigmoid Function
  • Rectified Linear Unit (ReLU)
  • Softmax Function
  • TanH Function

You might also like

2 thoughts on “Activation Functions in Neural Networks

Leave a Reply

Your email address will not be published. Required fields are marked *