Using Activation Functions in Deep Learning Models

A deep learning model in its simplest form are layers of perceptrons connected in tandem. Without any activation functions, they are just matrix multiplications with limited power, regardless how many of them. Activation is the magic why neural network can be an approximation to a wide variety of non-linear function. In PyTorch, there are many activation functions available for use in your deep learning models. In this post, you will see how the choice of activation functions can impact the model. Specifically,

  • What are the common activation functions
  • What are the nature of activation functions
  • How the different activation functions impact the learning rate
  • How the selection of activation function can solve the vanishing gradient problem

Kick-start your project

 

 

To finish reading, please visit source site