Using Activation Functions in Neural Networks
Activation functions play an integral role in neural networks by introducing nonlinearity. This nonlinearity allows neural networks to develop complex representations and functions based on the inputs that would not be possible with a simple linear regression model.
Many different nonlinear activation functions have been proposed throughout the history of neural networks. In this post, you will explore three popular ones: sigmoid, tanh, and ReLU.
After reading this article, you will learn:
- Why nonlinearity is important in a neural network
- How different activation functions can contribute to the vanishing gradient problem
- Sigmoid, tanh, and ReLU activation functions
- How to use different activation functions in your TensorFlow model
Let’s get started.