Application of differentiations in neural networks

Differential calculus is an important tool in machine learning algorithms. Neural networks in particular, the gradient descent algorithm depends on the gradient, which is a quantity computed by differentiation.

In this tutorial, we will see how the back-propagation technique is used in finding the gradients in neural networks.

After completing this tutorial, you will know

  • What is a total differential and total derivative
  • How to compute the total derivatives in neural networks
  • How back-propagation helped in computing the total derivatives

Let’s get started

Application of differentiations in neural networks

Application of differentiations in neural networks
Photo by Freeman Zhou, some rights reserved.

Tutorial overview

This

 

 

To finish reading, please visit source site