Applications of Derivatives

The derivative defines the rate at which one variable changes with respect to another.  It is an important concept that comes in extremely useful in many applications: in everyday life, the derivative can tell you at which speed you are driving, or help you predict fluctuations on the stock market; in machine learning, derivatives are important for function optimization.  This tutorial will explore different applications of derivatives, starting with the more familiar ones before moving to machine learning. We will […]

Read more

A Gentle Introduction to Multivariate Calculus

It is often desirable to study functions that depend on many variables.  Multivariate calculus provides us with the tools to do so by extending the concepts that we find in calculus, such as the computation of the rate of change, to multiple variables. It plays an essential role in the process of training a neural network, where the gradient is used extensively to update the model parameters.  In this tutorial, you will discover a gentle introduction to multivariate calculus.  After […]

Read more

Differential and Integral Calculus – Differentiate with Respect to Anything

Integral calculus was one of the greatest discoveries of Newton and Leibniz. Their work independently led to the proof, and recognition of the importance of the fundamental theorem of calculus, which linked integrals to derivatives. With the discovery of integrals, areas and volumes could thereafter be studied.  Integral calculus is the second half of the calculus journey that we will be exploring. In this tutorial, you will discover the relationship between differential and integral calculus.  After completing this tutorial, you […]

Read more

A Gentle Introduction To Vector Valued Functions

Vector valued functions are often encountered in machine learning, computer graphics and computer vision algorithms. They are particularly useful for defining the parametric equations of space curves. It is important to gain a basic understanding of vector valued functions to grasp more complex concepts. In this tutorial, you will discover what vector valued functions are, how to define them and some examples. After completing this tutorial, you will know: Definition of vector valued functions Derivatives of vector valued functions Let’s […]

Read more

A Gentle Introduction To Partial Derivatives and Gradient Vectors

Partial derivatives and gradient vectors are used very often in machine learning algorithms for finding the minimum or maximum of a function. Gradient vectors are used in the training of neural networks, logistic regression, and many other classification and regression problems. In this tutorial, you will discover partial derivatives and the gradient vector. After completing this tutorial, you will know: Function of several variables Level sets, contours and graphs of a function of two variables Partial derivatives of a function […]

Read more

A Gentle Introduction To Gradient Descent Procedure

Gradient descent procedure is a method that holds paramount importance in machine learning. It is often used for minimizing error functions in classification and regression problems. It is also used in training neural networks, and deep learning architectures. In this tutorial, you will discover the gradient descent procedure. After completing this tutorial, you will know: Gradient descent method Importance of gradient descent in machine learning Let’s get started. A Gentle Introduction to gradient descent. Photo by Mehreen  

Read more

Higher-Order Derivatives

Higher-order derivatives can capture information about a function that first-order derivatives on their own cannot capture.  First-order derivatives can capture important information, such as the rate of change, but on their own they cannot distinguish between local minima or maxima, where the rate of change is zero for both. Several optimization algorithms address this limitation by exploiting the use of higher-order derivatives, such as in Newton’s method where the second-order derivatives are used to reach the local minimum of an […]

Read more

A Gentle Introduction to the Jacobian

In the literature, the term Jacobian is often interchangeably used to refer to both the Jacobian matrix or its determinant.  Both the matrix and the determinant have useful and important applications: in machine learning, the Jacobian matrix aggregates the partial derivatives that are necessary for backpropagation; the determinant is useful in the process of changing between variables. In this tutorial, you will review a gentle introduction to the Jacobian.  After completing this tutorial, you will know: The Jacobian matrix collects […]

Read more

A Gentle Introduction To Hessian Matrices

Hessian matrices belong to a class of mathematical structures that involve second order derivatives. They are often used in machine learning and data science algorithms for optimizing a function of interest.  In this tutorial, you will discover Hessian matrices, their corresponding discriminants, and their significance. All concepts are illustrated via an example. After completing this tutorial, you will know: Hessian matrices Discriminants computed via Hessian matrices What information is contained in the discriminant Let’s get started.

Read more

A Gentle Introduction to the Laplacian

The Laplace operator was first applied to the study of celestial mechanics, or the motion of objects in outer space, by Pierre-Simon de Laplace, and as such has been named after him.  The Laplace operator has since been used to describe many different phenomena, from electric potentials, to the diffusion equation for heat and fluid flow, and quantum mechanics. It has also been recasted to the discrete space, where it has been used in applications related to image processing and […]

Read more
1 39 40 41 42 43 907