Lagrange Multiplier Approach with Inequality Constraints

In a previous post, we introduced the method of Lagrange multipliers to find local minima or local maxima of a function with equality constraints. The same method can be applied to those with inequality constraints as well. In this tutorial, you will discover the method of Lagrange multipliers applied to find the local minimum or maximum of a function when inequality constraints are present, optionally together with equality constraints. After completing this tutorial, you will know How to find points […]

Read more

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 1: The Separable Case)

This tutorial is designed for anyone looking for a deeper understanding of how Lagrange multipliers are used in building up the model for support vector machines (SVMs). SVMs were initially designed to solve binary classification problems and later extended and applied to regression and unsupervised learning. They have shown their success in solving many complex machine learning classification problems. In this tutorial, we’ll look at the simplest SVM that assumes that the positive and negative examples can be completely separated via […]

Read more

Application of differentiations in neural networks

Differential calculus is an important tool in machine learning algorithms. Neural networks in particular, the gradient descent algorithm depends on the gradient, which is a quantity computed by differentiation. In this tutorial, we will see how the back-propagation technique is used in finding the gradients in neural networks. After completing this tutorial, you will know What is a total differential and total derivative How to compute the total derivatives in neural networks How back-propagation helped in computing the total derivatives […]

Read more

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 2: The Non-Separable Case)

This tutorial is an extension of Method Of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 1: The Separable Case)) and explains the non-separable case. In real life problems positive and negative training examples may not be completely separable by a linear decision boundary. This tutorial explains how a soft margin can be built that tolerates a certain amount of errors. In this tutorial, we’ll cover the basics of a linear SVM. We won’t go into details of non-linear […]

Read more

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 3: Implementing An SVM From Scratch In Python)

The mathematics that powers a support vector machine (SVM) classifier is beautiful. It is important to not only learn the basic model of an SVM but also know how you can implement the entire model from scratch. This is a continuation of our series of tutorials on SVMs. In part1 and part2 of this series we discussed the mathematical model behind a linear SVM. In this tutorial, we’ll show how you can build an SVM linear classifier using the optimization routines […]

Read more

Calculus Books for Machine Learning

Knowledge of calculus is not required to get results and solve problems in machine learning or deep learning. However, knowing some calculus will help you in a number of ways, such as in reading mathematical notation in books and papers, and in understanding the terms used to describe fitting models like “gradient,” and in understanding the learning dynamics of models fit via optimization such as neural networks. Calculus is a challenging topic as taught at a university level, but you […]

Read more
1 2 3 4