Calculus in Action: Neural Networks

An artificial neural network is a computational model that approximates a mapping between inputs and outputs.  It is inspired by the structure of the human brain, in that it is similarly composed of a network of interconnected neurons that propagate information upon receiving sets of stimuli from neighbouring neurons. Training a neural network involves a process that employs the backpropagation and gradient descent algorithms in tandem. As we will be seeing, both of these algorithms make extensive use of calculus. […]

Read more

A Gentle Introduction To Sigmoid Function

Whether you implement a neural network yourself or you use a built in library for neural network learning, it is of paramount importance to understand the significance of a sigmoid function. The sigmoid function is the key to understanding how a neural network learns complex problems. This function also served as a basis for discovering other functions that lead to efficient and good solutions for supervised learning in deep learning architectures. In this tutorial, you will discover the sigmoid function […]

Read more

Lagrange Multiplier Approach with Inequality Constraints

In a previous post, we introduced the method of Lagrange multipliers to find local minima or local maxima of a function with equality constraints. The same method can be applied to those with inequality constraints as well. In this tutorial, you will discover the method of Lagrange multipliers applied to find the local minimum or maximum of a function when inequality constraints are present, optionally together with equality constraints. After completing this tutorial, you will know How to find points […]

Read more

A Gentle Introduction to Particle Swarm Optimization

Particle swarm optimization (PSO) is one of the bio-inspired algorithms and it is a simple one to search for an optimal solution in the solution space. It is different from other optimization algorithms in such a way that only the objective function is needed and it is not dependent on the gradient or any differential form of the objective. It also has very few hyperparameters. In this tutorial, you will learn the rationale of PSO and its algorithm with an […]

Read more

Training-validation-test split and cross-validation done right

One crucial step in machine learning is the choice of model. A suitable model with suitable hyperparameter is the key to a good prediction result. When we are faced with a choice between models, how should the decision be made? This is why we have cross validation. In scikit-learn, there is a family of functions that help us do this. But quite often, we see cross validation used improperly, or the result of cross validation not being interpreted correctly. In […]

Read more

How to Learn Python for Machine Learning

Python has become a de facto lingua franca for machine learning. It is not a difficult language to learn, but if you are not particularly familiar with the language, there are some tips that can help you learn faster or better. In this post, you will discover what the right way to learn a programming language is and how to get help. After reading this post, you will know: The right mentality to learn Python for use in machine learning […]

Read more

Optimization for Machine Learning Crash Course

Optimization for Machine Learning Crash Course.Find function optima with Python in 7 days. All machine learning models involve optimization. As a practitioner, we optimize for the most suitable hyperparameters or the subset of features. Decision tree algorithm optimize for the split. Neural network optimize for the weight. Most likely, we use computational algorithms to optimize. There are many ways to optimize numerically. SciPy has a number of functions handy for this. We can also try to implement the optimization algorithms […]

Read more

Principal Component Analysis for Visualization

Principal component analysis (PCA) is an unsupervised machine learning technique. Perhaps the most popular use of principal component analysis is dimensionality reduction. Besides using PCA as a data preparation technique, we can also use it to help visualize data. A picture is worth a thousand words. With the data visualized, it is easier for us to get some insights and decide on the next step in our machine learning models. In this tutorial, you will discover how to visualize data […]

Read more

A Gentle Introduction to Vector Space Models

Vector space models are to consider the relationship between data that are represented by vectors. It is popular in information retrieval systems but also useful for other purposes. Generally, this allows us to compare the similarity of two vectors from a geometric perspective. In this tutorial, we will see what is a vector space model and what it can do. After completing this tutorial, you will know: What is a vector space model and the properties of cosine similarity How […]

Read more

Using Singular Value Decomposition to Build a Recommender System

Singular value decomposition is a very popular linear algebra technique to break down a matrix into the product of a few smaller matrices. In fact, it is a technique that has many uses. One example is that we can use SVD to discover relationship between items. A recommender system can be build easily from this. In this tutorial, we will see how a recommender system can be build just using linear algebra techniques. After completing this tutorial, you will know: […]

Read more
1 47 48 49 50 51 914