Training-validation-test split and cross-validation done right

One crucial step in machine learning is the choice of model. A suitable model with suitable hyperparameter is the key to a good prediction result. When we are faced with a choice between models, how should the decision be made? This is why we have cross validation. In scikit-learn, there is a family of functions that help us do this. But quite often, we see cross validation used improperly, or the result of cross validation not being interpreted correctly. In […]

Read more

How to Learn Python for Machine Learning

Python has become a de facto lingua franca for machine learning. It is not a difficult language to learn, but if you are not particularly familiar with the language, there are some tips that can help you learn faster or better. In this post, you will discover what the right way to learn a programming language is and how to get help. After reading this post, you will know: The right mentality to learn Python for use in machine learning […]

Read more

Optimization for Machine Learning Crash Course

Optimization for Machine Learning Crash Course.Find function optima with Python in 7 days. All machine learning models involve optimization. As a practitioner, we optimize for the most suitable hyperparameters or the subset of features. Decision tree algorithm optimize for the split. Neural network optimize for the weight. Most likely, we use computational algorithms to optimize. There are many ways to optimize numerically. SciPy has a number of functions handy for this. We can also try to implement the optimization algorithms […]

Read more

Principal Component Analysis for Visualization

Principal component analysis (PCA) is an unsupervised machine learning technique. Perhaps the most popular use of principal component analysis is dimensionality reduction. Besides using PCA as a data preparation technique, we can also use it to help visualize data. A picture is worth a thousand words. With the data visualized, it is easier for us to get some insights and decide on the next step in our machine learning models. In this tutorial, you will discover how to visualize data […]

Read more

A Gentle Introduction to Vector Space Models

Vector space models are to consider the relationship between data that are represented by vectors. It is popular in information retrieval systems but also useful for other purposes. Generally, this allows us to compare the similarity of two vectors from a geometric perspective. In this tutorial, we will see what is a vector space model and what it can do. After completing this tutorial, you will know: What is a vector space model and the properties of cosine similarity How […]

Read more

Using Singular Value Decomposition to Build a Recommender System

Singular value decomposition is a very popular linear algebra technique to break down a matrix into the product of a few smaller matrices. In fact, it is a technique that has many uses. One example is that we can use SVD to discover relationship between items. A recommender system can be build easily from this. In this tutorial, we will see how a recommender system can be build just using linear algebra techniques. After completing this tutorial, you will know: […]

Read more

Face Recognition using Principal Component Analysis

Recent advance in machine learning has made face recognition not a difficult problem. But in the previous, researchers have made various attempts and developed various skills to make computer capable of identifying people. One of the early attempt with moderate success is eigenface, which is based on linear algebra techniques. In this tutorial, we will see how we can build a primitive face recognition system with some simple linear algebra technique such as principal component analysis. After completing this tutorial, […]

Read more

Using CNN for financial time series prediction

Convolutional neural networks have their roots in image processing. It was first published in LeNet to recognize the MNIST handwritten digits. However, convolutional neural networks are not limited to handling images. In this tutorial, we are going to look at an example of using CNN for time series prediction with an application from financial markets. By way of this example, we are going to explore some techniques in using Keras for model training as well. After completing this tutorial, you […]

Read more

Visualizing the vanishing gradient problem

Deep learning was a recent invention. Partially, it is due to improved computation power that allows us to use more layers of perceptrons in a neural network. But at the same time, we can train a deep network only after we know how to work around the vanishing gradient problem. In this tutorial, we visually examine why vanishing gradient problem exists. After completing this tutorial, you will know What is a vanishing gradient Which configuration of neural network will be […]

Read more

Method of Lagrange Multipliers: The Theory Behind Support Vector Machines (Part 1: The Separable Case)

This tutorial is designed for anyone looking for a deeper understanding of how Lagrange multipliers are used in building up the model for support vector machines (SVMs). SVMs were initially designed to solve binary classification problems and later extended and applied to regression and unsupervised learning. They have shown their success in solving many complex machine learning classification problems. In this tutorial, we’ll look at the simplest SVM that assumes that the positive and negative examples can be completely separated via […]

Read more
1 41 42 43 44 45 907