K-Nearest Neighbors for Machine Learning

Last Updated on August 15, 2020 In this post you will discover the k-Nearest Neighbors (KNN) algorithm for classification and regression. After reading this post you will know. The model representation used by KNN. How a model is learned using KNN (hint, it’s not). How to make predictions using KNN The many names for KNN including how different fields refer to it. How to prepare your data to get the most from KNN. Where to look to learn more about […]

Read more

Learning Vector Quantization for Machine Learning

Last Updated on August 15, 2020 A downside of K-Nearest Neighbors is that you need to hang on to your entire training dataset. The Learning Vector Quantization algorithm (or LVQ for short) is an artificial neural network algorithm that lets you choose how many training instances to hang onto and learns exactly what those instances should look like. In this post you will discover the Learning Vector Quantization algorithm. After reading this post you will know: The representation used by […]

Read more

Support Vector Machines for Machine Learning

Last Updated on August 15, 2020 Support Vector Machines are perhaps one of the most popular and talked about machine learning algorithms. They were extremely popular around the time they were developed in the 1990s and continue to be the go-to method for a high-performing algorithm with little tuning. In this post you will discover the Support Vector Machine (SVM) machine learning algorithm. After reading this post you will know: How to disentangle the many names used to refer to […]

Read more

Bagging and Random Forest Ensemble Algorithms for Machine Learning

Last Updated on August 15, 2020 Random Forest is one of the most popular and most powerful machine learning algorithms. It is a type of ensemble machine learning algorithm called Bootstrap Aggregation or bagging. In this post you will discover the Bagging ensemble algorithm and the Random Forest algorithm for predictive modeling. After reading this post you will know about: The bootstrap method for estimating statistical quantities from samples. The Bootstrap Aggregation algorithm for creating multiple different models from a […]

Read more

Boosting and AdaBoost for Machine Learning

Last Updated on August 15, 2020 Boosting is an ensemble technique that attempts to create a strong classifier from a number of weak classifiers. In this post you will discover the AdaBoost Ensemble method for machine learning. After reading this post, you will know: What the boosting ensemble method is and generally how it works. How to learn to boost decision trees using the AdaBoost algorithm. How to make predictions using the learned AdaBoost model. How to best prepare your […]

Read more

6 Questions To Understand Any Machine Learning Algorithm

Last Updated on August 12, 2019 There are a lot of machine learning algorithms and each algorithm is an island of research. You have to choose the level of detail that you study machine learning algorithms. There is a sweet spot if you are a developer interested in applied predictive modeling. This post describes that sweet spot and gives you a template that you can use to quickly understand any machine learning algorithm. Kick-start your project with my new book […]

Read more

Machine Learning Algorithms Mini-Course

Last Updated on August 12, 2019 Machine learning algorithms are a very large part of machine learning. You have to understand how they work to make any progress in the field. In this post you will discover a 14-part machine learning algorithms mini course that you can follow to finally understand machine learning algorithms. We are going to cover a lot of ground in this course and you are going to have a great time. Kick-start your project with my […]

Read more

Embrace Randomness in Machine Learning

Last Updated on August 12, 2019 Why Do You Get Different Results On Different Runs Of An Algorithm With The Same Data? Applied machine learning is a tapestry of breakthroughs and mindset shifts. Understanding the role of randomness in machine learning algorithms is one of those breakthroughs. Once you get it, you will see things differently. In a whole new light. Things like choosing between one algorithm and another, hyperparameter tuning and reporting results. You will also start to see the abuses […]

Read more

Stop Coding Machine Learning Algorithms From Scratch

Last Updated on August 12, 2019 You Don’t Have To Implement Algorithms…if you’re a beginner and just getting started. Stop. Are you implementing a machine learning algorithm at the moment? Why? Implementing algorithms from scratch is one of the biggest mistakes I see beginners make. In this post you will discover: The algorithm implementation trap that beginners fall into. The very real difficulty of engineering world-class implementations of machine learning algorithms. Why you should be using off-the-shelf implementations. Kick-start your […]

Read more

A Gentle Introduction to Concept Drift in Machine Learning

Last Updated on August 12, 2019 Data can change over time. This can result in poor and degrading predictive performance in predictive models that assume a static relationship between input and output variables. This problem of the changing underlying relationships in the data is called concept drift in the field of machine learning. In this post, you will discover the problem of concept drift and ways to you may be able to address it in your own predictive modeling problems. […]

Read more
1 2 3 4 5