Model Prediction Accuracy Versus Interpretation in Machine Learning

Last Updated on August 15, 2020 In their book Applied Predictive Modeling, Kuhn and Johnson comment early on the trade-off of model prediction accuracy versus model interpretation. For a given problem, it is critical to have a clear idea of the which is a priority, accuracy or explainability so that this trade-off can be made explicitly rather than implicitly. In this post you will discover and consider this important trade-off. Model Accuracy vs ExplainabilityPhoto by Donald Hobern, some rights reserved […]

Read more

Improve Model Accuracy with Data Pre-Processing

Last Updated on August 15, 2020 Data preparation can make or break the predictive ability of your model. In Chapter 3 of their book Applied Predictive Modeling, Kuhn and Johnson introduce the process of data preparation. They refer to it as the addition, deletion or transformation of training set data. In this post you will discover the data pre-process steps that you can use to improve the predictive ability of your models. Kick-start your project with my new book Data […]

Read more

Clever Application Of A Predictive Model

Last Updated on August 15, 2020 What if you could use a predictive model to find new combinations of attributes that do not exist in the data but could be valuable. In Chapter 10 of Applied Predictive Modeling, Kuhn and Johnson provide a case study that does just this. It’s a fascinating and creative example of how to use a predictive model. In this post we will discover this less obvious use of a predictive model and the types of […]

Read more

Linear Classification in R

Last Updated on August 22, 2019 In this post you will discover recipes for 3 linear classification algorithms in R. All recipes in this post use the iris flowers dataset provided with R in the datasets package. The dataset describes the measurements if iris flowers and requires classification of each observation to one of three flower species. Kick-start your project with my new book Machine Learning Mastery With R, including step-by-step tutorials and the R source code files for all […]

Read more

Non-Linear Classification in R

Last Updated on August 22, 2019 In this post you will discover 8 recipes for non-linear classification in R. Each recipe is ready for you to copy and paste and modify for your own problem. All recipes in this post use the iris flowers dataset provided with R in the datasets package. The dataset describes the measurements if iris flowers and requires classification of each observation to one of three flower species. Kick-start your project with my new book Machine Learning Mastery With […]

Read more

How To Get Better At Machine Learning

Last Updated on August 15, 2020 Colorado Reed from Metacademy wrote a great post recently titled “Level-Up Your Machine Learning” to answer the question he often receives of: What should I do if I want to get ‘better’ at machine learning, but I don’t know what I want to learn? In this post you will discover a summary of Colorado recommendations and a breakdown of his roadmap. Level-up Your Machine LearningPhoto by Helgi Halldórsson, some rights reserved Strategy To Do […]

Read more

How to Kick Ass in Competitive Machine Learning

Last Updated on June 7, 2016 David Kofoed Wind posted an article to the Kaggle blog No Free Hunch titled “Learning from the best“. In the post, David summarized 6 key areas related to participating and doing well in competitive machine learning with quotes from top performing kagglers. In this post you will discover the key heuristics for doing well in competitive machine learning distilled from that post. Learning from the bestPhoto by Lida, some rights reserved Learning from Kaggle […]

Read more

Going Beyond Predictions

Last Updated on June 7, 2016 The predictions you make with a predictive model do not matter, it is the use of those predictions that matters. Jeremy Howard was the President and Chief Scientist of Kaggle, the competitive machine learning platform. In 2012 he presented at the O’reilly Strata conference on what he called the Drivetrain Approach for building “data products” that go beyond just predictions. In this post you will discover Howard’s Drivetrain Approach and how you can use […]

Read more

Master Kaggle By Competing Consistently

Last Updated on June 7, 2016 How do you get good at Kaggle competitions? It is a common question I get asked. The best advice for getting started and getting good is to consistently participate in competitions. You cannot help but get better at machine learning. A recent post by Triskelion titled “Reflecting back on one year of Kaggle contests” bares this out. He started out as a machine learning beginner and finished up as a “master” level Kaggle competitor […]

Read more

5 Benefits of Competitive Machine Learning

Last Updated on June 7, 2016 Jeremy Howard, formally of Kaggle gave a presentation at the University of San Francisco in mid 2013. In that presentation he touched on some of the broader benefits of machine learning competitions like those held on Kaggle. In this post you will discover 5 points I extracted from this talk that will motivate you to want to start participating in machine learning competitions Competitive Machine Learning is a MeritocracyPhoto by PaulBarber, some rights reserved […]

Read more
1 767 768 769 770 771 906