Tune Hyperparameters for Classification Machine Learning Algorithms
Last Updated on August 28, 2020
Machine learning algorithms have hyperparameters that allow you to tailor the behavior of the algorithm to your specific dataset.
Hyperparameters are different from parameters, which are the internal coefficients or weights for a model found by the learning algorithm. Unlike parameters, hyperparameters are specified by the practitioner when configuring the model.
Typically, it is challenging to know what values to use for the hyperparameters of a given algorithm on a given dataset, therefore it is common to use random or grid search strategies for different hyperparameter values.
The more hyperparameters of an algorithm that you need to tune, the slower the tuning process. Therefore, it is desirable to select a minimum subset of model hyperparameters to search or tune.
Not all model hyperparameters are equally important. Some hyperparameters have an outsized effect on the behavior, and in turn, the performance of a machine learning algorithm.
As a machine learning practitioner, you must know which hyperparameters to focus on to get a good result quickly.
In this tutorial, you will discover those hyperparameters that are most important for some of the top machine learning algorithms.
Kick-start your project with my new book
To finish reading, please visit source site