Hyperparameter Optimization With Random Search and Grid Search
Last Updated on September 19, 2020
Machine learning models have hyperparameters that you must set in order to customize the model to your dataset.
Often the general effects of hyperparameters on a model are known, but how to best set a hyperparameter and combinations of interacting hyperparameters for a given dataset is challenging. There are often general heuristics or rules of thumb for configuring hyperparameters.
A better approach is to objectively search different values for model hyperparameters and choose a subset that results in a model that achieves the best performance on a given dataset. This is called hyperparameter optimization or hyperparameter tuning and is available in the scikit-learn Python machine learning library. The result of a hyperparameter optimization is a single set of well-performing hyperparameters that you can use to configure your model.
In this tutorial, you will discover hyperparameter optimization for machine learning in Python.
After completing this tutorial, you will know:
- Hyperparameter optimization is required to get the most out of your machine learning models.
- How to configure random and grid search hyperparameter optimization for classification tasks.
- How to configure random and grid search hyperparameter optimization for regression tasks.
Let’s get started.