Combined Algorithm Selection and Hyperparameter Optimization (CASH Optimization)
Machine learning model selection and configuration may be the biggest challenge in applied machine learning.
Controlled experiments must be performed in order to discover what works best for a given classification or regression predictive modeling task. This can feel overwhelming given the large number of data preparation schemes, learning algorithms, and model hyperparameters that could be considered.
The common approach is to use a shortcut, such as using a popular algorithm or testing a small number of algorithms with default hyperparameters.
A modern alternative is to consider the selection of data preparation, learning algorithm, and algorithm hyperparameters one large global optimization problem. This characterization is generally referred to as Combined Algorithm Selection and Hyperparameter Optimization, or “CASH Optimization” for short.
In this post, you will discover the challenge of machine learning model selection and the modern solution referred to CASH Optimization.
After reading this post, you will know:
- The challenge of machine learning model and hyperparameter selection.
- The shortcuts of using popular models or making a series of sequential decisions.
- The characterization of Combined Algorithm Selection and Hyperparameter Optimization that underlies modern AutoML.
Let’s get started.