A Gentle Introduction to XGBoost Loss Functions

XGBoost is a powerful and popular implementation of the gradient boosting ensemble algorithm.

An important aspect in configuring XGBoost models is the choice of loss function that is minimized during the training of the model.

The loss function must be matched to the predictive modeling problem type, in the same way we must choose appropriate loss functions based on problem types with deep learning neural networks.

In this tutorial, you will discover how to configure loss functions for XGBoost ensemble models.

After completing this tutorial, you will know:

  • Specifying loss functions used when training XGBoost ensembles is a critical step, much like neural networks.
  • How to configure XGBoost loss functions for binary and multi-class classification tasks.
  • How to configure

     

     

    To finish reading, please visit source site