7 Step Mini-Course to Get Started with XGBoost in Python
Last Updated on April 24, 2020
XGBoost With Python Mini-Course.
XGBoost is an implementation of gradient boosting that is being used to win machine learning competitions.
It is powerful but it can be hard to get started.
In this post, you will discover a 7-part crash course on XGBoost with Python.
This mini-course is designed for Python machine learning practitioners that are already comfortable with scikit-learn and the SciPy ecosystem.
Kick-start your project with my new book XGBoost With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
- Update Jan/2017: Updated to reflect changes in scikit-learn API version 0.18.1.
- Update Mar/2018: Added alternate link to download the dataset as the original appears to have been taken down.
(Tip: you might want to print or bookmark this page so that you can refer back to it later.)
Who Is This Mini-Course For?
Before we get started, let’s make sure you are in the right place.
To finish reading, please visit source site