Cost-Sensitive SVM for Imbalanced Classification
Last Updated on August 21, 2020
The Support Vector Machine algorithm is effective for balanced classification, although it does not perform well on imbalanced datasets.
The SVM algorithm finds a hyperplane decision boundary that best splits the examples into two classes. The split is made soft through the use of a margin that allows some points to be misclassified. By default, this margin favors the majority class on imbalanced datasets, although it can be updated to take the importance of each class into account and dramatically improve the performance of the algorithm on datasets with skewed class distributions.
This modification of SVM that weighs the margin proportional to the class importance is often referred to as weighted SVM, or cost-sensitive SVM.
In this tutorial, you will discover weighted support vector machines for imbalanced classification.
After completing this tutorial, you will know:
- How the standard support vector machine algorithm is limited for imbalanced classification.
- How the support vector machine algorithm can be modified to weight the margin penalty proportional to class importance during training.
- How to configure class weight for the SVM and how to grid search different class weight configurations.
Kick-start your project with my new book To finish reading, please visit source site