A custom implementation of SVM for classification with support for Gaussian RBF kernel, Polynomial kernel and Linear kernel. Uses Fast Gradient Descent algorithm to minimize smoothed hinge loss.
Package includes a wrapper class OneVsOneClassifier for performing multiclass classification on the SVM.
The Gaussian Radial Basis Function is given by
$$k(x,y) = \exp\left(-\frac{1}{2\sigma^2}\|x-y\|^2\right)$$
where sigma
which needs to be set. Default value of sigma
is set to 0.5.
The Polynomial kernel is given by
$$k(x,y) = (x^{T}y + b)^{p}$$
where bias
and power
which needs to be set. Default value of bias
is set to 1 and default value of power
is set to 2.
The Linear kernel is given by
$$k(x,y) = (x^{T}y)$$
For a demo of SVM on a simple simulated dataset (generated using the scikit-learn library):
python demo_simulated.py
For a demo of SVM on a real-world dataset (Digits dataset from the scikit-learn library):
python demo_digits.py
For a demo of the comparison of the custom implementation of SVM vs. the scikit-learn implementation on a real-world dataset (Digits dataset from the scikit-learn library):
python demo_compare_digits.py
For a demo of SVM on a real-world multiclass dataset (Vowel dataset from the book Elements of Statistical Learning) using the one vs. one multiclass classification strategy:
python demo_vowel.py
from models.svm import CustomSVM
clf = CustomSVM()
clf.fit(X_train, y_train)
predictions = clf.predict(X_test)
accuracy = clf.score(X_test, y_test)
- Python 3
- joblib
- scikit-learn
- numpy
- pandas
- matplotlib
- tqdm
- scipy