This repository contains two comprehensive SVM implementations using Python and NumPy:
- Kernel SVM (from scratch) using a polynomial kernel, hinge loss, and dual optimization.
- Soft Margin SVM using Quadratic Programming (cvxopt) for non-linearly separable data.
This project implements a Soft Margin Support Vector Machine (SVM) using the Lagrange Dual formulation, solved through Quadratic Programming (QP) with the cvxopt library. It is built entirely from scratch, showcasing how optimization and machine learning theory come together.
In real-world datasets, classes are often not linearly separable. Soft Margin SVM allows some misclassification, controlled by a penalty parameter C, and is formulated using Lagrangian multipliers (ฮฑ).
This implementation does the following:
- Generates a noisy dataset (non-perfectly separable)
- Formulates the SVM dual optimization using Lagrange multipliers
- Solves the dual using
cvxopt.qp() - Visualizes the decision boundary and margins