You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Support Vector Machines (SVMs) are powerful supervised learning algorithms used for both classification and regression. SVMs find the optimal decision boundary (hyperplane) that maximises the margin between classes, making them particularly effective for high-dimensional data and problems with clear class separation.
Imagine you have two classes of data points on a 2D plane. There are infinitely many lines that could separate them. An SVM finds the line that maximises the margin — the distance between the decision boundary and the nearest data points from each class.
| Concept | Description |
|---|---|
| Hyperplane | The decision boundary that separates the classes (a line in 2D, a plane in 3D, a hyperplane in higher dimensions) |
| Margin | The distance between the hyperplane and the nearest data points from each class |
| Support Vectors | The data points closest to the hyperplane — they "support" and define the margin |
| Maximum Margin | The goal of SVM is to find the hyperplane with the largest margin |
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.